Science.gov

Sample records for probabilistic risk-based management

  1. Risk-based decision making in water management using probabilistic forecasts: results from a game experiment

    NASA Astrophysics Data System (ADS)

    Crochemore, Louise; Ramos, Maria-Helena; Pappenberger, Florian; van Andel, Schalk-Jan; Wood, Andy

    2014-05-01

    Probabilistic streamflow forecasts have been increasingly used or requested by practitioners in the operation of multipurpose water reservoirs. They usually integrate hydrologic inflow forecasts to their operational management rules to optimize water allocation or its economic value, to mitigate droughts, for flood and ecological control, among others. In this paper, we present an experiment conducted to investigate the use of probabilistic forecasts to make decisions on water reservoir outflows. The experiment was set up as a risk-based decision-making game. In the game, each participant acted as a water manager. A sequence of probabilistic inflow forecasts was presented to be used to make a reservoir release decision at a monthly time step, subject to a few constraints. After each decision, the actual inflow was presented and the consequences of the decisions made were discussed. Results from the application of the game to different groups of scientists and operational managers during conferences and meetings in 2013 (a total of about 150 participants) illustrate the different strategies adopted by the players. This game experiment allowed participants to experience first hand the challenges of probabilistic, quantitative decision-making.

  2. Risk-Based Probabilistic Approach to Aeropropulsion System Assessment

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.

    2002-01-01

    In an era of shrinking development budgets and resources, where there is also an emphasis on reducing the product development cycle, the role of system assessment, performed in the early stages of an engine development program, becomes very critical to the successful development of new aeropropulsion systems. A reliable system assessment not only helps to identify the best propulsion system concept among several candidates, it can also identify which technologies are worth pursuing. This is particularly important for advanced aeropropulsion technology development programs, which require an enormous amount of resources. In the current practice of deterministic, or point-design, approaches, the uncertainties of design variables are either unaccounted for or accounted for by safety factors. This could often result in an assessment with unknown and unquantifiable reliability. Consequently, it would fail to provide additional insight into the risks associated with the new technologies, which are often needed by decision makers to determine the feasibility and return-on-investment of a new aircraft engine. In this work, an alternative approach based on the probabilistic method was described for a comprehensive assessment of an aeropropulsion system. The statistical approach quantifies the design uncertainties inherent in a new aeropropulsion system and their influences on engine performance. Because of this, it enhances the reliability of a system assessment. A technical assessment of a wave-rotor-enhanced gas turbine engine was performed to demonstrate the methodology. The assessment used probability distributions to account for the uncertainties that occur in component efficiencies and flows and in mechanical design variables. The approach taken in this effort was to integrate the thermodynamic cycle analysis embedded in the computer code NEPP (NASA Engine Performance Program) and the engine weight analysis embedded in the computer code WATE (Weight Analysis of Turbine

  3. Probabilistic modeling of percutaneous absorption for risk-based exposure assessments and transdermal drug delivery.

    SciTech Connect

    Ho, Clifford Kuofei

    2004-06-01

    Chemical transport through human skin can play a significant role in human exposure to toxic chemicals in the workplace, as well as to chemical/biological warfare agents in the battlefield. The viability of transdermal drug delivery also relies on chemical transport processes through the skin. Models of percutaneous absorption are needed for risk-based exposure assessments and drug-delivery analyses, but previous mechanistic models have been largely deterministic. A probabilistic, transient, three-phase model of percutaneous absorption of chemicals has been developed to assess the relative importance of uncertain parameters and processes that may be important to risk-based assessments. Penetration routes through the skin that were modeled include the following: (1) intercellular diffusion through the multiphase stratum corneum; (2) aqueous-phase diffusion through sweat ducts; and (3) oil-phase diffusion through hair follicles. Uncertainty distributions were developed for the model parameters, and a Monte Carlo analysis was performed to simulate probability distributions of mass fluxes through each of the routes. Sensitivity analyses using stepwise linear regression were also performed to identify model parameters that were most important to the simulated mass fluxes at different times. This probabilistic analysis of percutaneous absorption (PAPA) method has been developed to improve risk-based exposure assessments and transdermal drug-delivery analyses, where parameters and processes can be highly uncertain.

  4. Probabilistic Risk Based Decision Support for Oil and Gas Exploration and Production Facilities in Sensitive Ecosystems

    SciTech Connect

    Greg Thoma; John Veil; Fred Limp; Jackson Cothren; Bruce Gorham; Malcolm Williamson; Peter Smith; Bob Sullivan

    2009-05-31

    This report describes work performed during the initial period of the project 'Probabilistic Risk Based Decision Support for Oil and Gas Exploration and Production Facilities in Sensitive Ecosystems.' The specific region that is within the scope of this study is the Fayetteville Shale Play. This is an unconventional, tight formation, natural gas play that currently has approximately 1.5 million acres under lease, primarily to Southwestern Energy Incorporated and Chesapeake Energy Incorporated. The currently active play encompasses a region from approximately Fort Smith, AR east to Little Rock, AR approximately 50 miles wide (from North to South). The initial estimates for this field put it almost on par with the Barnett Shale play in Texas. It is anticipated that thousands of wells will be drilled during the next several years; this will entail installation of massive support infrastructure of roads and pipelines, as well as drilling fluid disposal pits and infrastructure to handle millions of gallons of fracturing fluids. This project focuses on gas production in Arkansas as the test bed for application of proactive risk management decision support system for natural gas exploration and production. The activities covered in this report include meetings with representative stakeholders, development of initial content and design for an educational web site, and development and preliminary testing of an interactive mapping utility designed to provide users with information that will allow avoidance of sensitive areas during the development of the Fayetteville Shale Play. These tools have been presented to both regulatory and industrial stakeholder groups, and their feedback has been incorporated into the project.

  5. Willingness-to-pay for a probabilistic flood forecast: a risk-based decision-making game

    NASA Astrophysics Data System (ADS)

    Arnal, Louise; Ramos, Maria-Helena; Coughlan de Perez, Erin; Cloke, Hannah Louise; Stephens, Elisabeth; Wetterhall, Fredrik; van Andel, Schalk Jan; Pappenberger, Florian

    2016-08-01

    Probabilistic hydro-meteorological forecasts have over the last decades been used more frequently to communicate forecast uncertainty. This uncertainty is twofold, as it constitutes both an added value and a challenge for the forecaster and the user of the forecasts. Many authors have demonstrated the added (economic) value of probabilistic over deterministic forecasts across the water sector (e.g. flood protection, hydroelectric power management and navigation). However, the richness of the information is also a source of challenges for operational uses, due partially to the difficulty in transforming the probability of occurrence of an event into a binary decision. This paper presents the results of a risk-based decision-making game on the topic of flood protection mitigation, called "How much are you prepared to pay for a forecast?". The game was played at several workshops in 2015, which were attended by operational forecasters and academics working in the field of hydro-meteorology. The aim of this game was to better understand the role of probabilistic forecasts in decision-making processes and their perceived value by decision-makers. Based on the participants' willingness-to-pay for a forecast, the results of the game show that the value (or the usefulness) of a forecast depends on several factors, including the way users perceive the quality of their forecasts and link it to the perception of their own performances as decision-makers.

  6. Willingness-to-pay for a probabilistic flood forecast: a risk-based decision-making game

    NASA Astrophysics Data System (ADS)

    Arnal, Louise; Ramos, Maria-Helena; Coughlan, Erin; Cloke, Hannah L.; Stephens, Elisabeth; Wetterhall, Fredrik; van Andel, Schalk-Jan; Pappenberger, Florian

    2016-04-01

    Forecast uncertainty is a twofold issue, as it constitutes both an added value and a challenge for the forecaster and the user of the forecasts. Many authors have demonstrated the added (economic) value of probabilistic forecasts over deterministic forecasts for a diversity of activities in the water sector (e.g. flood protection, hydroelectric power management and navigation). However, the richness of the information is also a source of challenges for operational uses, due partially to the difficulty to transform the probability of occurrence of an event into a binary decision. The setup and the results of a risk-based decision-making experiment, designed as a game on the topic of flood protection mitigation, called ``How much are you prepared to pay for a forecast?'', will be presented. The game was played at several workshops in 2015, including during this session at the EGU conference in 2015, and a total of 129 worksheets were collected and analysed. The aim of this experiment was to contribute to the understanding of the role of probabilistic forecasts in decision-making processes and their perceived value by decision-makers. Based on the participants' willingness-to-pay for a forecast, the results of the game showed that the value (or the usefulness) of a forecast depends on several factors, including the way users perceive the quality of their forecasts and link it to the perception of their own performances as decision-makers. Balancing avoided costs and the cost (or the benefit) of having forecasts available for making decisions is not straightforward, even in a simplified game situation, and is a topic that deserves more attention from the hydrological forecasting community in the future.

  7. A risk-based decision-making game relevant to water management. Try it yourself!

    NASA Astrophysics Data System (ADS)

    Pappenberger, Florian; van Andel, Schalk Jan; Wood, Andy; Ramos, Maria-Helena

    2013-04-01

    Monthly or seasonal streamflow forecasts are essential to improve water planning (eg., water allocation) and anticipate severe events like droughts. Additionally, multipurpose water reservoirs usually integrate hydrologic inflow forecasts to their operational management rules to optimize water allocation or its economic value, to mitigate droughts, for flood and ecological control, among others. Given the need to take into account uncertainties at long lead times to allow for optimal risk-based decisions, the use of probabilistic forecasts in this context is inevitable. In this presentation, we will engage a risk-based decision-making game, where each participant will act as a water manager. A sequence of probabilistic inflow forecasts will be presented to be used to make a reservoir release decision at a monthly time-step, subject to a few constraints -- e.g., an end of year target pool elevation, a maximum release and a minimum downstream flow. After each decision, the actual inflow will be presented and the consequences of the decisions made will be discussed together with the participants of the session. This experience will allow participants to experience firsthand the challenges of probabilistic, quantitative decision-making.

  8. Risk based management of piping systems

    SciTech Connect

    Conley, M.J.; Aller, J.E.; Tallin, A.; Weber, B.J.

    1996-07-01

    The API Piping Inspection Code is the first such Code to require classification of piping based on the consequences of failure, and to use this classification to influence inspection activity. Since this Code was published, progress has been made in the development of tools to improve on this approach by determining not only the consequences of failure, but also the likelihood of failure. ``Risk`` is defined as the product of the consequence and the likelihood. Measuring risk provides the means to formally manage risk by matching the inspection effort (costs) to the benefits of reduced risk. Using such a cost/benefit analysis allows the optimization of inspection budgets while meeting societal demands for reduction of the risk associated with process plant piping. This paper presents an overview of the tools developed to measure risk, and the methods to determine the effects of past and future inspections on the level of risk. The methodology is being developed as an industry-sponsored project under the direction of an API committee. The intent is to develop an API Recommended Practice that will be linked to In-Service Inspection Standards and the emerging Fitness for Service procedures. Actual studies using a similar approach have shown that a very high percentage of the risk due to piping in an operating facility is associated with relatively few pieces of piping. This permits inspection efforts to be focused on those piping systems that will result in the greatest risk reduction.

  9. A Practical Probabilistic Graphical Modeling Tool for Weighing Ecological Risk-Based Evidence

    EPA Science Inventory

    Past weight-of-evidence frameworks for adverse ecological effects have provided soft-scoring procedures for judgments based on the quality and measured attributes of evidence. Here, we provide a flexible probabilistic structure for weighing and integrating lines of evidence for e...

  10. Applicability of risk-based management and the need for risk-based economic decision analysis at hazardous waste contaminated sites.

    PubMed

    Khadam, Ibrahim; Kaluarachchi, Jagath J

    2003-07-01

    Decision analysis in subsurface contamination management is generally carried out through a traditional engineering economic viewpoint. However, new advances in human health risk assessment, namely, the probabilistic risk assessment, and the growing awareness of the importance of soft data in the decision-making process, require decision analysis methodologies that are capable of accommodating non-technical and politically biased qualitative information. In this work, we discuss the major limitations of the currently practiced decision analysis framework, which evolves around the definition of risk and cost of risk, and its poor ability to communicate risk-related information. A demonstration using a numerical example was conducted to provide insight on these limitations of the current decision analysis framework. The results from this simple ground water contamination and remediation scenario were identical to those obtained from studies carried out on existing Superfund sites, which suggests serious flaws in the current risk management framework. In order to provide a perspective on how these limitations may be avoided in future formulation of the management framework, more matured and well-accepted approaches to decision analysis in dam safety and the utility industry, where public health and public investment are of great concern, are presented and their applicability in subsurface remediation management is discussed. Finally, in light of the success of the application of risk-based decision analysis in dam safety and the utility industry, potential options for decision analysis in subsurface contamination management are discussed.

  11. Risk-based principles for defining and managing water security

    PubMed Central

    Hall, Jim; Borgomeo, Edoardo

    2013-01-01

    The concept of water security implies concern about potentially harmful states of coupled human and natural water systems. Those harmful states may be associated with water scarcity (for humans and/or the environment), floods or harmful water quality. The theories and practices of risk analysis and risk management have been developed and elaborated to deal with the uncertain occurrence of harmful events. Yet despite their widespread application in public policy, theories and practices of risk management have well-known limitations, particularly in the context of severe uncertainties and contested values. Here, we seek to explore the boundaries of applicability of risk-based principles as a means of formalizing discussion of water security. Not only do risk concepts have normative appeal, but they also provide an explicit means of addressing the variability that is intrinsic to hydrological, ecological and socio-economic systems. We illustrate the nature of these interconnections with a simulation study, which demonstrates how water resources planning could take more explicit account of epistemic uncertainties, tolerability of risk and the trade-offs in risk among different actors. PMID:24080616

  12. Risk-based principles for defining and managing water security.

    PubMed

    Hall, Jim; Borgomeo, Edoardo

    2013-11-13

    The concept of water security implies concern about potentially harmful states of coupled human and natural water systems. Those harmful states may be associated with water scarcity (for humans and/or the environment), floods or harmful water quality. The theories and practices of risk analysis and risk management have been developed and elaborated to deal with the uncertain occurrence of harmful events. Yet despite their widespread application in public policy, theories and practices of risk management have well-known limitations, particularly in the context of severe uncertainties and contested values. Here, we seek to explore the boundaries of applicability of risk-based principles as a means of formalizing discussion of water security. Not only do risk concepts have normative appeal, but they also provide an explicit means of addressing the variability that is intrinsic to hydrological, ecological and socio-economic systems. We illustrate the nature of these interconnections with a simulation study, which demonstrates how water resources planning could take more explicit account of epistemic uncertainties, tolerability of risk and the trade-offs in risk among different actors. PMID:24080616

  13. Risk-based principles for defining and managing water security.

    PubMed

    Hall, Jim; Borgomeo, Edoardo

    2013-11-13

    The concept of water security implies concern about potentially harmful states of coupled human and natural water systems. Those harmful states may be associated with water scarcity (for humans and/or the environment), floods or harmful water quality. The theories and practices of risk analysis and risk management have been developed and elaborated to deal with the uncertain occurrence of harmful events. Yet despite their widespread application in public policy, theories and practices of risk management have well-known limitations, particularly in the context of severe uncertainties and contested values. Here, we seek to explore the boundaries of applicability of risk-based principles as a means of formalizing discussion of water security. Not only do risk concepts have normative appeal, but they also provide an explicit means of addressing the variability that is intrinsic to hydrological, ecological and socio-economic systems. We illustrate the nature of these interconnections with a simulation study, which demonstrates how water resources planning could take more explicit account of epistemic uncertainties, tolerability of risk and the trade-offs in risk among different actors.

  14. Probabilistic objective functions for sensor management

    NASA Astrophysics Data System (ADS)

    Mahler, Ronald P. S.; Zajic, Tim R.

    2004-08-01

    This paper continues the investigation of a foundational and yet potentially practical basis for control-theoretic sensor management, using a comprehensive, intuitive, system-level Bayesian paradigm based on finite-set statistics (FISST). In this paper we report our most recent progress, focusing on multistep look-ahead -- i.e., allocation of sensor resources throughout an entire future time-window. We determine future sensor states in the time-window using a "probabilistically natural" sensor management objective function, the posterior expected number of targets (PENT). This objective function is constructed using a new "maxi-PIMS" optimization strategy that hedges against unknowable future observation-collections. PENT is used in conjuction with approximate multitarget filters: the probability hypothesis density (PHD) filter or the multi-hypothesis correlator (MHC) filter.

  15. Towards risk-based drought management in the Netherlands: quantifying the welfare effects of water shortage

    NASA Astrophysics Data System (ADS)

    van der Vat, Marnix; Femke, Schasfoort; Rhee Gigi, Van; Manfred, Wienhoven; Nico, Polman; Joost, Delsman; den Hoek Paul, Van; Maat Judith, Ter; Marjolein, Mens

    2016-04-01

    It is widely acknowledged that drought management should move from a crisis to a risk-based approach. A risk-based approach to managing water resources requires a sound drought risk analysis, quantifying the probability and impacts of water shortage due to droughts. Impacts of droughts are for example crop yield losses, hydropower production losses, and water shortage for municipal and industrial use. Many studies analyse the balance between supply and demand, but there is little experience in translating this into economic metrics that can be used in a decision-making process on investments to reduce drought risk. We will present a drought risk analysis method for the Netherlands, with a focus on the underlying economic method to quantify the welfare effects of water shortage for different water users. Both the risk-based approach as well as the economic valuation of water shortage for various water users was explored in a study for the Dutch Government. First, an historic analysis of the effects of droughts on revenues and prices in agriculture as well as on shipping and nature was carried out. Second, a drought risk analysis method was developed that combines drought hazard and drought impact analysis in a probabilistic way for various sectors. This consists of a stepwise approach, from water availability through water shortage to economic impact, for a range of drought events with a certain return period. Finally, a local case study was conducted to test the applicability of the drought risk analysis method. Through the study, experience was gained into integrating hydrological and economic analyses, which is a prerequisite for drought risk analysis. Results indicate that the risk analysis method is promising and applicable for various sectors. However, it was also found that quantification of economic impacts from droughts is time-consuming, because location- and sector-specific data is needed, which is not always readily available. Furthermore, for some

  16. COMMUNICATING PROBABILISTIC RISK OUTCOMES TO RISK MANAGERS

    EPA Science Inventory

    Increasingly, risk assessors are moving away from simple deterministic assessments to probabilistic approaches that explicitly incorporate ecological variability, measurement imprecision, and lack of knowledge (collectively termed "uncertainty"). While the new methods provide an...

  17. Probabilistic thinking and death anxiety: a terror management based study.

    PubMed

    Hayslip, Bert; Schuler, Eric R; Page, Kyle S; Carver, Kellye S

    2014-01-01

    Terror Management Theory has been utilized to understand how death can change behavioral outcomes and social dynamics. One area that is not well researched is why individuals willingly engage in risky behavior that could accelerate their mortality. One method of distancing a potential life threatening outcome when engaging in risky behaviors is through stacking probability in favor of the event not occurring, termed probabilistic thinking. The present study examines the creation and psychometric properties of the Probabilistic Thinking scale in a sample of young, middle aged, and older adults (n = 472). The scale demonstrated adequate internal consistency reliability for each of the four subscales, excellent overall internal consistency, and good construct validity regarding relationships with measures of death anxiety. Reliable age and gender effects in probabilistic thinking were also observed. The relationship of probabilistic thinking as part of a cultural buffer against death anxiety is discussed, as well as its implications for Terror Management research.

  18. Managing physician lipid management: a population wide, risk-based decision support approach.

    PubMed

    Rubenstein, Lisa V

    2015-01-01

    Successful implementation of clinical guidelines for preventing complications of dyslipidemias has been an ongoing challenge. The article by Vinker and colleagues in this journal investigates the results of implementing risk-based guidelines for LDL (Low Density Lipoprotein) management in comparison to the prior approach of using the same LDL cutoff for patients at all levels of risk. Results show LDL levels dropped across the primary care population using the new risk-based approach, suggesting that clinical decision aids that link to individual patient characteristics, rather than promoting a universal target for all, may provide a particularly strong stimulus for changing provider and patient behavior. Results also challenge healthcare organizations, providers and patients to learn more about the pathway from guidelines to clinical reminders and from reminders to lower LDL levels and better population health. PMID:26175893

  19. Probabilistic economic frameworks for disaster risk management

    NASA Astrophysics Data System (ADS)

    Dulac, Guillaume; Forni, Marc

    2013-04-01

    Starting from the general concept of risk, we set up an economic analysis framework for Disaster Risk Management (DRM) investment. It builds on uncertainty management techniques - notably Monte Carlo simulations - and includes both a risk and performance metrics adapted to recurring issues in disaster risk management as entertained by governments and international organisations. This type of framework proves to be enlightening in several regards, and is thought to ease the promotion of DRM projects as "investments" rather than "costs to be born" and allow for meaningful comparison between DRM and other sectors. We then look at the specificities of disaster risk investments of medium to large scales through this framework, where some "invariants" can be identified, notably: (i) it makes more sense to perform analysis over long-term horizons -space and time scales are somewhat linked; (ii) profiling of the fluctuations of the gains and losses of DRM investments over long periods requires the ability to handle possibly highly volatile variables; (iii) complexity increases with the scale which results in a higher sensitivity of the analytic framework on the results; (iv) as the perimeter of analysis (time, theme and space-wise) is widened, intrinsic parameters of the project tend to weight lighter. This puts DRM in a very different perspective from traditional modelling, which usually builds on more intrinsic features of the disaster as it relates to the scientific knowledge about hazard(s). As models hardly accommodate for such complexity or "data entropy" (they require highly structured inputs), there is a need for a complementary approach to understand risk at global scale. The proposed framework suggests opting for flexible ad hoc modelling of specific issues consistent with one's objective, risk and performance metrics. Such tailored solutions are strongly context-dependant (time and budget, sensitivity of the studied variable in the economic framework) and can

  20. Risk-Based Models for Managing Data Privacy in Healthcare

    ERIC Educational Resources Information Center

    AL Faresi, Ahmed

    2011-01-01

    Current research in health care lacks a systematic investigation to identify and classify various sources of threats to information privacy when sharing health data. Identifying and classifying such threats would enable the development of effective information security risk monitoring and management policies. In this research I put the first step…

  1. Verification of recursive probabilistic integration (RPI) method for fatigue life management using non-destructive inspections

    NASA Astrophysics Data System (ADS)

    Chen, Tzikang J.; Shiao, Michael

    2016-04-01

    This paper verified a generic and efficient assessment concept for probabilistic fatigue life management. The concept is developed based on an integration of damage tolerance methodology, simulations methods1, 2, and a probabilistic algorithm RPI (recursive probability integration)3-9 considering maintenance for damage tolerance and risk-based fatigue life management. RPI is an efficient semi-analytical probabilistic method for risk assessment subjected to various uncertainties such as the variability in material properties including crack growth rate, initial flaw size, repair quality, random process modeling of flight loads for failure analysis, and inspection reliability represented by probability of detection (POD). In addition, unlike traditional Monte Carlo simulations (MCS) which requires a rerun of MCS when maintenance plan is changed, RPI can repeatedly use a small set of baseline random crack growth histories excluding maintenance related parameters from a single MCS for various maintenance plans. In order to fully appreciate the RPI method, a verification procedure was performed. In this study, MC simulations in the orders of several hundred billions were conducted for various flight conditions, material properties, and inspection scheduling, POD and repair/replacement strategies. Since the MC simulations are time-consuming methods, the simulations were conducted parallelly on DoD High Performance Computers (HPC) using a specialized random number generator for parallel computing. The study has shown that RPI method is several orders of magnitude more efficient than traditional Monte Carlo simulations.

  2. Managing long-term polycyclic aromatic hydrocarbon contaminated soils: a risk-based approach.

    PubMed

    Duan, Luchun; Naidu, Ravi; Thavamani, Palanisami; Meaklim, Jean; Megharaj, Mallavarapu

    2015-06-01

    Polycyclic aromatic hydrocarbons (PAHs) are a family of contaminants that consist of two or more aromatic rings fused together. Soils contaminated with PAHs pose significant risk to human and ecological health. Over the last 50 years, significant research has been directed towards the cleanup of PAH-contaminated soils to background level. However, this achieved only limited success especially with high molecular weight compounds. Notably, during the last 5-10 years, the approach to remediate PAH-contaminated soils has changed considerably. A risk-based prioritization of remediation interventions has become a valuable step in the management of contaminated sites. The hydrophobicity of PAHs underlines that their phase distribution in soil is strongly influenced by factors such as soil properties and ageing of PAHs within the soil. A risk-based approach recognizes that exposure and environmental effects of PAHs are not directly related to the commonly measured total chemical concentration. Thus, a bioavailability-based assessment using a combination of chemical analysis with toxicological assays and nonexhaustive extraction technique would serve as a valuable tool in risk-based approach for remediation of PAH-contaminated soils. In this paper, the fate and availability of PAHs in contaminated soils and their relevance to risk-based management of long-term contaminated soils are reviewed. This review may serve as guidance for the use of site-specific risk-based management methods.

  3. Method for Water Management Considering Long-term Probabilistic Forecasts

    NASA Astrophysics Data System (ADS)

    Hwang, J.; Kang, J.; Suh, A. S.

    2015-12-01

    This research is aimed at predicting the monthly inflow of the Andong-dam basin in South Korea using long-term probabilistic forecasts to apply long-term forecasts to water management. Forecasted Cumulative Distribution Functions (CDFs) of monthly precipitation are plotted by combining the range of monthly precipitation based on proper Probability Density Function (PDF) in past data with probabilistic forecasts in each category. Ensembles of inflow are estimated by entering generated ensembles of precipitation based on the CDFs into the 'abcd' water budget model. The bias and RMSE between averages in past data and observed inflow are compared to them in forecasted ensembles. In our results, the bias and RMSE of average precipitation in the forecasted ensemble are bigger than in past data, whereas the average inflow in the forecasted ensemble is smaller than in past data. This result could be used for reference data to apply long-term forecasts to water management, because of the limit in the number of forecasted data for verification and differences between the Andong-dam basin and the forecasted regions. This research has significance by suggesting a method of applying probabilistic information in climate variables from long-term forecasts to water management in Korea. Original data of a climate model, which produces long-term probabilistic forecasts should be verified directly as input data of a water budget model in the future, so that a more scientific response in water management against uncertainty of climate change could be reached.

  4. The effects of climate model similarity on probabilistic climate projections and the implications for local, risk-based adaptation planning

    NASA Astrophysics Data System (ADS)

    Steinschneider, Scott; McCrary, Rachel; Mearns, Linda O.; Brown, Casey

    2015-06-01

    Approaches for probability density function (pdf) development of future climate often assume that different climate models provide independent information, despite model similarities that stem from a common genealogy (models with shared code or developed at the same institution). Here we use an ensemble of projections from the Coupled Model Intercomparison Project Phase 5 to develop probabilistic climate information, with and without an accounting of intermodel correlations, for seven regions across the United States. We then use the pdfs to estimate midcentury climate-related risks to a water utility in one of the regions. We show that the variance of climate changes is underestimated across all regions if model correlations are ignored, and in some cases, the mean change shifts as well. When coupled with impact models of the hydrology and infrastructure of a water utility, the underestimated likelihood of large climate changes significantly alters the quantification of risk for water shortages by midcentury.

  5. Introduction to Decision Support Systems for Risk Based Management of Contaminated Sites

    EPA Science Inventory

    A book on Decision Support Systems for Risk-based Management of contaminated sites is appealing for two reasons. First, it addresses the problem of contaminated sites, which has worldwide importance. Second, it presents Decision Support Systems (DSSs), which are powerful comput...

  6. A risk-based probabilistic framework to estimate the endpoint of remediation: Concentration rebound by rate-limited mass transfer

    NASA Astrophysics Data System (ADS)

    Barros, F. P. J.; Fernã Ndez-Garcia, D.; Bolster, D.; Sanchez-Vila, X.

    2013-04-01

    Aquifer remediation is a challenging problem with environmental, social, and economic implications. As a general rule, pumping proceeds until the concentration of the target substance within the pumped water lies below a prespecified value. In this paper we estimate the a priori potential failure of the endpoint of remediation due to a rebound of concentrations driven by back diffusion. In many cases, it has been observed that once pumping ceases, a rebound in the concentration at the well takes place. For this reason, administrative approaches are rather conservative, and pumping is forced to last much longer than initially expected. While a number of physical and chemical processes might account for the presence of rebounding, we focus here on diffusion from low water mobility into high mobility zones. In this work we look specifically at the concentration rebound when pumping is discontinued while accounting for multiple mass transfer processes occurring at different time scales and parametric uncertainty. We aim to develop a risk-based optimal operation methodology that is capable of estimating the endpoint of remediation based on aquifer parameters characterizing the heterogeneous medium as well as pumping rate and initial size of the polluted area.

  7. A risk-based framework for water resource management under changing water availability, policy options, and irrigation expansion

    NASA Astrophysics Data System (ADS)

    Hassanzadeh, Elmira; Elshorbagy, Amin; Wheater, Howard; Gober, Patricia

    2016-08-01

    resemble nonlinear functions of changes in individual drivers. The proposed risk-based framework can be linked to any water resource system assessment scheme to quantify the risk in system performance under changing conditions, with the larger goal of proposing alternative policy options to address future uncertainties and management concerns.

  8. The role of risk-based prioritization in total quality management

    SciTech Connect

    Bennett, C.T.

    1994-10-01

    The climate in which government managers must make decisions grows more complex and uncertain. All stakeholders - the public, industry, and Congress - are demanding greater consciousness, responsibility, and accountability of programs and their budgets. Yet, managerial decisions have become multifaceted, involve greater risk, and operate over much longer time periods. Over the last four or five decades, as policy analysis and decisions became more complex, scientists from psychology, operations research, systems science, and economics have developed a more or less coherent process called decision analysis to aid program management. The process of decision analysis - a systems theoretic approach - provides the backdrop for this paper. The Laboratory Integrated Prioritization System (LIPS) has been developed as a systems analytic and risk-based prioritization tool to aid the management of the Tri-Labs` (Lawrence Livermore, Los Alamos, and Sandia) operating resources. Preliminary analyses of the effects of LIPS has confirmed the practical benefits of decision and systems sciences - the systematic, quantitative reduction in uncertainty. To date, the use of LIPS - and, hence, its value - has been restricted to resource allocation within the Tri-Labs` operations budgets. This report extends the role of risk-based prioritization to the support of DOE Total Quality Management (TQM) programs. Furthermore, this paper will argue for the requirement to institutionalize an evolutionary, decision theoretic approach to the policy analysis of the Department of Energy`s Program Budget.

  9. Waste management project's alternatives: A risk-based multi-criteria assessment (RBMCA) approach

    SciTech Connect

    Karmperis, Athanasios C.; Sotirchos, Anastasios; Aravossis, Konstantinos; Tatsiopoulos, Ilias P.

    2012-01-15

    Highlights: Black-Right-Pointing-Pointer We examine the evaluation of a waste management project's alternatives. Black-Right-Pointing-Pointer We present a novel risk-based multi-criteria assessment (RBMCA) approach. Black-Right-Pointing-Pointer In the RBMCA the evaluation criteria are based on the quantitative risk analysis of the project's alternatives. Black-Right-Pointing-Pointer Correlation between the criteria weight values and the decision makers' risk preferences is examined. Black-Right-Pointing-Pointer Preference to the multi-criteria against the one-criterion evaluation process is discussed. - Abstract: This paper examines the evaluation of a waste management project's alternatives through a quantitative risk analysis. Cost benefit analysis is a widely used method, in which the investments are mainly assessed through the calculation of their evaluation indicators, namely benefit/cost (B/C) ratios, as well as the quantification of their financial, technical, environmental and social risks. Herein, a novel approach in the form of risk-based multi-criteria assessment (RBMCA) is introduced, which can be used by decision makers, in order to select the optimum alternative of a waste management project. Specifically, decision makers use multiple criteria, which are based on the cumulative probability distribution functions of the alternatives' B/C ratios. The RBMCA system is used for the evaluation of a waste incineration project's alternatives, where the correlation between the criteria weight values and the decision makers' risk preferences is analyzed and useful conclusions are discussed.

  10. Risk-based requirements management framework with applications to assurance cases

    NASA Astrophysics Data System (ADS)

    Feng, D.; Eyster, C.

    The current regulatory approach for assuring device safety primarily focuses on compliance with prescriptive safety regulations and relevant safety standards. This approach, however, does not always lead to a safe system design even though safety regulations and standards have been met. In the medical device industry, several high profile recalls involving infusion pumps have prompted the regulatory agency to reconsider how device safety should be managed, reviewed and approved. An assurance case has been cited as a promising tool to address this growing concern. Assurance cases have been used in safety-critical systems for some time. Most assurance cases, if not all, in literature today are developed in an ad hoc fashion, independent from risk management and requirement development. An assurance case is a resource-intensive endeavor that requires additional effort and documentation from equipment manufacturers. Without a well-organized requirements infrastructure in place, such “ additional effort” can be substantial, to the point where the cost of adoption outweighs the benefit of adoption. In this paper, the authors present a Risk-Based Requirements and Assurance Management (RBRAM) methodology. The RBRAM is an elaborate framework that combines Risk-Based Requirements Management (RBRM) with assurance case methods. Such an integrated framework can help manufacturers leverage an existing risk management to present a comprehensive assurance case with minimal additional effort while providing a supplementary means to reexamine the integrity of the system design in terms of the mission objective. Although the example used is from the medical industry, the authors believe that the RBRAM methodology underlines the fundamental principle of risk management, and offers a simple, yet effective framework applicable to aerospace industry, perhaps, to any industry.

  11. Assistance to the states with risk based data management. Quarterly technical progress report, January 1, 1995--March 31, 1995

    SciTech Connect

    1995-06-01

    This report describes activities associated with the risk based data management systems (RBDMS) project. The RBDMS applies to data pertaining to injection well operations. This report describes assistance provided to the states of ALaska, Mississippi, Montana, and Nebraska for the conversion of data from existing data management systems, coding and internal testing of the RBDMS, document preparation, training, technology transfer, and project management.

  12. MO-E-9A-01: Risk Based Quality Management: TG100 In Action

    SciTech Connect

    Huq, M; Palta, J; Dunscombe, P; Thomadsen, B

    2014-06-15

    One of the goals of quality management in radiation therapy is to gain high confidence that patients will receive the prescribed treatment correctly. To accomplish these goals professional societies such as the American Association of Physicists in Medicine (AAPM) has published many quality assurance (QA), quality control (QC), and quality management (QM) guidance documents. In general, the recommendations provided in these documents have emphasized on performing device-specific QA at the expense of process flow and protection of the patient against catastrophic errors. Analyses of radiation therapy incidents find that they are most often caused by flaws in the overall therapy process, from initial consult through final treatment, than by isolated hardware or computer failures detectable by traditional physics QA. This challenge is shared by many intrinsically hazardous industries. Risk assessment tools and analysis techniques have been developed to define, identify, and eliminate known and/or potential failures, problems, or errors, from a system, process and/or service before they reach the customer. These include, but are not limited to, process mapping, failure modes and effects analysis (FMEA), fault tree analysis (FTA), and establishment of a quality management program that best avoids the faults and risks that have been identified in the overall process. These tools can be easily adapted to radiation therapy practices because of their simplicity and effectiveness to provide efficient ways to enhance the safety and quality of treatment processes. Task group 100 (TG100) of AAPM has developed a risk-based quality management program that uses these tools. This session will be devoted to a discussion of these tools and how these tools can be used in a given radiotherapy clinic to develop a risk based QM program. Learning Objectives: Learn how to design a process map for a radiotherapy process. Learn how to perform a FMEA analysis for a given process. Learn what

  13. Risk based management of contaminated sediments: consideration of spatial and temporal patterns in exposure modeling.

    PubMed

    Linkov, Igor; Burmistrov, Dmitriy; Cura, Jerome; Bridges, Todd S

    2002-01-15

    This paper addresses interactions among foraging behavior, habitat preferences, site characteristics, and spatial distribution of contaminants in developing PCB exposure estimates for winter flounder at a hypothetical open water dredged material disposal site in the coastal waters of New York and New Jersey (NY-NJ). The implications of these interactions for human health risk estimates for local recreational anglers who fish for and eat flounder are described. The models implemented in this study include a spatial submodel to account for spatial and temporal characteristics of fish exposures and a probabilistic adaptation of the Gobas bioaccumulation model that accounts for temporal variation in concentrations of hydrophobic contaminants in sediment and water. We estimated the geographic distribution of a winter flounder subpopulation offshore of NY-NJ based on species biology and its vulnerability to local recreational fishing, the foraging area of individual fish, and their migration patterns. We incorporated these parameters and an estimate of differential attraction to a management site into a spatially explicit model to assess the range of exposures within the population. The output of this modeling effort, flounder PCB tissue concentrations, provided exposure point concentrations for an estimate of human health risk through ingestion of locally caught flounder. The risks obtained for the spatially nonexplicit case are as much as 1 order of magnitude higher than those obtained with explicit consideration of spatial and temporal characteristics of winter flounder foraging and seasonal migration. This practice of "defaulting" to extremely conservative estimates for exposure parameters in the face of uncertainty ill serves the decision-making process for management of contaminated sediments in general and specifically for disposal of dredged materials. Consideration of realistic spatial and temporal scales in food chain models can help support sediment management

  14. Emerging contaminants in the environment: Risk-based analysis for better management.

    PubMed

    Naidu, Ravi; Arias Espana, Victor Andres; Liu, Yanju; Jit, Joytishna

    2016-07-01

    Emerging contaminants (ECs) are chemicals of a synthetic origin or deriving from a natural source that has recently been discovered and for which environmental or public health risks are yet to be established. This is due to limited available information on their interaction and toxicological impacts on receptors. Several types of ECs exist such as antibiotics, pesticides, pharmaceuticals, personal care products, effluents, certain naturally occurring contaminants and more recently nanomaterials. ECs may derive from a known source, for example released directly to the aquatic environment from direct discharges such as those from wastewater treatment plants. Although in most instances the direct source cannot be identified, ECs have been detected in virtually every country's natural environment and as a consequence they represent a global problem. There is very limited information on the fate and transport of ECs in the environment and their toxicological impact. This lack of information can be attributed to limited financial resources and the lack of analytical techniques for detecting their effects on ecosystems and human health on their own or as mixture. We do not know how ECs interact with each other or various contaminants. This paper presents an overview of existing knowledge on ECs, their fate and transport and a risk-based analysis for ECs management and complementary strategies. PMID:27062002

  15. Emerging contaminants in the environment: Risk-based analysis for better management.

    PubMed

    Naidu, Ravi; Arias Espana, Victor Andres; Liu, Yanju; Jit, Joytishna

    2016-07-01

    Emerging contaminants (ECs) are chemicals of a synthetic origin or deriving from a natural source that has recently been discovered and for which environmental or public health risks are yet to be established. This is due to limited available information on their interaction and toxicological impacts on receptors. Several types of ECs exist such as antibiotics, pesticides, pharmaceuticals, personal care products, effluents, certain naturally occurring contaminants and more recently nanomaterials. ECs may derive from a known source, for example released directly to the aquatic environment from direct discharges such as those from wastewater treatment plants. Although in most instances the direct source cannot be identified, ECs have been detected in virtually every country's natural environment and as a consequence they represent a global problem. There is very limited information on the fate and transport of ECs in the environment and their toxicological impact. This lack of information can be attributed to limited financial resources and the lack of analytical techniques for detecting their effects on ecosystems and human health on their own or as mixture. We do not know how ECs interact with each other or various contaminants. This paper presents an overview of existing knowledge on ECs, their fate and transport and a risk-based analysis for ECs management and complementary strategies.

  16. Health risk-based assessment and management of heavy metals-contaminated soil sites in Taiwan.

    PubMed

    Lai, Hung-Yu; Hseu, Zeng-Yei; Chen, Ting-Chien; Chen, Bo-Ching; Guo, Horng-Yuh; Chen, Zueng-Sang

    2010-10-01

    Risk-based assessment is a way to evaluate the potential hazards of contaminated sites and is based on considering linkages between pollution sources, pathways, and receptors. These linkages can be broken by source reduction, pathway management, and modifying exposure of the receptors. In Taiwan, the Soil and Groundwater Pollution Remediation Act (SGWPR Act) uses one target regulation to evaluate the contamination status of soil and groundwater pollution. More than 600 sites contaminated with heavy metals (HMs) have been remediated and the costs of this process are always high. Besides using soil remediation techniques to remove contaminants from these sites, the selection of possible remediation methods to obtain rapid risk reduction is permissible and of increasing interest. This paper discusses previous soil remediation techniques applied to different sites in Taiwan and also clarified the differences of risk assessment before and after soil remediation obtained by applying different risk assessment models. This paper also includes many case studies on: (1) food safety risk assessment for brown rice growing in a HMs-contaminated site; (2) a tiered approach to health risk assessment for a contaminated site; (3) risk assessment for phytoremediation techniques applied in HMs-contaminated sites; and (4) soil remediation cost analysis for contaminated sites in Taiwan.

  17. National Drought Policy: Shifting the Paradigm from Crisis to Risk-based Management

    NASA Astrophysics Data System (ADS)

    Wilhite, D. A.; Sivakumar, M. K.; Stefanski, R.

    2011-12-01

    Drought is a normal part of climate for virtually all of the world's climatic regimes. To better address the risks associated with this hazard and societal vulnerability, there must be a dramatic paradigm shift in our approach to drought management in the coming decade in the light of the increasing frequency of droughts and projections of increased severity and duration of these events in the future for many regions, especially in the developing world. Addressing this challenge will require an improved awareness of drought as a natural hazard, the establishment of integrated drought monitoring and early warning systems, a higher level of preparedness that fully incorporates risk-based management, and the adoption of national drought policies that are directed at increasing the coping capacity and resilience of populations to future drought episodes. The World Meteorological Organization (WMO), in partnership with other United Nations' agencies, the National Drought Mitigation Center at the University of Nebraska, NOAA, the U.S. Department of Agriculture, and other partners, is currently launching a program to organize a High Level Meeting on National Drought Policy (HMNDP) in March 2013 to encourage the development of national drought policies through the development of a compendium of key policy elements. The key objectives of a national drought policy are to: (1) encourage vulnerable economic sectors and population groups to adopt self-reliant measures that promote risk management; (2) promote sustainable use of the agricultural and natural resource base; and (3) facilitate early recovery from drought through actions consistent with national drought policy objectives. The key elements of a drought policy framework are policy and governance, including political will; addressing risk and improving early warnings, including vulnerability analysis, impact assessment, and communication; mitigation and preparedness, including the application of effective and

  18. Toward a holistic and risk-based management of European river basins.

    PubMed

    Brack, Werner; Apitz, Sabine E; Borchardt, Dietrich; Brils, Jos; Cardoso, Ana Cristina; Foekema, Edwin M; van Gils, Jos; Jansen, Stefan; Harris, Bob; Hein, Michaela; Heise, Susanne; Hellsten, Seppo; de Maagd, P Gert-Jan; Müller, Dietmar; Panov, Vadim E; Posthuma, Leo; Quevauviller, Philippe; Verdonschot, Piet F M; von der Ohe, Peter C

    2009-01-01

    The European Union Water Framework Directive (WFD) requires a good chemical and ecological status of European surface waters by 2015. Integrated, risk-based management of river basins is presumed to be an appropriate approach to achieve that goal. The approach of focusing on distinct hazardous substances in surface waters together with investment in best available technology for treatment of industrial and domestic effluents was successful in significantly reducing excessive contamination of several European river basins. The use of the concept of chemical status in the WFD is based on this experience and focuses on chemicals for which there is a general agreement that they should be phased out. However, the chemical status, based primarily on a list of 33 priority substances and 8 priority hazardous substances, considers only a small portion of possible toxicants and does not address all causes of ecotoxicological stress in general. Recommendations for further development of this concept are 1) to focus on river basin-specific toxicants, 2) to regularly update priority lists with a focus on emerging toxicants, 3) to consider state-of-the-art mixture toxicity concepts and bioavailability to link chemical and ecological status, and 4) to add a short list of priority effects and to develop environmental quality standards for these effects. The ecological status reflected by ecological quality ratios is a leading principle of the WFD. While on the European scale the improvement of hydromorphological conditions and control of eutrophication are crucial to achieve a good ecological status, on a local and regional scale managers have to deal with multiple pressures. On this scale, toxic pollution may play an important role. Strategic research is necessary 1) to identify dominant pressures, 2) to predict multistressor effects, 3) to develop stressor- and type-specific metrics of pressures, and 4) to better understand the ecology of recovery. The concept of reference

  19. A Risk-Based Approach to Evaluating Wildlife Demographics for Management in a Changing Climate: A Case Study of the Lewis's Woodpecker

    NASA Astrophysics Data System (ADS)

    Towler, Erin; Saab, Victoria A.; Sojda, Richard S.; Dickinson, Katherine; Bruyère, Cindy L.; Newlon, Karen R.

    2012-12-01

    Given the projected threat that climate change poses to biodiversity, the need for proactive response efforts is clear. However, integrating uncertain climate change information into conservation planning is challenging, and more explicit guidance is needed. To this end, this article provides a specific example of how a risk-based approach can be used to incorporate a species' response to climate into conservation decisions. This is shown by taking advantage of species' response (i.e., impact) models that have been developed for a well-studied bird species of conservation concern. Specifically, we examine the current and potential impact of climate on nest survival of the Lewis's Woodpecker ( Melanerpes lewis) in two different habitats. To address climate uncertainty, climate scenarios are developed by manipulating historical weather observations to create ensembles (i.e., multiple sequences of daily weather) that reflect historical variability and potential climate change. These ensembles allow for a probabilistic evaluation of the risk posed to Lewis's Woodpecker nest survival and are used in two demographic analyses. First, the relative value of each habitat is compared in terms of nest survival, and second, the likelihood of exceeding a critical population threshold is examined. By embedding the analyses in a risk framework, we show how management choices can be made to be commensurate with a defined level of acceptable risk. The results can be used to inform habitat prioritization and are discussed in the context of an economic framework for evaluating trade-offs between management alternatives.

  20. A risk-based approach to evaluating wildlife demographics for management in a changing climate: A case study of the Lewis's Woodpecker

    USGS Publications Warehouse

    Towler, Erin; Saab, Victoria A.; Sojda, Richard S.; Dickinson, Katherine; Bruyere, Cindy L.; Newlon, Karen R.

    2012-01-01

    Given the projected threat that climate change poses to biodiversity, the need for proactive response efforts is clear. However, integrating uncertain climate change information into conservation planning is challenging, and more explicit guidance is needed. To this end, this article provides a specific example of how a risk-based approach can be used to incorporate a species' response to climate into conservation decisions. This is shown by taking advantage of species' response (i.e., impact) models that have been developed for a well-studied bird species of conservation concern. Specifically, we examine the current and potential impact of climate on nest survival of the Lewis's Woodpecker (Melanerpes lewis) in two different habitats. To address climate uncertainty, climate scenarios are developed by manipulating historical weather observations to create ensembles (i.e., multiple sequences of daily weather) that reflect historical variability and potential climate change. These ensembles allow for a probabilistic evaluation of the risk posed to Lewis's Woodpecker nest survival and are used in two demographic analyses. First, the relative value of each habitat is compared in terms of nest survival, and second, the likelihood of exceeding a critical population threshold is examined. By embedding the analyses in a risk framework, we show how management choices can be made to be commensurate with a defined level of acceptable risk. The results can be used to inform habitat prioritization and are discussed in the context of an economic framework for evaluating trade-offs between management alternatives.

  1. Irrigation and Instream Management under Drought Conditions using Probabilistic Constraints

    NASA Astrophysics Data System (ADS)

    Oviedo-Salcedo, D. M.; Cai, X.; Valocchi, A. J.

    2009-12-01

    It is well-known that river-aquifer flux exchange may be an important control on low flow condition in a stream. Moreover, the connections between streams and underlying formations can be spatially variable due to geological heterogeneity and landscape topography. For example, during drought seasons, farming activities may induce critical peak pumping rates to supply irrigation water needs for crops, and this leads to increased concerns about reductions in baseflow and adverse impacts upon riverine ecosystems. Quantitative management of the subsurface water resources is a required key component in this particular human-nature interaction system to evaluate the tradeoffs between irrigation for agriculture and the ecosystems low flow requirements. This work presents an optimization scheme developed upon the systems reliability-based design optimization -SRBDO- analysis, which accounts for prescribed probabilistic constraint evaluation. This approach can provide optimal solutions in the presence of uncertainty with a higher level of confidence. In addition, the proposed methodology quantifies and controls the risk of failure. SRBDO have been developed in the aerospace industry and extensively applied in the field of structural engineering, but has only seen limited application in the field of hydrology. SRBDO uses probability theory to model uncertainty and to determine the probability of failure by solving a mathematical nonlinear programming problem. Furthermore, the reliability-based design optimization provides a complete and detailed insight of the relative importance of each random variable involved in the application, in this case the surface -groundwater coupled system. Importance measures and sensitivity analyses of both, random variables and probability distribution function parameters are integral components of the system reliability analysis. Therefore, with this methodology it is possible to assess the contribution of each uncertain variable on the total

  2. A Two-Stage Probabilistic Approach to Manage Personal Worklist in Workflow Management Systems

    NASA Astrophysics Data System (ADS)

    Han, Rui; Liu, Yingbo; Wen, Lijie; Wang, Jianmin

    The application of workflow scheduling in managing individual actor's personal worklist is one area that can bring great improvement to business process. However, current deterministic work cannot adapt to the dynamics and uncertainties in the management of personal worklist. For such an issue, this paper proposes a two-stage probabilistic approach which aims at assisting actors to flexibly manage their personal worklists. To be specific, the approach analyzes every activity instance's continuous probability of satisfying deadline at the first stage. Based on this stochastic analysis result, at the second stage, an innovative scheduling strategy is proposed to minimize the overall deadline violation cost for an actor's personal worklist. Simultaneously, the strategy recommends the actor a feasible worklist of activity instances which meet the required bottom line of successful execution. The effectiveness of our approach is evaluated in a real-world workflow management system and with large scale simulation experiments.

  3. How to Quantify Sustainable Development: A Risk-Based Approach to Water Quality Management

    NASA Astrophysics Data System (ADS)

    Sarang, Amin; Vahedi, Arman; Shamsai, Abolfazl

    2008-02-01

    Since the term was coined in the Brundtland report in 1987, the issue of sustainable development has been challenged in terms of quantification. Different policy options may lend themselves more or less to the underlying principles of sustainability, but no analytical tools are available for a more in-depth assessment of the degree of sustainability. Overall, there are two major schools of thought employing the sustainability concept in managerial decisions: those of measuring and those of monitoring. Measurement of relative sustainability is the key issue in bridging the gap between theory and practice of sustainability of water resources systems. The objective of this study is to develop a practical tool for quantifying and assessing the degree of relative sustainability of water quality systems based on risk-based indicators, including reliability, resilience, and vulnerability. Current work on the Karoun River, the largest river in Iran, has included the development of an integrated model consisting of two main parts: a water quality simulation subroutine to evaluate Dissolved Oxygen Biological Oxygen Demand (DO-BOD) response, and an estimation of risk-based indicators subroutine via the First Order Reliability Method (FORM) and Monte Carlo Simulation (MCS). We also developed a simple waste load allocation model via Least Cost and Uniform Treatment approaches in order to consider the optimal point of pollutants control costs given a desired reliability value which addresses DO in two different targets. The Risk-based approach developed herein, particularly via the FORM technique, appears to be an appropriately efficient tool for estimating the relative sustainability. Moreover, our results in the Karoun system indicate that significant changes in sustainability values are possible through dedicating money for treatment and strict pollution controls while simultaneously requiring a technical advance along change in current attitudes for environment protection.

  4. Communicating uncertainty: managing the inherent probabilistic character of hazard estimates

    NASA Astrophysics Data System (ADS)

    Albarello, Dario

    2013-04-01

    Science is much more fixing the limits of our knowledge about possible occurrences than the identification of any "truth". This is particularly true when scientific statements concern prediction of natural phenomena largely exceeding the laboratory scale as in the case of seismogenesis. In these cases, many scenarios about future occurrences result possible (plausible) and the contribution of scientific knowledge (based on the available knowledge about underlying processes or the phenomenological studies) mainly consists in attributing to each scenario a different level of likelihood (probability). In other terms, scientific predictions in the field of geosciences (hazard assessment) are inherently probabilistic. However, despite of this, many scientist (seismologists, etc.) in communicating their position in public debates tend to stress the " truth" of their statements against the fancy character of pseudo-scientific assertions: stronger is the opposition of science and pseudo-science, more hidden becomes the probabilistic character of scientific statements. The problem arises when this kind of "probabilistic" knowledge becomes the basis of any political action (e.g., to impose expensive form of risk reducing activities): in these cases the lack of any definitive "truth" requires a direct assumption of responsibility by the relevant decider (being the single citizen or the legitimate expression of a larger community) to choose among several possibilities (however characterized by different levels of likelihood). In many cases, this can be uncomfortable and strong is the attitude to delegate to the scientific counterpart the responsibility of these decisions. This "transfer" from the genuine political field to an improper scientific context is also facilitated by the lack of a diffuse culture of "probability" outside the scientific community (and in many cases inside also). This is partially the effect of the generalized adoption (by media and scientific

  5. Development of applicating probabilistic long-term forecasts into water management

    NASA Astrophysics Data System (ADS)

    Hwang, Jin; Ryoo, Kyongsik; Suh, Aesook

    2016-04-01

    This research shows development of applicating probabilistic long-term forecasts into water management. Forecasted Cumulative Distribution Functions (CDFs) of monthly precipitation are plotted by combining the range of monthly precipitation based on proper Probability Density Function (PDF) in past data with probabilistic forecasts in each category. Ensembles of inflow are estimated by entering generated ensembles of precipitation based on the CDFs into the 'abcd' water budget model. The bias and RMSE between averages in past data and observed inflow are compared to them in forecasted ensembles. In our results, the bias and RMSE of average precipitation in the forecasted ensemble are bigger than in past data, whereas the average inflow in the forecasted ensemble is smaller than in past data. This result could be used for reference data to apply long-term forecasts to water management, because of the limit in the number of forecasted data for verification and differences between the Andong-dam basin and the forecasted regions. This research has significance by suggesting a method of applying probabilistic information in climate variables from long-term forecasts to water management in Korea. Original data of a climate model, which produces long-term probabilistic forecasts should be verified directly as input data of a water budget model in the future, so that a more scientific response in water management against uncertainty of climate change could be reached.

  6. Probabilistic graphs as a conceptual and computational tool in hydrology and water management

    NASA Astrophysics Data System (ADS)

    Schoups, Gerrit

    2014-05-01

    Originally developed in the fields of machine learning and artificial intelligence, probabilistic graphs constitute a general framework for modeling complex systems in the presence of uncertainty. The framework consists of three components: 1. Representation of the model as a graph (or network), with nodes depicting random variables in the model (e.g. parameters, states, etc), which are joined together by factors. Factors are local probabilistic or deterministic relations between subsets of variables, which, when multiplied together, yield the joint distribution over all variables. 2. Consistent use of probability theory for quantifying uncertainty, relying on basic rules of probability for assimilating data into the model and expressing unknown variables as a function of observations (via the posterior distribution). 3. Efficient, distributed approximation of the posterior distribution using general-purpose algorithms that exploit model structure encoded in the graph. These attributes make probabilistic graphs potentially useful as a conceptual and computational tool in hydrology and water management (and beyond). Conceptually, they can provide a common framework for existing and new probabilistic modeling approaches (e.g. by drawing inspiration from other fields of application), while computationally they can make probabilistic inference feasible in larger hydrological models. The presentation explores, via examples, some of these benefits.

  7. Application of risk-based multiple criteria decision analysis for selection of the best agricultural scenario for effective watershed management.

    PubMed

    Javidi Sabbaghian, Reza; Zarghami, Mahdi; Nejadhashemi, A Pouyan; Sharifi, Mohammad Bagher; Herman, Matthew R; Daneshvar, Fariborz

    2016-03-01

    Effective watershed management requires the evaluation of agricultural best management practice (BMP) scenarios which carefully consider the relevant environmental, economic, and social criteria involved. In the Multiple Criteria Decision-Making (MCDM) process, scenarios are first evaluated and then ranked to determine the most desirable outcome for the particular watershed. The main challenge of this process is the accurate identification of the best solution for the watershed in question, despite the various risk attitudes presented by the associated decision-makers (DMs). This paper introduces a novel approach for implementation of the MCDM process based on a comparative neutral risk/risk-based decision analysis, which results in the selection of the most desirable scenario for use in the entire watershed. At the sub-basin level, each scenario includes multiple BMPs with scores that have been calculated using the criteria derived from two cases of neutral risk and risk-based decision-making. The simple additive weighting (SAW) operator is applied for use in neutral risk decision-making, while the ordered weighted averaging (OWA) and induced OWA (IOWA) operators are effective for risk-based decision-making. At the watershed level, the BMP scores of the sub-basins are aggregated to calculate each scenarios' combined goodness measurements; the most desirable scenario for the entire watershed is then selected based on the combined goodness measurements. Our final results illustrate the type of operator and risk attitudes needed to satisfy the relevant criteria within the number of sub-basins, and how they ultimately affect the final ranking of the given scenarios. The methodology proposed here has been successfully applied to the Honeyoey Creek-Pine Creek watershed in Michigan, USA to evaluate various BMP scenarios and determine the best solution for both the stakeholders and the overall stream health.

  8. A risk-based approach to sanitary sewer pipe asset management.

    PubMed

    Baah, Kelly; Dubey, Brajesh; Harvey, Richard; McBean, Edward

    2015-02-01

    Wastewater collection systems are an important component of proper management of wastewater to prevent environmental and human health implications from mismanagement of anthropogenic waste. Due to aging and inadequate asset management practices, the wastewater collection assets of many cities around the globe are in a state of rapid decline and in need of urgent attention. Risk management is a tool which can help prioritize resources to better manage and rehabilitate wastewater collection systems. In this study, a risk matrix and a weighted sum multi-criteria decision-matrix are used to assess the consequence and risk of sewer pipe failure for a mid-sized city, using ArcGIS. The methodology shows that six percent of the uninspected sewer pipe assets of the case study have a high consequence of failure while four percent of the assets have a high risk of failure and hence provide priorities for inspection. A map incorporating risk of sewer pipe failure and consequence is developed to facilitate future planning, rehabilitation and maintenance programs. The consequence of failure assessment also includes a novel failure impact factor which captures the effect of structurally defective stormwater pipes on the failure assessment. The methodology recommended in this study can serve as a basis for future planning and decision making and has the potential to be universally applied by municipal sewer pipe asset managers globally to effectively manage the sanitary sewer pipe infrastructure within their jurisdiction.

  9. Quantitative risk-based approach for improving water quality management in mining.

    PubMed

    Liu, Wenying; Moran, Chris J; Vink, Sue

    2011-09-01

    The potential environmental threats posed by freshwater withdrawal and mine water discharge are some of the main drivers for the mining industry to improve water management. The use of multiple sources of water supply and introducing water reuse into the mine site water system have been part of the operating philosophies employed by the mining industry to realize these improvements. However, a barrier to implementation of such good water management practices is concomitant water quality variation and the resulting impacts on the efficiency of mineral separation processes, and an increased environmental consequence of noncompliant discharge events. There is an increasing appreciation that conservative water management practices, production efficiency, and environmental consequences are intimately linked through the site water system. It is therefore essential to consider water management decisions and their impacts as an integrated system as opposed to dealing with each impact separately. This paper proposes an approach that could assist mine sites to manage water quality issues in a systematic manner at the system level. This approach can quantitatively forecast the risk related with water quality and evaluate the effectiveness of management strategies in mitigating the risk by quantifying implications for production and hence economic viability. PMID:21797262

  10. Quantitative risk-based approach for improving water quality management in mining.

    PubMed

    Liu, Wenying; Moran, Chris J; Vink, Sue

    2011-09-01

    The potential environmental threats posed by freshwater withdrawal and mine water discharge are some of the main drivers for the mining industry to improve water management. The use of multiple sources of water supply and introducing water reuse into the mine site water system have been part of the operating philosophies employed by the mining industry to realize these improvements. However, a barrier to implementation of such good water management practices is concomitant water quality variation and the resulting impacts on the efficiency of mineral separation processes, and an increased environmental consequence of noncompliant discharge events. There is an increasing appreciation that conservative water management practices, production efficiency, and environmental consequences are intimately linked through the site water system. It is therefore essential to consider water management decisions and their impacts as an integrated system as opposed to dealing with each impact separately. This paper proposes an approach that could assist mine sites to manage water quality issues in a systematic manner at the system level. This approach can quantitatively forecast the risk related with water quality and evaluate the effectiveness of management strategies in mitigating the risk by quantifying implications for production and hence economic viability.

  11. Management of groundwater in farmed pond area using risk-based regulation.

    PubMed

    Huang, Jun-Ying; Liao, Chiao-Miao; Lin, Kao-Hung; Lee, Cheng-Haw

    2014-09-01

    Blackfoot disease (BFD) had occurred seriously in the Yichu, Hsuehchia, Putai, and Peimen townships of Chia-Nan District of Taiwan in the early days. These four townships are the districts of fishpond cultivation domestically in Taiwan. Groundwater becomes the main water supply because of short income in surface water. The problems of over pumping in groundwater may not only result in land subsidence and seawater intrusion but also be harmful to the health of human giving rise to the bioaccumulation via food chain in groundwater with arsenic (As). This research uses sequential indicator simulation (SIS) to characterize the spatial arsenic distribution in groundwater in the four townships. Risk assessment is applied to explore the dilution ratio (DR) of groundwater utilization, which is defined as the ratio showing the volume of groundwater utilization compared to pond water, for fish farming in the range of target cancer risk (TR) especially on the magnitude of 10(-4)~10(-6). Our study results reveal that the 50th percentile of groundwater DRs served as a regulation standard can be used to perform fish farm groundwater management for a TR of 10(-6). For a TR of 5 × 10(-6), we suggest using the 75th percentile of DR for groundwater management. For a TR of 10(-5), we suggest using the 95th percentile of the DR standard for performing groundwater management in fish farm areas. For the TR of exceeding 5 × 10(-5), we do not suggest establishing groundwater management standards under these risk standards. Based on the research results, we suggest that establishing a TR at 10(-5) and using the 95th percentile of DR are best for groundwater management in fish farm areas. PMID:24869949

  12. Management of groundwater in farmed pond area using risk-based regulation.

    PubMed

    Huang, Jun-Ying; Liao, Chiao-Miao; Lin, Kao-Hung; Lee, Cheng-Haw

    2014-09-01

    Blackfoot disease (BFD) had occurred seriously in the Yichu, Hsuehchia, Putai, and Peimen townships of Chia-Nan District of Taiwan in the early days. These four townships are the districts of fishpond cultivation domestically in Taiwan. Groundwater becomes the main water supply because of short income in surface water. The problems of over pumping in groundwater may not only result in land subsidence and seawater intrusion but also be harmful to the health of human giving rise to the bioaccumulation via food chain in groundwater with arsenic (As). This research uses sequential indicator simulation (SIS) to characterize the spatial arsenic distribution in groundwater in the four townships. Risk assessment is applied to explore the dilution ratio (DR) of groundwater utilization, which is defined as the ratio showing the volume of groundwater utilization compared to pond water, for fish farming in the range of target cancer risk (TR) especially on the magnitude of 10(-4)~10(-6). Our study results reveal that the 50th percentile of groundwater DRs served as a regulation standard can be used to perform fish farm groundwater management for a TR of 10(-6). For a TR of 5 × 10(-6), we suggest using the 75th percentile of DR for groundwater management. For a TR of 10(-5), we suggest using the 95th percentile of the DR standard for performing groundwater management in fish farm areas. For the TR of exceeding 5 × 10(-5), we do not suggest establishing groundwater management standards under these risk standards. Based on the research results, we suggest that establishing a TR at 10(-5) and using the 95th percentile of DR are best for groundwater management in fish farm areas.

  13. Urban stormwater management planning with analytical probabilistic models

    SciTech Connect

    Adams, B.J.

    2000-07-01

    Understanding how to properly manage urban stormwater is a critical concern to civil and environmental engineers the world over. Mismanagement of stormwater and urban runoff results in flooding, erosion, and water quality problems. In an effort to develop better management techniques, engineers have come to rely on computer simulation and advanced mathematical modeling techniques to help plan and predict water system performance. This important book outlines a new method that uses probability tools to model how stormwater behaves and interacts in a combined- or single-system municipal water system. Complete with sample problems and case studies illustrating how concepts really work, the book presents a cost-effective, easy-to-master approach to analytical modeling of stormwater management systems.

  14. A two-stage inexact joint-probabilistic programming method for air quality management under uncertainty.

    PubMed

    Lv, Y; Huang, G H; Li, Y P; Yang, Z F; Sun, W

    2011-03-01

    A two-stage inexact joint-probabilistic programming (TIJP) method is developed for planning a regional air quality management system with multiple pollutants and multiple sources. The TIJP method incorporates the techniques of two-stage stochastic programming, joint-probabilistic constraint programming and interval mathematical programming, where uncertainties expressed as probability distributions and interval values can be addressed. Moreover, it can not only examine the risk of violating joint-probability constraints, but also account for economic penalties as corrective measures against any infeasibility. The developed TIJP method is applied to a case study of a regional air pollution control problem, where the air quality index (AQI) is introduced for evaluation of the integrated air quality management system associated with multiple pollutants. The joint-probability exists in the environmental constraints for AQI, such that individual probabilistic constraints for each pollutant can be efficiently incorporated within the TIJP model. The results indicate that useful solutions for air quality management practices have been generated; they can help decision makers to identify desired pollution abatement strategies with minimized system cost and maximized environmental efficiency. PMID:21067860

  15. Application of risk-based assessment and management to riverbank filtration sites in India.

    PubMed

    Bartak, Rico; Page, Declan; Sandhu, Cornelius; Grischek, Thomas; Saini, Bharti; Mehrotra, Indu; Jain, Chakresh K; Ghosh, Narayan C

    2015-03-01

    This is the first reported study of a riverbank filtration (RBF) scheme to be assessed following the Australian Guidelines for Managed Aquifer Recharge. A comprehensive staged approach to assess the risks from 12 hazards to human health and the environment has been undertaken. Highest risks from untreated ground and Ganga River water were identified with pathogens, turbidity, iron, manganese, total dissolved solids and total hardness. Recovered water meets the guideline values for inorganic chemicals and salinity but exceeds limits for thermotolerant coliforms frequently. A quantitative microbial risk assessment undertaken on the water recovered from the aquifer indicated that the residual risks of 0.00165 disability-adjusted life years (DALYs) posed by the reference bacteria Escherichia coli O157:H7 were below the national diarrhoeal incidence of 0.027 DALYs and meet the health target in this study of 0.005 DALYs per person per year, which corresponds to the World Health Organization (WHO) regional diarrhoeal incidence in South-East Asia. Monsoon season was a major contributor to the calculated burden of disease and final DALYs were strongly dependent on RBF and disinfection pathogen removal capabilities. Finally, a water safety plan was developed with potential risk management procedures to minimize residual risks related to pathogens.

  16. A risk-based approach to management of leachables utilizing statistical analysis of extractables.

    PubMed

    Stults, Cheryl L M; Mikl, Jaromir; Whelehan, Oliver; Morrical, Bradley; Duffield, William; Nagao, Lee M

    2015-04-01

    To incorporate quality by design concepts into the management of leachables, an emphasis is often put on understanding the extractable profile for the materials of construction for manufacturing disposables, container-closure, or delivery systems. Component manufacturing processes may also impact the extractable profile. An approach was developed to (1) identify critical components that may be sources of leachables, (2) enable an understanding of manufacturing process factors that affect extractable profiles, (3) determine if quantitative models can be developed that predict the effect of those key factors, and (4) evaluate the practical impact of the key factors on the product. A risk evaluation for an inhalation product identified injection molding as a key process. Designed experiments were performed to evaluate the impact of molding process parameters on the extractable profile from an ABS inhaler component. Statistical analysis of the resulting GC chromatographic profiles identified processing factors that were correlated with peak levels in the extractable profiles. The combination of statistically significant molding process parameters was different for different types of extractable compounds. ANOVA models were used to obtain optimal process settings and predict extractable levels for a selected number of compounds. The proposed paradigm may be applied to evaluate the impact of material composition and processing parameters on extractable profiles and utilized to manage product leachables early in the development process and throughout the product lifecycle. PMID:25294001

  17. Application of risk-based assessment and management to riverbank filtration sites in India.

    PubMed

    Bartak, Rico; Page, Declan; Sandhu, Cornelius; Grischek, Thomas; Saini, Bharti; Mehrotra, Indu; Jain, Chakresh K; Ghosh, Narayan C

    2015-03-01

    This is the first reported study of a riverbank filtration (RBF) scheme to be assessed following the Australian Guidelines for Managed Aquifer Recharge. A comprehensive staged approach to assess the risks from 12 hazards to human health and the environment has been undertaken. Highest risks from untreated ground and Ganga River water were identified with pathogens, turbidity, iron, manganese, total dissolved solids and total hardness. Recovered water meets the guideline values for inorganic chemicals and salinity but exceeds limits for thermotolerant coliforms frequently. A quantitative microbial risk assessment undertaken on the water recovered from the aquifer indicated that the residual risks of 0.00165 disability-adjusted life years (DALYs) posed by the reference bacteria Escherichia coli O157:H7 were below the national diarrhoeal incidence of 0.027 DALYs and meet the health target in this study of 0.005 DALYs per person per year, which corresponds to the World Health Organization (WHO) regional diarrhoeal incidence in South-East Asia. Monsoon season was a major contributor to the calculated burden of disease and final DALYs were strongly dependent on RBF and disinfection pathogen removal capabilities. Finally, a water safety plan was developed with potential risk management procedures to minimize residual risks related to pathogens. PMID:25719477

  18. A risk-based approach to management of leachables utilizing statistical analysis of extractables.

    PubMed

    Stults, Cheryl L M; Mikl, Jaromir; Whelehan, Oliver; Morrical, Bradley; Duffield, William; Nagao, Lee M

    2015-04-01

    To incorporate quality by design concepts into the management of leachables, an emphasis is often put on understanding the extractable profile for the materials of construction for manufacturing disposables, container-closure, or delivery systems. Component manufacturing processes may also impact the extractable profile. An approach was developed to (1) identify critical components that may be sources of leachables, (2) enable an understanding of manufacturing process factors that affect extractable profiles, (3) determine if quantitative models can be developed that predict the effect of those key factors, and (4) evaluate the practical impact of the key factors on the product. A risk evaluation for an inhalation product identified injection molding as a key process. Designed experiments were performed to evaluate the impact of molding process parameters on the extractable profile from an ABS inhaler component. Statistical analysis of the resulting GC chromatographic profiles identified processing factors that were correlated with peak levels in the extractable profiles. The combination of statistically significant molding process parameters was different for different types of extractable compounds. ANOVA models were used to obtain optimal process settings and predict extractable levels for a selected number of compounds. The proposed paradigm may be applied to evaluate the impact of material composition and processing parameters on extractable profiles and utilized to manage product leachables early in the development process and throughout the product lifecycle.

  19. A risk-based approach to managing active pharmaceutical ingredients in manufacturing effluent.

    PubMed

    Caldwell, Daniel J; Mertens, Birgit; Kappler, Kelly; Senac, Thomas; Journel, Romain; Wilson, Peter; Meyerhoff, Roger D; Parke, Neil J; Mastrocco, Frank; Mattson, Bengt; Murray-Smith, Richard; Dolan, David G; Straub, Jürg Oliver; Wiedemann, Michael; Hartmann, Andreas; Finan, Douglas S

    2016-04-01

    The present study describes guidance intended to assist pharmaceutical manufacturers in assessing, mitigating, and managing the potential environmental impacts of active pharmaceutical ingredients (APIs) in wastewater from manufacturing operations, including those from external suppliers. The tools are not a substitute for compliance with local regulatory requirements but rather are intended to help manufacturers achieve the general standard of "no discharge of APIs in toxic amounts." The approaches detailed in the present study identify practices for assessing potential environmental risks from APIs in manufacturing effluent and outline measures that can be used to reduce the risk, including selective application of available treatment technologies. These measures either are commonly employed within the industry or have been implemented to a more limited extent based on local circumstances. Much of the material is based on company experience and case studies discussed at an industry workshop held on this topic. PMID:26183919

  20. A risk-based approach to managing active pharmaceutical ingredients in manufacturing effluent.

    PubMed

    Caldwell, Daniel J; Mertens, Birgit; Kappler, Kelly; Senac, Thomas; Journel, Romain; Wilson, Peter; Meyerhoff, Roger D; Parke, Neil J; Mastrocco, Frank; Mattson, Bengt; Murray-Smith, Richard; Dolan, David G; Straub, Jürg Oliver; Wiedemann, Michael; Hartmann, Andreas; Finan, Douglas S

    2016-04-01

    The present study describes guidance intended to assist pharmaceutical manufacturers in assessing, mitigating, and managing the potential environmental impacts of active pharmaceutical ingredients (APIs) in wastewater from manufacturing operations, including those from external suppliers. The tools are not a substitute for compliance with local regulatory requirements but rather are intended to help manufacturers achieve the general standard of "no discharge of APIs in toxic amounts." The approaches detailed in the present study identify practices for assessing potential environmental risks from APIs in manufacturing effluent and outline measures that can be used to reduce the risk, including selective application of available treatment technologies. These measures either are commonly employed within the industry or have been implemented to a more limited extent based on local circumstances. Much of the material is based on company experience and case studies discussed at an industry workshop held on this topic.

  1. Propagating Water Quality Analysis Uncertainty Into Resource Management Decisions Through Probabilistic Modeling

    NASA Astrophysics Data System (ADS)

    Gronewold, A. D.; Wolpert, R. L.; Reckhow, K. H.

    2007-12-01

    Most probable number (MPN) and colony-forming-unit (CFU) are two estimates of fecal coliform bacteria concentration commonly used as measures of water quality in United States shellfish harvesting waters. The MPN is the maximum likelihood estimate (or MLE) of the true fecal coliform concentration based on counts of non-sterile tubes in serial dilution of a sample aliquot, indicating bacterial metabolic activity. The CFU is the MLE of the true fecal coliform concentration based on the number of bacteria colonies emerging on a growth plate after inoculation from a sample aliquot. Each estimating procedure has intrinsic variability and is subject to additional uncertainty arising from minor variations in experimental protocol. Several versions of each procedure (using different sized aliquots or different numbers of tubes, for example) are in common use, each with its own levels of probabilistic and experimental error and uncertainty. It has been observed empirically that the MPN procedure is more variable than the CFU procedure, and that MPN estimates are somewhat higher on average than CFU estimates, on split samples from the same water bodies. We construct a probabilistic model that provides a clear theoretical explanation for the observed variability in, and discrepancy between, MPN and CFU measurements. We then explore how this variability and uncertainty might propagate into shellfish harvesting area management decisions through a two-phased modeling strategy. First, we apply our probabilistic model in a simulation-based analysis of future water quality standard violation frequencies under alternative land use scenarios, such as those evaluated under guidelines of the total maximum daily load (TMDL) program. Second, we apply our model to water quality data from shellfish harvesting areas which at present are closed (either conditionally or permanently) to shellfishing, to determine if alternative laboratory analysis procedures might have led to different

  2. Probabilistic Risk Assessment Procedures Guide for NASA Managers and Practitioners (Second Edition)

    NASA Technical Reports Server (NTRS)

    Stamatelatos,Michael; Dezfuli, Homayoon; Apostolakis, George; Everline, Chester; Guarro, Sergio; Mathias, Donovan; Mosleh, Ali; Paulos, Todd; Riha, David; Smith, Curtis; Vesely, William; Youngblood, Robert

    2011-01-01

    Probabilistic Risk Assessment (PRA) is a comprehensive, structured, and logical analysis method aimed at identifying and assessing risks in complex technological systems for the purpose of cost-effectively improving their safety and performance. NASA's objective is to better understand and effectively manage risk, and thus more effectively ensure mission and programmatic success, and to achieve and maintain high safety standards at NASA. NASA intends to use risk assessment in its programs and projects to support optimal management decision making for the improvement of safety and program performance. In addition to using quantitative/probabilistic risk assessment to improve safety and enhance the safety decision process, NASA has incorporated quantitative risk assessment into its system safety assessment process, which until now has relied primarily on a qualitative representation of risk. Also, NASA has recently adopted the Risk-Informed Decision Making (RIDM) process [1-1] as a valuable addition to supplement existing deterministic and experience-based engineering methods and tools. Over the years, NASA has been a leader in most of the technologies it has employed in its programs. One would think that PRA should be no exception. In fact, it would be natural for NASA to be a leader in PRA because, as a technology pioneer, NASA uses risk assessment and management implicitly or explicitly on a daily basis. NASA has probabilistic safety requirements (thresholds and goals) for crew transportation system missions to the International Space Station (ISS) [1-2]. NASA intends to have probabilistic requirements for any new human spaceflight transportation system acquisition. Methods to perform risk and reliability assessment in the early 1960s originated in U.S. aerospace and missile programs. Fault tree analysis (FTA) is an example. It would have been a reasonable extrapolation to expect that NASA would also become the world leader in the application of PRA. That was

  3. Probabilistic scenario-based water resource planning and management:A case study in the Yellow River Basin, China

    NASA Astrophysics Data System (ADS)

    Dong, C.; Schoups, G.; van de Giesen, N.

    2012-04-01

    Water resource planning and management is subject to large uncertainties with respect to the impact of climate change and socio-economic development on water systems. In order to deal with these uncertainties, probabilistic climate and socio-economic scenarios were developed based on the Principle of Maximum Entropy, as defined within information theory, and as inputs to hydrological models to construct probabilistic water scenarios using Monte Carlo simulation. Probabilistic scenarios provide more explicit information than equally-likely scenarios for decision-making in water resource management. A case was developed for the Yellow River Basin, China, where future water availability and water demand are affected by both climate change and socio-economic development. Climate scenarios of future precipitation and temperature were developed based on the results of multiple Global climate models; and socio-economic scenarios were downscaled from existing large-scale scenarios. Probability distributions were assigned to these scenarios to explicitly represent a full set of future possibilities. Probabilistic climate scenarios were used as input to a rainfall-runoff model to simulate future river discharge and socio-economic scenarios for calculating water demand. A full set of possible future water supply-demand scenarios and their associated probability distributions were generated. This set can feed the further analysis of the future water balance, which can be used as a basis to plan and manage water resources in the Yellow River Basin. Key words: Probabilistic scenarios, climate change, socio-economic development, water management

  4. Towards risk-based drought management in the Netherlands: making water supply levels transparent to water users

    NASA Astrophysics Data System (ADS)

    Maat Judith, Ter; Marjolein, Mens; Vuren Saskia, Van; der Vat Marnix, Van

    2016-04-01

    Improving Predictions and Management of Hydrological Extremes (IMPREX), running from 2016-2019, a consortium of the Dutch research institute Deltares and the Dutch water management consultant HKV will design and build a tool to support quantitative risk-informed decision-making for fresh water management for the Netherlands, in particular the decision on water supply service levels. The research will be conducted in collaboration with the Dutch Ministry for Infrastructure and Environment, the Freshwater Supply Programme Office, the Dutch governmental organisation responsible for water management (Rijkswaterstaat), the Foundation for Applied Water Research, (STOWA, knowledge centre of the water boards) and a number of water boards. In the session we will present the conceptual framework for a risk-based approach for water shortage management and share thoughts on how the proposed tool can be applied in the Dutch water management context.

  5. A Framework for Probabilistic Evaluation of Interval Management Tolerance in the Terminal Radar Control Area

    NASA Technical Reports Server (NTRS)

    Hercencia-Zapana, Heber; Herencia-Zapana, Heber; Hagen, George E.; Neogi, Natasha

    2012-01-01

    Projections of future traffic in the national airspace show that most of the hub airports and their attendant airspace will need to undergo significant redevelopment and redesign in order to accommodate any significant increase in traffic volume. Even though closely spaced parallel approaches increase throughput into a given airport, controller workload in oversubscribed metroplexes is further taxed by these approaches that require stringent monitoring in a saturated environment. The interval management (IM) concept in the TRACON area is designed to shift some of the operational burden from the control tower to the flight deck, placing the flight crew in charge of implementing the required speed changes to maintain a relative spacing interval. The interval management tolerance is a measure of the allowable deviation from the desired spacing interval for the IM aircraft (and its target aircraft). For this complex task, Formal Methods can help to ensure better design and system implementation. In this paper, we propose a probabilistic framework to quantify the uncertainty and performance associated with the major components of the IM tolerance. The analytical basis for this framework may be used to formalize both correctness and probabilistic system safety claims in a modular fashion at the algorithmic level in a way compatible with several Formal Methods tools.

  6. Spatial probabilistic multi-criteria decision making for assessment of flood management alternatives

    NASA Astrophysics Data System (ADS)

    Ahmadisharaf, Ebrahim; Kalyanapu, Alfred J.; Chung, Eun-Sung

    2016-02-01

    Flood management alternatives are often evaluated on the basis of flood parameters such as depth and velocity. As these parameters are uncertain, so is the evaluation of the alternatives. It is thus important to incorporate the uncertainty of flood parameters into the decision making frameworks. This research develops a spatial probabilistic multi-criteria decision making (SPMCDM) framework to demonstrate the impact of the design rainfall uncertainty on evaluation of flood management alternatives. The framework employs a probabilistic rainfall-runoff transformation model, a two-dimensional flood model and a spatial MCDM technique. Thereby, the uncertainty of decision making can be determined alongside the best alternative. A probability-based map is produced to show the discrete probability distribution function (PDF) of selecting each competing alternative. Overall the best at each grid cell is the alternative with the mode parameter of this PDF. This framework is demonstrated on the Swannanoa River watershed in North Carolina, USA and its results are compared to those of deterministic approach. While the deterministic framework fails to provide the uncertainty of selecting an alternative, the SPMCDM framework showed that in overall, selection of flood management alternatives in the watershed is "moderately uncertain". Moreover, three comparison metrics, F fit measure, κ statistic, and Spearman rank correlation coefficient (ρ), are computed to compare the results of these two approaches. An F fit measure of 62.6%, κ statistic of 15.4-45.0%, and spatial mean ρ value of 0.48, imply a significant difference in decision making by incorporating the design rainfall uncertainty through the presented SPMCDM framework. The SPMCDM framework can help decision makers to understand the uncertainty in selection of flood management alternatives.

  7. Management of the Area 5 Radioactive Waste Management Site using Decision-based, Probabilistic Performance Assessment Modeling

    SciTech Connect

    Carilli, J.; Crowe, B.; Black, P.; Tauxe, J.; Stockton, T.; Catlett, K.; Yucel, V.

    2003-02-27

    Low-level radioactive waste from cleanup activities at the Nevada Test Site and from multiple sites across the U.S. Department of Energy (DOE) complex is disposed at two active Radioactive Waste Management Sites (RWMS) on the Nevada Test Site. These facilities, which are managed by the DOE National Nuclear Security Administration Nevada Site Office, were recently designated as one of two regional disposal centers and yearly volumes of disposed waste now exceed 50,000 m3 (> 2 million ft3). To safely and cost-effectively manage the disposal facilities, the Waste Management Division of Environmental Management has implemented decision-based management practices using flexible and problem-oriented probabilistic performance assessment modeling. Deterministic performance assessments and composite analyses were completed originally for the Area 5 and Area 3 RWMSs located in, respectively, Frenchman Flat and Yucca Flat on the Nevada Test Site. These documents provide the technical bases for issuance of disposal authorization statements for continuing operation of the disposal facilities. Both facilities are now in a maintenance phase that requires testing of conceptual models, reduction of uncertainty, and site monitoring all leading to eventual closure of the facilities and transition to long-term stewardship.

  8. Inexact joint-probabilistic stochastic programming for water resources management under uncertainty

    NASA Astrophysics Data System (ADS)

    Li, Y. P.; Huang, G. H.

    2010-11-01

    In this study, an inexact two-stage integer program with joint-probabilistic constraint (ITIP-JPC) is developed for supporting water resources management under uncertainty. This method can tackle uncertainties expressed as joint probabilities and interval values, and can reflect the reliability of satisfying (or the risk of violating) system constraints under uncertain events and/or parameters. Moreover, it can be used for analysing various policy scenarios that are associated with different levels of economic consequences when the pre-regulated targets are violated. The developed ITIP-JPC is applied to a case study of water resources allocation within a multi-stream, multi-reservoir and multi-user context, where joint probabilities exist in both water availabilities and storage capacities. The results indicate that reasonable solutions have been generated for both binary and continuous variables. They can help generate desired policies for water allocation and flood diversion with a maximized economic benefit and a minimized system-disruption risk.

  9. Integrated software health management for aerospace guidance, navigation, and control systems: A probabilistic reasoning approach

    NASA Astrophysics Data System (ADS)

    Mbaya, Timmy

    Embedded Aerospace Systems have to perform safety and mission critical operations in a real-time environment where timing and functional correctness are extremely important. Guidance, Navigation, and Control (GN&C) systems substantially rely on complex software interfacing with hardware in real-time; any faults in software or hardware, or their interaction could result in fatal consequences. Integrated Software Health Management (ISWHM) provides an approach for detection and diagnosis of software failures while the software is in operation. The ISWHM approach is based on probabilistic modeling of software and hardware sensors using a Bayesian network. To meet memory and timing constraints of real-time embedded execution, the Bayesian network is compiled into an Arithmetic Circuit, which is used for on-line monitoring. This type of system monitoring, using an ISWHM, provides automated reasoning capabilities that compute diagnoses in a timely manner when failures occur. This reasoning capability enables time-critical mitigating decisions and relieves the human agent from the time-consuming and arduous task of foraging through a multitude of isolated---and often contradictory---diagnosis data. For the purpose of demonstrating the relevance of ISWHM, modeling and reasoning is performed on a simple simulated aerospace system running on a real-time operating system emulator, the OSEK/Trampoline platform. Models for a small satellite and an F-16 fighter jet GN&C (Guidance, Navigation, and Control) system have been implemented. Analysis of the ISWHM is then performed by injecting faults and analyzing the ISWHM's diagnoses.

  10. Improved water allocation utilizing probabilistic climate forecasts: Short-term water contracts in a risk management framework

    NASA Astrophysics Data System (ADS)

    Sankarasubramanian, A.; Lall, Upmanu; Souza Filho, Francisco Assis; Sharma, Ashish

    2009-11-01

    Probabilistic, seasonal to interannual streamflow forecasts are becoming increasingly available as the ability to model climate teleconnections is improving. However, water managers and practitioners have been slow to adopt such products, citing concerns with forecast skill. Essentially, a management risk is perceived in "gambling" with operations using a probabilistic forecast, while a system failure upon following existing operating policies is "protected" by the official rules or guidebook. In the presence of a prescribed system of prior allocation of releases under different storage or water availability conditions, the manager has little incentive to change. Innovation in allocation and operation is hence key to improved risk management using such forecasts. A participatory water allocation process that can effectively use probabilistic forecasts as part of an adaptive management strategy is introduced here. Users can express their demand for water through statements that cover the quantity needed at a particular reliability, the temporal distribution of the "allocation," the associated willingness to pay, and compensation in the event of contract nonperformance. The water manager then assesses feasible allocations using the probabilistic forecast that try to meet these criteria across all users. An iterative process between users and water manager could be used to formalize a set of short-term contracts that represent the resulting prioritized water allocation strategy over the operating period for which the forecast was issued. These contracts can be used to allocate water each year/season beyond long-term contracts that may have precedence. Thus, integrated supply and demand management can be achieved. In this paper, a single period multiuser optimization model that can support such an allocation process is presented. The application of this conceptual model is explored using data for the Jaguaribe Metropolitan Hydro System in Ceara, Brazil. The performance

  11. Use Of Probabilistic Risk Assessment (PRA) In Expert Systems To Advise Nuclear Plant Operators And Managers

    NASA Astrophysics Data System (ADS)

    Uhrig, Robert E.

    1988-03-01

    The use of expert systems in nuclear power plants to provide advice to managers, supervisors and/or operators is a concept that is rapidly gaining acceptance. f2 Generally, expert systems rely on the expertise of human experts or knowledge that has been codified in publications, books, or regulations to provide advice under a wide variety of conditions. In this work, a probabilistic risk assessment (PRA)3 of a nuclear power plant performed previously is used to assess the safety status of nuclear power plants and to make recommendations to the plant personnel. Nuclear power plants have many redundant systems and can continue to operate when one or more of these systems is disabled or removed from service for maintenance or testing. PRAs provide a means of evaluating the risk to the public associated with the operation of nuclear power plants with components or systems out of service. While the choice of the "source term" and methodology in a PRA may influence the absolute probability and consequences of a core melt, the ratio of two PRA calculations for two configurations of the same plant, carried out on a consistent basis, can readily identify the increase in risk associated with going from one configuration to the other. PRISIM,4 a personal computer program to calculate the ratio of core melt probabilities described above (based on previously performed PRAs), has been developed under the sponsorship of the U.S. Nuclear Regulatory Commission (NRC). When one or several components are removed from service, PRISM then calculates the ratio of the core melt probabilities. The inference engine of the expert system then uses this ratio and a constant risk criterion,5 along with information from its knowledge base (which includes information from the PRA), to advise plant personnel as to what action, if any, should be taken.

  12. Multi-criteria decision analysis with probabilistic risk assessment for the management of contaminated ground water

    SciTech Connect

    Khadam, Ibrahim M.; Kaluarachchi, Jagath J

    2003-10-01

    Traditionally, environmental decision analysis in subsurface contamination scenarios is performed using cost-benefit analysis. In this paper, we discuss some of the limitations associated with cost-benefit analysis, especially its definition of risk, its definition of cost of risk, and its poor ability to communicate risk-related information. This paper presents an integrated approach for management of contaminated ground water resources using health risk assessment and economic analysis through a multi-criteria decision analysis framework. The methodology introduces several important concepts and definitions in decision analysis related to subsurface contamination. These are the trade-off between population risk and individual risk, the trade-off between the residual risk and the cost of risk reduction, and cost-effectiveness as a justification for remediation. The proposed decision analysis framework integrates probabilistic health risk assessment into a comprehensive, yet simple, cost-based multi-criteria decision analysis framework. The methodology focuses on developing decision criteria that provide insight into the common questions of the decision-maker that involve a number of remedial alternatives. The paper then explores three potential approaches for alternative ranking, a structured explicit decision analysis, a heuristic approach of importance of the order of criteria, and a fuzzy logic approach based on fuzzy dominance and similarity analysis. Using formal alternative ranking procedures, the methodology seeks to present a structured decision analysis framework that can be applied consistently across many different and complex remediation settings. A simple numerical example is presented to demonstrate the proposed methodology. The results showed the importance of using an integrated approach for decision-making considering both costs and risks. Future work should focus on the application of the methodology to a variety of complex field conditions to

  13. Risk-Based Approach for Microbiological Food Safety Management in the Dairy Industry: The Case of Listeria monocytogenes in Soft Cheese Made from Pasteurized Milk.

    PubMed

    Tenenhaus-Aziza, Fanny; Daudin, Jean-Jacques; Maffre, Alexandre; Sanaa, Moez

    2014-01-01

    According to Codex Alimentarius Commission recommendations, management options applied at the process production level should be based on good hygiene practices, HACCP system, and new risk management metrics such as the food safety objective. To follow this last recommendation, the use of quantitative microbiological risk assessment is an appealing approach to link new risk-based metrics to management options that may be applied by food operators. Through a specific case study, Listeria monocytogenes in soft cheese made from pasteurized milk, the objective of the present article is to practically show how quantitative risk assessment could be used to direct potential intervention strategies at different food processing steps. Based on many assumptions, the model developed estimates the risk of listeriosis at the moment of consumption taking into account the entire manufacturing process and potential sources of contamination. From pasteurization to consumption, the amplification of a primo-contamination event of the milk, the fresh cheese or the process environment is simulated, over time, space, and between products, accounting for the impact of management options, such as hygienic operations and sampling plans. A sensitivity analysis of the model will help orientating data to be collected prioritarily for the improvement and the validation of the model. What-if scenarios were simulated and allowed for the identification of major parameters contributing to the risk of listeriosis and the optimization of preventive and corrective measures.

  14. Risk-Based Approach for Microbiological Food Safety Management in the Dairy Industry: The Case of Listeria monocytogenes in Soft Cheese Made from Pasteurized Milk.

    PubMed

    Tenenhaus-Aziza, Fanny; Daudin, Jean-Jacques; Maffre, Alexandre; Sanaa, Moez

    2014-01-01

    According to Codex Alimentarius Commission recommendations, management options applied at the process production level should be based on good hygiene practices, HACCP system, and new risk management metrics such as the food safety objective. To follow this last recommendation, the use of quantitative microbiological risk assessment is an appealing approach to link new risk-based metrics to management options that may be applied by food operators. Through a specific case study, Listeria monocytogenes in soft cheese made from pasteurized milk, the objective of the present article is to practically show how quantitative risk assessment could be used to direct potential intervention strategies at different food processing steps. Based on many assumptions, the model developed estimates the risk of listeriosis at the moment of consumption taking into account the entire manufacturing process and potential sources of contamination. From pasteurization to consumption, the amplification of a primo-contamination event of the milk, the fresh cheese or the process environment is simulated, over time, space, and between products, accounting for the impact of management options, such as hygienic operations and sampling plans. A sensitivity analysis of the model will help orientating data to be collected prioritarily for the improvement and the validation of the model. What-if scenarios were simulated and allowed for the identification of major parameters contributing to the risk of listeriosis and the optimization of preventive and corrective measures. PMID:23777564

  15. A risk-based decision tool for the management of organic waste in agriculture and farming activities (FARMERS).

    PubMed

    Río, Miguel; Franco-Uría, Amaya; Abad, Emilio; Roca, Enrique

    2011-01-30

    Currently, specific management guidelines must be implemented for guaranteeing the safe reuse of organic waste in agriculture. With that aim, this work was focused on the development of a decision support tool for a safe and sustainable management of cattle manure as fertiliser in pastureland, to control and limit metal accumulation in soil and to reduce metal biotransfer from soil to other compartments. The system was developed on the basis of an environmental risk assessment multi-compartmental model. In contrast to other management tools, a long-term dynamic modelling approach was selected considering the persistence of metals in the environment. A detailed description of the underlying flow equations which accounts for distribution, human exposure and risk characterisation of metals in the assessed scenario was presented, as well as model parameterization. The tool was implemented in Visual C++ and is structured on a data base, where all required data is stored, the risk assessment model and a GIS module for the visualization of the scenario characteristics and the results obtained (risk indexes). The decision support system allows choosing among three estimation options, depending on the needs of the user, which provide information to both farmers and policy makers. The first option is useful for evaluating the adequacy of the current management practices of the different farms, and the remaining ones provides information on the measures that can be taken to carry out a fertilising plan without exceeding risk to human health. Among other results, maximum values of application rates of manure, maximum permissible metal content of manure and maximum application times in a particular scenario can be estimated by this system. To illustrate tool application, a real case study with data corresponding to different farms of a milk production cooperative was presented.

  16. Probabilistic floodplain hazard mapping: managing uncertainty by using a bivariate approach for flood frequency analysis

    NASA Astrophysics Data System (ADS)

    Candela, Angela; Tito Aronica, Giuseppe

    2014-05-01

    Floods are a global problem and are considered the most frequent natural disaster world-wide. Many studies show that the severity and frequency of floods have increased in recent years and underline the difficulty to separate the effects of natural climatic changes and human influences as land management practices, urbanization etc. Flood risk analysis and assessment is required to provide information on current or future flood hazard and risks in order to accomplish flood risk mitigation, to propose, evaluate and select measures to reduce it. Both components of risk can be mapped individually and are affected by multiple uncertainties as well as the joint estimate of flood risk. Major sources of uncertainty include statistical analysis of extremes events, definition of hydrological input, channel and floodplain topography representation, the choice of effective hydraulic roughness coefficients. The classical procedure to estimate flood discharge for a chosen probability of exceedance is to deal with a rainfall-runoff model associating to risk the same return period of original rainfall, in accordance with the iso-frequency criterion. Alternatively, a flood frequency analysis to a given record of discharge data is applied, but again the same probability is associated to flood discharges and respective risk. Moreover, since flood peaks and corresponding flood volumes are variables of the same phenomenon, they should be, directly, correlated and, consequently, multivariate statistical analyses must be applied. This study presents an innovative approach to obtain flood hazard maps where hydrological input (synthetic flood design event) to a 2D hydraulic model has been defined by generating flood peak discharges and volumes from: a) a classical univariate approach, b) a bivariate statistical analysis, through the use of copulas. The univariate approach considers flood hydrographs generation by an indirect approach (rainfall-runoff transformation using input rainfall

  17. Groundwater contamination from waste management sites: The interaction between risk-based engineering design and regulatory policy: 1. Methodology

    NASA Astrophysics Data System (ADS)

    Massmann, Joel; Freeze, R. Allan

    1987-02-01

    This paper puts in place a risk-cost-benefit analysis for waste management facilities that explicitly recognizes the adversarial relationship that exists in a regulated market economy between the owner/operator of a waste management facility and the government regulatory agency under whose terms the facility must be licensed. The risk-cost-benefit analysis is set up from the perspective of the owner/operator. It can be used directly by the owner/operator to assess alternative design strategies. It can also be used by the regulatory agency to assess alternative regulatory policy, but only in an indirect manner, by examining the response of an owner/operator to the stimuli of various policies. The objective function is couched in terms of a discounted stream of benefits, costs, and risks over an engineering time horizon. Benefits are in the form of revenues for services provided; costs are those of construction and operation of the facility. Risk is defined as the cost associated with the probability of failure, with failure defined as the occurrence of a groundwater contamination event that violates the licensing requirements established for the facility. Failure requires a breach of the containment structure and contaminant migration through the hydrogeological environment to a compliance surface. The probability of failure can be estimated on the basis of reliability theory for the breach of containment and with a Monte-Carlo finite-element simulation for the advective contaminant transport. In the hydrogeological environment the hydraulic conductivity values are defined stochastically. The probability of failure is reduced by the presence of a monitoring network operated by the owner/operator and located between the source and the regulatory compliance surface. The level of reduction in the probability of failure depends on the probability of detection of the monitoring network, which can be calculated from the stochastic contaminant transport simulations. While

  18. Network analysis of swine shipments in Ontario, Canada, to support disease spread modelling and risk-based disease management.

    PubMed

    Dorjee, S; Revie, C W; Poljak, Z; McNab, W B; Sanchez, J

    2013-10-01

    Understanding contact networks are important for modelling and managing the spread and control of communicable diseases in populations. This study characterizes the swine shipment network of a multi-site production system in southwestern Ontario, Canada. Data were extracted from a company's database listing swine shipments among 251 swine farms, including 20 sow, 69 nursery and 162 finishing farms, for the 2-year period of 2006 to 2007. Several network metrics were generated. The number of shipments per week between pairs of farms ranged from 1 to 6. The medians (and ranges) of out-degree were: sow 6 (1-21), nursery 8 (0-25), and finishing 0 (0-4), over the entire 2-year study period. Corresponding estimates for in-degree of nursery and finishing farms were 3 (0-9) and 3 (0-12) respectively. Outgoing and incoming infection chains (OIC and IIC), were also measured. The medians (ranges) of the monthly OIC and IIC were 0 (0-8) and 0 (0-6), respectively, with very similar measures observed for 2-week intervals. Nursery farms exhibited high measures of centrality. This indicates that they pose greater risks of disease spread in the network. Therefore, they should be given a high priority for disease prevention and control measures affecting all age groups alike. The network demonstrated scale-free and small-world topologies as observed in other livestock shipment studies. This heterogeneity in contacts among farm types and network topologies should be incorporated in simulation models to improve their validity. In conclusion, this study provided useful epidemiological information and parameters for the control and modelling of disease spread among swine farms, for the first time from Ontario, Canada.

  19. Undiscovered Locatable Mineral Resources in the Bay Resource Management Plan Area, Southwestern Alaska: A Probabilistic Assessment

    USGS Publications Warehouse

    Schmidt, J.M.; Light, T.D.; Drew, L.J.; Wilson, F.H.; Miller, M.L.; Saltus, R.W.

    2007-01-01

    The Bay Resource Management Plan (RMP) area in southwestern Alaska, north and northeast of Bristol Bay contains significant potential for undiscovered locatable mineral resources of base and precious metals, in addition to metallic mineral deposits that are already known. A quantitative probabilistic assessment has identified 24 tracts of land that are permissive for 17 mineral deposit model types likely to be explored for within the next 15 years in this region. Commodities we discuss in this report that have potential to occur in the Bay RMP area are Ag, Au, Cr, Cu, Fe, Hg, Mo, Pb, Sn, W, Zn, and platinum-group elements. Geoscience data for the region are sufficient to make quantitative estimates of the number of undiscovered deposits only for porphyry copper, epithermal vein, copper skarn, iron skarn, hot-spring mercury, placer gold, and placer platinum-deposit models. A description of a group of shallow- to intermediate-level intrusion-related gold deposits is combined with grade and tonnage data from 13 deposits of this type to provide a quantitative estimate of undiscovered deposits of this new type. We estimate that significant resources of Ag, Au, Cu, Fe, Hg, Mo, Pb, and Pt occur in the Bay Resource Management Plan area in these deposit types. At the 10th percentile probability level, the Bay RMP area is estimated to contain 10,067 metric tons silver, 1,485 metric tons gold, 12.66 million metric tons copper, 560 million metric tons iron, 8,100 metric tons mercury, 500,000 metric tons molybdenum, 150 metric tons lead, and 17 metric tons of platinum in undiscovered deposits of the eight quantified deposit types. At the 90th percentile probability level, the Bay RMP area is estimated to contain 89 metric tons silver, 14 metric tons gold, 911,215 metric tons copper, 330,000 metric tons iron, 1 metric ton mercury, 8,600 metric tons molybdenum and 1 metric ton platinum in undiscovered deposits of the eight deposit types. Other commodities, which may occur in the

  20. Synthetic drought event sets: thousands of meteorological drought events for risk-based management under present and future conditions

    NASA Astrophysics Data System (ADS)

    Guillod, Benoit P.; Massey, Neil; Otto, Friederike E. L.; Allen, Myles R.; Jones, Richard; Hall, Jim W.

    2016-04-01

    Droughts and related water scarcity can have large impacts on societies and consist of interactions between a number of natural and human factors. Meteorological conditions are usually the first natural trigger of droughts, and climate change is expected to impact these and thereby the frequency and intensity of the events. However, extreme events such as droughts are, by definition, rare, and accurately quantifying the risk related to such events is therefore difficult. The MaRIUS project (Managing the Risks, Impacts and Uncertainties of drought and water Scarcity) aims at quantifying the risks associated with droughts in the UK under present and future conditions. To do so, a large number of drought events, from climate model simulations downscaled at 25km over Europe, are being fed into hydrological models of various complexity and used for the estimation of drought risk associated with human and natural systems, including impacts on the economy, industry, agriculture, terrestrial and aquatic ecosystems, and socio-cultural aspects. Here, we present the hydro-meteorological drought event set that has been produced by weather@home [1] for MaRIUS. Using idle processor time on volunteers' computers around the world, we have run a very large number (10'000s) of Global Climate Model (GCM) simulations, downscaled at 25km over Europe by a nested Regional Climate Model (RCM). Simulations include the past 100 years as well as two future horizons (2030s and 2080s), and provide a large number of sequences of spatio-temporally consistent weather, which are consistent with the boundary forcing such as the ocean, greenhouse gases and solar forcing. The drought event set for use in impact studies is constructed by extracting sequences of dry conditions from these model runs, leading to several thousand drought events. In addition to describing methodological and validation aspects of the synthetic drought event sets, we provide insights into drought risk in the UK, its

  1. Effectiveness of chemical amendments for stabilisation of lead and antimony in risk-based land management of soils of shooting ranges.

    PubMed

    Sanderson, Peter; Naidu, Ravi; Bolan, Nanthi

    2015-06-01

    This study aims to examine the effectiveness of amendments for risk-based land management of shooting range soils and to explore the effectiveness of amendments applied to sites with differing soil physiochemical parameters. A series of amendments with differing mechanisms for stabilisation were applied to four shooting range soils and aged for 1 year. Chemical stabilisation was monitored by pore water extraction, toxicity characteristic leaching procedure (TCLP) and the physiologically based extraction test (PBET) over 1 year. The performance of amendments when applied in conditions reflecting field application did not match the performance in the batch studies. Pore water-extractable metals were not greatly affected by amendment addition. TCLP-extractable Pb was reduced significantly by amendments, particularly lime and magnesium oxide. Antimony leaching was reduced by red mud but mobilised by some of the other amendments. Bioaccessible Pb measured by PBET shows that bioaccessible Pb increased with time after an initial decrease due to the presence of metallic fragments in the soil. Amendments were able to reduce bioaccessible Pb by up to 50 %. Bioaccessible Sb was not readily reduced by soil amendments. Soil amendments were not equally effective across the four soils. PMID:23807560

  2. Effectiveness of chemical amendments for stabilisation of lead and antimony in risk-based land management of soils of shooting ranges.

    PubMed

    Sanderson, Peter; Naidu, Ravi; Bolan, Nanthi

    2015-06-01

    This study aims to examine the effectiveness of amendments for risk-based land management of shooting range soils and to explore the effectiveness of amendments applied to sites with differing soil physiochemical parameters. A series of amendments with differing mechanisms for stabilisation were applied to four shooting range soils and aged for 1 year. Chemical stabilisation was monitored by pore water extraction, toxicity characteristic leaching procedure (TCLP) and the physiologically based extraction test (PBET) over 1 year. The performance of amendments when applied in conditions reflecting field application did not match the performance in the batch studies. Pore water-extractable metals were not greatly affected by amendment addition. TCLP-extractable Pb was reduced significantly by amendments, particularly lime and magnesium oxide. Antimony leaching was reduced by red mud but mobilised by some of the other amendments. Bioaccessible Pb measured by PBET shows that bioaccessible Pb increased with time after an initial decrease due to the presence of metallic fragments in the soil. Amendments were able to reduce bioaccessible Pb by up to 50 %. Bioaccessible Sb was not readily reduced by soil amendments. Soil amendments were not equally effective across the four soils.

  3. A family of analytical probabilistic models for urban stormwater management planning

    SciTech Connect

    Papa, F.; Adams, B.J.; Guo, Y.

    1998-07-01

    This paper presents the synthesis of over fifteen years of research on the topic of analytical probabilistic models, as an alternative approach to continuous simulation, that have been derived for the performance analysis of urban runoff quantity and quality control systems. These models overcome the limitations imposed by single event modeling through the use of long term rainfall records and are significantly more computationally efficient and less cumbersome than other methods of continuous analysis. These attributes promote the comprehensive analysis of drainage system design alternatives at the screening and planning levels.

  4. State Assistance with Risk-Based Data Management: Inventory and needs assessment of 25 state Class II Underground Injection Control programs. Phase 1

    SciTech Connect

    Not Available

    1992-07-01

    As discussed in Section I of the attached report, state agencies must decide where to direct their limited resources in an effort to make optimum use of their available manpower and address those areas that pose the greatest risk to valuable drinking water sources. The Underground Injection Practices Research Foundation (UIPRF) proposed a risk-based data management system (RBDMS) to provide states with the information they need to effectively utilize staff resources, provide dependable documentation to justify program planning, and enhance environmental protection capabilities. The UIPRF structured its approach regarding environmental risk management to include data and information from production, injection, and inactive wells in its RBDMS project. Data from each of these well types is critical to the complete statistical evaluation of environmental risk and selected automated functions. This comprehensive approach allows state Underground Injection Control (UIC) programs to effectively evaluate the risk of contaminating underground sources of drinking water, while alleviating the additional work and associated problems that often arise when separate data bases are used. CH2M Hill and Digital Design Group, through a DOE grant to the UIPRF, completed an inventory and needs assessment of 25 state Class II UIC programs. The states selected for participation by the UIPRF were generally chosen based on interest and whether an active Class II injection well program was in place. The inventory and needs assessment provided an effective means of collecting and analyzing the interest, commitment, design requirements, utilization, and potential benefits of implementing a in individual state UIC programs. Personal contacts were made with representatives from each state to discuss the applicability of a RBDMS in their respective state.

  5. Coastal cliff recession: the use of probabilistic prediction methods

    NASA Astrophysics Data System (ADS)

    Lee, E. M.; Hall, J. W.; Meadowcroft, I. C.

    2001-10-01

    A range of probabilistic methods is introduced for predicting coastal cliff recession, which provide a means of demonstrating the potential variability in such predictions. They form the basis for risk-based land-use planning, cliff management and engineering decision-making. Examples of probabilistic models are presented for a number of different cliff settings: the simulation of recession on eroding cliffs; the use of historical records and statistical experiments to model the behaviour of cliffs affected by rare, episodic landslide events; the adaptation of an event tree approach to assess the probability of failure of protected cliffs, taking into account the residual life of the existing defences; and the evaluation of the probability of landslide reactivation in areas of pre-existing landslide systems. These methods are based on a geomorphological assessment of the episodic nature of the recession process, together with historical records.

  6. Probabilistic Risk-Based Approach to Aeropropulsion System Assessment Developed

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.

    2001-01-01

    In an era of shrinking development budgets and resources, where there is also an emphasis on reducing the product development cycle, the role of system assessment, performed in the early stages of an engine development program, becomes very critical to the successful development of new aeropropulsion systems. A reliable system assessment not only helps to identify the best propulsion system concept among several candidates, it can also identify which technologies are worth pursuing. This is particularly important for advanced aeropropulsion technology development programs, which require an enormous amount of resources. In the current practice of deterministic, or point-design, approaches, the uncertainties of design variables are either unaccounted for or accounted for by safety factors. This could often result in an assessment with unknown and unquantifiable reliability. Consequently, it would fail to provide additional insight into the risks associated with the new technologies, which are often needed by decisionmakers to determine the feasibility and return-on-investment of a new aircraft engine.

  7. Integration of fuzzy analytic hierarchy process and probabilistic dynamic programming in formulating an optimal fleet management model

    NASA Astrophysics Data System (ADS)

    Teoh, Lay Eng; Khoo, Hooi Ling

    2013-09-01

    This study deals with two major aspects of airlines, i.e. supply and demand management. The aspect of supply focuses on the mathematical formulation of an optimal fleet management model to maximize operational profit of the airlines while the aspect of demand focuses on the incorporation of mode choice modeling as parts of the developed model. The proposed methodology is outlined in two-stage, i.e. Fuzzy Analytic Hierarchy Process is first adopted to capture mode choice modeling in order to quantify the probability of probable phenomena (for aircraft acquisition/leasing decision). Then, an optimization model is developed as a probabilistic dynamic programming model to determine the optimal number and types of aircraft to be acquired and/or leased in order to meet stochastic demand during the planning horizon. The findings of an illustrative case study show that the proposed methodology is viable. The results demonstrate that the incorporation of mode choice modeling could affect the operational profit and fleet management decision of the airlines at varying degrees.

  8. Probabilistic ecosystem model for predicting the nutrient concentrations in the Gulf of Finland under diverse management actions.

    PubMed

    Vanhatalo, Jarno P; Tuomi, Laura M; Inkala, Arto T; Helle, S Inari; Pitkänen, J Heikki

    2013-01-01

    Many countries define legislative targets for the ecological status of aquatic ecosystems. Fulfilling these legally binding targets requires often large scale and expensive management actions. The expected benefits from alternative actions are commonly compared with deterministic ecosystem models. However, from a practical management point of view the uncertainty in model predictions and the probability to achieve the targets are as essential as the point estimates provided by the deterministic models. For this reason, we extend a deterministic ecosystem model into a probabilistic form. We use the model for predicting the probability to achieve the targets set by EU's Water Framework Directive (WFD) in Finnish coastal waters in the Gulf of Finland, one of the most eutrophicated areas of the Baltic Sea, under alternative management scenarios. Our results show that the probability to reach the WFD objectives for total phosphorus is generally less than or equal to 0.51 in all areas. However, for total nitrogen the probability varies substantially as it is practically zero in the western areas but almost 0.80 or higher in the eastern areas. It seems that especially with phosphorus, international co-operation is needed in order for Finland to fulfill the objectives of the WFD. PMID:23190405

  9. The future of host cell protein (HCP) identification during process development and manufacturing linked to a risk-based management for their control.

    PubMed

    Bracewell, Daniel G; Francis, Richard; Smales, C Mark

    2015-09-01

    The use of biological systems to synthesize complex therapeutic products has been a remarkable success. However, during product development, great attention must be devoted to defining acceptable levels of impurities that derive from that biological system, heading this list are host cell proteins (HCPs). Recent advances in proteomic analytics have shown how diverse this class of impurities is; as such knowledge and capability grows inevitable questions have arisen about how thorough current approaches to measuring HCPs are. The fundamental issue is how to adequately measure (and in turn monitor and control) such a large number of protein species (potentially thousands of components) to ensure safe and efficacious products. A rather elegant solution is to use an immunoassay (enzyme-linked immunosorbent assay [ELISA]) based on polyclonal antibodies raised to the host cell (biological system) used to synthesize a particular therapeutic product. However, the measurement is entirely dependent on the antibody serum used, which dictates the sensitivity of the assay and the degree of coverage of the HCP spectrum. It provides one summed analog value for HCP amount; a positive if all HCP components can be considered equal, a negative in the more likely event one associates greater risk with certain components of the HCP proteome. In a thorough risk-based approach, one would wish to be able to account for this. These issues have led to the investigation of orthogonal analytical methods; most prominently mass spectrometry. These techniques can potentially both identify and quantify HCPs. The ability to measure and monitor thousands of proteins proportionally increases the amount of data acquired. Significant benefits exist if the information can be used to determine critical HCPs and thereby create an improved basis for risk management. We describe a nascent approach to risk assessment of HCPs based upon such data, drawing attention to timeliness in relation to biosimilar

  10. The future of host cell protein (HCP) identification during process development and manufacturing linked to a risk-based management for their control.

    PubMed

    Bracewell, Daniel G; Francis, Richard; Smales, C Mark

    2015-09-01

    The use of biological systems to synthesize complex therapeutic products has been a remarkable success. However, during product development, great attention must be devoted to defining acceptable levels of impurities that derive from that biological system, heading this list are host cell proteins (HCPs). Recent advances in proteomic analytics have shown how diverse this class of impurities is; as such knowledge and capability grows inevitable questions have arisen about how thorough current approaches to measuring HCPs are. The fundamental issue is how to adequately measure (and in turn monitor and control) such a large number of protein species (potentially thousands of components) to ensure safe and efficacious products. A rather elegant solution is to use an immunoassay (enzyme-linked immunosorbent assay [ELISA]) based on polyclonal antibodies raised to the host cell (biological system) used to synthesize a particular therapeutic product. However, the measurement is entirely dependent on the antibody serum used, which dictates the sensitivity of the assay and the degree of coverage of the HCP spectrum. It provides one summed analog value for HCP amount; a positive if all HCP components can be considered equal, a negative in the more likely event one associates greater risk with certain components of the HCP proteome. In a thorough risk-based approach, one would wish to be able to account for this. These issues have led to the investigation of orthogonal analytical methods; most prominently mass spectrometry. These techniques can potentially both identify and quantify HCPs. The ability to measure and monitor thousands of proteins proportionally increases the amount of data acquired. Significant benefits exist if the information can be used to determine critical HCPs and thereby create an improved basis for risk management. We describe a nascent approach to risk assessment of HCPs based upon such data, drawing attention to timeliness in relation to biosimilar

  11. Handbook of methods for risk-based analysis of Technical Specification requirements

    SciTech Connect

    Samanta, P.K.; Vesely, W.E.

    1993-12-31

    Technical Specifications (TS) requirements for nuclear power plants define the Limiting Conditions for Operation (LCOs) and Surveillance Requirements (SRs) to assure safety during operation. In general, these requirements were based on deterministic analysis and engineering judgments. Experiences with plant operation indicate that some elements of the requirements are unnecessarily restrictive, while others may not be conducive to safety. Improvements in these requirements are facilitated by the availability of plant specific Probabilistic Safety Assessments (PSAs). The use of risk and reliability-based methods to improve TS requirements has gained wide interest because these methods can: quantitatively evaluate the risk impact and justify changes based on objective risk arguments. Provide a defensible basis for these requirements for regulatory applications. The United States Nuclear Regulatory Commission (USNRC) Office of Research is sponsoring research to develop systematic risk-based methods to improve various aspects of TS requirements. The handbook of methods, which is being prepared, summarizes such risk-based methods. The scope of the handbook includes reliability and risk-based methods for evaluating allowed outage times (AOTs), action statements requiring shutdown where shutdown risk may be substantial, surveillance test intervals (STIs), defenses against common-cause failures, managing plant configurations, and scheduling maintenances. For each topic, the handbook summarizes methods of analysis and data needs, outlines the insights to be gained, lists additional references, and presents examples of evaluations.

  12. A PROBABILISTIC APPROACH FOR ANALYSIS OF UNCERTAINTY IN THE EVALUATION OF WATERSHED MANAGEMENT PRACTICES

    EPA Science Inventory

    A computational framework is presented for analyzing the uncertainty in model estimates of water quality benefits of best management practices (BMPs) in two small (<10 km2) watersheds in Indiana. The analysis specifically recognizes the significance of the difference b...

  13. Development and use of risk-based inspection guides

    SciTech Connect

    Taylor, J.H.; Fresco, A.; Higgins, J.; Usher, J.; Long, S.M.

    1989-06-01

    Risk-based system inspection guides, for nuclear power plants which have been subjected to a probabilistic risk assessment (PRA), have been developed to provide guidance to NRC inspectors in prioritizing their inspection activities. Systems are prioritized, and then dominant component failure modes and human errors within those systems are identified for the above-stated purposes. Examples of applications to specific types of NRC inspection activities are also presented. Thus, the report provides guidance for both the development and use of risk-based system inspection guides. Work is proceeding to develop a method methodology for risk-based guidance for nuclear power plants not subject to a PRA. 18 refs., 1 fig.

  14. 12 CFR 932.3 - Risk-based capital requirement.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 7 2011-01-01 2011-01-01 false Risk-based capital requirement. 932.3 Section 932.3 Banks and Banking FEDERAL HOUSING FINANCE BOARD FEDERAL HOME LOAN BANK RISK MANAGEMENT AND CAPITAL STANDARDS FEDERAL HOME LOAN BANK CAPITAL REQUIREMENTS § 932.3 Risk-based capital requirement....

  15. 12 CFR 932.3 - Risk-based capital requirement.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Risk-based capital requirement. 932.3 Section 932.3 Banks and Banking FEDERAL HOUSING FINANCE BOARD FEDERAL HOME LOAN BANK RISK MANAGEMENT AND CAPITAL STANDARDS FEDERAL HOME LOAN BANK CAPITAL REQUIREMENTS § 932.3 Risk-based capital requirement....

  16. 12 CFR 932.3 - Risk-based capital requirement.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 8 2014-01-01 2014-01-01 false Risk-based capital requirement. 932.3 Section 932.3 Banks and Banking FEDERAL HOUSING FINANCE BOARD FEDERAL HOME LOAN BANK RISK MANAGEMENT AND CAPITAL STANDARDS FEDERAL HOME LOAN BANK CAPITAL REQUIREMENTS § 932.3 Risk-based capital requirement....

  17. Probabilistic cost-benefit analysis of disaster risk management in a development context.

    PubMed

    Kull, Daniel; Mechler, Reinhard; Hochrainer-Stigler, Stefan

    2013-07-01

    Limited studies have shown that disaster risk management (DRM) can be cost-efficient in a development context. Cost-benefit analysis (CBA) is an evaluation tool to analyse economic efficiency. This research introduces quantitative, stochastic CBA frameworks and applies them in case studies of flood and drought risk reduction in India and Pakistan, while also incorporating projected climate change impacts. DRM interventions are shown to be economically efficient, with integrated approaches more cost-effective and robust than singular interventions. The paper highlights that CBA can be a useful tool if certain issues are considered properly, including: complexities in estimating risk; data dependency of results; negative effects of interventions; and distributional aspects. The design and process of CBA must take into account specific objectives, available information, resources, and the perceptions and needs of stakeholders as transparently as possible. Intervention design and uncertainties should be qualified through dialogue, indicating that process is as important as numerical results.

  18. A risk-based focused decision-management approach for justifying characterization of Hanford tank waste. June 1996, Revision 1; April 1997, Revision 2

    SciTech Connect

    Colson, S.D.; Gephart, R.E.; Hunter, V.L.; Janata, J.; Morgan, L.G.

    1997-12-31

    This report describes a disciplined, risk-based decision-making approach for determining characterization needs and resolving safety issues during the storage and remediation of radioactive waste stored in Hanford tanks. The strategy recommended uses interactive problem evaluation and decision analysis methods commonly used in industry to solve problems under conditions of uncertainty (i.e., lack of perfect knowledge). It acknowledges that problem resolution comes through both the application of high-quality science and human decisions based upon preferences and sometimes hard-to-compare choices. It recognizes that to firmly resolve a safety problem, the controlling waste characteristics and chemical phenomena must be measurable or estimated to an acceptable level of confidence tailored to the decision being made.

  19. Do probabilistic forecasts lead to better decisions?

    NASA Astrophysics Data System (ADS)

    Ramos, M. H.; van Andel, S. J.; Pappenberger, F.

    2013-06-01

    The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also started focusing attention on ways of communicating the probabilistic forecasts to decision-makers. Communicating probabilistic forecasts includes preparing tools and products for visualisation, but also requires understanding how decision-makers perceive and use uncertainty information in real time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision-makers. Answers were collected and analysed. In this paper, we present the results of this exercise and discuss if we indeed make better decisions on the basis of probabilistic forecasts.

  20. Improving nutrient management practices in agriculture: The role of risk-based beliefs in understanding farmers' attitudes toward taking additional action

    NASA Astrophysics Data System (ADS)

    Wilson, Robyn S.; Howard, Gregory; Burnett, Elizabeth A.

    2014-08-01

    A recent increase in the amount of dissolved reactive phosphorus (DRP) entering the western Lake Erie basin is likely due to increased spring storm events in combination with issues related to fertilizer application and timing. These factors in combination with warmer lake temperatures have amplified the spread of toxic algal blooms. We assessed the attitudes of farmers in northwest Ohio toward taking at least one additional action to reduce nutrient loss on their farm. Specifically, we (1) identified to what extent farm and farmer characteristics (e.g., age, gross farm sales) as well as risk-based beliefs (e.g., efficacy, risk perception) influenced attitudes, and (2) assessed how these characteristics and beliefs differ in their predictive ability based on unobservable latent classes of farmers. Risk perception, or a belief that negative impacts to profit and water quality from nutrient loss were likely, was the most consistent predictor of farmer attitudes. Response efficacy, or a belief that taking action on one's farm made a difference, was found to significantly influence attitudes, although this belief was particularly salient for the minority class of farmers who were older and more motivated by profit. Communication efforts should focus on the negative impacts of nutrient loss to both the farm (i.e., profit) and the natural environment (i.e., water quality) to raise individual perceived risk among the majority, while the minority need higher perceived efficacy or more specific information about the economic effectiveness of particular recommended practices.

  1. Risk-Based Management of Contaminated Groundwater: The Role of Geologic Heterogeneity, Exposure and Cancer Risk in Determining the Performance of Aquifer Remediation

    SciTech Connect

    Maxwell, R.M.; Carle, S.F.; Tompson, A.F.B.

    2000-04-07

    The effectiveness of aquifer remediation is typically expressed in terms of a reduction in contaminant concentrations relative to a regulated maximum contaminant level (MCL), and is usually confined by sparse monitoring data and/or simple model calculations. Here, the effectiveness of remediation is examined from a risk-based perspective that goes beyond the traditional MCL concept. A methodology is employed to evaluate the health risk to individuals exposed to contaminated household water that is produced from groundwater. This approach explicitly accounts for differences in risk arising from variability in individual physiology and water use, the uncertainty in estimating chemical carcinogenesis for different individuals, and the uncertainties and variability in contaminant concentrations within groundwater. A hypothetical contamination scenario is developed as a case study in a saturated, alluvial aquifer underlying a real Superfund site. A baseline (unremediated) human exposure and health risk scenario, as induced by contaminated groundwater pumped from this site, is predicted and compared with a similar estimate based upon pump-and-treat exposure intervention. The predicted reduction in risk in the remediation scenario is not an equitable one--that is, it is not uniform to all individuals within a population and varies according to the level of uncertainty in prediction. The importance of understanding the detailed hydrogeologic connections that are established in the heterogeneous geologic regime between the contaminated source, municipal receptors, and remediation wells, and its relationship to this uncertainty is demonstrated. Using two alternative pumping rates, we develop cost-benefit curves based upon reduced exposure and risk to different individuals within the population, under the presence of uncertainty.

  2. Corrosion risk assessment and risk based inspection for sweet oil and gas corrosion -- Practical experience

    SciTech Connect

    Pursell, M.J.; Sehnan, C.; Naen, M.F.

    1999-11-01

    Successful and cost effective Corrosion Risk Assessment depends on a sensible use of prediction methods and good understanding of process factors. Both are discussed with examples. Practice semi-probabilistic Risk Based Inspection planning methods that measure risk directly as cost and personnel hazard are compared with traditional methods and discussed.

  3. Risk-based decisionmaking (Panel)

    SciTech Connect

    Smith, T.H.

    1995-12-31

    By means of a panel discussion and extensive audience interaction, explore the current challenges and progress to date in applying risk considerations to decisionmaking related to low-level waste. This topic is especially timely because of the proposed legislation pertaining to risk-based decisionmaking and because of the increased emphasis placed on radiological performance assessments of low-level waste disposal.

  4. Probabilistic Tsunami Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.

    2006-12-01

    The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes

  5. Probabilistic Exposure Analysis for Chemical Risk Characterization

    PubMed Central

    Bogen, Kenneth T.; Cullen, Alison C.; Frey, H. Christopher; Price, Paul S.

    2009-01-01

    This paper summarizes the state of the science of probabilistic exposure assessment (PEA) as applied to chemical risk characterization. Current probabilistic risk analysis methods applied to PEA are reviewed. PEA within the context of risk-based decision making is discussed, including probabilistic treatment of related uncertainty, interindividual heterogeneity, and other sources of variability. Key examples of recent experience gained in assessing human exposures to chemicals in the environment, and other applications to chemical risk characterization and assessment, are presented. It is concluded that, although improvements continue to be made, existing methods suffice for effective application of PEA to support quantitative analyses of the risk of chemically induced toxicity that play an increasing role in key decision-making objectives involving health protection, triage, civil justice, and criminal justice. Different types of information required to apply PEA to these different decision contexts are identified, and specific PEA methods are highlighted that are best suited to exposure assessment in these separate contexts. PMID:19223660

  6. Overview of the co-ordinated risk-based approach to science and management response and recovery for the 2012 eruptions of Tongariro volcano, New Zealand

    NASA Astrophysics Data System (ADS)

    Jolly, G. E.; Keys, H. J. R.; Procter, J. N.; Deligne, N. I.

    2014-10-01

    Tongariro volcano, New Zealand, lies wholly within the Tongariro National Park (TNP), one of New Zealand's major tourist destinations. Two small eruptions of the Te Maari vents on the northern flanks of Tongariro on 6 August 2012 and 21 November 2012 each produced a small ash cloud to < 8 km height accompanied by pyroclastic density currents and ballistic projectiles. The most popular day hike in New Zealand, the Tongariro Alpine Crossing (TAC), runs within 2 km of the Te Maari vents. The larger of the two eruptions (6 August 2012) severely impacted the TAC and resulted in its closure, impacting the local economic and potentially influencing national tourism. In this paper, we document the science and risk management response to the eruption, and detail how quantitative risk assessments were applied in a rapidly evolving situation to inform robust decision-making for when the TAC would be re-opened. The volcanologist and risk manager partnership highlights the value of open communication between scientists and stakeholders during a response to, and subsequent recovery from, a volcanic eruption.

  7. Towards risk-based management of critical infrastructures : enabling insights and analysis methodologies from a focused study of the bulk power grid.

    SciTech Connect

    Richardson, Bryan T.; LaViolette, Randall A.; Cook, Benjamin Koger

    2008-02-01

    This report summarizes research on a holistic analysis framework to assess and manage risks in complex infrastructures, with a specific focus on the bulk electric power grid (grid). A comprehensive model of the grid is described that can approximate the coupled dynamics of its physical, control, and market components. New realism is achieved in a power simulator extended to include relevant control features such as relays. The simulator was applied to understand failure mechanisms in the grid. Results suggest that the implementation of simple controls might significantly alter the distribution of cascade failures in power systems. The absence of cascade failures in our results raises questions about the underlying failure mechanisms responsible for widespread outages, and specifically whether these outages are due to a system effect or large-scale component degradation. Finally, a new agent-based market model for bilateral trades in the short-term bulk power market is presented and compared against industry observations.

  8. A Multi-Disciplinary Management of Flooding Risk Based on the Use of Rainfall Data, Historical Impacts Databases and Hydrological Modeling

    NASA Astrophysics Data System (ADS)

    Renard, F.; Alonso, L.; Soto, D.

    2014-12-01

    Greater Lyon (1.3 million inhabitants 650 km ²), France, is subjected to recurring floods, with numerous consequences. From the perspective of prevention and management of this risk, the local authorities, in partnership with multidisciplinary researchers, have developed since 1988 a database built by the field teams, which specifically identifies all floods (places, date, impacts, damage, etc.). At first, this historical database is compared to two other databases, the emergency services and the local newspaper ones, in georeferencing these events using a GIS. It turns out that the historical database is more complete and precise, but the contribution of the other two bases is not negligible, and a useful complement to the knowledge of impacts. Thanks to the dense rain measurement network (30 rain gauges), the flood information is then compared to the distribution of rainfall for each episode (interpolation by ordinary kriging). The results are satisfactory and validate the accuracy of the information contained in the database, but also the accuracy of rainfall measurements. Thereafter, the number of flood on the study area is confronted with rainfall characteristics (intensity, duration and height of precipitated water). It does not appear here clear relationship between the number of floods and rainfall characteristics, because of the diversity of land uses, its permeability and the the types of local sewer network and urban water management. Finally, floods observed in the database are compared spatially with a GIS to flooding from the sewer network modeling (using the software Canoe). A strong spatial similarity between floods observed in the field and simulated flood is found in the majority of cases, despite the limitations of each tools. These encouraging results confirm the accuracy of the database and the reliability of the simulation software, and offer many operational perspectives to better understand the flood and learn to cope with the flooding risk.

  9. Assessment of environmental risks from toxic and nontoxic stressors; a proposed concept for a risk-based management tool for offshore drilling discharges.

    PubMed

    Smit, Mathijs G D; Jak, Robbert G; Rye, Henrik; Frost, Tone Karin; Singsaas, Ivar; Karman, Chris C

    2008-04-01

    In order to improve the ecological status of aquatic systems, both toxic (e.g., chemical) and nontoxic stressors (e.g., suspended particles) should be evaluated. This paper describes an approach to environmental risk assessment of drilling discharges to the sea. These discharges might lead to concentrations of toxic compounds and suspended clay particles in the water compartment and concentrations of toxic compounds, burial of biota, change in sediment structure, and oxygen depletion in marine sediments. The main challenges were to apply existing protocols for environmental risk assessment to nontoxic stressors and to combine risks arising from exposure to these stressors with risk from chemical exposure. The defined approach is based on species sensitivity distributions (SSDs). In addition, precautionary principles from the EU-Technical Guidance Document were incorporated to assure that the method is acceptable in a regulatory context. For all stressors a protocol was defined to construct an SSD for no observed effect concentrations (or levels; NOEC(L)-SSD) to allow for the calculation of the potentially affected fraction of species from predicted exposures. Depending on the availability of data, a NOEC-SSD for toxicants can either be directly based on available NOECs or constructed from the predicted no effect concentration and the variation in sensitivity among species. For nontoxic stressors a NOEL-SSD can be extrapolated from an SSD based on effect or field data. Potentially affected fractions of species at predicted exposures are combined into an overall risk estimate. The developed approach facilitates environmental management of drilling discharges and can be applied to define risk-mitigating measures for both toxic and nontoxic stress.

  10. Assessment of environmental risks from toxic and nontoxic stressors; a proposed concept for a risk-based management tool for offshore drilling discharges.

    PubMed

    Smit, Mathijs G D; Jak, Robbert G; Rye, Henrik; Frost, Tone Karin; Singsaas, Ivar; Karman, Chris C

    2008-04-01

    In order to improve the ecological status of aquatic systems, both toxic (e.g., chemical) and nontoxic stressors (e.g., suspended particles) should be evaluated. This paper describes an approach to environmental risk assessment of drilling discharges to the sea. These discharges might lead to concentrations of toxic compounds and suspended clay particles in the water compartment and concentrations of toxic compounds, burial of biota, change in sediment structure, and oxygen depletion in marine sediments. The main challenges were to apply existing protocols for environmental risk assessment to nontoxic stressors and to combine risks arising from exposure to these stressors with risk from chemical exposure. The defined approach is based on species sensitivity distributions (SSDs). In addition, precautionary principles from the EU-Technical Guidance Document were incorporated to assure that the method is acceptable in a regulatory context. For all stressors a protocol was defined to construct an SSD for no observed effect concentrations (or levels; NOEC(L)-SSD) to allow for the calculation of the potentially affected fraction of species from predicted exposures. Depending on the availability of data, a NOEC-SSD for toxicants can either be directly based on available NOECs or constructed from the predicted no effect concentration and the variation in sensitivity among species. For nontoxic stressors a NOEL-SSD can be extrapolated from an SSD based on effect or field data. Potentially affected fractions of species at predicted exposures are combined into an overall risk estimate. The developed approach facilitates environmental management of drilling discharges and can be applied to define risk-mitigating measures for both toxic and nontoxic stress. PMID:18232721

  11. Risk-based system refinement

    SciTech Connect

    Winter, V.L.; Berg, R.S.; Dalton, L.J.

    1998-06-01

    When designing a high consequence system, considerable care should be taken to ensure that the system can not easily be placed into a high consequence failure state. A formal system design process should include a model that explicitly shows the complete state space of the system (including failure states) as well as those events (e.g., abnormal environmental conditions, component failures, etc.) that can cause a system to enter a failure state. In this paper the authors present such a model and formally develop a notion of risk-based refinement with respect to the model.

  12. Enhancing the effectiveness of IST through risk-based techniques

    SciTech Connect

    Floyd, S.D.

    1996-12-01

    Current IST requirements were developed mainly through deterministic-based methods. While this approach has resulted in an adequate level of safety and reliability for pumps and valves, insights from probabilistic safety assessments suggest a better safety focus can be achieved at lower costs. That is, some high safety impact pumps and valves are currently not tested under the IST program and should be added, while low safety impact valves could be tested at significantly greater intervals than allowed by the current IST program. The nuclear utility industry, through the Nuclear Energy Institute (NEI), has developed a draft guideline for applying risk-based techniques to focus testing on those pumps and valves with a high safety impact while reducing test frequencies on low safety impact pumps and valves. The guideline is being validated through an industry pilot application program that is being reviewed by the U.S. Nuclear Regulatory Commission. NEI and the ASME maintain a dialogue on the two groups` activities related to risk-based IST. The presenter will provide an overview of the NEI guideline, discuss the methodological approach for applying risk-based technology to IST and provide the status of the industry pilot plant effort.

  13. Staged decision making based on probabilistic forecasting

    NASA Astrophysics Data System (ADS)

    Booister, Nikéh; Verkade, Jan; Werner, Micha; Cranston, Michael; Cumiskey, Lydia; Zevenbergen, Chris

    2016-04-01

    Flood forecasting systems reduce, but cannot eliminate uncertainty about the future. Probabilistic forecasts explicitly show that uncertainty remains. However, as - compared to deterministic forecasts - a dimension is added ('probability' or 'likelihood'), with this added dimension decision making is made slightly more complicated. A technique of decision support is the cost-loss approach, which defines whether or not to issue a warning or implement mitigation measures (risk-based method). With the cost-loss method a warning will be issued when the ratio of the response costs to the damage reduction is less than or equal to the probability of the possible flood event. This cost-loss method is not widely used, because it motivates based on only economic values and is a technique that is relatively static (no reasoning, yes/no decision). Nevertheless it has high potential to improve risk-based decision making based on probabilistic flood forecasting because there are no other methods known that deal with probabilities in decision making. The main aim of this research was to explore the ways of making decision making based on probabilities with the cost-loss method better applicable in practice. The exploration began by identifying other situations in which decisions were taken based on uncertain forecasts or predictions. These cases spanned a range of degrees of uncertainty: from known uncertainty to deep uncertainty. Based on the types of uncertainties, concepts of dealing with situations and responses were analysed and possible applicable concepts where chosen. Out of this analysis the concepts of flexibility and robustness appeared to be fitting to the existing method. Instead of taking big decisions with bigger consequences at once, the idea is that actions and decisions are cut-up into smaller pieces and finally the decision to implement is made based on economic costs of decisions and measures and the reduced effect of flooding. The more lead-time there is in

  14. Risk Based Security Management at Research Reactors

    SciTech Connect

    Ek, David R.

    2015-09-01

    This presentation provides a background of what led to the international emphasis on nuclear security and describes how nuclear security is effectively implemented so as to preserve the societal benefits of nuclear and radioactive materials.

  15. Software for Probabilistic Risk Reduction

    NASA Technical Reports Server (NTRS)

    Hensley, Scott; Michel, Thierry; Madsen, Soren; Chapin, Elaine; Rodriguez, Ernesto

    2004-01-01

    A computer program implements a methodology, denoted probabilistic risk reduction, that is intended to aid in planning the development of complex software and/or hardware systems. This methodology integrates two complementary prior methodologies: (1) that of probabilistic risk assessment and (2) a risk-based planning methodology, implemented in a prior computer program known as Defect Detection and Prevention (DDP), in which multiple requirements and the beneficial effects of risk-mitigation actions are taken into account. The present methodology and the software are able to accommodate both process knowledge (notably of the efficacy of development practices) and product knowledge (notably of the logical structure of a system, the development of which one seeks to plan). Estimates of the costs and benefits of a planned development can be derived. Functional and non-functional aspects of software can be taken into account, and trades made among them. It becomes possible to optimize the planning process in the sense that it becomes possible to select the best suite of process steps and design choices to maximize the expectation of success while remaining within budget.

  16. The Evidence for a Risk-Based Approach to Australian Higher Education Regulation and Quality Assurance

    ERIC Educational Resources Information Center

    Edwards, Fleur

    2012-01-01

    This paper explores the nascent field of risk management in higher education, which is of particular relevance in Australia currently, as the Commonwealth Government implements its plans for a risk-based approach to higher education regulation and quality assurance. The literature outlines the concept of risk management and risk-based approaches…

  17. A fractional-factorial probabilistic-possibilistic optimization framework for planning water resources management systems with multi-level parametric interactions.

    PubMed

    Wang, S; Huang, G H; Zhou, Y

    2016-05-01

    In this study, a multi-level factorial-vertex fuzzy-stochastic programming (MFFP) approach is developed for optimization of water resources systems under probabilistic and possibilistic uncertainties. MFFP is capable of tackling fuzzy parameters at various combinations of α-cut levels, reflecting distinct attitudes of decision makers towards fuzzy parameters in the fuzzy discretization process based on the α-cut concept. The potential interactions among fuzzy parameters can be explored through a multi-level factorial analysis. A water resources management problem with fuzzy and random features is used to demonstrate the applicability of the proposed methodology. The results indicate that useful solutions can be obtained for the optimal allocation of water resources under fuzziness and randomness. They can help decision makers to identify desired water allocation schemes with maximized total net benefits. A variety of decision alternatives can also be generated under different scenarios of water management policies. The findings from the factorial experiment reveal the interactions among design factors (fuzzy parameters) and their curvature effects on the total net benefit, which are helpful in uncovering the valuable information hidden beneath the parameter interactions affecting system performance. A comparison between MFFP and the vertex method is also conducted to demonstrate the merits of the proposed methodology.

  18. A probabilistic approach for a cost-benefit analysis of oil spill management under uncertainty: A Bayesian network model for the Gulf of Finland.

    PubMed

    Helle, Inari; Ahtiainen, Heini; Luoma, Emilia; Hänninen, Maria; Kuikka, Sakari

    2015-08-01

    Large-scale oil accidents can inflict substantial costs to the society, as they typically result in expensive oil combating and waste treatment operations and have negative impacts on recreational and environmental values. Cost-benefit analysis (CBA) offers a way to assess the economic efficiency of management measures capable of mitigating the adverse effects. However, the irregular occurrence of spills combined with uncertainties related to the possible effects makes the analysis a challenging task. We develop a probabilistic modeling approach for a CBA of oil spill management and apply it in the Gulf of Finland, the Baltic Sea. The model has a causal structure, and it covers a large number of factors relevant to the realistic description of oil spills, as well as the costs of oil combating operations at open sea, shoreline clean-up, and waste treatment activities. Further, to describe the effects on environmental benefits, we use data from a contingent valuation survey. The results encourage seeking for cost-effective preventive measures, and emphasize the importance of the inclusion of the costs related to waste treatment and environmental values in the analysis. Although the model is developed for a specific area, the methodology is applicable also to other areas facing the risk of oil spills as well as to other fields that need to cope with the challenging combination of low probabilities, high losses and major uncertainties. PMID:25983196

  19. A probabilistic approach for a cost-benefit analysis of oil spill management under uncertainty: A Bayesian network model for the Gulf of Finland.

    PubMed

    Helle, Inari; Ahtiainen, Heini; Luoma, Emilia; Hänninen, Maria; Kuikka, Sakari

    2015-08-01

    Large-scale oil accidents can inflict substantial costs to the society, as they typically result in expensive oil combating and waste treatment operations and have negative impacts on recreational and environmental values. Cost-benefit analysis (CBA) offers a way to assess the economic efficiency of management measures capable of mitigating the adverse effects. However, the irregular occurrence of spills combined with uncertainties related to the possible effects makes the analysis a challenging task. We develop a probabilistic modeling approach for a CBA of oil spill management and apply it in the Gulf of Finland, the Baltic Sea. The model has a causal structure, and it covers a large number of factors relevant to the realistic description of oil spills, as well as the costs of oil combating operations at open sea, shoreline clean-up, and waste treatment activities. Further, to describe the effects on environmental benefits, we use data from a contingent valuation survey. The results encourage seeking for cost-effective preventive measures, and emphasize the importance of the inclusion of the costs related to waste treatment and environmental values in the analysis. Although the model is developed for a specific area, the methodology is applicable also to other areas facing the risk of oil spills as well as to other fields that need to cope with the challenging combination of low probabilities, high losses and major uncertainties.

  20. A fractional-factorial probabilistic-possibilistic optimization framework for planning water resources management systems with multi-level parametric interactions.

    PubMed

    Wang, S; Huang, G H; Zhou, Y

    2016-05-01

    In this study, a multi-level factorial-vertex fuzzy-stochastic programming (MFFP) approach is developed for optimization of water resources systems under probabilistic and possibilistic uncertainties. MFFP is capable of tackling fuzzy parameters at various combinations of α-cut levels, reflecting distinct attitudes of decision makers towards fuzzy parameters in the fuzzy discretization process based on the α-cut concept. The potential interactions among fuzzy parameters can be explored through a multi-level factorial analysis. A water resources management problem with fuzzy and random features is used to demonstrate the applicability of the proposed methodology. The results indicate that useful solutions can be obtained for the optimal allocation of water resources under fuzziness and randomness. They can help decision makers to identify desired water allocation schemes with maximized total net benefits. A variety of decision alternatives can also be generated under different scenarios of water management policies. The findings from the factorial experiment reveal the interactions among design factors (fuzzy parameters) and their curvature effects on the total net benefit, which are helpful in uncovering the valuable information hidden beneath the parameter interactions affecting system performance. A comparison between MFFP and the vertex method is also conducted to demonstrate the merits of the proposed methodology. PMID:26922500

  1. Learning Probabilistic Logic Models from Probabilistic Examples.

    PubMed

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2008-10-01

    We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples.

  2. Risk based ASME Code requirements

    SciTech Connect

    Gore, B.F.; Vo, T.V.; Balkey, K.R.

    1992-09-01

    The objective of this ASME Research Task Force is to develop and to apply a methodology for incorporating quantitative risk analysis techniques into the definition of in-service inspection (ISI) programs for a wide range of industrial applications. An additional objective, directed towards the field of nuclear power generation, is ultimately to develop a recommendation for comprehensive revisions to the ISI requirements of Section XI of the ASME Boiler and Pressure Vessel Code. This will require development of a firm technical basis for such requirements, which does not presently exist. Several years of additional research will be required before this can be accomplished. A general methodology suitable for application to any industry has been defined and published. It has recently been refined and further developed during application to the field of nuclear power generation. In the nuclear application probabilistic risk assessment (PRA) techniques and information have been incorporated. With additional analysis, PRA information is used to determine the consequence of a component rupture (increased reactor core damage probability). A procedure has also been recommended for using the resulting quantified risk estimates to determine target component rupture probability values to be maintained by inspection activities. Structural risk and reliability analysis (SRRA) calculations are then used to determine characteristics which an inspection strategy must posess in order to maintain component rupture probabilities below target values. The methodology, results of example applications, and plans for future work are discussed.

  3. 12 CFR 932.3 - Risk-based capital requirement.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 932.3 Banks and Banking FEDERAL HOUSING FINANCE BOARD FEDERAL HOME LOAN BANK RISK MANAGEMENT AND CAPITAL STANDARDS FEDERAL HOME LOAN BANK CAPITAL REQUIREMENTS § 932.3 Risk-based capital requirement. Each Bank shall maintain at all times permanent capital in an amount at least equal to the sum of its...

  4. 12 CFR 932.3 - Risk-based capital requirement.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 932.3 Banks and Banking FEDERAL HOUSING FINANCE BOARD FEDERAL HOME LOAN BANK RISK MANAGEMENT AND CAPITAL STANDARDS FEDERAL HOME LOAN BANK CAPITAL REQUIREMENTS § 932.3 Risk-based capital requirement. Each Bank shall maintain at all times permanent capital in an amount at least equal to the sum of its...

  5. Risk-based zoning for urbanizing floodplains.

    PubMed

    Porse, Erik

    2014-01-01

    Urban floodplain development brings economic benefits and enhanced flood risks. Rapidly growing cities must often balance the economic benefits and increased risks of floodplain settlement. Planning can provide multiple flood mitigation and environmental benefits by combining traditional structural measures such as levees, increasingly popular landscape and design features (green infrastructure), and non-structural measures such as zoning. Flexibility in both structural and non-structural options, including zoning procedures, can reduce flood risks. This paper presents a linear programming formulation to assess cost-effective urban floodplain development decisions that consider benefits and costs of development along with expected flood damages. It uses a probabilistic approach to identify combinations of land-use allocations (residential and commercial development, flood channels, distributed runoff management) and zoning regulations (development zones in channel) to maximize benefits. The model is applied to a floodplain planning analysis for an urbanizing region in the Baja Sur peninsula of Mexico. The analysis demonstrates how (1) economic benefits drive floodplain development, (2) flexible zoning can improve economic returns, and (3) cities can use landscapes, enhanced by technology and design, to manage floods. The framework can incorporate additional green infrastructure benefits, and bridges typical disciplinary gaps for planning and engineering.

  6. Risk-based zoning for urbanizing floodplains.

    PubMed

    Porse, Erik

    2014-01-01

    Urban floodplain development brings economic benefits and enhanced flood risks. Rapidly growing cities must often balance the economic benefits and increased risks of floodplain settlement. Planning can provide multiple flood mitigation and environmental benefits by combining traditional structural measures such as levees, increasingly popular landscape and design features (green infrastructure), and non-structural measures such as zoning. Flexibility in both structural and non-structural options, including zoning procedures, can reduce flood risks. This paper presents a linear programming formulation to assess cost-effective urban floodplain development decisions that consider benefits and costs of development along with expected flood damages. It uses a probabilistic approach to identify combinations of land-use allocations (residential and commercial development, flood channels, distributed runoff management) and zoning regulations (development zones in channel) to maximize benefits. The model is applied to a floodplain planning analysis for an urbanizing region in the Baja Sur peninsula of Mexico. The analysis demonstrates how (1) economic benefits drive floodplain development, (2) flexible zoning can improve economic returns, and (3) cities can use landscapes, enhanced by technology and design, to manage floods. The framework can incorporate additional green infrastructure benefits, and bridges typical disciplinary gaps for planning and engineering. PMID:25500464

  7. Probabilistic record linkage.

    PubMed

    Sayers, Adrian; Ben-Shlomo, Yoav; Blom, Ashley W; Steele, Fiona

    2016-06-01

    Studies involving the use of probabilistic record linkage are becoming increasingly common. However, the methods underpinning probabilistic record linkage are not widely taught or understood, and therefore these studies can appear to be a 'black box' research tool. In this article, we aim to describe the process of probabilistic record linkage through a simple exemplar. We first introduce the concept of deterministic linkage and contrast this with probabilistic linkage. We illustrate each step of the process using a simple exemplar and describe the data structure required to perform a probabilistic linkage. We describe the process of calculating and interpreting matched weights and how to convert matched weights into posterior probabilities of a match using Bayes theorem. We conclude this article with a brief discussion of some of the computational demands of record linkage, how you might assess the quality of your linkage algorithm, and how epidemiologists can maximize the value of their record-linked research using robust record linkage methods. PMID:26686842

  8. Probabilistic Structural Analysis Program

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  9. An integrated GIS-based interval-probabilistic programming model for land-use planning management under uncertainty--a case study at Suzhou, China.

    PubMed

    Lu, Shasha; Zhou, Min; Guan, Xingliang; Tao, Lizao

    2015-03-01

    A large number of mathematical models have been developed for supporting optimization of land-use allocation; however, few of them simultaneously consider land suitability (e.g., physical features and spatial information) and various uncertainties existing in many factors (e.g., land availabilities, land demands, land-use patterns, and ecological requirements). This paper incorporates geographic information system (GIS) technology into interval-probabilistic programming (IPP) for land-use planning management (IPP-LUPM). GIS is utilized to assemble data for the aggregated land-use alternatives, and IPP is developed for tackling uncertainties presented as discrete intervals and probability distribution. Based on GIS, the suitability maps of different land users are provided by the outcomes of land suitability assessment and spatial analysis. The maximum area of every type of land use obtained from the suitability maps, as well as various objectives/constraints (i.e., land supply, land demand of socioeconomic development, future development strategies, and environmental capacity), is used as input data for the optimization of land-use areas with IPP-LUPM model. The proposed model not only considers the outcomes of land suitability evaluation (i.e., topography, ground conditions, hydrology, and spatial location) but also involves economic factors, food security, and eco-environmental constraints, which can effectively reflect various interrelations among different aspects in a land-use planning management system. The case study results at Suzhou, China, demonstrate that the model can help to examine the reliability of satisfying (or risk of violating) system constraints under uncertainty. Moreover, it may identify the quantitative relationship between land suitability and system benefits. Willingness to arrange the land areas based on the condition of highly suitable land will not only reduce the potential conflicts on the environmental system but also lead to a lower

  10. Risk based inspection for atmospheric storage tank

    NASA Astrophysics Data System (ADS)

    Nugroho, Agus; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin

    2016-04-01

    Corrosion is an attack that occurs on a metallic material as a result of environment's reaction.Thus, it causes atmospheric storage tank's leakage, material loss, environmental pollution, equipment failure and affects the age of process equipment then finally financial damage. Corrosion risk measurement becomesa vital part of Asset Management at the plant for operating any aging asset.This paper provides six case studies dealing with high speed diesel atmospheric storage tank parts at a power plant. A summary of the basic principles and procedures of corrosion risk analysis and RBI applicable to the Process Industries were discussed prior to the study. Semi quantitative method based onAPI 58I Base-Resource Document was employed. The risk associated with corrosion on the equipment in terms of its likelihood and its consequences were discussed. The corrosion risk analysis outcome used to formulate Risk Based Inspection (RBI) method that should be a part of the atmospheric storage tank operation at the plant. RBI gives more concern to inspection resources which are mostly on `High Risk' and `Medium Risk' criteria and less on `Low Risk' shell. Risk categories of the evaluated equipment were illustrated through case study analysis outcome.

  11. Risk-based inservice testing program modifications at Palo Verde nuclear generating station

    SciTech Connect

    Knauf, S.; Lindenlaub, B.; Linthicum, R.

    1996-12-01

    Arizona Public Service Company (APS) is investigating changes to the Palo Verde Inservice Testing (IST) Program that are intended to result in the reduction of the required test frequency for various valves in the American Society of Mechanical Engineers (ASME) Section XI IST program. The analytical techniques employed to select candidate valves and to demonstrate that these frequency reductions are acceptable are risk based. The results of the Palo Verde probabilistic risk assessment (PRA), updated in June 1994, and the risk significant determination performed as part of the implementation efforts for 10 CFR 50.65 (the maintenance rule) were used to select candidate valves for extended test intervals. Additional component level evaluations were conducted by an `expert panel.` The decision to pursue these changes was facilitated by the ASME Risk-Based Inservice Testing Research Task Force for which Palo Verde is participating as a pilot plant. The NRC`s increasing acceptance of cost beneficial licensing actions and risk-based submittals also provided incentive to seek these changes. Arizona Public Service is pursuing the risk-based IST program modification in order to reduce the unnecessary regulatory burden of the IST program through qualitative and quantitative analysis consistent with maintaining a high level of plant safety. The objectives of this project at Palo Verde are as follows: (1) Apply risk-based technologies to IST components to determine their risk significance (i.e., high or low). (2) Apply a combination of deterministic and risk-based methods to determine appropriate testing requirements for IST components including improvement of testing methods and frequency intervals for high-risk significant components. (3) Apply risk-based technologies to high-risk significant components identified by the {open_quotes}expert panel{close_quotes} and outside of the IST program to determine whether additional testing requirements are appropriate.

  12. PROBABILISTIC INFORMATION INTEGRATION TECHNOLOGY

    SciTech Connect

    J. BOOKER; M. MEYER; ET AL

    2001-02-01

    The Statistical Sciences Group at Los Alamos has successfully developed a structured, probabilistic, quantitative approach for the evaluation of system performance based on multiple information sources, called Information Integration Technology (IIT). The technology integrates diverse types and sources of data and information (both quantitative and qualitative), and their associated uncertainties, to develop distributions for performance metrics, such as reliability. Applications include predicting complex system performance, where test data are lacking or expensive to obtain, through the integration of expert judgment, historical data, computer/simulation model predictions, and any relevant test/experimental data. The technology is particularly well suited for tracking estimated system performance for systems under change (e.g. development, aging), and can be used at any time during product development, including concept and early design phases, prior to prototyping, testing, or production, and before costly design decisions are made. Techniques from various disciplines (e.g., state-of-the-art expert elicitation, statistical and reliability analysis, design engineering, physics modeling, and knowledge management) are merged and modified to develop formal methods for the data/information integration. The power of this technology, known as PREDICT (Performance and Reliability Evaluation with Diverse Information Combination and Tracking), won a 1999 R and D 100 Award (Meyer, Booker, Bement, Kerscher, 1999). Specifically the PREDICT application is a formal, multidisciplinary process for estimating the performance of a product when test data are sparse or nonexistent. The acronym indicates the purpose of the methodology: to evaluate the performance or reliability of a product/system by combining all available (often diverse) sources of information and then tracking that performance as the product undergoes changes.

  13. Probabilistic record linkage

    PubMed Central

    Sayers, Adrian; Ben-Shlomo, Yoav; Blom, Ashley W; Steele, Fiona

    2016-01-01

    Studies involving the use of probabilistic record linkage are becoming increasingly common. However, the methods underpinning probabilistic record linkage are not widely taught or understood, and therefore these studies can appear to be a ‘black box’ research tool. In this article, we aim to describe the process of probabilistic record linkage through a simple exemplar. We first introduce the concept of deterministic linkage and contrast this with probabilistic linkage. We illustrate each step of the process using a simple exemplar and describe the data structure required to perform a probabilistic linkage. We describe the process of calculating and interpreting matched weights and how to convert matched weights into posterior probabilities of a match using Bayes theorem. We conclude this article with a brief discussion of some of the computational demands of record linkage, how you might assess the quality of your linkage algorithm, and how epidemiologists can maximize the value of their record-linked research using robust record linkage methods. PMID:26686842

  14. Probabilistic microcell prediction model

    NASA Astrophysics Data System (ADS)

    Kim, Song-Kyoo

    2002-06-01

    A microcell is a cell with 1-km or less radius which is suitable for heavily urbanized area such as a metropolitan city. This paper deals with the microcell prediction model of propagation loss which uses probabilistic techniques. The RSL (Receive Signal Level) is the factor which can evaluate the performance of a microcell and the LOS (Line-Of-Sight) component and the blockage loss directly effect on the RSL. We are combining the probabilistic method to get these performance factors. The mathematical methods include the CLT (Central Limit Theorem) and the SPC (Statistical Process Control) to get the parameters of the distribution. This probabilistic solution gives us better measuring of performance factors. In addition, it gives the probabilistic optimization of strategies such as the number of cells, cell location, capacity of cells, range of cells and so on. Specially, the probabilistic optimization techniques by itself can be applied to real-world problems such as computer-networking, human resources and manufacturing process.

  15. Probabilistic numerics and uncertainty in computations

    PubMed Central

    Hennig, Philipp; Osborne, Michael A.; Girolami, Mark

    2015-01-01

    We deliver a call to arms for probabilistic numerical methods: algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations. PMID:26346321

  16. Probabilistic Composite Design

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1997-01-01

    Probabilistic composite design is described in terms of a computational simulation. This simulation tracks probabilistically the composite design evolution from constituent materials, fabrication process, through composite mechanics and structural components. Comparisons with experimental data are provided to illustrate selection of probabilistic design allowables, test methods/specimen guidelines, and identification of in situ versus pristine strength, For example, results show that: in situ fiber tensile strength is 90% of its pristine strength; flat-wise long-tapered specimens are most suitable for setting ply tensile strength allowables: a composite radome can be designed with a reliability of 0.999999; and laminate fatigue exhibits wide-spread scatter at 90% cyclic-stress to static-strength ratios.

  17. Formalizing Probabilistic Safety Claims

    NASA Technical Reports Server (NTRS)

    Herencia-Zapana, Heber; Hagen, George E.; Narkawicz, Anthony J.

    2011-01-01

    A safety claim for a system is a statement that the system, which is subject to hazardous conditions, satisfies a given set of properties. Following work by John Rushby and Bev Littlewood, this paper presents a mathematical framework that can be used to state and formally prove probabilistic safety claims. It also enables hazardous conditions, their uncertainties, and their interactions to be integrated into the safety claim. This framework provides a formal description of the probabilistic composition of an arbitrary number of hazardous conditions and their effects on system behavior. An example is given of a probabilistic safety claim for a conflict detection algorithm for aircraft in a 2D airspace. The motivation for developing this mathematical framework is that it can be used in an automated theorem prover to formally verify safety claims.

  18. Risk-based decisionmaking in the DOE: Challenges and status

    SciTech Connect

    Henry, C.J.; Alchowiak, J.; Moses, M.

    1995-12-31

    The primary mission of the Environmental Management Program is to protect human health and the environment, the first goal of which must be, to address urgent risks and threats. Another is to provide for a safe workplace. Without credible risk assessments and good risk management practices, the central environmental goals cannot be met. Principles for risk analysis which include principles for risk assessment, management, communication, and priority setting were adopted. As recommended, Environmental Management is using risk-based decision making in its budget process and in the implementation of its program. The challenges presented in using a risk-based Decision making process are to integrate risk assessment methods and cultural and social values so as to produce meaningful priorities. The different laws and regulations governing the Department define risk differently in implementing activities to protect human health and the environment, therefore, assumptions and judgements in risk analysis vary. Currently, the Environmental Management Program is developing and improving a framework to incorporate risk into the budget process and to link the budget, compliance requirements and risk reduction/pollution prevention activities.

  19. Probabilistic composite analysis

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Murthy, P. L. N.

    1991-01-01

    Formal procedures are described which are used to computationally simulate the probabilistic behavior of composite structures. The computational simulation starts with the uncertainties associated with all aspects of a composite structure (constituents, fabrication, assembling, etc.) and encompasses all aspects of composite behavior (micromechanics, macromechanics, combined stress failure, laminate theory, structural response, and tailoring) optimization. Typical cases are included to illustrate the formal procedure for computational simulation. The collective results of the sample cases demonstrate that uncertainties in composite behavior and structural response can be probabilistically quantified.

  20. Risk based microbiological criteria for Campylobacter in broiler meat in the European Union.

    PubMed

    Nauta, Maarten J; Sanaa, Moez; Havelaar, Arie H

    2012-09-01

    Quantitative microbiological risk assessment (QMRA) allows evaluating the public health impact of food safety targets to support the control of foodborne pathogens. We estimate the risk reduction of setting microbiological criteria (MCs) for Campylobacter on broiler meat in 25 European countries, applying quantitative data from the 2008 EU baseline survey. We demonstrate that risk based MCs can be derived without explicit consideration of Food Safety Objectives or Performance Objectives. Published QMRA models for the consumer phase and dose response provide a relation between Campylobacter concentration on skin samples and the attending probability of illness for the consumer. Probabilistic modelling is used to evaluate a set of potential MCs. We present the percentage of batches not complying with the potential criteria, in relation to the risk reduction attending totally efficient treatment of these batches. We find different risk estimates and different impacts of MCs in different countries, which offers a practical and flexible tool for risk managers to select the most appropriate MC by weighing the costs (i.e. non-compliant batches) and the benefits (i.e. reduction in public health risk). Our analyses show that the estimated percentage of batches not complying with the MC is better correlated with the risk estimate than surrogate risk measures like the flock prevalence or the arithmetic mean concentration of bacteria on carcasses, and would therefore be a good measure for the risk of Campylobacter on broiler meat in a particular country. Two uncertain parameters in the model are the ratio of within- and between-flock variances in concentrations, and the transition factor of skin sample concentrations to concentrations on the meat. Sensitivity analyses show that these parameters have a considerable effect on our results, but the impact of their uncertainty is small compared to that of the parameters defining the Microbiological Criterion and the concentration

  1. Risk-Based Object Oriented Testing

    NASA Technical Reports Server (NTRS)

    Rosenberg, Linda H.; Stapko, Ruth; Gallo, Albert

    2000-01-01

    Software testing is a well-defined phase of the software development life cycle. Functional ("black box") testing and structural ("white box") testing are two methods of test case design commonly used by software developers. A lesser known testing method is risk-based testing, which takes into account the probability of failure of a portion of code as determined by its complexity. For object oriented programs, a methodology is proposed for identification of risk-prone classes. Risk-based testing is a highly effective testing technique that can be used to find and fix the most important problems as quickly as possible.

  2. An Approach to Risk-Based Design Incorporating Damage Tolerance Analyses

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Glaessgen, Edward H.; Sleight, David W.

    2002-01-01

    Incorporating risk-based design as an integral part of spacecraft development is becoming more and more common. Assessment of uncertainties associated with design parameters and environmental aspects such as loading provides increased knowledge of the design and its performance. Results of such studies can contribute to mitigating risk through a system-level assessment. Understanding the risk of an event occurring, the probability of its occurrence, and the consequences of its occurrence can lead to robust, reliable designs. This paper describes an approach to risk-based structural design incorporating damage-tolerance analysis. The application of this approach to a candidate Earth-entry vehicle is described. The emphasis of the paper is on describing an approach for establishing damage-tolerant structural response inputs to a system-level probabilistic risk assessment.

  3. Probabilistic Threshold Criterion

    SciTech Connect

    Gresshoff, M; Hrousis, C A

    2010-03-09

    The Probabilistic Shock Threshold Criterion (PSTC) Project at LLNL develops phenomenological criteria for estimating safety or performance margin on high explosive (HE) initiation in the shock initiation regime, creating tools for safety assessment and design of initiation systems and HE trains in general. Until recently, there has been little foundation for probabilistic assessment of HE initiation scenarios. This work attempts to use probabilistic information that is available from both historic and ongoing tests to develop a basis for such assessment. Current PSTC approaches start with the functional form of the James Initiation Criterion as a backbone, and generalize to include varying areas of initiation and provide a probabilistic response based on test data for 1.8 g/cc (Ultrafine) 1,3,5-triamino-2,4,6-trinitrobenzene (TATB) and LX-17 (92.5% TATB, 7.5% Kel-F 800 binder). Application of the PSTC methodology is presented investigating the safety and performance of a flying plate detonator and the margin of an Ultrafine TATB booster initiating LX-17.

  4. 12 CFR 652.70 - Risk-based capital level.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Risk-based capital level. 652.70 Section 652.70... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.70 Risk-based capital level. The risk-based capital level is the sum of the following amounts: (a) Credit and interest rate risk....

  5. Fews-Risk: A step towards risk-based flood forecasting

    NASA Astrophysics Data System (ADS)

    Bachmann, Daniel; Eilander, Dirk; de Leeuw, Annemargreet; Diermanse, Ferdinand; Weerts, Albrecht; de Bruijn, Karin; Beckers, Joost; Boelee, Leonore; Brown, Emma; Hazlewood, Caroline

    2015-04-01

    Operational flood prediction and the assessment of flood risk are important components of flood management. Currently, the model-based prediction of discharge and/or water level in a river is common practice for operational flood forecasting. Based on the prediction of these values decisions about specific emergency measures are made within operational flood management. However, the information provided for decision support is restricted to pure hydrological or hydraulic aspects of a flood. Information about weak sections within the flood defences, flood prone areas and assets at risk in the protected areas are rarely used in a model-based flood forecasting system. This information is often available for strategic planning, but is not in an appropriate format for operational purposes. The idea of FEWS-Risk is the extension of existing flood forecasting systems with elements of strategic flood risk analysis, such as probabilistic failure analysis, two dimensional flood spreading simulation and the analysis of flood impacts and consequences. Thus, additional information is provided to the decision makers, such as: • Location, timing and probability of failure of defined sections of the flood defence line; • Flood spreading, extent and hydraulic values in the hinterland caused by an overflow or a breach flow • Impacts and consequences in case of flooding in the protected areas, such as injuries or casualties and/or damages to critical infrastructure or economy. In contrast with purely hydraulic-based operational information, these additional data focus upon decision support for answering crucial questions within an operational flood forecasting framework, such as: • Where should I reinforce my flood defence system? • What type of action can I take to mend a weak spot in my flood defences? • What are the consequences of a breach? • Which areas should I evacuate first? This presentation outlines the additional required workflows towards risk-based flood

  6. Probabilistic authenticated quantum dialogue

    NASA Astrophysics Data System (ADS)

    Hwang, Tzonelih; Luo, Yi-Ping

    2015-12-01

    This work proposes a probabilistic authenticated quantum dialogue (PAQD) based on Bell states with the following notable features. (1) In our proposed scheme, the dialogue is encoded in a probabilistic way, i.e., the same messages can be encoded into different quantum states, whereas in the state-of-the-art authenticated quantum dialogue (AQD), the dialogue is encoded in a deterministic way; (2) the pre-shared secret key between two communicants can be reused without any security loophole; (3) each dialogue in the proposed PAQD can be exchanged within only one-step quantum communication and one-step classical communication. However, in the state-of-the-art AQD protocols, both communicants have to run a QKD protocol for each dialogue and each dialogue requires multiple quantum as well as classical communicational steps; (4) nevertheless, the proposed scheme can resist the man-in-the-middle attack, the modification attack, and even other well-known attacks.

  7. Geothermal probabilistic cost study

    NASA Astrophysics Data System (ADS)

    Orren, L. H.; Ziman, G. M.; Jones, S. C.; Lee, T. K.; Noll, R.; Wilde, L.; Sadanand, V.

    1981-08-01

    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model was used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents was analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance were examined.

  8. Geothermal probabilistic cost study

    NASA Technical Reports Server (NTRS)

    Orren, L. H.; Ziman, G. M.; Jones, S. C.; Lee, T. K.; Noll, R.; Wilde, L.; Sadanand, V.

    1981-01-01

    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model was used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents was analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance were examined.

  9. Geothermal probabilistic cost study

    SciTech Connect

    Orren, L.H.; Ziman, G.M.; Jones, S.C.; Lee, T.K.; Noll, R.; Wilde, L.; Sadanand, V.

    1981-08-01

    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model is used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents are analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance are examined. (MHR)

  10. Probabilistic river forecast methodology

    NASA Astrophysics Data System (ADS)

    Kelly, Karen Suzanne

    1997-09-01

    The National Weather Service (NWS) operates deterministic conceptual models to predict the hydrologic response of a river basin to precipitation. The output from these models are forecasted hydrographs (time series of the future river stage) at certain locations along a river. In order for the forecasts to be useful for optimal decision making, the uncertainty associated with them must be quantified. A methodology is developed for this purpose that (i) can be implemented with any deterministic hydrologic model, (ii) receives a probabilistic forecast of precipitation as input, (iii) quantifies all sources of uncertainty, (iv) operates in real-time and within computing constraints, and (v) produces probability distributions of future river stages. The Bayesian theory which supports the methodology involves transformation of a distribution of future precipitation into one of future river stage, and statistical characterization of the uncertainty in the hydrologic model. This is accomplished by decomposing total uncertainty into that associated with future precipitation and that associated with the hydrologic transformations. These are processed independently and then integrated into a predictive distribution which constitutes a probabilistic river stage forecast. A variety of models are presented for implementation of the methodology. In the most general model, a probability of exceedance associated with a given future hydrograph specified. In the simplest model, a probability of exceedance associated with a given future river stage is specified. In conjunction with the Ohio River Forecast Center of the NWS, the simplest model is used to demonstrate the feasibility of producing probabilistic river stage forecasts for a river basin located in headwaters. Previous efforts to quantify uncertainty in river forecasting have only considered selected sources of uncertainty, been specific to a particular hydrologic model, or have not obtained an entire probability

  11. Probabilistic simple splicing systems

    NASA Astrophysics Data System (ADS)

    Selvarajoo, Mathuri; Heng, Fong Wan; Sarmin, Nor Haniza; Turaev, Sherzod

    2014-06-01

    A splicing system, one of the early theoretical models for DNA computing was introduced by Head in 1987. Splicing systems are based on the splicing operation which, informally, cuts two strings of DNA molecules at the specific recognition sites and attaches the prefix of the first string to the suffix of the second string, and the prefix of the second string to the suffix of the first string, thus yielding the new strings. For a specific type of splicing systems, namely the simple splicing systems, the recognition sites are the same for both strings of DNA molecules. It is known that splicing systems with finite sets of axioms and splicing rules only generate regular languages. Hence, different types of restrictions have been considered for splicing systems in order to increase their computational power. Recently, probabilistic splicing systems have been introduced where the probabilities are initially associated with the axioms, and the probabilities of the generated strings are computed from the probabilities of the initial strings. In this paper, some properties of probabilistic simple splicing systems are investigated. We prove that probabilistic simple splicing systems can also increase the computational power of the splicing languages generated.

  12. Probabilistic Failure Assessment For Fatigue

    NASA Technical Reports Server (NTRS)

    Moore, Nicholas; Ebbeler, Donald; Newlin, Laura; Sutharshana, Sravan; Creager, Matthew

    1995-01-01

    Probabilistic Failure Assessment for Fatigue (PFAFAT) package of software utilizing probabilistic failure-assessment (PFA) methodology to model high- and low-cycle-fatigue modes of failure of structural components. Consists of nine programs. Three programs perform probabilistic fatigue analysis by means of Monte Carlo simulation. Other six used for generating random processes, characterizing fatigue-life data pertaining to materials, and processing outputs of computational simulations. Written in FORTRAN 77.

  13. Incorporating psychological influences in probabilistic cost analysis

    SciTech Connect

    Kujawski, Edouard; Alvaro, Mariana; Edwards, William

    2004-01-08

    Today's typical probabilistic cost analysis assumes an ''ideal'' project that is devoid of the human and organizational considerations that heavily influence the success and cost of real-world projects. In the real world ''Money Allocated Is Money Spent'' (MAIMS principle); cost underruns are rarely available to protect against cost overruns while task overruns are passed on to the total project cost. Realistic cost estimates therefore require a modified probabilistic cost analysis that simultaneously models the cost management strategy including budget allocation. Psychological influences such as overconfidence in assessing uncertainties and dependencies among cost elements and risks are other important considerations that are generally not addressed. It should then be no surprise that actual project costs often exceed the initial estimates and are delivered late and/or with a reduced scope. This paper presents a practical probabilistic cost analysis model that incorporates recent findings in human behavior and judgment under uncertainty, dependencies among cost elements, the MAIMS principle, and project management practices. Uncertain cost elements are elicited from experts using the direct fractile assessment method and fitted with three-parameter Weibull distributions. The full correlation matrix is specified in terms of two parameters that characterize correlations among cost elements in the same and in different subsystems. The analysis is readily implemented using standard Monte Carlo simulation tools such as {at}Risk and Crystal Ball{reg_sign}. The analysis of a representative design and engineering project substantiates that today's typical probabilistic cost analysis is likely to severely underestimate project cost for probability of success values of importance to contractors and procuring activities. The proposed approach provides a framework for developing a viable cost management strategy for allocating baseline budgets and contingencies. Given the

  14. Risk-based monitored natural attenuation--a case study.

    PubMed

    Khan, F I; Husain, T

    2001-08-17

    The term "monitored natural attenuation" (MNA) refers to a reliance on natural attenuation (NA) processes for remediation through the careful monitoring of the behavior of a contaminant source in time and space domains. In recent years, policymakers are shifting to a risk-based approach where site characteristics are measured against the potential risk to human health and the environment, and site management strategies are prioritized to be commensurate with that risk. Risk-based corrective action (RBCA), a concept developed by the American Society for Testing Materials (ASTM), was the first indication of how this approach could be used in the development of remediation strategies. This paper, which links ASTM's RBCA approach with MNA, develops a systematic working methodology for a risk-based site evaluation and remediation through NA. The methodology is comprised of seven steps, with the first five steps intended to evaluate site characteristics and the feasibility of NA. If NA is effective, then the last two steps will guide the development of a long-term monitoring plan and approval for a site closure. This methodology is used to evaluate a site contaminated with oil from a pipeline spill. The case study concluded that the site has the requisite characteristics for NA, but it would take more than 80 years for attenuation of xylene and ethylbenzene, as these chemicals appear in the pure phase. If fast remediation is sought, then efforts should be made to remove the contaminant from the soil. Initially, the site posed a serious risk to both on-site and off-site receptors, but it becomes acceptable after 20 years, as the plume is diluted and drifts from its source of origin.

  15. Probabilistic Seismic Hazard Analysis

    SciTech Connect

    Not Available

    1988-01-01

    The purpose of Probabilistic Seismic Hazard Analysis (PSHA) is to evaluate the hazard of seismic ground motion at a site by considering all possible earthquakes in the area, estimating the associated shaking at the site, and calculating the probabilities of these occurrences. The Panel on Seismic Hazard Analysis is charged with assessment of the capabilities, limitations, and future trends of PSHA in the context of alternatives. The report identifies and discusses key issues of PSHA and is addressed to decision makers with a modest scientific and technical background and to the scientific and technical community. 37 refs., 19 figs.

  16. Put risk-based remediation to work

    SciTech Connect

    Johl, C.J.; Feldman, L.; Rafferty, M.T.

    1995-09-01

    Risk-based site cleanups are gaining prominence in environmental remediation. In particular, the ``brownfields`` program in the US--designed to promote the redevelopment of contaminated industrial sites rather than the development of pristine sites--is bringing this new remediation approach to the forefront on a national basis. The traditional approach to remediating a contaminated site is dubbed the remedial investigation and feasibility study (RI-FS) approach. Using an RI-FS approach, site operators and environmental consultants conduct a complete site characterization, using extensive air, water and soil sampling, and then evaluate all potential remediation alternatives. In many cases, the traditional remediation goal has been to return contaminant levels to background or ``non-detect`` levels--with little or no regard to the potential future use of the site. However, with cleanup costs on the rise, and a heightened awareness of the ``how clean is clean`` debate, nay are beginning to view the RI-FS approach as excessive. By comparison, the goal for a focused, risk-based site remediation is to protect human health and the environment in a manner that is consistent with the planned use of the site. Compared to a standard RI-FS cleanup, the newer method can save time and money, by prioritizing site-restoration activities based on risk analysis. A comparison of the to approaches for metals-laden soil is presented.

  17. A risk-based sensor placement methodology.

    PubMed

    Lee, Ronald W; Kulesz, James J

    2008-10-30

    A risk-based sensor placement methodology is proposed to solve the problem of optimal location of sensors to protect population against the exposure to, and effects of, known and/or postulated chemical, biological, and/or radiological threats. Risk is calculated as a quantitative value representing population at risk from exposure at standard exposure levels. Historical meteorological data are used to characterize weather conditions as the frequency of wind speed and direction pairs. The meteorological data drive atmospheric transport and dispersion modeling of the threats, the results of which are used to calculate risk values. Sensor locations are determined via an iterative dynamic programming algorithm whereby threats detected by sensors placed in prior iterations are removed from consideration in subsequent iterations. In addition to the risk-based placement algorithm, the proposed methodology provides a quantification of the marginal utility of each additional sensor. This is the fraction of the total risk accounted for by placement of the sensor. Thus, the criteria for halting the iterative process can be the number of sensors available, a threshold marginal utility value, and/or a minimum cumulative utility achieved with all sensors.

  18. A Risk-Based Sensor Placement Methodology

    SciTech Connect

    Lee, Ronald W; Kulesz, James J

    2008-01-01

    A risk-based sensor placement methodology is proposed to solve the problem of optimal location of sensors or detectors to protect population against the exposure to and effects of known and/or postulated chemical, biological, and/or radiological threats. Risk is calculated as a quantitative value representing population at risk from exposure against standard exposure levels. Historical meteorological data are used to characterize weather conditions as the frequency of wind speed and direction pairs. The meteorological data drive atmospheric transport and dispersion modeling of the threats, the results of which are used to calculate risk values. Sensor locations are determined via an iterative dynamic programming algorithm whereby threats captured or detected by sensors placed in prior stages are removed from consideration in subsequent stages. In addition to the risk-based placement algorithm, the proposed methodology provides a quantification of the marginal utility of each additional sensor or detector. Thus, the criterion for halting the iterative process can be the number of detectors available, a threshold marginal utility value, or the cumulative detection of a minimum factor of the total risk value represented by all threats. The methodology quantifies the effect of threat reduction measures, such as reduced probability of one or more threats due to administrative and/or engineering controls.

  19. Towards Risk Based Design for NASA's Missions

    NASA Technical Reports Server (NTRS)

    Tumer, Irem Y.; Barrientos, Francesca; Meshkat, Leila

    2004-01-01

    This paper describes the concept of Risk Based Design in the context of NASA s low volume, high cost missions. The concept of accounting for risk in the design lifecycle has been discussed and proposed under several research topics, including reliability, risk analysis, optimization, uncertainty, decision-based design, and robust design. This work aims to identify and develop methods to enable and automate a means to characterize and optimize risk, and use risk as a tradeable resource to make robust and reliable decisions, in the context of the uncertain and ambiguous stage of early conceptual design. This paper first presents a survey of the related topics explored in the design research community as they relate to risk based design. Then, a summary of the topics from the NASA-led Risk Colloquium is presented, followed by current efforts within NASA to account for risk in early design. Finally, a list of "risk elements", identified for early-phase conceptual design at NASA, is presented. The purpose is to lay the foundation and develop a roadmap for future work and collaborations for research to eliminate and mitigate these risk elements in early phase design.

  20. Risk based limits for Operational Safety Requirements

    SciTech Connect

    Cappucci, A.J. Jr.

    1993-01-18

    OSR limits are designed to protect the assumptions made in the facility safety analysis in order to preserve the safety envelope during facility operation. Normally, limits are set based on ``worst case conditions`` without regard to the likelihood (frequency) of a credible event occurring. In special cases where the accident analyses are based on ``time at risk`` arguments, it may be desirable to control the time at which the facility is at risk. A methodology has been developed to use OSR limits to control the source terms and the times these source terms would be available, thus controlling the acceptable risk to a nuclear process facility. The methodology defines a new term ``gram-days``. This term represents the area under a source term (inventory) vs time curve which represents the risk to the facility. Using the concept of gram-days (normalized to one year) allows the use of an accounting scheme to control the risk under the inventory vs time curve. The methodology results in at least three OSR limits: (1) control of the maximum inventory or source term, (2) control of the maximum gram-days for the period based on a source term weighted average, and (3) control of the maximum gram-days at the individual source term levels. Basing OSR limits on risk based safety analysis is feasible, and a basis for development of risk based limits is defensible. However, monitoring inventories and the frequencies required to maintain facility operation within the safety envelope may be complex and time consuming.

  1. Probabilistic Cellular Automata

    PubMed Central

    Agapie, Alexandru; Giuclea, Marius

    2014-01-01

    Abstract Cellular automata are binary lattices used for modeling complex dynamical systems. The automaton evolves iteratively from one configuration to another, using some local transition rule based on the number of ones in the neighborhood of each cell. With respect to the number of cells allowed to change per iteration, we speak of either synchronous or asynchronous automata. If randomness is involved to some degree in the transition rule, we speak of probabilistic automata, otherwise they are called deterministic. With either type of cellular automaton we are dealing with, the main theoretical challenge stays the same: starting from an arbitrary initial configuration, predict (with highest accuracy) the end configuration. If the automaton is deterministic, the outcome simplifies to one of two configurations, all zeros or all ones. If the automaton is probabilistic, the whole process is modeled by a finite homogeneous Markov chain, and the outcome is the corresponding stationary distribution. Based on our previous results for the asynchronous case—connecting the probability of a configuration in the stationary distribution to its number of zero-one borders—the article offers both numerical and theoretical insight into the long-term behavior of synchronous cellular automata. PMID:24999557

  2. Quantum probabilistic logic programming

    NASA Astrophysics Data System (ADS)

    Balu, Radhakrishnan

    2015-05-01

    We describe a quantum mechanics based logic programming language that supports Horn clauses, random variables, and covariance matrices to express and solve problems in probabilistic logic. The Horn clauses of the language wrap random variables, including infinite valued, to express probability distributions and statistical correlations, a powerful feature to capture relationship between distributions that are not independent. The expressive power of the language is based on a mechanism to implement statistical ensembles and to solve the underlying SAT instances using quantum mechanical machinery. We exploit the fact that classical random variables have quantum decompositions to build the Horn clauses. We establish the semantics of the language in a rigorous fashion by considering an existing probabilistic logic language called PRISM with classical probability measures defined on the Herbrand base and extending it to the quantum context. In the classical case H-interpretations form the sample space and probability measures defined on them lead to consistent definition of probabilities for well formed formulae. In the quantum counterpart, we define probability amplitudes on Hinterpretations facilitating the model generations and verifications via quantum mechanical superpositions and entanglements. We cast the well formed formulae of the language as quantum mechanical observables thus providing an elegant interpretation for their probabilities. We discuss several examples to combine statistical ensembles and predicates of first order logic to reason with situations involving uncertainty.

  3. Risk-based targeting: A new approach in environmental protection

    SciTech Connect

    Fox, C.A.

    1995-12-31

    Risk-based targeting has recently emerged as an effective tool to help prioritize efforts to identify and manage geographic areas, chemicals, facilities, and agricultural activities that cause the most environmental degradation. This paper focuses on how the Environmental Protection Agency (EPA) has recently used risk-based targeting to identify and screen Federal, industrial, commercial and municipal facilities which contribute to probable human health (fish consumption advisories and contaminated fish tissue) and aquatic life (contaminated sediments) impacts. Preliminary results identified several hundred potential contributors of problem chemicals to probable impacts within the same river reach in 1991--93. Analysis by industry sector showed that the majority of the facilities identified were publicly owned treatment works (POTWs), in addition to industry organic and inorganic chemical manufacturers, petroleum refineries, and electric services, coatings, engravings, and allied services, among others. Both compliant and non-compliant potentially contributing facilities were identified to some extent in all EPA regions. Additional results identifying possible linkages of other pollutant sources to probable impacts, as well as estimation of potential exposure of these contaminants to minority and/or poverty populations are also presented. Out of these analyses, a number of short and long-term strategies are being developed that EPA may use to reduce loadings of problem contaminants to impacted waterbodies.

  4. Topics in Probabilistic Judgment Aggregation

    ERIC Educational Resources Information Center

    Wang, Guanchun

    2011-01-01

    This dissertation is a compilation of several studies that are united by their relevance to probabilistic judgment aggregation. In the face of complex and uncertain events, panels of judges are frequently consulted to provide probabilistic forecasts, and aggregation of such estimates in groups often yield better results than could have been made…

  5. Passage Retrieval: A Probabilistic Technique.

    ERIC Educational Resources Information Center

    Melucci, Massimo

    1998-01-01

    Presents a probabilistic technique to retrieve passages from texts having a large size or heterogeneous semantic content. Results of experiments comparing the probabilistic technique to one based on a text segmentation algorithm revealed that the passage size affects passage retrieval performance; text organization and query generality may have an…

  6. Cost-effectiveness and harm-benefit analyses of risk-based screening strategies for breast cancer.

    PubMed

    Vilaprinyo, Ester; Forné, Carles; Carles, Misericordia; Sala, Maria; Pla, Roger; Castells, Xavier; Domingo, Laia; Rue, Montserrat

    2014-01-01

    The one-size-fits-all paradigm in organized screening of breast cancer is shifting towards a personalized approach. The present study has two objectives: 1) To perform an economic evaluation and to assess the harm-benefit ratios of screening strategies that vary in their intensity and interval ages based on breast cancer risk; and 2) To estimate the gain in terms of cost and harm reductions using risk-based screening with respect to the usual practice. We used a probabilistic model and input data from Spanish population registries and screening programs, as well as from clinical studies, to estimate the benefit, harm, and costs over time of 2,624 screening strategies, uniform or risk-based. We defined four risk groups, low, moderate-low, moderate-high and high, based on breast density, family history of breast cancer and personal history of breast biopsy. The risk-based strategies were obtained combining the exam periodicity (annual, biennial, triennial and quinquennial), the starting ages (40, 45 and 50 years) and the ending ages (69 and 74 years) in the four risk groups. Incremental cost-effectiveness and harm-benefit ratios were used to select the optimal strategies. Compared to risk-based strategies, the uniform ones result in a much lower benefit for a specific cost. Reductions close to 10% in costs and higher than 20% in false-positive results and overdiagnosed cases were obtained for risk-based strategies. Optimal screening is characterized by quinquennial or triennial periodicities for the low or moderate risk-groups and annual periodicity for the high-risk group. Risk-based strategies can reduce harm and costs. It is necessary to develop accurate measures of individual risk and to work on how to implement risk-based screening strategies.

  7. Cost-Effectiveness and Harm-Benefit Analyses of Risk-Based Screening Strategies for Breast Cancer

    PubMed Central

    Carles, Misericordia; Sala, Maria; Pla, Roger; Castells, Xavier; Domingo, Laia; Rue, Montserrat

    2014-01-01

    The one-size-fits-all paradigm in organized screening of breast cancer is shifting towards a personalized approach. The present study has two objectives: 1) To perform an economic evaluation and to assess the harm-benefit ratios of screening strategies that vary in their intensity and interval ages based on breast cancer risk; and 2) To estimate the gain in terms of cost and harm reductions using risk-based screening with respect to the usual practice. We used a probabilistic model and input data from Spanish population registries and screening programs, as well as from clinical studies, to estimate the benefit, harm, and costs over time of 2,624 screening strategies, uniform or risk-based. We defined four risk groups, low, moderate-low, moderate-high and high, based on breast density, family history of breast cancer and personal history of breast biopsy. The risk-based strategies were obtained combining the exam periodicity (annual, biennial, triennial and quinquennial), the starting ages (40, 45 and 50 years) and the ending ages (69 and 74 years) in the four risk groups. Incremental cost-effectiveness and harm-benefit ratios were used to select the optimal strategies. Compared to risk-based strategies, the uniform ones result in a much lower benefit for a specific cost. Reductions close to 10% in costs and higher than 20% in false-positive results and overdiagnosed cases were obtained for risk-based strategies. Optimal screening is characterized by quinquennial or triennial periodicities for the low or moderate risk-groups and annual periodicity for the high-risk group. Risk-based strategies can reduce harm and costs. It is necessary to develop accurate measures of individual risk and to work on how to implement risk-based screening strategies. PMID:24498285

  8. Probabilistic retinal vessel segmentation

    NASA Astrophysics Data System (ADS)

    Wu, Chang-Hua; Agam, Gady

    2007-03-01

    Optic fundus assessment is widely used for diagnosing vascular and non-vascular pathology. Inspection of the retinal vasculature may reveal hypertension, diabetes, arteriosclerosis, cardiovascular disease and stroke. Due to various imaging conditions retinal images may be degraded. Consequently, the enhancement of such images and vessels in them is an important task with direct clinical applications. We propose a novel technique for vessel enhancement in retinal images that is capable of enhancing vessel junctions in addition to linear vessel segments. This is an extension of vessel filters we have previously developed for vessel enhancement in thoracic CT scans. The proposed approach is based on probabilistic models which can discern vessels and junctions. Evaluation shows the proposed filter is better than several known techniques and is comparable to the state of the art when evaluated on a standard dataset. A ridge-based vessel tracking process is applied on the enhanced image to demonstrate the effectiveness of the enhancement filter.

  9. Probabilistic Fiber Composite Micromechanics

    NASA Technical Reports Server (NTRS)

    Stock, Thomas A.

    1996-01-01

    Probabilistic composite micromechanics methods are developed that simulate expected uncertainties in unidirectional fiber composite properties. These methods are in the form of computational procedures using Monte Carlo simulation. The variables in which uncertainties are accounted for include constituent and void volume ratios, constituent elastic properties and strengths, and fiber misalignment. A graphite/epoxy unidirectional composite (ply) is studied to demonstrate fiber composite material property variations induced by random changes expected at the material micro level. Regression results are presented to show the relative correlation between predictor and response variables in the study. These computational procedures make possible a formal description of anticipated random processes at the intra-ply level, and the related effects of these on composite properties.

  10. Probabilistic Mesomechanical Fatigue Model

    NASA Technical Reports Server (NTRS)

    Tryon, Robert G.

    1997-01-01

    A probabilistic mesomechanical fatigue life model is proposed to link the microstructural material heterogeneities to the statistical scatter in the macrostructural response. The macrostructure is modeled as an ensemble of microelements. Cracks nucleation within the microelements and grow from the microelements to final fracture. Variations of the microelement properties are defined using statistical parameters. A micromechanical slip band decohesion model is used to determine the crack nucleation life and size. A crack tip opening displacement model is used to determine the small crack growth life and size. Paris law is used to determine the long crack growth life. The models are combined in a Monte Carlo simulation to determine the statistical distribution of total fatigue life for the macrostructure. The modeled response is compared to trends in experimental observations from the literature.

  11. Novel probabilistic neuroclassifier

    NASA Astrophysics Data System (ADS)

    Hong, Jiang; Serpen, Gursel

    2003-09-01

    A novel probabilistic potential function neural network classifier algorithm to deal with classes which are multi-modally distributed and formed from sets of disjoint pattern clusters is proposed in this paper. The proposed classifier has a number of desirable properties which distinguish it from other neural network classifiers. A complete description of the algorithm in terms of its architecture and the pseudocode is presented. Simulation analysis of the newly proposed neuro-classifier algorithm on a set of benchmark problems is presented. Benchmark problems tested include IRIS, Sonar, Vowel Recognition, Two-Spiral, Wisconsin Breast Cancer, Cleveland Heart Disease and Thyroid Gland Disease. Simulation results indicate that the proposed neuro-classifier performs consistently better for a subset of problems for which other neural classifiers perform relatively poorly.

  12. Use of Geologic and Paleoflood Information for INL Probabilistic Flood Hazard Decisions

    NASA Astrophysics Data System (ADS)

    Ostenaa, D.; O'Connell, D.; Creed, B.

    2009-05-01

    The Big Lost River is a western U.S., closed basin stream which flows through and terminates on the Idaho National Laboratory. Historic flows are highly regulated, and peak flows decline downstream through natural and anthropomorphic influences. Glaciated headwater regions were the source of Pleistocene outburst floods which traversed the site. A wide range of DOE facilities (including a nuclear research reactor) require flood stage estimates for flow exceedance probabilities over a range from 1/100/yr to 1/100,000/yr per DOE risk based standards. These risk management objectives required the integration of geologic and geomorphic paleoflood data into Bayesian non parametric flood frequency analyses that incorporated measurement uncertainties in gaged, historical, and paleoflood discharges and non exceedance bounds to produce fully probabilistic flood frequency estimates for annual exceedance probabilities of specific discharges of interest. Two-dimensional hydraulic flow modeling with scenarios for varied hydraulic parameters, infiltration, and culvert blockages on the site was conducted for a range of discharges from 13-700 m3/s. High-resolution topographic grids and two-dimensional flow modeling allowed detailed evaluation of the potential impacts of numerous secondary channels and flow paths resulting from flooding in extreme events. These results were used to construct stage probability curves for 15 key locations on the site consistent with DOE standards. These probability curves resulted from the systematic inclusion of contributions of uncertainty from flood sources, hydraulic modeling, and flood-frequency analyses. These products also provided a basis to develop weights for logic tree branches associated with infiltration and culvert performance scenarios to produce probabilistic inundation maps. The flood evaluation process was structured using Senior Seismic Hazard Analysis Committee processes (NRC-NUREG/CR-6372) concepts, evaluating and integrating the

  13. Probabilistic brains: knowns and unknowns

    PubMed Central

    Pouget, Alexandre; Beck, Jeffrey M; Ma, Wei Ji; Latham, Peter E

    2015-01-01

    There is strong behavioral and physiological evidence that the brain both represents probability distributions and performs probabilistic inference. Computational neuroscientists have started to shed light on how these probabilistic representations and computations might be implemented in neural circuits. One particularly appealing aspect of these theories is their generality: they can be used to model a wide range of tasks, from sensory processing to high-level cognition. To date, however, these theories have only been applied to very simple tasks. Here we discuss the challenges that will emerge as researchers start focusing their efforts on real-life computations, with a focus on probabilistic learning, structural learning and approximate inference. PMID:23955561

  14. Probabilistic Design of Composite Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2006-01-01

    A formal procedure for the probabilistic design evaluation of a composite structure is described. The uncertainties in all aspects of a composite structure (constituent material properties, fabrication variables, structural geometry, and service environments, etc.), which result in the uncertain behavior in the composite structural responses, are included in the evaluation. The probabilistic evaluation consists of: (1) design criteria, (2) modeling of composite structures and uncertainties, (3) simulation methods, and (4) the decision-making process. A sample case is presented to illustrate the formal procedure and to demonstrate that composite structural designs can be probabilistically evaluated with accuracy and efficiency.

  15. Probabilistic, meso-scale flood loss modelling

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2016-04-01

    Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.

  16. Probabilistic Open Set Recognition

    NASA Astrophysics Data System (ADS)

    Jain, Lalit Prithviraj

    Real-world tasks in computer vision, pattern recognition and machine learning often touch upon the open set recognition problem: multi-class recognition with incomplete knowledge of the world and many unknown inputs. An obvious way to approach such problems is to develop a recognition system that thresholds probabilities to reject unknown classes. Traditional rejection techniques are not about the unknown; they are about the uncertain boundary and rejection around that boundary. Thus traditional techniques only represent the "known unknowns". However, a proper open set recognition algorithm is needed to reduce the risk from the "unknown unknowns". This dissertation examines this concept and finds existing probabilistic multi-class recognition approaches are ineffective for true open set recognition. We hypothesize the cause is due to weak adhoc assumptions combined with closed-world assumptions made by existing calibration techniques. Intuitively, if we could accurately model just the positive data for any known class without overfitting, we could reject the large set of unknown classes even under this assumption of incomplete class knowledge. For this, we formulate the problem as one of modeling positive training data by invoking statistical extreme value theory (EVT) near the decision boundary of positive data with respect to negative data. We provide a new algorithm called the PI-SVM for estimating the unnormalized posterior probability of class inclusion. This dissertation also introduces a new open set recognition model called Compact Abating Probability (CAP), where the probability of class membership decreases in value (abates) as points move from known data toward open space. We show that CAP models improve open set recognition for multiple algorithms. Leveraging the CAP formulation, we go on to describe the novel Weibull-calibrated SVM (W-SVM) algorithm, which combines the useful properties of statistical EVT for score calibration with one-class and binary

  17. Arsenic speciation driving risk based corrective action.

    PubMed

    Marlborough, Sidney J; Wilson, Vincent L

    2015-07-01

    The toxicity of arsenic depends on a number of factors including its valence state. The more potent trivalent arsenic [arsenite (As3+)] inhibits a large number of cellular enzymatic pathways involved in energy production, while the less toxic pentavalent arsenic [arsenate (As5+)] interferes with phosphate metabolism, phosphoproteins and ATP formation (uncoupling of oxidative phosphorylation). Environmental risk based corrective action for arsenic contamination utilizes data derived from arsenite studies of toxicity to be conservative. However, depending upon environmental conditions, the arsenate species may predominate substantially, especially in well aerated surface soils. Analyses of soil concentrations of arsenic species at two sites in northeastern Texas historically contaminated with arsenical pesticides yielded mean arsenate concentrations above 90% of total arsenic with the majority of the remainder being the trivalent arsenite species. Ecological risk assessments based on the concentration of the trivalent arsenite species will lead to restrictive remediation requirements that do not adequately reflect the level of risk associated with the predominate species of arsenic found in the soil. The greater concentration of the pentavalent arsenate species in soils would be the more appropriate species to monitor remediation at sites that contain high arsenate to arsenite ratios.

  18. Probabilistic Risk Assessment: A Bibliography

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Probabilistic risk analysis is an integration of failure modes and effects analysis (FMEA), fault tree analysis and other techniques to assess the potential for failure and to find ways to reduce risk. This bibliography references 160 documents in the NASA STI Database that contain the major concepts, probabilistic risk assessment, risk and probability theory, in the basic index or major subject terms, An abstract is included with most citations, followed by the applicable subject terms.

  19. Probabilistic theories with purification

    SciTech Connect

    Chiribella, Giulio; D'Ariano, Giacomo Mauro; Perinotti, Paolo

    2010-06-15

    We investigate general probabilistic theories in which every mixed state has a purification, unique up to reversible channels on the purifying system. We show that the purification principle is equivalent to the existence of a reversible realization of every physical process, that is, to the fact that every physical process can be regarded as arising from a reversible interaction of the system with an environment, which is eventually discarded. From the purification principle we also construct an isomorphism between transformations and bipartite states that possesses all structural properties of the Choi-Jamiolkowski isomorphism in quantum theory. Such an isomorphism allows one to prove most of the basic features of quantum theory, like, e.g., existence of pure bipartite states giving perfect correlations in independent experiments, no information without disturbance, no joint discrimination of all pure states, no cloning, teleportation, no programming, no bit commitment, complementarity between correctable channels and deletion channels, characterization of entanglement-breaking channels as measure-and-prepare channels, and others, without resorting to the mathematical framework of Hilbert spaces.

  20. Probabilistic Survivability Versus Time Modeling

    NASA Technical Reports Server (NTRS)

    Joyner, James J., Sr.

    2015-01-01

    This technical paper documents Kennedy Space Centers Independent Assessment team work completed on three assessments for the Ground Systems Development and Operations (GSDO) Program to assist the Chief Safety and Mission Assurance Officer (CSO) and GSDO management during key programmatic reviews. The assessments provided the GSDO Program with an analysis of how egress time affects the likelihood of astronaut and worker survival during an emergency. For each assessment, the team developed probability distributions for hazard scenarios to address statistical uncertainty, resulting in survivability plots over time. The first assessment developed a mathematical model of probabilistic survivability versus time to reach a safe location using an ideal Emergency Egress System at Launch Complex 39B (LC-39B); the second used the first model to evaluate and compare various egress systems under consideration at LC-39B. The third used a modified LC-39B model to determine if a specific hazard decreased survivability more rapidly than other events during flight hardware processing in Kennedys Vehicle Assembly Building (VAB).Based on the composite survivability versus time graphs from the first two assessments, there was a soft knee in the Figure of Merit graphs at eight minutes (ten minutes after egress ordered). Thus, the graphs illustrated to the decision makers that the final emergency egress design selected should have the capability of transporting the flight crew from the top of LC 39B to a safe location in eight minutes or less. Results for the third assessment were dominated by hazards that were classified as instantaneous in nature (e.g. stacking mishaps) and therefore had no effect on survivability vs time to egress the VAB. VAB emergency scenarios that degraded over time (e.g. fire) produced survivability vs time graphs that were line with aerospace industry norms.

  1. Risk-Based Data Management System design specifications and implementation plan for the Alaska Oil and Gas Conservation Commission; the Mississippi State Oil and Gas Board; the Montana Board of Oil and Gas Conservation; and the Nebraska Oil and Gas Conservation Commission

    SciTech Connect

    Not Available

    1993-09-01

    The purpose of this document is to present design specifications and an implementation schedule for the development and implementation of Risk Based Data Management Systems (RBDMS`s) in the states of Alaska, Mississippi, Montana, and Nebraska. The document presents detailed design information including a description of the system database structure, data dictionary, data entry and inquiry screen layouts, specifications for standard reports that will be produced by the system, functions and capabilities (including environmental risk analyses), And table relationships for each database table within the system. This design information provides a comprehensive blueprint of the system to be developed and presents the necessary detailed information for system development and implementation. A proposed schedule for development and implementation also is presented. The schedule presents timeframes for the development of system modules, training, implementation, and providing assistance to the states with data conversion from existing systems. However, the schedule will vary depending upon the timing of funding allocations from the United States Department of Energy (DOE) for the development and implementation phase of the project. For planning purposes, the schedule assumes that initiation of the development and implementation phase will commence November 1, 1993, somewhat later than originally anticipated.

  2. 12 CFR 652.70 - Risk-based capital level.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 6 2011-01-01 2011-01-01 false Risk-based capital level. 652.70 Section 652.70 Banks and Banking FARM CREDIT ADMINISTRATION FARM CREDIT SYSTEM FEDERAL AGRICULTURAL MORTGAGE... risk-based capital level is the sum of the following amounts: (a) Credit and interest rate risk....

  3. Risk-based versus deterministic explosives safety criteria

    SciTech Connect

    Wright, R.E.

    1996-12-01

    The Department of Defense Explosives Safety Board (DDESB) is actively considering ways to apply risk-based approaches in its decision- making processes. As such, an understanding of the impact of converting to risk-based criteria is required. The objectives of this project are to examine the benefits and drawbacks of risk-based criteria and to define the impact of converting from deterministic to risk-based criteria. Conclusions will be couched in terms that allow meaningful comparisons of deterministic and risk-based approaches. To this end, direct comparisons of the consequences and impacts of both deterministic and risk-based criteria at selected military installations are made. Deterministic criteria used in this report are those in DoD 6055.9-STD, `DoD Ammunition and Explosives Safety Standard.` Risk-based criteria selected for comparison are those used by the government of Switzerland, `Technical Requirements for the Storage of Ammunition (TLM 75).` The risk-based criteria used in Switzerland were selected because they have been successfully applied for over twenty-five years.

  4. Probabilistic risk assessment of the Space Shuttle. Phase 3: A study of the potential of losing the vehicle during nominal operation, volume 1

    NASA Technical Reports Server (NTRS)

    Fragola, Joseph R.; Maggio, Gaspare; Frank, Michael V.; Gerez, Luis; Mcfadden, Richard H.; Collins, Erin P.; Ballesio, Jorge; Appignani, Peter L.; Karns, James J.

    1995-01-01

    This document is the Executive Summary of a technical report on a probabilistic risk assessment (PRA) of the Space Shuttle vehicle performed under the sponsorship of the Office of Space Flight of the US National Aeronautics and Space Administration. It briefly summarizes the methodology and results of the Shuttle PRA. The primary objective of this project was to support management and engineering decision-making with respect to the Shuttle program by producing (1) a quantitative probabilistic risk model of the Space Shuttle during flight, (2) a quantitative assessment of in-flight safety risk, (3) an identification and prioritization of the design and operations that principally contribute to in-flight safety risk, and (4) a mechanism for risk-based evaluation proposed modifications to the Shuttle System. Secondary objectives were to provide a vehicle for introducing and transferring PRA technology to the NASA community, and to demonstrate the value of PRA by applying it beneficially to a real program of great international importance.

  5. Vagueness as Probabilistic Linguistic Knowledge

    NASA Astrophysics Data System (ADS)

    Lassiter, Daniel

    Consideration of the metalinguistic effects of utterances involving vague terms has led Barker [1] to treat vagueness using a modified Stalnakerian model of assertion. I present a sorites-like puzzle for factual beliefs in the standard Stalnakerian model [28] and show that it can be resolved by enriching the model to make use of probabilistic belief spaces. An analogous problem arises for metalinguistic information in Barker's model, and I suggest that a similar enrichment is needed here as well. The result is a probabilistic theory of linguistic representation that retains a classical metalanguage but avoids the undesirable divorce between meaning and use inherent in the epistemic theory [34]. I also show that the probabilistic approach provides a plausible account of the sorites paradox and higher-order vagueness and that it fares well empirically and conceptually in comparison to leading competitors.

  6. A probabilistic tsunami hazard assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-11-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence-based decision-making regarding risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean tsunami, but this has been largely concentrated on the Sunda Arc with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent probabilistic tsunami hazard assessment (PTHA) for Indonesia. This assessment produces time-independent forecasts of tsunami hazards at the coast using data from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting the larger maximum magnitudes. The annual probability of experiencing a tsunami with a height of > 0.5 m at the coast is greater than 10% for Sumatra, Java, the Sunda islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of > 3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national-scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  7. Probabilistic reasoning in data analysis.

    PubMed

    Sirovich, Lawrence

    2011-09-20

    This Teaching Resource provides lecture notes, slides, and a student assignment for a lecture on probabilistic reasoning in the analysis of biological data. General probabilistic frameworks are introduced, and a number of standard probability distributions are described using simple intuitive ideas. Particular attention is focused on random arrivals that are independent of prior history (Markovian events), with an emphasis on waiting times, Poisson processes, and Poisson probability distributions. The use of these various probability distributions is applied to biomedical problems, including several classic experimental studies.

  8. Air Quality Monitoring: Risk-Based Choices

    NASA Technical Reports Server (NTRS)

    James, John T.

    2009-01-01

    Air monitoring is secondary to rigid control of risks to air quality. Air quality monitoring requires us to target the credible residual risks. Constraints on monitoring devices are severe. Must transition from archival to real-time, on-board monitoring. Must provide data to crew in a way that they can interpret findings. Dust management and monitoring may be a major concern for exploration class missions.

  9. Forewarning model for water pollution risk based on Bayes theory.

    PubMed

    Zhao, Jun; Jin, Juliang; Guo, Qizhong; Chen, Yaqian; Lu, Mengxiong; Tinoco, Luis

    2014-02-01

    In order to reduce the losses by water pollution, forewarning model for water pollution risk based on Bayes theory was studied. This model is built upon risk indexes in complex systems, proceeding from the whole structure and its components. In this study, the principal components analysis is used to screen out index systems. Hydrological model is employed to simulate index value according to the prediction principle. Bayes theory is adopted to obtain posterior distribution by prior distribution with sample information which can make samples' features preferably reflect and represent the totals to some extent. Forewarning level is judged on the maximum probability rule, and then local conditions for proposing management strategies that will have the effect of transforming heavy warnings to a lesser degree. This study takes Taihu Basin as an example. After forewarning model application and vertification for water pollution risk from 2000 to 2009 between the actual and simulated data, forewarning level in 2010 is given as a severe warning, which is well coincide with logistic curve. It is shown that the model is rigorous in theory with flexible method, reasonable in result with simple structure, and it has strong logic superiority and regional adaptability, providing a new way for warning water pollution risk.

  10. Generalization in probabilistic RAM nets.

    PubMed

    Clarkson, T G; Guan, Y; Taylor, J G; Gorse, D

    1993-01-01

    The probabilistic RAM (pRAM) is a hardware-realizable neural device which is stochastic in operation and highly nonlinear. Even small nets of pRAMs offer high levels of functionality. The means by which a pRAM network generalizes when trained in noise is shown and the results of this behavior are described.

  11. Probabilistic assessment of composite structures

    NASA Technical Reports Server (NTRS)

    Shiao, Michael E.; Abumeri, Galib H.; Chamis, Christos C.

    1993-01-01

    A general computational simulation methodology for an integrated probabilistic assessment of composite structures is discussed and demonstrated using aircraft fuselage (stiffened composite cylindrical shell) structures with rectangular cutouts. The computational simulation was performed for the probabilistic assessment of the structural behavior including buckling loads, vibration frequencies, global displacements, and local stresses. The scatter in the structural response is simulated based on the inherent uncertainties in the primitive (independent random) variables at the fiber matrix constituent, ply, laminate, and structural scales that describe the composite structures. The effect of uncertainties due to fabrication process variables such as fiber volume ratio, void volume ratio, ply orientation, and ply thickness is also included. The methodology has been embedded in the computer code IPACS (Integrated Probabilistic Assessment of Composite Structures). In addition to the simulated scatter, the IPACS code also calculates the sensitivity of the composite structural behavior to all the primitive variables that influence the structural behavior. This information is useful for assessing reliability and providing guidance for improvement. The results from the probabilistic assessment for the composite structure with rectangular cutouts indicate that the uncertainty in the longitudinal ply stress is mainly caused by the uncertainty in the laminate thickness, and the large overlap of the scatter in the first four buckling loads implies that the buckling mode shape for a specific buckling load can be either of the four modes.

  12. Research on probabilistic information processing

    NASA Technical Reports Server (NTRS)

    Edwards, W.

    1973-01-01

    The work accomplished on probabilistic information processing (PIP) is reported. The research proposals and decision analysis are discussed along with the results of research on MSC setting, multiattribute utilities, and Bayesian research. Abstracts of reports concerning the PIP research are included.

  13. Enhanced probabilistic microcell prediction model

    NASA Astrophysics Data System (ADS)

    Kim, Song-Kyoo

    2005-06-01

    A microcell is a cell with 1-km or less radius which is suitable not only for heavily urbanized area such as a metropolitan city but also for in-building area such as offices and shopping malls. This paper deals with the microcell prediction model of propagation loss focused on in-buildng solution that is analyzed by probabilistic techniques. The RSL (Receive Signal Level) is the factor which can evaluate the performance of a microcell and the LOS (Line-Of-Sight) component and the blockage loss directly effect on the RSL. Combination of the probabilistic method is applied to get these performance factors. The mathematical methods include the CLT (Central Limit Theorem) and the SSQC (Six-Sigma Quality Control) to get the parameters of the distribution. This probabilistic solution gives us compact measuring of performance factors. In addition, it gives the probabilistic optimization of strategies such as the number of cells, cell location, capacity of cells, range of cells and so on. In addition, the optimal strategies for antenna allocation for a building can be obtained by using this model.

  14. Designing Probabilistic Tasks for Kindergartners

    ERIC Educational Resources Information Center

    Skoumpourdi, Chrysanthi; Kafoussi, Sonia; Tatsis, Konstantinos

    2009-01-01

    Recent research suggests that children could be engaged in probability tasks at an early age and task characteristics seem to play an important role in the way children perceive an activity. To this direction in the present article we investigate the role of some basic characteristics of probabilistic tasks in their design and implementation. In…

  15. [Some thoughts of the comparison of risk based soil environmental standards between different countries].

    PubMed

    Zhang, Hong-Zhen; Luo, Yong-Ming; Xia, Jia-Qi; Zhang, Hai-Bo

    2011-03-01

    Risk-based soil environmental standard is one of the important aspects in contaminated soil management which have already been widely used in many countries. However, because of diversity in geographical, biological, social-cultural, regulatory and scientific aspects among each country, there are great distinctions on both titles and values of these soil environmental standards between different countries. Risk-based soil environmental standards and derivation process were introduced and compared in detail. The variability was analyzed and explained through the comparison of sensitivity risk receptors, land utilizations and pathways of exposure among these countries. We suggest that the risk-based soil environmental standards among the developed countries could be classified as target value, screening value and intervention value, which aim to protect soil for sustainable development in the future, to determine whether there is potential unacceptable risk to specified acceptors, and whether further counter-actions should be conducted, respectively. At last, risk assessment of contaminated soils and establishment of risk-based soil environmental standards in China was proposed.

  16. How Should Risk-Based Regulation Reflect Current Public Opinion?

    PubMed

    Pollock, Christopher John

    2016-08-01

    Risk-based regulation of novel agricultural products with public choice manifest via traceability and labelling is a more effective approach than the use of regulatory processes to reflect public concerns, which may not always be supported by evidence.

  17. A probabilistic Hu-Washizu variational principle

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Belytschko, T.; Besterfield, G. H.

    1987-01-01

    A Probabilistic Hu-Washizu Variational Principle (PHWVP) for the Probabilistic Finite Element Method (PFEM) is presented. This formulation is developed for both linear and nonlinear elasticity. The PHWVP allows incorporation of the probabilistic distributions for the constitutive law, compatibility condition, equilibrium, domain and boundary conditions into the PFEM. Thus, a complete probabilistic analysis can be performed where all aspects of the problem are treated as random variables and/or fields. The Hu-Washizu variational formulation is available in many conventional finite element codes thereby enabling the straightforward inclusion of the probabilistic features into present codes.

  18. Risk-based consequences of extreme natural hazard processes in mountain regions - Multi-hazard analysis in Tyrol (Austria)

    NASA Astrophysics Data System (ADS)

    Huttenlau, Matthias; Stötter, Johann

    2010-05-01

    weighting within the risk concept, this has sufficient implications on the results of risk analyses. Thus, an equal and scale appropriated balance of those risk components is a fundamental key factor for effective natural hazard risk analyses. The results of such analyses inform especially decision makers in the insurance industry, the administration, and politicians on potential consequences and are the basis for appropriate risk management strategies. Thereby, results (i) on an annual or probabilistic risk comprehension have to be distinguished from (ii) scenario-based analyses. The first analyses are based on statistics of periodically or episodically occurring events whereas the latter approach is especially applied for extreme, non-linear, stochastic events. Focusing on the needs especially of insurance companies, the first approaches are appropriate for premium pricing and reinsurance strategies with an annual perspective, whereas the latter is focusing on events with extreme loss burdens under worst-case criteria to guarantee accordant reinsurance coverage. Moreover, the demand of adequate loss model approaches and methods is strengthened by the risk-based requirements of the upcoming capital requirement directive Solvency II. The present study estimates the potential elements at risk, their corresponding damage potentials and the Probable Maximum Losses (PMLs) of extreme natural hazards events in Tyrol (Austria) and considers adequatly the scale dependency and balanced application of the introduced risk components. Beside the introduced analysis an additionally portfolio analysis of a regional insurance company was executed. The geocoded insurance contracts of this portfolio analysis were the basis to estimate spatial, socio-economical and functional differentiated mean insurance values for the different risk categories of (i) buildings, (ii) contents or inventory, (iii) vehicles, and (iv) persons in the study area. The estimated mean insurance values were

  19. Accounting for failure: risk-based regulation and the problems of ensuring healthcare quality in the NHS

    PubMed Central

    Beaussier, Anne-Laure; Demeritt, David; Griffiths, Alex; Rothstein, Henry

    2016-01-01

    In this paper, we examine why risk-based policy instruments have failed to improve the proportionality, effectiveness, and legitimacy of healthcare quality regulation in the National Health Service (NHS) in England. Rather than trying to prevent all possible harms, risk-based approaches promise to rationalise and manage the inevitable limits of what regulation can hope to achieve by focusing regulatory standard-setting and enforcement activity on the highest priority risks, as determined through formal assessments of their probability and consequences. As such, risk-based approaches have been enthusiastically adopted by healthcare quality regulators over the last decade. However, by drawing on historical policy analysis and in-depth interviews with 15 high-level UK informants in 2013–2015, we identify a series of practical problems in using risk-based policy instruments for defining, assessing, and ensuring compliance with healthcare quality standards. Based on our analysis, we go on to consider why, despite a succession of failures, healthcare regulators remain committed to developing and using risk-based approaches. We conclude by identifying several preconditions for successful risk-based regulation: goals must be clear and trade-offs between them amenable to agreement; regulators must be able to reliably assess the probability and consequences of adverse outcomes; regulators must have a range of enforcement tools that can be deployed in proportion to risk; and there must be political tolerance for adverse outcomes. PMID:27499677

  20. Environmental probabilistic quantitative assessment methodologies

    USGS Publications Warehouse

    Crovelli, R.A.

    1995-01-01

    In this paper, four petroleum resource assessment methodologies are presented as possible pollution assessment methodologies, even though petroleum as a resource is desirable, whereas pollution is undesirable. A methodology is defined in this paper to consist of a probability model and a probabilistic method, where the method is used to solve the model. The following four basic types of probability models are considered: 1) direct assessment, 2) accumulation size, 3) volumetric yield, and 4) reservoir engineering. Three of the four petroleum resource assessment methodologies were written as microcomputer systems, viz. TRIAGG for direct assessment, APRAS for accumulation size, and FASPU for reservoir engineering. A fourth microcomputer system termed PROBDIST supports the three assessment systems. The three assessment systems have different probability models but the same type of probabilistic method. The type of advantages of the analytic method are in computational speed and flexibility, making it ideal for a microcomputer. -from Author

  1. Probabilistic Simulation for Nanocomposite Characterization

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Coroneos, Rula M.

    2007-01-01

    A unique probabilistic theory is described to predict the properties of nanocomposites. The simulation is based on composite micromechanics with progressive substructuring down to a nanoscale slice of a nanofiber where all the governing equations are formulated. These equations have been programmed in a computer code. That computer code is used to simulate uniaxial strengths properties of a mononanofiber laminate. The results are presented graphically and discussed with respect to their practical significance. These results show smooth distributions.

  2. Probabilistic Cloning and Quantum Computation

    NASA Astrophysics Data System (ADS)

    Gao, Ting; Yan, Feng-Li; Wang, Zhi-Xi

    2004-06-01

    We discuss the usefulness of quantum cloning and present examples of quantum computation tasks for which the cloning offers an advantage which cannot be matched by any approach that does not resort to quantum cloning. In these quantum computations, we need to distribute quantum information contained in the states about which we have some partial information. To perform quantum computations, we use a state-dependent probabilistic quantum cloning procedure to distribute quantum information in the middle of a quantum computation.

  3. Distribution functions of probabilistic automata

    NASA Technical Reports Server (NTRS)

    Vatan, F.

    2001-01-01

    Each probabilistic automaton M over an alphabet A defines a probability measure Prob sub(M) on the set of all finite and infinite words over A. We can identify a k letter alphabet A with the set {0, 1,..., k-1}, and, hence, we can consider every finite or infinite word w over A as a radix k expansion of a real number X(w) in the interval [0, 1]. This makes X(w) a random variable and the distribution function of M is defined as usual: F(x) := Prob sub(M) { w: X(w) < x }. Utilizing the fixed-point semantics (denotational semantics), extended to probabilistic computations, we investigate the distribution functions of probabilistic automata in detail. Automata with continuous distribution functions are characterized. By a new, and much more easier method, it is shown that the distribution function F(x) is an analytic function if it is a polynomial. Finally, answering a question posed by D. Knuth and A. Yao, we show that a polynomial distribution function F(x) on [0, 1] can be generated by a prob abilistic automaton iff all the roots of F'(x) = 0 in this interval, if any, are rational numbers. For this, we define two dynamical systems on the set of polynomial distributions and study attracting fixed points of random composition of these two systems.

  4. Application impact analysis: a risk-based approach to business continuity and disaster recovery.

    PubMed

    Epstein, Beth; Khan, Dawn Christine

    2014-01-01

    There are many possible disruptions that can occur in business. Overlooking or under planning for Business Continuity requires time, understanding and careful planning. Business Continuity Management is far more than producing a document and declaring business continuity success. What is the recipe for businesses to achieve continuity management success? Application Impact Analysis is a method for understanding the unique Business Attributes. This AIA Cycle involves a risk based approach to understanding the business priority and considering business aspects such as Financial, Operational, Service Structure, Contractual Legal, and Brand. The output of this analysis provides a construct for viewing data, evaluating impact, and delivering results, for an approved valuation of Recovery Time Objectives (RTO).

  5. Risk-Based Comparison of Carbon Capture Technologies

    SciTech Connect

    Engel, David W.; Dalton, Angela C.; Dale, Crystal; Jones, Edward

    2013-05-01

    In this paper, we describe an integrated probabilistic risk assessment methodological framework and a decision-support tool suite for implementing systematic comparisons of competing carbon capture technologies. Culminating from a collaborative effort among national laboratories under the Carbon Capture Simulation Initiative (CCSI), the risk assessment framework and the decision-support tool suite encapsulate three interconnected probabilistic modeling and simulation components. The technology readiness level (TRL) assessment component identifies specific scientific and engineering targets required by each readiness level and applies probabilistic estimation techniques to calculate the likelihood of graded as well as nonlinear advancement in technology maturity. The technical risk assessment component focuses on identifying and quantifying risk contributors, especially stochastic distributions for significant risk contributors, performing scenario-based risk analysis, and integrating with carbon capture process model simulations and optimization. The financial risk component estimates the long-term return on investment based on energy retail pricing, production cost, operating and power replacement cost, plan construction and retrofit expenses, and potential tax relief, expressed probabilistically as the net present value distributions over various forecast horizons.

  6. A risk-based approach to scheduling audits.

    PubMed

    Rönninger, Stephan; Holmes, Malcolm

    2009-01-01

    The manufacture and supply of pharmaceutical products can be a very complex operation. Companies may purchase a wide variety of materials, from active pharmaceutical ingredients to packaging materials, from "in company" suppliers or from third parties. They may also purchase or contract a number of services such as analysis, data management, audit, among others. It is very important that these materials and services are of the requisite quality in order that patient safety and company reputation are adequately protected. Such quality requirements are ongoing throughout the product life cycle. In recent years, assurance of quality has been derived via audit of the supplier or service provider and by using periodic audits, for example, annually or at least once every 5 years. In the past, companies may have used an audit only for what they considered to be "key" materials or services and used testing on receipt, for example, as their quality assurance measure for "less important" supplies. Such approaches changed as a result of pressure from both internal sources and regulators to the time-driven audit for all suppliers and service providers. Companies recognised that eventually they would be responsible for the quality of the supplied product or service and audit, although providing only a "snapshot in time" seemed a convenient way of demonstrating that they were meeting their obligations. Problems, however, still occur with the supplied product or service and will usually be more frequent from certain suppliers. Additionally, some third-party suppliers will no longer accept routine audits from individual companies, as the overall audit load can exceed one external audit per working day. Consequently a different model is needed for assessing supplier quality. This paper presents a risk-based approach to creating an audit plan and for scheduling the frequency and depth of such audits. The approach is based on the principles and process of the Quality Risk Management

  7. Probabilistic Computational Methods in Structural Failure Analysis

    NASA Astrophysics Data System (ADS)

    Krejsa, Martin; Kralik, Juraj

    2015-12-01

    Probabilistic methods are used in engineering where a computational model contains random variables. Each random variable in the probabilistic calculations contains uncertainties. Typical sources of uncertainties are properties of the material and production and/or assembly inaccuracies in the geometry or the environment where the structure should be located. The paper is focused on methods for the calculations of failure probabilities in structural failure and reliability analysis with special attention on newly developed probabilistic method: Direct Optimized Probabilistic Calculation (DOProC), which is highly efficient in terms of calculation time and the accuracy of the solution. The novelty of the proposed method lies in an optimized numerical integration that does not require any simulation technique. The algorithm has been implemented in mentioned software applications, and has been used several times in probabilistic tasks and probabilistic reliability assessments.

  8. Probabilistic Aeroelastic Analysis of Turbomachinery Components

    NASA Technical Reports Server (NTRS)

    Reddy, T. S. R.; Mital, S. K.; Stefko, G. L.

    2004-01-01

    A probabilistic approach is described for aeroelastic analysis of turbomachinery blade rows. Blade rows with subsonic flow and blade rows with supersonic flow with subsonic leading edge are considered. To demonstrate the probabilistic approach, the flutter frequency, damping and forced response of a blade row representing a compressor geometry is considered. The analysis accounts for uncertainties in structural and aerodynamic design variables. The results are presented in the form of probabilistic density function (PDF) and sensitivity factors. For subsonic flow cascade, comparisons are also made with different probabilistic distributions, probabilistic methods, and Monte-Carlo simulation. The approach shows that the probabilistic approach provides a more realistic and systematic way to assess the effect of uncertainties in design variables on the aeroelastic instabilities and response.

  9. Probabilistic models of language processing and acquisition.

    PubMed

    Chater, Nick; Manning, Christopher D

    2006-07-01

    Probabilistic methods are providing new explanatory approaches to fundamental cognitive science questions of how humans structure, process and acquire language. This review examines probabilistic models defined over traditional symbolic structures. Language comprehension and production involve probabilistic inference in such models; and acquisition involves choosing the best model, given innate constraints and linguistic and other input. Probabilistic models can account for the learning and processing of language, while maintaining the sophistication of symbolic models. A recent burgeoning of theoretical developments and online corpus creation has enabled large models to be tested, revealing probabilistic constraints in processing, undermining acquisition arguments based on a perceived poverty of the stimulus, and suggesting fruitful links with probabilistic theories of categorization and ambiguity resolution in perception.

  10. Quantitative Integration of Ndt with Probabilistic Fracture Mechanics for the Assessment of Fracture Risk in Pipelines

    NASA Astrophysics Data System (ADS)

    Kurz, J. H.; Cioclov, D.; Dobmann, G.; Boller, C.

    2010-02-01

    In the context of probabilistic paradigm of fracture risk assessment in structural components a computer simulation rationale is presented which has at the base the integration of Quantitative Non-destructive Inspection and Probabilistic Fracture Mechanics. In this study the static failure under static loading is assessed in the format known as Failure Assessment Diagram (FAD). The fracture risk is evaluated in probabilistic terms. The superposed probabilistic pattern over the deterministic one is implemented via Monte-Carlo sampling. The probabilistic fracture simulation yields a more informative analysis in terms of probability of failure. The ability to simulate the influence of the quality and reliability of non-destructive inspection (NDI) is an important feature of this approach. It is achieved by integrating, algorithmically, probabilistic FAD analysis and the Probability of Detection (POD). The POD information can only be applied in a probabilistic analysis and leads to a refinement of the assessment. By this means, it can be ascertained the decrease of probability of failure when POD-characterized NDI is applied. Therefore, this procedure can be used as a tool for inspection based life time conceptions. In this paper results of sensitivity analyses are presented with the aim to outline, in terms of non-failure probabilities, the benefits of applying NDI, in various qualities, in comparison with the situation when NDI is lacking. A better substantiation is enabled of both the component reliability management and the costs-effectiveness of NDI timing.

  11. 78 FR 11611 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-19

    ..., and 211 RIN 0910-AG36 Current Good Manufacturing Practice and Hazard Analysis and Risk- Based... period for the information collection related to the proposed rule on ``Current Good Manufacturing... information collection provisions that are subject to review by the Office of Management and Budget...

  12. Practical risk-based decision making: Good decisions made efficiently

    SciTech Connect

    Haire, M.J.; Guthrie, V.; Walker, D.; Singer, R.

    1995-12-01

    The Robotics and Process Systems Division of the Oak Ridge National Laboratory and the Westinghouse Savannah River Company have teamed with JBF Associates, Inc. to address risk-based robotic planning. The objective of the project is to provide systematic, risk-based relative comparisons of competing alternatives for solving clean-up problems at DOE facilities. This paper presents the methodology developed, describes the software developed to efficiently apply the methodology, and discusses the results of initial applications for DOE. The paper also addresses current work in applying the approach to problems in other industries (including an example from the hydrocarbon processing industry).

  13. Probabilistic cloning of three nonorthogonal states

    NASA Astrophysics Data System (ADS)

    Zhang, Wen; Rui, Pinshu; Yang, Qun; Zhao, Yan; Zhang, Ziyun

    2015-04-01

    We study the probabilistic cloning of three nonorthogonal states with equal success probabilities. For simplicity, we assume that the three states belong to a special set. Analytical form of the maximal success probability for probabilistic cloning is calculated. With the maximal success probability, we deduce the explicit form of probabilistic quantum cloning machine. In the case of cloning, we get the unambiguous form of the unitary operation. It is demonstrated that the upper bound for probabilistic quantum cloning machine in (Qiu in J Phys A 35:6931, 2002) can be reached only if the three states are equidistant.

  14. A Probabilistic Tsunami Hazard Assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D.; Kongko, W.; Cipta, A.; Koetapangwa, B.; Anugrah, S.; Thio, H. K.

    2012-12-01

    We present the first national probabilistic tsunami hazard assessment (PTHA) for Indonesia. This assessment considers tsunami generated from near-field earthquakes sources around Indonesia as well as regional and far-field sources, to define the tsunami hazard at the coastline. The PTHA methodology is based on the established stochastic event-based approach to probabilistic seismic hazard assessment (PSHA) and has been adapted for tsunami. The earthquake source information is primarily based on the recent Indonesian National Seismic Hazard Map and included a consensus-workshop with Indonesia's leading tsunami and earthquake scientists to finalize the seismic source models and logic trees to include epistemic uncertainty. Results are presented in the form of tsunami hazard maps showing the expected tsunami height at the coast for a given return period, and also as tsunami probability maps, showing the probability of exceeding a tsunami height of 0.5m and 3.0m at the coast. These heights define the thresholds for different tsunami warning levels in the Indonesian Tsunami Early Warning System (Ina-TEWS). The results show that for short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, the islands of Nias and Mentawai. For longer return periods (>500 years), the tsunami hazard in Eastern Indonesia (north Papua, north Sulawesi) is nearly as high as that along the Sunda Arc. A sensitivity analysis of input parameters is conducted by sampling branches of the logic tree using a monte-carlo approach to constrain the relative importance of each input parameter. The results from this assessment can be used to underpin evidence-based decision making by disaster managers to prioritize tsunami mitigation, such as developing detailed inundation simulations for evacuation planning.

  15. Probabilistic Analysis of Ground-Holding Strategies

    NASA Technical Reports Server (NTRS)

    Sheel, Minakshi

    1997-01-01

    The Ground-Holding Policy Problem (GHPP) has become a matter of great interest in recent years because of the high cost incurred by aircraft suffering from delays. Ground-holding keeps a flight on the ground at the departure airport if it is known it will be unable to land at the arrival airport. The GBPP is determining how many flights should be held on the ground before take-off and for how long, in order to minimize the cost of delays. When the uncertainty associated with airport landing capacity is considered, the GHPP becomes complicated. A decision support system that incorporates this uncertainty, solves the GHPP quickly, and gives good results would be of great help to air traffic management. The purpose of this thesis is to modify and analyze a probabilistic ground-holding algorithm by applying it to two common cases of capacity reduction. A graphical user interface was developed and sensitivity analysis was done on the algorithm, in order to see how it may be implemented in practice. The sensitivity analysis showed the algorithm was very sensitive to the number of probabilistic capacity scenarios used and to the cost ratio of air delay to ground delay. The algorithm was not particularly sensitive to the number of periods that the time horizon was divided into. In terms of cost savings, a ground-holding policy was the most beneficial when demand greatly exceeded airport capacity. When compared to other air traffic flow strategies, the ground-holding algorithm performed the best and was the most consistent under various situations. The algorithm can solve large problems quickly and efficiently on a personal computer.

  16. Auxiliary feedwater system risk-based inspection guide for the Byron and Braidwood nuclear power plants

    SciTech Connect

    Moffitt, N.E.; Gore, B.F.: Vo, T.V. )

    1991-07-01

    In a study sponsored by the US Nuclear Regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment (PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance information recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. Byron and Braidwood were selected for the fourth study in this program. The produce of this effort is a prioritized listing of AFW failures which have occurred at the plants and at other PWRs. This listing is intended for use by NRC inspectors in the preparation of inspection plans addressing AFW risk-important components at the Byron/Braidwood plants. 23 refs., 1 fig., 1 tab.

  17. Auxiliary feedwater system risk-based inspection guide for the Ginna Nuclear Power Plant

    SciTech Connect

    Pugh, R.; Gore, B.F.; Vo, T.V.; Moffitt, N.E. )

    1991-09-01

    In a study sponsored by the US Nuclear Regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment (PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance information recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. Ginna was selected as the eighth plant for study. The product of this effort is a prioritized listing of AFW failures which have occurred at the plant and at other PWRs. This listing is intended for use by NRC inspectors in the preparation of inspection plans addressing AFW risk-important components at the Ginna plant. 23 refs., 1 fig., 1 tab.

  18. Auxiliary feedwater system risk-based inspection guide for the J. M. Farley Nuclear Power Plant

    SciTech Connect

    Vo, T.V.; Pugh, R.; Gore, B.F.; Harrison, D.G. )

    1990-10-01

    In a study sponsored by the US Nuclear Regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment(PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. J. M. Farley was selected as the second plant for study. The product of this effort is a prioritized listing of AFW failures which have occurred at the plant and at other PWRs. This listing is intended for use by NRC inspectors in the preparation of inspection plans addressing AFW risk-important at the J. M. Farley plant. 23 refs., 1 fig., 1 tab.

  19. Auxiliary feedwater system risk-based inspection guide for the McGuire nuclear power plant

    SciTech Connect

    Bumgardner, J.D.; Lloyd, R.C.; Moffitt, N.E.; Gore, B.F.; Vo, T.V.

    1994-05-01

    In a study sponsored by the US Nuclear Regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment (PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance information recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. McGuire was selected as one of a series of plants for study. The product of this effort is a prioritized listing of AFW failures which have occurred at the plant and at other PWRs. This listing is intended for use by NRC inspectors in the preparation of inspection plans addressing AFW risk-important components at the McGuire plant.

  20. Auxiliary feedwater system risk-based inspection guide for the H. B. Robinson nuclear power plant

    SciTech Connect

    Moffitt, N.E.; Lloyd, R.C.; Gore, B.F.; Vo, T.V.; Garner, L.W.

    1993-08-01

    In a study sponsored by the US Nuclear Regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment (PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance information recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. H. B. Robinson was selected as one of a series of plants for study. The product of this effort is a prioritized listing of AFW failures which have occurred at the plant and at other PWRs. This listing is intended for use by NRC inspectors in the preparation of inspection plans addressing AFW risk-important components at the H. B. Robinson plant.

  1. Auxiliary feedwater system risk-based inspection guide for the South Texas Project nuclear power plant

    SciTech Connect

    Bumgardner, J.D.; Nickolaus, J.R.; Moffitt, N.E.; Gore, B.F.; Vo, T.V.

    1993-12-01

    In a study sponsored by the US Nuclear Regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment (PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance information recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. South Texas Project was selected as a plant for study. The product of this effort is a prioritized listing of AFW failures which have occurred at the plant and at other PWRs. This listing is intended for use by the NRC inspectors in preparation of inspection plans addressing AFW risk important components at the South Texas Project plant.

  2. Health economics and outcomes methods in risk-based decision-making for blood safety.

    PubMed

    Custer, Brian; Janssen, Mart P

    2015-08-01

    Analytical methods appropriate for health economic assessments of transfusion safety interventions have not previously been described in ways that facilitate their use. Within the context of risk-based decision-making (RBDM), health economics can be important for optimizing decisions among competing interventions. The objective of this review is to address key considerations and limitations of current methods as they apply to blood safety. Because a voluntary blood supply is an example of a public good, analyses should be conducted from the societal perspective when possible. Two primary study designs are recommended for most blood safety intervention assessments: budget impact analysis (BIA), which measures the cost to implement an intervention both to the blood operator but also in a broader context, and cost-utility analysis (CUA), which measures the ratio between costs and health gain achieved, in terms of reduced morbidity and mortality, by use of an intervention. These analyses often have important limitations because data that reflect specific aspects, for example, blood recipient population characteristics or complication rates, are not available. Sensitivity analyses play an important role. The impact of various uncertain factors can be studied conjointly in probabilistic sensitivity analyses. The use of BIA and CUA together provides a comprehensive assessment of the costs and benefits from implementing (or not) specific interventions. RBDM is multifaceted and impacts a broad spectrum of stakeholders. Gathering and analyzing health economic evidence as part of the RBDM process enhances the quality, completeness, and transparency of decision-making.

  3. Probabilistic Planning with Imperfect Sensing Actions Using Hybrid Probabilistic Logic Programs

    NASA Astrophysics Data System (ADS)

    Saad, Emad

    Effective planning in uncertain environment is important to agents and multi-agents systems. In this paper, we introduce a new logic based approach to probabilistic contingent planning (probabilistic planning with imperfect sensing actions), by relating probabilistic contingent planning to normal hybrid probabilistic logic programs with probabilistic answer set semantics [24]. We show that any probabilistic contingent planning problem can be encoded as a normal hybrid probabilistic logic program. We formally prove the correctness of our approach. Moreover, we show that the complexity of finding a probabilistic contingent plan in our approach is NP-complete. In addition, we show that any probabilistic contingent planning problem, \\cal PP, can be encoded as a classical normal logic program with answer set semantics, whose answer sets corresponds to valid trajectories in \\cal PP. We show that probabilistic contingent planning problems can be encoded as SAT problems. We present a new high level probabilistic action description language that allows the representation of sensing actions with probabilistic outcomes.

  4. Probabilistic Simulation for Nanocomposite Fracture

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A unique probabilistic theory is described to predict the uniaxial strengths and fracture properties of nanocomposites. The simulation is based on composite micromechanics with progressive substructuring down to a nanoscale slice of a nanofiber where all the governing equations are formulated. These equations have been programmed in a computer code. That computer code is used to simulate uniaxial strengths and fracture of a nanofiber laminate. The results are presented graphically and discussed with respect to their practical significance. These results show smooth distributions from low probability to high.

  5. Probabilistic risk assessment: Number 219

    SciTech Connect

    Bari, R.A.

    1985-11-13

    This report describes a methodology for analyzing the safety of nuclear power plants. A historical overview of plants in the US is provided, and past, present, and future nuclear safety and risk assessment are discussed. A primer on nuclear power plants is provided with a discussion of pressurized water reactors (PWR) and boiling water reactors (BWR) and their operation and containment. Probabilistic Risk Assessment (PRA), utilizing both event-tree and fault-tree analysis, is discussed as a tool in reactor safety, decision making, and communications. (FI)

  6. Development of a risk-based approach to Hanford Site cleanup

    SciTech Connect

    Hesser, W.A.; Daling, P.M.; Baynes, P.A.

    1995-06-01

    In response to a request from Mr. Thomas Grumbly, Assistant Secretary of Energy for Environmental Management, the Hanford Site contractors developed a conceptual set of risk-based cleanup strategies that (1) protect the public, workers, and environment from unacceptable risks; (2) are executable technically; and (3) fit within an expected annual funding profile of 1.05 billion dollars. These strategies were developed because (1) the US Department of Energy and Hanford Site budgets are being reduced, (2) stakeholders are dissatisfied with the perceived rate of cleanup, (3) the US Congress and the US Department of Energy are increasingly focusing on risk and riskreduction activities, (4) the present strategy is not integrated across the Site and is inconsistent in its treatment of similar hazards, (5) the present cleanup strategy is not cost-effective from a risk-reduction or future land use perspective, and (6) the milestones and activities in the Tri-Party Agreement cannot be achieved with an anticipated funding of 1.05 billion dollars annually. The risk-based strategies described herein were developed through a systems analysis approach that (1) analyzed the cleanup mission; (2) identified cleanup objectives, including risk reduction, land use, and mortgage reduction; (3) analyzed the existing baseline cleanup strategy from a cost and risk perspective; (4) developed alternatives for accomplishing the cleanup mission; (5) compared those alternatives against cleanup objectives; and (6) produced conclusions and recommendations regarding the current strategy and potential risk-based strategies.

  7. Risk based guideline values and the development of preliminary remediation goals

    SciTech Connect

    Brothers, R.A.; Cox, D.M.; Guty, J.L.; Miller, D.B.; Motheramgari, K.; Stinnette, S.E.

    1995-02-01

    Risk managers at federal facilities often need a risk-based tool to rapidly assess the possible human health risks of large numbers of sites before completing a baseline risk assessment. Risk-based concentrations, based on Preliminary Remediation Goal (PRG) development methodology, can be used as screening guideline values. We have developed a set of guideline values (GVs) for the Mound Facility at Miamisburg, Ohio, that are risk based, decision-making tools. The GVs are used (with regulatory approval) to rapidly assess the possibility that sites may be considered for {open_quotes}no action{close_quotes} decisions. The GVs are neither PRGs nor final remedial action levels. Development of the GVs on a facilitywide basis incorporated known contaminants of potential concern, physical and chemical characteristics of contaminated media, current and potential future land uses, and exposure pathway assumptions. Because no one site was used in the development process, the GVs can be applied (after consideration of the land use and exposure potential) to any site on the facility. The facilitywide approach will streamline the PRG development process by minimizing the efforts to develop site-specific PRGs for each operable unit at a considerable saving of time and effort.

  8. Study of a risk-based piping inspection guideline system.

    PubMed

    Tien, Shiaw-Wen; Hwang, Wen-Tsung; Tsai, Chih-Hung

    2007-02-01

    A risk-based inspection system and a piping inspection guideline model were developed in this study. The research procedure consists of two parts--the building of a risk-based inspection model for piping and the construction of a risk-based piping inspection guideline model. Field visits at the plant were conducted to develop the risk-based inspection and strategic analysis system. A knowledge-based model had been built in accordance with international standards and local government regulations, and the rational unified process was applied for reducing the discrepancy in the development of the models. The models had been designed to analyze damage factors, damage models, and potential damage positions of piping in the petrochemical plants. The purpose of this study was to provide inspection-related personnel with the optimal planning tools for piping inspections, hence, to enable effective predictions of potential piping risks and to enhance the better degree of safety in plant operations that the petrochemical industries can be expected to achieve. A risk analysis was conducted on the piping system of a petrochemical plant. The outcome indicated that most of the risks resulted from a small number of pipelines.

  9. Risks and Risk-Based Regulation in Higher Education Institutions

    ERIC Educational Resources Information Center

    Huber, Christian

    2009-01-01

    Risk-based regulation is a relatively new mode of governance. Not only does it offer a way of controlling institutions from the outside but it also provides the possibility of making an organisation's achievements visible/visualisable. This paper comments on a list of possible risks that higher education institutions have to face. In a second…

  10. 13 CFR 120.1000 - Risk-Based Lender Oversight.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Risk-Based Lender Oversight. 120.1000 Section 120.1000 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Risk... Oversight. SBA supervises, examines, and regulates, and enforces laws against, SBA Supervised Lenders...

  11. Role of Context in Risk-Based Reasoning

    ERIC Educational Resources Information Center

    Pratt, Dave; Ainley, Janet; Kent, Phillip; Levinson, Ralph; Yogui, Cristina; Kapadia, Ramesh

    2011-01-01

    In this article we report the influence of contextual factors on mathematics and science teachers' reasoning in risk-based decision-making. We examine previous research that presents judgments of risk as being subjectively influenced by contextual factors and other research that explores the role of context in mathematical problem-solving. Our own…

  12. How Should Risk-Based Regulation Reflect Current Public Opinion?

    PubMed

    Pollock, Christopher John

    2016-08-01

    Risk-based regulation of novel agricultural products with public choice manifest via traceability and labelling is a more effective approach than the use of regulatory processes to reflect public concerns, which may not always be supported by evidence. PMID:27266813

  13. Risk-Based Educational Accountability in Dutch Primary Education

    ERIC Educational Resources Information Center

    Timmermans, A. C.; de Wolf, I. F.; Bosker, R. J.; Doolaard, S.

    2015-01-01

    A recent development in educational accountability is a risk-based approach, in which intensity and frequency of school inspections vary across schools to make educational accountability more efficient and effective by enabling inspectorates to focus on organizations at risk. Characteristics relevant in predicting which schools are "at risk…

  14. Use of Probabilistic Risk Assessment in Shuttle Decision Making Process

    NASA Technical Reports Server (NTRS)

    Boyer, Roger L.; Hamlin, Teri, L.

    2011-01-01

    This slide presentation reviews the use of Probabilistic Risk Assessment (PRA) to assist in the decision making for the shuttle design and operation. Probabilistic Risk Assessment (PRA) is a comprehensive, structured, and disciplined approach to identifying and analyzing risk in complex systems and/or processes that seeks answers to three basic questions: (i.e., what can go wrong? what is the likelihood of these occurring? and what are the consequences that could result if these occur?) The purpose of the Shuttle PRA (SPRA) is to provide a useful risk management tool for the Space Shuttle Program (SSP) to identify strengths and possible weaknesses in the Shuttle design and operation. SPRA was initially developed to support upgrade decisions, but has evolved into a tool that supports Flight Readiness Reviews (FRR) and near real-time flight decisions. Examples of the use of PRA for the shuttle are reviewed.

  15. Uncertainty driven probabilistic voxel selection for image registration.

    PubMed

    Oreshkin, Boris N; Arbel, Tal

    2013-10-01

    This paper presents a novel probabilistic voxel selection strategy for medical image registration in time-sensitive contexts, where the goal is aggressive voxel sampling (e.g., using less than 1% of the total number) while maintaining registration accuracy and low failure rate. We develop a Bayesian framework whereby, first, a voxel sampling probability field (VSPF) is built based on the uncertainty on the transformation parameters. We then describe a practical, multi-scale registration algorithm, where, at each optimization iteration, different voxel subsets are sampled based on the VSPF. The approach maximizes accuracy without committing to a particular fixed subset of voxels. The probabilistic sampling scheme developed is shown to manage the tradeoff between the robustness of traditional random voxel selection (by permitting more exploration) and the accuracy of fixed voxel selection (by permitting a greater proportion of informative voxels).

  16. Probabilistic analysis of tsunami hazards

    USGS Publications Warehouse

    Geist, E.L.; Parsons, T.

    2006-01-01

    Determining the likelihood of a disaster is a key component of any comprehensive hazard assessment. This is particularly true for tsunamis, even though most tsunami hazard assessments have in the past relied on scenario or deterministic type models. We discuss probabilistic tsunami hazard analysis (PTHA) from the standpoint of integrating computational methods with empirical analysis of past tsunami runup. PTHA is derived from probabilistic seismic hazard analysis (PSHA), with the main difference being that PTHA must account for far-field sources. The computational methods rely on numerical tsunami propagation models rather than empirical attenuation relationships as in PSHA in determining ground motions. Because a number of source parameters affect local tsunami runup height, PTHA can become complex and computationally intensive. Empirical analysis can function in one of two ways, depending on the length and completeness of the tsunami catalog. For site-specific studies where there is sufficient tsunami runup data available, hazard curves can primarily be derived from empirical analysis, with computational methods used to highlight deficiencies in the tsunami catalog. For region-wide analyses and sites where there are little to no tsunami data, a computationally based method such as Monte Carlo simulation is the primary method to establish tsunami hazards. Two case studies that describe how computational and empirical methods can be integrated are presented for Acapulco, Mexico (site-specific) and the U.S. Pacific Northwest coastline (region-wide analysis).

  17. 2009 Space Shuttle Probabilistic Risk Assessment Overview

    NASA Technical Reports Server (NTRS)

    Hamlin, Teri L.; Canga, Michael A.; Boyer, Roger L.; Thigpen, Eric B.

    2010-01-01

    Loss of a Space Shuttle during flight has severe consequences, including loss of a significant national asset; loss of national confidence and pride; and, most importantly, loss of human life. The Shuttle Probabilistic Risk Assessment (SPRA) is used to identify risk contributors and their significance; thus, assisting management in determining how to reduce risk. In 2006, an overview of the SPRA Iteration 2.1 was presented at PSAM 8 [1]. Like all successful PRAs, the SPRA is a living PRA and has undergone revisions since PSAM 8. The latest revision to the SPRA is Iteration 3. 1, and it will not be the last as the Shuttle program progresses and more is learned. This paper discusses the SPRA scope, overall methodology, and results, as well as provides risk insights. The scope, assumptions, uncertainties, and limitations of this assessment provide risk-informed perspective to aid management s decision-making process. In addition, this paper compares the Iteration 3.1 analysis and results to the Iteration 2.1 analysis and results presented at PSAM 8.

  18. Is Probabilistic Evidence a Source of Knowledge?

    ERIC Educational Resources Information Center

    Friedman, Ori; Turri, John

    2015-01-01

    We report a series of experiments examining whether people ascribe knowledge for true beliefs based on probabilistic evidence. Participants were less likely to ascribe knowledge for beliefs based on probabilistic evidence than for beliefs based on perceptual evidence (Experiments 1 and 2A) or testimony providing causal information (Experiment 2B).…

  19. Probabilistic Cue Combination: Less Is More

    ERIC Educational Resources Information Center

    Yurovsky, Daniel; Boyer, Ty W.; Smith, Linda B.; Yu, Chen

    2013-01-01

    Learning about the structure of the world requires learning probabilistic relationships: rules in which cues do not predict outcomes with certainty. However, in some cases, the ability to track probabilistic relationships is a handicap, leading adults to perform non-normatively in prediction tasks. For example, in the "dilution effect,"…

  20. Error Discounting in Probabilistic Category Learning

    ERIC Educational Resources Information Center

    Craig, Stewart; Lewandowsky, Stephan; Little, Daniel R.

    2011-01-01

    The assumption in some current theories of probabilistic categorization is that people gradually attenuate their learning in response to unavoidable error. However, existing evidence for this error discounting is sparse and open to alternative interpretations. We report 2 probabilistic-categorization experiments in which we investigated error…

  1. Risk-based prioritization methodology for the classification of groundwater pollution sources.

    PubMed

    Pizzol, Lisa; Zabeo, Alex; Critto, Andrea; Giubilato, Elisa; Marcomini, Antonio

    2015-02-15

    Water management is one of the EU environmental priorities and it is one of the most serious challenges that today's major cities are facing. The main European regulation for the protection of water resources is represented by the Water Framework Directive (WFD) and the Groundwater Directive (2006/118/EC) which require the identification, risk-based ranking and management of sources of pollution and the identification of those contamination sources that threaten the achievement of groundwater's good quality status. The aim of this paper is to present a new risk-based prioritization methodology to support the determination of a management strategy for the achievement of the good quality status of groundwater. The proposed methodology encompasses the following steps: 1) hazard analysis, 2) pathway analysis, 3) receptor vulnerability analysis and 4) relative risk estimation. Moreover, by integrating GIS functionalities and Multi Criteria Decision Analysis (MCDA) techniques, it allows to: i) deal with several sources and multiple impacted receptors within the area of concern; ii) identify different receptors' vulnerability levels according to specific groundwater uses; iii) assess the risks posed by all contamination sources in the area; and iv) provide a risk-based ranking of the contamination sources that can threaten the achievement of the groundwater good quality status. The application of the proposed framework to a well-known industrialized area located in the surroundings of Milan (Italy) is illustrated in order to demonstrate the effectiveness of the proposed framework in supporting the identification of intervention priorities. Among the 32 sources analyzed in the case study, three sources received the highest relevance score, due to the medium-high relative risks estimated for Chromium (VI) and Perchloroethylene. The case study application showed that the developed methodology is flexible and easy to adapt to different contexts, thanks to the possibility to

  2. 12 CFR 652.65 - Risk-based capital stress test.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Risk-based capital stress test. 652.65 Section... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.65 Risk-based capital stress test. You will perform the risk-based capital stress test as described in summary form below and...

  3. 12 CFR 652.65 - Risk-based capital stress test.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 7 2014-01-01 2014-01-01 false Risk-based capital stress test. 652.65 Section... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.65 Risk-based capital stress test. You will perform the risk-based capital stress test as described in summary form below and...

  4. 12 CFR 652.65 - Risk-based capital stress test.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 7 2013-01-01 2013-01-01 false Risk-based capital stress test. 652.65 Section... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.65 Risk-based capital stress test. You will perform the risk-based capital stress test as described in summary form below and...

  5. 12 CFR 652.65 - Risk-based capital stress test.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 6 2011-01-01 2011-01-01 false Risk-based capital stress test. 652.65 Section... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.65 Risk-based capital stress test. You will perform the risk-based capital stress test as described in summary form below and...

  6. 12 CFR 652.65 - Risk-based capital stress test.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 7 2012-01-01 2012-01-01 false Risk-based capital stress test. 652.65 Section... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.65 Risk-based capital stress test. You will perform the risk-based capital stress test as described in summary form below and...

  7. 12 CFR 1022.73 - Content, form, and timing of risk-based pricing notices.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 8 2012-01-01 2012-01-01 false Content, form, and timing of risk-based pricing... REPORTING (REGULATION V) Duties of Users Regarding Risk-Based Pricing § 1022.73 Content, form, and timing of risk-based pricing notices. (a) Content of the notice—(1) In general. The risk-based pricing...

  8. 12 CFR 1022.73 - Content, form, and timing of risk-based pricing notices.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 8 2013-01-01 2013-01-01 false Content, form, and timing of risk-based pricing... REPORTING (REGULATION V) Duties of Users Regarding Risk-Based Pricing § 1022.73 Content, form, and timing of risk-based pricing notices. (a) Content of the notice. (1) In general. The risk-based pricing...

  9. 16 CFR 640.3 - General requirements for risk-based pricing notices.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 16 Commercial Practices 1 2012-01-01 2012-01-01 false General requirements for risk-based pricing... DUTIES OF CREDITORS REGARDING RISK-BASED PRICING § 640.3 General requirements for risk-based pricing... notice (“risk-based pricing notice”) in the form and manner required by this part if the person both—...

  10. 16 CFR 640.4 - Content, form, and timing of risk-based pricing notices.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 16 Commercial Practices 1 2011-01-01 2011-01-01 false Content, form, and timing of risk-based... REPORTING ACT DUTIES OF CREDITORS REGARDING RISK-BASED PRICING § 640.4 Content, form, and timing of risk-based pricing notices. (a) Content of the notice—(1) In general. The risk-based pricing notice...

  11. Is probabilistic evidence a source of knowledge?

    PubMed

    Friedman, Ori; Turri, John

    2015-07-01

    We report a series of experiments examining whether people ascribe knowledge for true beliefs based on probabilistic evidence. Participants were less likely to ascribe knowledge for beliefs based on probabilistic evidence than for beliefs based on perceptual evidence (Experiments 1 and 2A) or testimony providing causal information (Experiment 2B). Denial of knowledge for beliefs based on probabilistic evidence did not arise because participants viewed such beliefs as unjustified, nor because such beliefs leave open the possibility of error. These findings rule out traditional philosophical accounts for why probabilistic evidence does not produce knowledge. The experiments instead suggest that people deny knowledge because they distrust drawing conclusions about an individual based on reasoning about the population to which it belongs, a tendency previously identified by "judgment and decision making" researchers. Consistent with this, participants were more willing to ascribe knowledge for beliefs based on probabilistic evidence that is specific to a particular case (Experiments 3A and 3B).

  12. Risk-based decision support tools: protecting rail-centered transit corridors from cascading effects.

    PubMed

    Greenberg, Michael R; Lowrie, Karen; Mayer, Henry; Altiok, Tayfur

    2011-12-01

    We consider the value of decision support tools for passenger rail system managers. First, we call for models that follow events along main rail lines and then into the surrounding environment where they can cascade onto connected light rail, bus, auto, truck, and other transport modes. Second, we suggest that both probabilistic risk assessment (PRA-based) and agent-based models have a role to play at different scales of analysis and for different kinds of risks. Third, we argue that economic impact tools need more systematic evaluation. Fourth, we note that developers of decision support tools face a challenge of balancing their desire for theoretical elegance and the tendency to focus only on high consequence events against decisionmakers' mistrust of complex tools that they and their staff cannot manage and incorporate into their routine operations, as well as the high costs of developing, updating, and applying decision support tools to transport systems undergoing budget cuts and worker and service reductions.

  13. Protecting the Smart Grid: A Risk Based Approach

    SciTech Connect

    Clements, Samuel L.; Kirkham, Harold; Elizondo, Marcelo A.; Lu, Shuai

    2011-10-10

    This paper describes a risk-based approach to security that has been used for years in protecting physical assets, and shows how it could be modified to help secure the digital aspects of the smart grid and control systems in general. One way the smart grid has been said to be vulnerable is that mass load fluctuations could be created by quickly turning off and on large quantities of smart meters. We investigate the plausibility.

  14. Probabilistic cloning of equidistant states

    SciTech Connect

    Jimenez, O.; Roa, Luis; Delgado, A.

    2010-08-15

    We study the probabilistic cloning of equidistant states. These states are such that the inner product between them is a complex constant or its conjugate. Thereby, it is possible to study their cloning in a simple way. In particular, we are interested in the behavior of the cloning probability as a function of the phase of the overlap among the involved states. We show that for certain families of equidistant states Duan and Guo's cloning machine leads to cloning probabilities lower than the optimal unambiguous discrimination probability of equidistant states. We propose an alternative cloning machine whose cloning probability is higher than or equal to the optimal unambiguous discrimination probability for any family of equidistant states. Both machines achieve the same probability for equidistant states whose inner product is a positive real number.

  15. Probabilistic Fatigue Damage Program (FATIG)

    NASA Technical Reports Server (NTRS)

    Michalopoulos, Constantine

    2012-01-01

    FATIG computes fatigue damage/fatigue life using the stress rms (root mean square) value, the total number of cycles, and S-N curve parameters. The damage is computed by the following methods: (a) traditional method using Miner s rule with stress cycles determined from a Rayleigh distribution up to 3*sigma; and (b) classical fatigue damage formula involving the Gamma function, which is derived from the integral version of Miner's rule. The integration is carried out over all stress amplitudes. This software solves the problem of probabilistic fatigue damage using the integral form of the Palmgren-Miner rule. The software computes fatigue life using an approach involving all stress amplitudes, up to N*sigma, as specified by the user. It can be used in the design of structural components subjected to random dynamic loading, or by any stress analyst with minimal training for fatigue life estimates of structural components.

  16. Introducing a probabilistic Budyko framework

    NASA Astrophysics Data System (ADS)

    Greve, P.; Gudmundsson, L.; Orlowsky, B.; Seneviratne, S. I.

    2015-04-01

    Water availability is of importance for a wide range of ecological, climatological, and socioeconomic applications. Over land, the partitioning of precipitation into evapotranspiration and runoff essentially determines the availability of water. At mean annual catchment scales, the widely used Budyko framework provides a simple, deterministic, first-order relationship to estimate this partitioning as a function of the prevailing climatic conditions. Here we extend the framework by introducing a method to specify probabilistic estimates of water availability that account for the nonlinearity of the underlying phase space. The new framework allows to evaluate the predictability of water availability that is related to varying catchment characteristics and conditional on the underlying climatic conditions. Corresponding results support the practical experience of low predictability of river runoff in transitional climates.

  17. Risk-based planning analysis for a single levee

    NASA Astrophysics Data System (ADS)

    Hui, Rui; Jachens, Elizabeth; Lund, Jay

    2016-04-01

    Traditional risk-based analysis for levee planning focuses primarily on overtopping failure. Although many levees fail before overtopping, few planning studies explicitly include intermediate geotechnical failures in flood risk analysis. This study develops a risk-based model for two simplified levee failure modes: overtopping failure and overall intermediate geotechnical failure from through-seepage, determined by the levee cross section represented by levee height and crown width. Overtopping failure is based only on water level and levee height, while through-seepage failure depends on many geotechnical factors as well, mathematically represented here as a function of levee crown width using levee fragility curves developed from professional judgment or analysis. These levee planning decisions are optimized to minimize the annual expected total cost, which sums expected (residual) annual flood damage and annualized construction costs. Applicability of this optimization approach to planning new levees or upgrading existing levees is demonstrated preliminarily for a levee on a small river protecting agricultural land, and a major levee on a large river protecting a more valuable urban area. Optimized results show higher likelihood of intermediate geotechnical failure than overtopping failure. The effects of uncertainty in levee fragility curves, economic damage potential, construction costs, and hydrology (changing climate) are explored. Optimal levee crown width is more sensitive to these uncertainties than height, while the derived general principles and guidelines for risk-based optimal levee planning remain the same.

  18. Game Theory and Risk-Based Levee System Design

    NASA Astrophysics Data System (ADS)

    Hui, R.; Lund, J. R.; Madani, K.

    2014-12-01

    Risk-based analysis has been developed for optimal levee design for economic efficiency. Along many rivers, two levees on opposite riverbanks act as a simple levee system. Being rational and self-interested, land owners on each river bank would tend to independently optimize their levees with risk-based analysis, resulting in a Pareto-inefficient levee system design from the social planner's perspective. Game theory is applied in this study to analyze decision making process in a simple levee system in which the land owners on each river bank develop their design strategies using risk-based economic optimization. For each land owner, the annual expected total cost includes expected annual damage cost and annualized construction cost. The non-cooperative Nash equilibrium is identified and compared to the social planner's optimal distribution of flood risk and damage cost throughout the system which results in the minimum total flood cost for the system. The social planner's optimal solution is not feasible without appropriate level of compensation for the transferred flood risk to guarantee and improve conditions for all parties. Therefore, cooperative game theory is then employed to develop an economically optimal design that can be implemented in practice. By examining the game in the reversible and irreversible decision making modes, the cost of decision making myopia is calculated to underline the significance of considering the externalities and evolution path of dynamic water resource problems for optimal decision making.

  19. A pilot application of risk-based methods to establish in-service inspection priorities for nuclear components at Surry Unit 1 Nuclear Power Station

    SciTech Connect

    Vo, T.; Gore, B.; Simonen, F.; Doctor, S.

    1994-08-01

    As part of the Nondestructive Evaluation Reliability Program sponsored by the US Nuclear Regulatory Commission, the Pacific Northwest Laboratory is developing a method that uses risk-based approaches to establish in-service inspection plans for nuclear power plant components. This method uses probabilistic risk assessment (PRA) results and Failure Modes and Effects Analysis (FEMA) techniques to identify and prioritize the most risk-important systems and components for inspection. The Surry Nuclear Power Station Unit 1 was selected for pilot applications of this method. The specific systems addressed in this report are the reactor pressure vessel, the reactor coolant, the low-pressure injection, and the auxiliary feedwater. The results provide a risk-based ranking of components within these systems and relate the target risk to target failure probability values for individual components. These results will be used to guide the development of improved inspection plans for nuclear power plants. To develop inspection plans, the acceptable level of risk from structural failure for important systems and components will be apportioned as a small fraction (i.e., 5%) of the total PRA-estimated risk for core damage. This process will determine target (acceptable) risk and target failure probability values for individual components. Inspection requirements will be set at levels to assure that acceptable failure probabilistics are maintained.

  20. Development of probabilistic multimedia multipathway computer codes.

    SciTech Connect

    Yu, C.; LePoire, D.; Gnanapragasam, E.; Arnish, J.; Kamboj, S.; Biwer, B. M.; Cheng, J.-J.; Zielen, A. J.; Chen, S. Y.; Mo, T.; Abu-Eid, R.; Thaggard, M.; Sallo, A., III.; Peterson, H., Jr.; Williams, W. A.; Environmental Assessment; NRC; EM

    2002-01-01

    The deterministic multimedia dose/risk assessment codes RESRAD and RESRAD-BUILD have been widely used for many years for evaluation of sites contaminated with residual radioactive materials. The RESRAD code applies to the cleanup of sites (soils) and the RESRAD-BUILD code applies to the cleanup of buildings and structures. This work describes the procedure used to enhance the deterministic RESRAD and RESRAD-BUILD codes for probabilistic dose analysis. A six-step procedure was used in developing default parameter distributions and the probabilistic analysis modules. These six steps include (1) listing and categorizing parameters; (2) ranking parameters; (3) developing parameter distributions; (4) testing parameter distributions for probabilistic analysis; (5) developing probabilistic software modules; and (6) testing probabilistic modules and integrated codes. The procedures used can be applied to the development of other multimedia probabilistic codes. The probabilistic versions of RESRAD and RESRAD-BUILD codes provide tools for studying the uncertainty in dose assessment caused by uncertain input parameters. The parameter distribution data collected in this work can also be applied to other multimedia assessment tasks and multimedia computer codes.

  1. A Unified Probabilistic Framework for Dose–Response Assessment of Human Health Effects

    PubMed Central

    Slob, Wout

    2015-01-01

    management decisions. Citation Chiu WA, Slob W. 2015. A unified probabilistic framework for dose–response assessment of human health effects. Environ Health Perspect 123:1241–1254; http://dx.doi.org/10.1289/ehp.1409385 PMID:26006063

  2. Probabilistic population projections with migration uncertainty

    PubMed Central

    Azose, Jonathan J.; Ševčíková, Hana; Raftery, Adrian E.

    2016-01-01

    We produce probabilistic projections of population for all countries based on probabilistic projections of fertility, mortality, and migration. We compare our projections to those from the United Nations’ Probabilistic Population Projections, which uses similar methods for fertility and mortality but deterministic migration projections. We find that uncertainty in migration projection is a substantial contributor to uncertainty in population projections for many countries. Prediction intervals for the populations of Northern America and Europe are over 70% wider, whereas prediction intervals for the populations of Africa, Asia, and the world as a whole are nearly unchanged. Out-of-sample validation shows that the model is reasonably well calibrated. PMID:27217571

  3. Probabilistic cognition in two indigenous Mayan groups.

    PubMed

    Fontanari, Laura; Gonzalez, Michel; Vallortigara, Giorgio; Girotto, Vittorio

    2014-12-01

    Is there a sense of chance shared by all individuals, regardless of their schooling or culture? To test whether the ability to make correct probabilistic evaluations depends on educational and cultural guidance, we investigated probabilistic cognition in preliterate and prenumerate Kaqchikel and K'iche', two indigenous Mayan groups, living in remote areas of Guatemala. Although the tested individuals had no formal education, they performed correctly in tasks in which they had to consider prior and posterior information, proportions and combinations of possibilities. Their performance was indistinguishable from that of Mayan school children and Western controls. Our results provide evidence for the universal nature of probabilistic cognition.

  4. Probabilistic machine learning and artificial intelligence.

    PubMed

    Ghahramani, Zoubin

    2015-05-28

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  5. Probabilistic machine learning and artificial intelligence.

    PubMed

    Ghahramani, Zoubin

    2015-05-28

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery. PMID:26017444

  6. Probabilistic machine learning and artificial intelligence

    NASA Astrophysics Data System (ADS)

    Ghahramani, Zoubin

    2015-05-01

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  7. Probabilistic cognition in two indigenous Mayan groups

    PubMed Central

    Fontanari, Laura; Gonzalez, Michel; Vallortigara, Giorgio; Girotto, Vittorio

    2014-01-01

    Is there a sense of chance shared by all individuals, regardless of their schooling or culture? To test whether the ability to make correct probabilistic evaluations depends on educational and cultural guidance, we investigated probabilistic cognition in preliterate and prenumerate Kaqchikel and K’iche’, two indigenous Mayan groups, living in remote areas of Guatemala. Although the tested individuals had no formal education, they performed correctly in tasks in which they had to consider prior and posterior information, proportions and combinations of possibilities. Their performance was indistinguishable from that of Mayan school children and Western controls. Our results provide evidence for the universal nature of probabilistic cognition. PMID:25368160

  8. Probabilistic and Non-probabilistic Synthetic Reliability Model for Space Structures

    NASA Astrophysics Data System (ADS)

    Hong, Dongpao; Hu, Xiao; Zhang, Jing

    2016-07-01

    As an alternative to reliability analysis, the non-probabilistic model is an effective supplement when the interval information exists. We describe the uncertain parameters of the structures with interval variables, and establish a non-probabilistic reliability model of structures. Then, we analyze the relation between the typical interference mode and the reliability according to the structure stress-strength interference model, and propose a new measure of structure non-probabilistic reliability. Furthermore we describe other uncertain parameters with random variables when probabilistic information also exists. For the complex structures including both random variables and interval variables, we propose a probabilistic and non-probabilistic synthetic reliability model. The illustrative example shows that the presented model is feasible for structure reliability analysis and design.

  9. Risk-based analyses in support of California hazardous site remediation

    SciTech Connect

    Ringland, J.T.

    1995-08-01

    The California Environmental Enterprise (CEE) is a joint program of the Department of Energy (DOE), Lawrence Livermore National Laboratory, Lawrence Berkeley Laboratory, and Sandia National Laboratories. Its goal is to make DOE laboratory expertise accessible to hazardous site cleanups in the state This support might involve working directly with parties responsible for individual cleanups or it might involve working with the California Environmental Protection Agency to develop tools that would be applicable across a broad range of sites. As part of its initial year`s activities, the CEE supported a review to examine where laboratory risk and risk-based systems analysis capabilities might be most effectively applied. To this end, this study draws the following observations. The labs have a clear role in analyses supporting the demonstration and transfer of laboratory characterization or remediation technologies. The labs may have opportunities in developing broadly applicable analysis tools and computer codes for problems such as site characterization or efficient management of resources. Analysis at individual sites, separate from supporting lab technologies or prototyping general tools, may be appropriate only in limited circumstances. In any of these roles, the labs` capabilities extend beyond health risk assessment to the broader areas of risk management and risk-based systems analysis.

  10. Exploration Health Risks: Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Rhatigan, Jennifer; Charles, John; Hayes, Judith; Wren, Kiley

    2006-01-01

    Maintenance of human health on long-duration exploration missions is a primary challenge to mission designers. Indeed, human health risks are currently the largest risk contributors to the risks of evacuation or loss of the crew on long-duration International Space Station missions. We describe a quantitative assessment of the relative probabilities of occurrence of the individual risks to human safety and efficiency during space flight to augment qualitative assessments used in this field to date. Quantitative probabilistic risk assessments will allow program managers to focus resources on those human health risks most likely to occur with undesirable consequences. Truly quantitative assessments are common, even expected, in the engineering and actuarial spheres, but that capability is just emerging in some arenas of life sciences research, such as identifying and minimize the hazards to astronauts during future space exploration missions. Our expectation is that these results can be used to inform NASA mission design trade studies in the near future with the objective of preventing the higher among the human health risks. We identify and discuss statistical techniques to provide this risk quantification based on relevant sets of astronaut biomedical data from short and long duration space flights as well as relevant analog populations. We outline critical assumptions made in the calculations and discuss the rationale for these. Our efforts to date have focussed on quantifying the probabilities of medical risks that are qualitatively perceived as relatively high risks of radiation sickness, cardiac dysrhythmias, medically significant renal stone formation due to increased calcium mobilization, decompression sickness as a result of EVA (extravehicular activity), and bone fracture due to loss of bone mineral density. We present these quantitative probabilities in order-of-magnitude comparison format so that relative risk can be gauged. We address the effects of

  11. Development of a Risk-Based Comparison Methodology of Carbon Capture Technologies

    SciTech Connect

    Engel, David W.; Dalton, Angela C.; Dale, Crystal; Thompson, Julie; Leclaire, Rene; Edward, Bryan; Jones, Edward

    2014-06-01

    Given the varying degrees of maturity among existing carbon capture (CC) technology alternatives, an understanding of the inherent technical and financial risk and uncertainty associated with these competing technologies is requisite to the success of carbon capture as a viable solution to the greenhouse gas emission challenge. The availability of tools and capabilities to conduct rigorous, risk–based technology comparisons is thus highly desirable for directing valuable resources toward the technology option(s) with a high return on investment, superior carbon capture performance, and minimum risk. To address this research need, we introduce a novel risk-based technology comparison method supported by an integrated multi-domain risk model set to estimate risks related to technological maturity, technical performance, and profitability. Through a comparison between solid sorbent and liquid solvent systems, we illustrate the feasibility of estimating risk and quantifying uncertainty in a single domain (modular analytical capability) as well as across multiple risk dimensions (coupled analytical capability) for comparison. This method brings technological maturity and performance to bear on profitability projections, and carries risk and uncertainty modeling across domains via inter-model sharing of parameters, distributions, and input/output. The integration of the models facilitates multidimensional technology comparisons within a common probabilistic risk analysis framework. This approach and model set can equip potential technology adopters with the necessary computational capabilities to make risk-informed decisions about CC technology investment. The method and modeling effort can also be extended to other industries where robust tools and analytical capabilities are currently lacking for evaluating nascent technologies.

  12. Application of risk-based methods to inservice testing of check valves

    SciTech Connect

    Closky, N.B.; Balkey, K.R.; McAllister, W.J.

    1996-12-01

    Research efforts have been underway in the American Society of Mechanical Engineers (ASME) and industry to define appropriate methods for the application of risk-based technology in the development of inservice testing (IST) programs for pumps and valves in nuclear steam supply systems. This paper discusses a pilot application of these methods to the inservice testing of check valves in the emergency core cooling system of Georgia Power`s Vogtle nuclear power station. The results of the probabilistic safety assessment (PSA) are used to divide the check valves into risk-significant and less-risk-significant groups. This information is reviewed by a plant expert panel along with the consideration of appropriate deterministic insights to finally categorize the check valves into more safety-significant and less safety-significant component groups. All of the more safety-significant check valves are further evaluated in detail using a failure modes and causes analysis (FMCA) to assist in defining effective IST strategies. A template has been designed to evaluate how effective current and emerging tests for check valves are in detecting failures or in finding significant conditions that are precursors to failure for the likely failure causes. This information is then used to design and evaluate appropriate IST strategies that consider both the test method and frequency. A few of the less safety-significant check valves are also evaluated using this process since differences exist in check valve design, function, and operating conditions. Appropriate test strategies are selected for each check valve that has been evaluated based on safety and cost considerations. Test strategies are inferred from this information for the other check valves based on similar check valve conditions. Sensitivity studies are performed using the PSA model to arrive at an overall IST program that maintains or enhances safety at the lowest achievable cost.

  13. Probabilistic micromechanics for high-temperature composites

    NASA Technical Reports Server (NTRS)

    Reddy, J. N.

    1993-01-01

    The three-year program of research had the following technical objectives: the development of probabilistic methods for micromechanics-based constitutive and failure models, application of the probabilistic methodology in the evaluation of various composite materials and simulation of expected uncertainties in unidirectional fiber composite properties, and influence of the uncertainties in composite properties on the structural response. The first year of research was devoted to the development of probabilistic methodology for micromechanics models. The second year of research focused on the evaluation of the Chamis-Hopkins constitutive model and Aboudi constitutive model using the methodology developed in the first year of research. The third year of research was devoted to the development of probabilistic finite element analysis procedures for laminated composite plate and shell structures.

  14. Non-unitary probabilistic quantum computing

    NASA Technical Reports Server (NTRS)

    Gingrich, Robert M.; Williams, Colin P.

    2004-01-01

    We present a method for designing quantum circuits that perform non-unitary quantum computations on n-qubit states probabilistically, and give analytic expressions for the success probability and fidelity.

  15. PROBABILISTIC MODELING FOR ADVANCED HUMAN EXPOSURE ASSESSMENT

    EPA Science Inventory

    Human exposures to environmental pollutants widely vary depending on the emission patterns that result in microenvironmental pollutant concentrations, as well as behavioral factors that determine the extent of an individual's contact with these pollutants. Probabilistic human exp...

  16. DEVELOPMENT OF RISK-BASED AND TECHNOLOGY-INDEPENDENT SAFETY CRITERIA FOR GENERATION IV SYSTEMS

    SciTech Connect

    William E. Kastenberg; Edward Blandford; Lance Kim

    2009-03-31

    This project has developed quantitative safety goals for Generation IV (Gen IV) nuclear energy systems. These safety goals are risk based and technology independent. The foundations for a new approach to risk analysis has been developed, along with a new operational definition of risk. This project has furthered the current state-of-the-art by developing quantitative safety goals for both Gen IV reactors and for the overall Gen IV nuclear fuel cycle. The risk analysis approach developed will quantify performance measures, characterize uncertainty, and address a more comprehensive view of safety as it relates to the overall system. Appropriate safety criteria are necessary to manage risk in a prudent and cost-effective manner. This study is also important for government agencies responsible for managing, reviewing, and for approving advanced reactor systems because they are charged with assuring the health and safety of the public.

  17. Application impact analysis: a risk-based approach to business continuity and disaster recovery.

    PubMed

    Epstein, Beth; Khan, Dawn Christine

    2014-01-01

    There are many possible disruptions that can occur in business. Overlooking or under planning for Business Continuity requires time, understanding and careful planning. Business Continuity Management is far more than producing a document and declaring business continuity success. What is the recipe for businesses to achieve continuity management success? Application Impact Analysis is a method for understanding the unique Business Attributes. This AIA Cycle involves a risk based approach to understanding the business priority and considering business aspects such as Financial, Operational, Service Structure, Contractual Legal, and Brand. The output of this analysis provides a construct for viewing data, evaluating impact, and delivering results, for an approved valuation of Recovery Time Objectives (RTO). PMID:24578024

  18. Probabilistic cloning of three symmetric states

    SciTech Connect

    Jimenez, O.; Bergou, J.; Delgado, A.

    2010-12-15

    We study the probabilistic cloning of three symmetric states. These states are defined by a single complex quantity, the inner product among them. We show that three different probabilistic cloning machines are necessary to optimally clone all possible families of three symmetric states. We also show that the optimal cloning probability of generating M copies out of one original can be cast as the quotient between the success probability of unambiguously discriminating one and M copies of symmetric states.

  19. Probabilistic structural analysis for nuclear thermal propulsion

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin

    1993-01-01

    Viewgraphs of probabilistic structural analysis for nuclear thermal propulsion are presented. The objective of the study was to develop a methodology to certify Space Nuclear Propulsion System (SNPS) Nozzle with assured reliability. Topics covered include: advantage of probabilistic structural analysis; space nuclear propulsion system nozzle uncertainties in the random variables; SNPS nozzle natural frequency; and sensitivity of primitive variable uncertainties SNPS nozzle natural frequency and shell stress.

  20. Probabilistic/Fracture-Mechanics Model For Service Life

    NASA Technical Reports Server (NTRS)

    Watkins, T., Jr.; Annis, C. G., Jr.

    1991-01-01

    Computer program makes probabilistic estimates of lifetime of engine and components thereof. Developed to fill need for more accurate life-assessment technique that avoids errors in estimated lives and provides for statistical assessment of levels of risk created by engineering decisions in designing system. Implements mathematical model combining techniques of statistics, fatigue, fracture mechanics, nondestructive analysis, life-cycle cost analysis, and management of engine parts. Used to investigate effects of such engine-component life-controlling parameters as return-to-service intervals, stresses, capabilities for nondestructive evaluation, and qualities of materials.

  1. Risk-based audit selection of dairy farms.

    PubMed

    van Asseldonk, M A P M; Velthuis, A G J

    2014-02-01

    Dairy farms are audited in the Netherlands on numerous process standards. Each farm is audited once every 2 years. Increasing demands for cost-effectiveness in farm audits can be met by introducing risk-based principles. This implies targeting subpopulations with a higher risk of poor process standards. To select farms for an audit that present higher risks, a statistical analysis was conducted to test the relationship between the outcome of farm audits and bulk milk laboratory results before the audit. The analysis comprised 28,358 farm audits and all conducted laboratory tests of bulk milk samples 12 mo before the audit. The overall outcome of each farm audit was classified as approved or rejected. Laboratory results included somatic cell count (SCC), total bacterial count (TBC), antimicrobial drug residues (ADR), level of butyric acid spores (BAB), freezing point depression (FPD), level of free fatty acids (FFA), and cleanliness of the milk (CLN). The bulk milk laboratory results were significantly related to audit outcomes. Rejected audits are likely to occur on dairy farms with higher mean levels of SCC, TBC, ADR, and BAB. Moreover, in a multivariable model, maxima for TBC, SCC, and FPD as well as standard deviations for TBC and FPD are risk factors for negative audit outcomes. The efficiency curve of a risk-based selection approach, on the basis of the derived regression results, dominated the current random selection approach. To capture 25, 50, or 75% of the population with poor process standards (i.e., audit outcome of rejected), respectively, only 8, 20, or 47% of the population had to be sampled based on a risk-based selection approach. Milk quality information can thus be used to preselect high-risk farms to be audited more frequently. PMID:24290823

  2. Risk-based audit selection of dairy farms.

    PubMed

    van Asseldonk, M A P M; Velthuis, A G J

    2014-02-01

    Dairy farms are audited in the Netherlands on numerous process standards. Each farm is audited once every 2 years. Increasing demands for cost-effectiveness in farm audits can be met by introducing risk-based principles. This implies targeting subpopulations with a higher risk of poor process standards. To select farms for an audit that present higher risks, a statistical analysis was conducted to test the relationship between the outcome of farm audits and bulk milk laboratory results before the audit. The analysis comprised 28,358 farm audits and all conducted laboratory tests of bulk milk samples 12 mo before the audit. The overall outcome of each farm audit was classified as approved or rejected. Laboratory results included somatic cell count (SCC), total bacterial count (TBC), antimicrobial drug residues (ADR), level of butyric acid spores (BAB), freezing point depression (FPD), level of free fatty acids (FFA), and cleanliness of the milk (CLN). The bulk milk laboratory results were significantly related to audit outcomes. Rejected audits are likely to occur on dairy farms with higher mean levels of SCC, TBC, ADR, and BAB. Moreover, in a multivariable model, maxima for TBC, SCC, and FPD as well as standard deviations for TBC and FPD are risk factors for negative audit outcomes. The efficiency curve of a risk-based selection approach, on the basis of the derived regression results, dominated the current random selection approach. To capture 25, 50, or 75% of the population with poor process standards (i.e., audit outcome of rejected), respectively, only 8, 20, or 47% of the population had to be sampled based on a risk-based selection approach. Milk quality information can thus be used to preselect high-risk farms to be audited more frequently.

  3. Exploring the uncertainties in cancer risk assessment using the integrated probabilistic risk assessment (IPRA) approach.

    PubMed

    Slob, Wout; Bakker, Martine I; Biesebeek, Jan Dirk Te; Bokkers, Bas G H

    2014-08-01

    Current methods for cancer risk assessment result in single values, without any quantitative information on the uncertainties in these values. Therefore, single risk values could easily be overinterpreted. In this study, we discuss a full probabilistic cancer risk assessment approach in which all the generally recognized uncertainties in both exposure and hazard assessment are quantitatively characterized and probabilistically evaluated, resulting in a confidence interval for the final risk estimate. The methodology is applied to three example chemicals (aflatoxin, N-nitrosodimethylamine, and methyleugenol). These examples illustrate that the uncertainty in a cancer risk estimate may be huge, making single value estimates of cancer risk meaningless. Further, a risk based on linear extrapolation tends to be lower than the upper 95% confidence limit of a probabilistic risk estimate, and in that sense it is not conservative. Our conceptual analysis showed that there are two possible basic approaches for cancer risk assessment, depending on the interpretation of the dose-incidence data measured in animals. However, it remains unclear which of the two interpretations is the more adequate one, adding an additional uncertainty to the already huge confidence intervals for cancer risk estimates.

  4. Probabilistic Choice, Reversibility, Loops, and Miracles

    NASA Astrophysics Data System (ADS)

    Stoddart, Bill; Bell, Pete

    We consider an addition of probabilistic choice to Abrial's Generalised Substitution Language (GSL) in a form that accommodates the backtracking interpretation of non-deterministic choice. Our formulation is introduced as an extension of the Prospective Values formalism we have developed to describe the results from a backtracking search. Significant features are that probabilistic choice is governed by feasibility, and non-termination is strict. The former property allows us to use probabilistic choice to generate search heuristics. In this paper we are particularly interested in iteration. By demonstrating sub-conjunctivity and monotonicity properties of expectations we give the basis for a fixed point semantics of iterative constructs, and we consider the practical proof treatment of probabilistic loops. We discuss loop invariants, loops with probabilistic behaviour, and probabilistic termination in the context of a formalism in which a small probability of non-termination can dominate our calculations, proposing a method of limits to avoid this problem. The formal programming constructs described have been implemented in a reversible virtual machine (RVM).

  5. Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.

    2016-01-01

    Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.

  6. Symbolic representation of probabilistic worlds.

    PubMed

    Feldman, Jacob

    2012-04-01

    Symbolic representation of environmental variables is a ubiquitous and often debated component of cognitive science. Yet notwithstanding centuries of philosophical discussion, the efficacy, scope, and validity of such representation has rarely been given direct consideration from a mathematical point of view. This paper introduces a quantitative measure of the effectiveness of symbolic representation, and develops formal constraints under which such representation is in fact warranted. The effectiveness of symbolic representation hinges on the probabilistic structure of the environment that is to be represented. For arbitrary probability distributions (i.e., environments), symbolic representation is generally not warranted. But in modal environments, defined here as those that consist of mixtures of component distributions that are narrow ("spiky") relative to their spreads, symbolic representation can be shown to represent the environment with a relatively negligible loss of information. Modal environments support propositional forms, logical relations, and other familiar features of symbolic representation. Hence the assumption that our environment is, in fact, modal is a key tacit assumption underlying the use of symbols in cognitive science. PMID:22270145

  7. Probabilistic computation by neuromine networks.

    PubMed

    Hangartner, R D; Cull, P

    2000-01-01

    In this paper, we address the question, can biologically feasible neural nets compute more than can be computed by deterministic polynomial time algorithms? Since we want to maintain a claim of plausibility and reasonableness we restrict ourselves to algorithmically easy to construct nets and we rule out infinite precision in parameters and in any analog parts of the computation. Our approach is to consider the recent advances in randomized algorithms and see if such randomized computations can be described by neural nets. We start with a pair of neurons and show that by connecting them with reciprocal inhibition and some tonic input, then the steady-state will be one neuron ON and one neuron OFF, but which neuron will be ON and which neuron will be OFF will be chosen at random (perhaps, it would be better to say that microscopic noise in the analog computation will be turned into a megascale random bit). We then show that we can build a small network that uses this random bit process to generate repeatedly random bits. This random bit generator can then be connected with a neural net representing the deterministic part of randomized algorithm. We, therefore, demonstrate that these neural nets can carry out probabilistic computation and thus be less limited than classical neural nets.

  8. Probabilistic modeling of children's handwriting

    NASA Astrophysics Data System (ADS)

    Puri, Mukta; Srihari, Sargur N.; Hanson, Lisa

    2013-12-01

    There is little work done in the analysis of children's handwriting, which can be useful in developing automatic evaluation systems and in quantifying handwriting individuality. We consider the statistical analysis of children's handwriting in early grades. Samples of handwriting of children in Grades 2-4 who were taught the Zaner-Bloser style were considered. The commonly occurring word "and" written in cursive style as well as hand-print were extracted from extended writing. The samples were assigned feature values by human examiners using a truthing tool. The human examiners looked at how the children constructed letter formations in their writing, looking for similarities and differences from the instructions taught in the handwriting copy book. These similarities and differences were measured using a feature space distance measure. Results indicate that the handwriting develops towards more conformity with the class characteristics of the Zaner-Bloser copybook which, with practice, is the expected result. Bayesian networks were learnt from the data to enable answering various probabilistic queries, such as determining students who may continue to produce letter formations as taught during lessons in school and determining the students who will develop a different and/or variation of the those letter formations and the number of different types of letter formations.

  9. Representation of probabilistic scientific knowledge.

    PubMed

    Soldatova, Larisa N; Rzhetsky, Andrey; De Grave, Kurt; King, Ross D

    2013-04-15

    The theory of probability is widely used in biomedical research for data analysis and modelling. In previous work the probabilities of the research hypotheses have been recorded as experimental metadata. The ontology HELO is designed to support probabilistic reasoning, and provides semantic descriptors for reporting on research that involves operations with probabilities. HELO explicitly links research statements such as hypotheses, models, laws, conclusions, etc. to the associated probabilities of these statements being true. HELO enables the explicit semantic representation and accurate recording of probabilities in hypotheses, as well as the inference methods used to generate and update those hypotheses. We demonstrate the utility of HELO on three worked examples: changes in the probability of the hypothesis that sirtuins regulate human life span; changes in the probability of hypotheses about gene functions in the S. cerevisiae aromatic amino acid pathway; and the use of active learning in drug design (quantitative structure activity relation learning), where a strategy for the selection of compounds with the highest probability of improving on the best known compound was used. HELO is open source and available at https://github.com/larisa-soldatova/HELO. PMID:23734675

  10. Representation of probabilistic scientific knowledge

    PubMed Central

    2013-01-01

    The theory of probability is widely used in biomedical research for data analysis and modelling. In previous work the probabilities of the research hypotheses have been recorded as experimental metadata. The ontology HELO is designed to support probabilistic reasoning, and provides semantic descriptors for reporting on research that involves operations with probabilities. HELO explicitly links research statements such as hypotheses, models, laws, conclusions, etc. to the associated probabilities of these statements being true. HELO enables the explicit semantic representation and accurate recording of probabilities in hypotheses, as well as the inference methods used to generate and update those hypotheses. We demonstrate the utility of HELO on three worked examples: changes in the probability of the hypothesis that sirtuins regulate human life span; changes in the probability of hypotheses about gene functions in the S. cerevisiae aromatic amino acid pathway; and the use of active learning in drug design (quantitative structure activity relation learning), where a strategy for the selection of compounds with the highest probability of improving on the best known compound was used. HELO is open source and available at https://github.com/larisa-soldatova/HELO PMID:23734675

  11. Dynamical systems probabilistic risk assessment.

    SciTech Connect

    Denman, Matthew R.; Ames, Arlo Leroy

    2014-03-01

    Probabilistic Risk Assessment (PRA) is the primary tool used to risk-inform nuclear power regulatory and licensing activities. Risk-informed regulations are intended to reduce inherent conservatism in regulatory metrics (e.g., allowable operating conditions and technical specifications) which are built into the regulatory framework by quantifying both the total risk profile as well as the change in the risk profile caused by an event or action (e.g., in-service inspection procedures or power uprates). Dynamical Systems (DS) analysis has been used to understand unintended time-dependent feedbacks in both industrial and organizational settings. In dynamical systems analysis, feedback loops can be characterized and studied as a function of time to describe the changes to the reliability of plant Structures, Systems and Components (SSCs). While DS has been used in many subject areas, some even within the PRA community, it has not been applied toward creating long-time horizon, dynamic PRAs (with time scales ranging between days and decades depending upon the analysis). Understanding slowly developing dynamic effects, such as wear-out, on SSC reliabilities may be instrumental in ensuring a safely and reliably operating nuclear fleet. Improving the estimation of a plant's continuously changing risk profile will allow for more meaningful risk insights, greater stakeholder confidence in risk insights, and increased operational flexibility.

  12. Probabilistic risk assessment familiarization training

    SciTech Connect

    Phillabaum, J.L.

    1989-01-01

    Philadelphia Electric Company (PECo) created a Nuclear Group Risk and Reliability Assessment Program Plan in order to focus the utilization of probabilistic risk assessment (PRA) in support of Limerick Generating Station and Peach Bottom Atomic Power Station. The continuation of a PRA program was committed by PECo to the U.S. Nuclear Regulatory Commission (NRC) prior to be the issuance of an operating license for Limerick Unit 1. It is believed that increased use of PRA techniques to support activities at Limerick and Peach Bottom will enhance PECo's overall nuclear excellence. Training for familiarization with PRA is designed for attendance once by all nuclear group personnel to understand PRA and its potential effect on their jobs. The training content describes the history of PRA and how it applies to PECo's nuclear activities. Key PRA concepts serve as the foundation for the familiarization training. These key concepts are covered in all classes to facilitate an appreciation of the remaining material, which is tailored to the audience. Some of the concepts covered are comparison of regulatory philosophy to PRA techniques, fundamentals of risk/success, risk equation/risk summation, and fault trees and event trees. Building on the concepts, PRA insights and applications are then described that are tailored to the audience.

  13. Probabilistic description of traffic flow

    NASA Astrophysics Data System (ADS)

    Mahnke, R.; Kaupužs, J.; Lubashevsky, I.

    2005-03-01

    A stochastic description of traffic flow, called probabilistic traffic flow theory, is developed. The general master equation is applied to relatively simple models to describe the formation and dissolution of traffic congestions. Our approach is mainly based on spatially homogeneous systems like periodically closed circular rings without on- and off-ramps. We consider a stochastic one-step process of growth or shrinkage of a car cluster (jam). As generalization we discuss the coexistence of several car clusters of different sizes. The basic problem is to find a physically motivated ansatz for the transition rates of the attachment and detachment of individual cars to a car cluster consistent with the empirical observations in real traffic. The emphasis is put on the analogy with first-order phase transitions and nucleation phenomena in physical systems like supersaturated vapour. The results are summarized in the flux-density relation, the so-called fundamental diagram of traffic flow, and compared with empirical data. Different regimes of traffic flow are discussed: free flow, congested mode as stop-and-go regime, and heavy viscous traffic. The traffic breakdown is studied based on the master equation as well as the Fokker-Planck approximation to calculate mean first passage times or escape rates. Generalizations are developed to allow for on-ramp effects. The calculated flux-density relation and characteristic breakdown times coincide with empirical data measured on highways. Finally, a brief summary of the stochastic cellular automata approach is given.

  14. MOND using a probabilistic approach

    NASA Astrophysics Data System (ADS)

    Raut, Usha

    2009-05-01

    MOND has been proposed as a viable alternative to the dark matter hypothesis. In the original MOND formulation [1], a modification of Newtonian Dynamics was brought about by postulating new equations of particle motion at extremely low accelerations, as a possible explanation for the flat rotation curves of spiral galaxies. In this paper, we attempt a different approach to modify the usual force laws by trying to link gravity with the probabilistic aspects of quantum mechanics [2]. In order to achieve this, one starts by replacing the classical notion of a continuous distance between two elementary particles with a statistical probability function, π. The gravitational force between two elementary particles then can be interpreted in terms of the probability of interaction between them. We attempt to show that such a modified gravitational force would fall off a lot slower than the usual inverse square law predicts, leading to revised MOND equations. In the limit that the statistical aggregate of the probabilities becomes equal to the usual inverse square law force, we recover Newtonian/Einstein gravity.[3pt] [1] Milgrom, M. 1983, ApJ, 270, 365 [2] Goradia, S. 2002, .org/pdf/physics/0210040

  15. A Novel Biobjective Risk-Based Model for Stochastic Air Traffic Network Flow Optimization Problem

    PubMed Central

    Cai, Kaiquan; Jia, Yaoguang; Zhu, Yanbo; Xiao, Mingming

    2015-01-01

    Network-wide air traffic flow management (ATFM) is an effective way to alleviate demand-capacity imbalances globally and thereafter reduce airspace congestion and flight delays. The conventional ATFM models assume the capacities of airports or airspace sectors are all predetermined. However, the capacity uncertainties due to the dynamics of convective weather may make the deterministic ATFM measures impractical. This paper investigates the stochastic air traffic network flow optimization (SATNFO) problem, which is formulated as a weighted biobjective 0-1 integer programming model. In order to evaluate the effect of capacity uncertainties on ATFM, the operational risk is modeled via probabilistic risk assessment and introduced as an extra objective in SATNFO problem. Computation experiments using real-world air traffic network data associated with simulated weather data show that presented model has far less constraints compared to stochastic model with nonanticipative constraints, which means our proposed model reduces the computation complexity. PMID:26180842

  16. A Novel Biobjective Risk-Based Model for Stochastic Air Traffic Network Flow Optimization Problem.

    PubMed

    Cai, Kaiquan; Jia, Yaoguang; Zhu, Yanbo; Xiao, Mingming

    2015-01-01

    Network-wide air traffic flow management (ATFM) is an effective way to alleviate demand-capacity imbalances globally and thereafter reduce airspace congestion and flight delays. The conventional ATFM models assume the capacities of airports or airspace sectors are all predetermined. However, the capacity uncertainties due to the dynamics of convective weather may make the deterministic ATFM measures impractical. This paper investigates the stochastic air traffic network flow optimization (SATNFO) problem, which is formulated as a weighted biobjective 0-1 integer programming model. In order to evaluate the effect of capacity uncertainties on ATFM, the operational risk is modeled via probabilistic risk assessment and introduced as an extra objective in SATNFO problem. Computation experiments using real-world air traffic network data associated with simulated weather data show that presented model has far less constraints compared to stochastic model with nonanticipative constraints, which means our proposed model reduces the computation complexity. PMID:26180842

  17. 12 CFR 222.73 - Content, form, and timing of risk-based pricing notices.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 3 2014-01-01 2014-01-01 false Content, form, and timing of risk-based pricing... Risk-Based Pricing § 222.73 Content, form, and timing of risk-based pricing notices. (a) Content of the notice—(1) In general. The risk-based pricing notice required by § 222.72(a) or (c) must include: (i)...

  18. 12 CFR 222.73 - Content, form, and timing of risk-based pricing notices.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 3 2013-01-01 2013-01-01 false Content, form, and timing of risk-based pricing... Risk-Based Pricing § 222.73 Content, form, and timing of risk-based pricing notices. (a) Content of the notice—(1) In general. The risk-based pricing notice required by § 222.72(a) or (c) must include: (i)...

  19. National Research Needs Conference Proceedings: Risk-Based Decision Making for Onsite Wastewater Treatment

    SciTech Connect

    2001-03-01

    On May 19-20, 2000, the Research Needs Conference for ''Risk-Based Decision Making for Onsite Wastewater Treatment'' was convened in St. Louis, Missouri. The conference, funded by the U.S. Environmental Protection Agency (EPA), was the culmination of an eighteen-month-long effort by the National Decentralized Water Resources Capacity Development Project (NDWRCDP) to assist onsite wastewater leadership in identifying critical research gaps in the field. The five ''White Papers'' included in this volume of Proceedings, along with the reviewer comments for four of these papers, provided the basis for extended discussion. Topics for the papers had been determined from research needs forums convened in three different areas of the country. Four major research areas were defined at the conclusion of the regional meetings: fate and transport of nutrients; fate and transport of pathogens; long-term performance of soil-absorption systems; and the economics of decentralized wastewater systems. National leaders were then identified to prepare white papers in each of these areas, and two reviewers were also selected to critique each of the papers at the research needs conference. Other experts were asked to prepare a white paper on risk assessment and risk management, and to incorporate specific onsite wastewater examples that had been cited in the regional meetings. The resulting papers and peer review comments summarize the existing literature. They also identify gaps relevant for rigorous risk-based decision-making.

  20. Risk-based selection of SSCs at Peach Bottom

    SciTech Connect

    Krueger, G.A.; Marie, A.J. )

    1993-01-01

    The purpose of identifying risk significant systems, structures, and components (SSCS) that are within the scope of the maintenance rule is to bring a higher level of attention to a subset of those SSCS. These risk-significant SSCs will have specific performance criteria established for them, and failure to meet this performance criteria will result in establishing goals to ensure the necessary improvement in performance. The Peach Bottom individual plant examination (IPE) results were used to provide insights for the verification of proposed probabilistic risk assessment (PRA) methods set forth in the Industry Maintenance Guidelines for Implementation of the Maintenance Rule. The objective of reviewing the methods for selection of SSCs that are considered risk significant was to ensure the methods used are logical, reproducible, and can be consistently applied.

  1. 12 CFR 652.85 - When to report the risk-based capital level.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... report the risk-based capital level. (a) You must file a risk-based capital report with us each time you determine your risk-based capital level as required by § 652.80. (b) You must also report to us at once if you identify in the interim between quarterly or more frequent reports to us that you are not...

  2. 12 CFR 652.100 - Audit of the risk-based capital stress test.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 7 2014-01-01 2014-01-01 false Audit of the risk-based capital stress test... the risk-based capital stress test. You must have a qualified, independent external auditor review your implementation of the risk-based capital stress test every 3 years and submit a copy of...

  3. 12 CFR 652.100 - Audit of the risk-based capital stress test.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 7 2012-01-01 2012-01-01 false Audit of the risk-based capital stress test... the risk-based capital stress test. You must have a qualified, independent external auditor review your implementation of the risk-based capital stress test every 3 years and submit a copy of...

  4. 12 CFR 652.100 - Audit of the risk-based capital stress test.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 7 2013-01-01 2013-01-01 false Audit of the risk-based capital stress test... the risk-based capital stress test. You must have a qualified, independent external auditor review your implementation of the risk-based capital stress test every 3 years and submit a copy of...

  5. 12 CFR 652.100 - Audit of the risk-based capital stress test.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 6 2011-01-01 2011-01-01 false Audit of the risk-based capital stress test... the risk-based capital stress test. You must have a qualified, independent external auditor review your implementation of the risk-based capital stress test every 3 years and submit a copy of...

  6. 12 CFR 652.100 - Audit of the risk-based capital stress test.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Audit of the risk-based capital stress test... the risk-based capital stress test. You must have a qualified, independent external auditor review your implementation of the risk-based capital stress test every 3 years and submit a copy of...

  7. 16 CFR 640.4 - Content, form, and timing of risk-based pricing notices.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... pricing notices. 640.4 Section 640.4 Commercial Practices FEDERAL TRADE COMMISSION THE FAIR CREDIT REPORTING ACT DUTIES OF CREDITORS REGARDING RISK-BASED PRICING § 640.4 Content, form, and timing of risk-based pricing notices. (a) Content of the notice—(1) In general. The risk-based pricing notice...

  8. 12 CFR 222.72 - General requirements for risk-based pricing notices.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 3 2013-01-01 2013-01-01 false General requirements for risk-based pricing... Risk-Based Pricing § 222.72 General requirements for risk-based pricing notices. (a) In general. Except... pricing notice”) in the form and manner required by this subpart if the person both— (1) Uses a...

  9. 16 CFR 640.4 - Content, form, and timing of risk-based pricing notices.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... pricing notices. 640.4 Section 640.4 Commercial Practices FEDERAL TRADE COMMISSION THE FAIR CREDIT REPORTING ACT DUTIES OF CREDITORS REGARDING RISK-BASED PRICING § 640.4 Content, form, and timing of risk-based pricing notices. (a) Content of the notice—(1) In general. The risk-based pricing notice...

  10. 12 CFR 1022.72 - General requirements for risk-based pricing notices.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 8 2012-01-01 2012-01-01 false General requirements for risk-based pricing... REPORTING (REGULATION V) Duties of Users Regarding Risk-Based Pricing § 1022.72 General requirements for risk-based pricing notices. (a) In general. Except as otherwise provided in this subpart, a person...

  11. 12 CFR 1022.72 - General requirements for risk-based pricing notices.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 8 2013-01-01 2013-01-01 false General requirements for risk-based pricing... REPORTING (REGULATION V) Duties of Users Regarding Risk-Based Pricing § 1022.72 General requirements for risk-based pricing notices. (a) In general. Except as otherwise provided in this subpart, a person...

  12. 12 CFR 222.72 - General requirements for risk-based pricing notices.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 3 2012-01-01 2012-01-01 false General requirements for risk-based pricing... Pricing § 222.72 General requirements for risk-based pricing notices. (a) In general. Except as otherwise provided in this subpart, a person must provide to a consumer a notice (“risk-based pricing notice”) in...

  13. 16 CFR 640.4 - Content, form, and timing of risk-based pricing notices.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... pricing notices. 640.4 Section 640.4 Commercial Practices FEDERAL TRADE COMMISSION THE FAIR CREDIT REPORTING ACT DUTIES OF CREDITORS REGARDING RISK-BASED PRICING § 640.4 Content, form, and timing of risk-based pricing notices. (a) Content of the notice—(1) In general. The risk-based pricing notice...

  14. 12 CFR 222.72 - General requirements for risk-based pricing notices.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 3 2014-01-01 2014-01-01 false General requirements for risk-based pricing... Risk-Based Pricing § 222.72 General requirements for risk-based pricing notices. (a) In general. Except... pricing notice”) in the form and manner required by this subpart if the person both— (1) Uses a...

  15. 12 CFR 652.90 - How to report your risk-based capital determination.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false How to report your risk-based capital determination. 652.90 Section 652.90 Banks and Banking FARM CREDIT ADMINISTRATION FARM CREDIT SYSTEM FEDERAL... report your risk-based capital determination. (a) Your risk-based capital report must contain at...

  16. 12 CFR 222.73 - Content, form, and timing of risk-based pricing notices.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 3 2011-01-01 2011-01-01 false Content, form, and timing of risk-based pricing... THE FEDERAL RESERVE SYSTEM FAIR CREDIT REPORTING (REGULATION V) Duties of Users Regarding Risk-Based Pricing § 222.73 Content, form, and timing of risk-based pricing notices. (a) Content of the notice—(1)...

  17. Risk-based consequences of extreme natural hazard processes in mountain regions - Multi-hazard analysis in Tyrol (Austria)

    NASA Astrophysics Data System (ADS)

    Huttenlau, Matthias; Stötter, Johann

    2010-05-01

    weighting within the risk concept, this has sufficient implications on the results of risk analyses. Thus, an equal and scale appropriated balance of those risk components is a fundamental key factor for effective natural hazard risk analyses. The results of such analyses inform especially decision makers in the insurance industry, the administration, and politicians on potential consequences and are the basis for appropriate risk management strategies. Thereby, results (i) on an annual or probabilistic risk comprehension have to be distinguished from (ii) scenario-based analyses. The first analyses are based on statistics of periodically or episodically occurring events whereas the latter approach is especially applied for extreme, non-linear, stochastic events. Focusing on the needs especially of insurance companies, the first approaches are appropriate for premium pricing and reinsurance strategies with an annual perspective, whereas the latter is focusing on events with extreme loss burdens under worst-case criteria to guarantee accordant reinsurance coverage. Moreover, the demand of adequate loss model approaches and methods is strengthened by the risk-based requirements of the upcoming capital requirement directive Solvency II. The present study estimates the potential elements at risk, their corresponding damage potentials and the Probable Maximum Losses (PMLs) of extreme natural hazards events in Tyrol (Austria) and considers adequatly the scale dependency and balanced application of the introduced risk components. Beside the introduced analysis an additionally portfolio analysis of a regional insurance company was executed. The geocoded insurance contracts of this portfolio analysis were the basis to estimate spatial, socio-economical and functional differentiated mean insurance values for the different risk categories of (i) buildings, (ii) contents or inventory, (iii) vehicles, and (iv) persons in the study area. The estimated mean insurance values were

  18. Risk-based assessment of the surety of information systems

    SciTech Connect

    Jansma, R.; Fletcher, S.; Halbgewachs, R.; Lim, J.; Murphy, M.; Sands, P.; Wyss, G.

    1995-03-01

    Correct operation of an information system requires a balance of ``surety`` domains -- access control (confidentiality), integrity, utility, availability, and safety. However, traditional approaches provide little help on how to systematically analyze and balance the combined impact of surety requirements on a system. The key to achieving information system surety is identifying, prioritizing, and mitigating the sources of risk that may lead to system failure. Consequently, the authors propose a risk assessment methodology that provides a framework to guide the analyst in identifying and prioritizing sources of risk and selecting mitigation techniques. The framework leads the analyst to develop a risk-based system model for balancing the surety requirements and quantifying the effectiveness and combined impact of the mitigation techniques. Such a model allows the information system designer to make informed trade-offs based on the most effective risk-reduction measures.

  19. Nuclear insurance risk assessment using risk-based methodology

    SciTech Connect

    Wendland, W.G. )

    1992-01-01

    This paper presents American Nuclear Insurers' (ANI's) and Mutual Atomic Energy Liability Underwriters' (MAELU's) process and experience for conducting nuclear insurance risk assessments using a risk-based methodology. The process is primarily qualitative and uses traditional insurance risk assessment methods and an approach developed under the auspices of the American Society of Mechanical Engineers (ASME) in which ANI/MAELU is an active sponsor. This process assists ANI's technical resources in identifying where to look for insurance risk in an industry in which insurance exposure tends to be dynamic and nonactuarial. The process is an evolving one that also seeks to minimize the impact on insureds while maintaining a mutually agreeable risk tolerance.

  20. Risk-based verification, validation, and accreditation process

    NASA Astrophysics Data System (ADS)

    Elele, James N.; Smith, Jeremy

    2010-04-01

    This paper presents a risk-based Verification, Validation, and Accreditation (VV&A) process for Models and Simulations (M&S). Recently, the emphasis on M&S used to support Department of Defense (DoD) acquisition has been based on the level of resources allocated to establishing the credibility of the M&S on the risks associated with the decision being supported by the M&S. In addition, DoD VV&A regulations recommend tailoring the V&V process to allow efficient use of resources. However, one problem is that no methodology is specified for such tailoring processes. The BMV&V has developed a risk-based process that implements tailoring of the VV&A activities based on risk. Our process incorporates MIL-STD 3022 for new M&S. For legacy M&S, the process starts by first assessing the current risk level of the M&S based on the credibility attributes of the M&S as defined through its Capability, Accuracy and Usability, relative to the articulated Intended Use Statement (IUS). If the risk is low, the M&S is credible for application, and no further V&V is required. If the risk is medium or high, the Accreditation Authority determines whether the M&S can be accepted as-is or if the risk should be mitigated. If the Accreditation Authority is willing to accept the risks, then a Conditional Accreditation is made. If the risks associated with using the M&S as-is are deemed too high to accept, then a Risk Mitigation/Accreditation Plan is developed to guide the process. The implementation of such a risk mitigation plan is finally documented through an Accreditation Support Package.

  1. A risk-based approach for a national assessment

    SciTech Connect

    Whelan, Gene; Laniak, Gerard F.

    1998-10-18

    The need for environmental systems modeling is growing rapidly because of the 1) the combination of increasing technical scope and complexity related to questions of risk-based cause and effect and 2) need to explicitly address cost effectiveness in both the development and implementation of environmental regulations. The nature of risk assessments are evolving with their increased complexity in assessing individual sites and collection of sites, addressing regional or national regulatory needs. These assessments require the integration of existing tools and the development of new databases and models, based on a comprehensive and holistic view of the risk assessment problem. To meet these environmental regulatory needs, multiple-media-based assessments are formulated to view and assess risks from a comprehensive environmental systems perspective, crossing the boundaries of several scientific disciplines. Given the consideration and the advanced states of computer hardware and software, it is possible to design a software system that facilitates the development and integration of assessment tools (e.g., databases and models). In this paper, a risk-based approach for supporting national risk assessments is presented. This approach combines 1) databases, 2) multiple media models, combining source-term, fate and transport, exposure, and risk/hazard, and 3) sensitivity/uncertainty capabilities within a software system capable of growing within the science of risk assessment. The design and linkages of the system are discussed. This paper also provides the rationale behind the design of the framework, as there is a recognized need to develop more holistic approaches to risk assessment.

  2. Collaborative Review: Risk-Based Prostate Cancer Screening

    PubMed Central

    Zhu, Xiaoye; Albertsen, Peter C.; Andriole, Gerald L.; Roobol, Monique J.; Schröder, Fritz H.; Vickers, Andrew J.

    2016-01-01

    Context Widespread mass screening of prostate cancer (PCa) is not recommended because the balance between benefits and harms is still not well established. The achieved mortality reduction comes with considerable harm such as unnecessary biopsies, overdiagnoses, and overtreatment. Therefore, patient stratification with regard to PCa risk and aggressiveness is necessary to identify those men who are at risk and may actually benefit from early detection. Objective This review critically examines the current evidence regarding risk-based PCa screening. Evidence acquisition A search of the literature was performed using the Medline database. Further studies were selected based on manual searches of reference lists and review articles. Evidence synthesis Prostate-specific antigen (PSA) has been shown to be the single most significant predictive factor for identifying men at increased risk of developing PCa. Especially in men with no additional risk factors, PSA alone provides an appropriate marker up to 30 yr into the future. After assessment of an early PSA test, the screening frequency may be determined based on individualized risk. A limited list of additional factors such as age, comorbidity, prostate volume, family history, ethnicity, and previous biopsy status have been identified to modify risk and are important for consideration in routine practice. In men with a known PSA, risk calculators may hold the promise of identifying those who are at increased risk of having PCa and are therefore candidates for biopsy. Conclusions PSA testing may serve as the foundation for a more risk-based assessment. However, the decision to undergo early PSA testing should be a shared one between the patient and his physician based on information balancing its advantages and disadvantages. PMID:22134009

  3. Homeland security R&D roadmapping : risk-based methodological options.

    SciTech Connect

    Brandt, Larry D.

    2008-12-01

    The Department of Energy (DOE) National Laboratories support the Department of Homeland Security (DHS) in the development and execution of a research and development (R&D) strategy to improve the nation's preparedness against terrorist threats. Current approaches to planning and prioritization of DHS research decisions are informed by risk assessment tools and processes intended to allocate resources to programs that are likely to have the highest payoff. Early applications of such processes have faced challenges in several areas, including characterization of the intelligent adversary and linkage to strategic risk management decisions. The risk-based analysis initiatives at Sandia Laboratories could augment the methodologies currently being applied by the DHS and could support more credible R&D roadmapping for national homeland security programs. Implementation and execution issues facing homeland security R&D initiatives within the national laboratories emerged as a particular concern in this research.

  4. Risk-based decision-making framework for the selection of sediment dredging option.

    PubMed

    Manap, Norpadzlihatun; Voulvoulis, Nikolaos

    2014-10-15

    The aim of this study was to develop a risk-based decision-making framework for the selection of sediment dredging option. Descriptions using case studies of the newly integrated, holistic and staged framework were followed. The first stage utilized the historical dredging monitoring data and the contamination level in media data into Ecological Risk Assessment phases, which have been altered for benefits in cost, time and simplicity. How Multi-Criteria Decision Analysis (MCDA) can be used to analyze and prioritize dredging areas based on environmental, socio-economic and managerial criteria was described for the next stage. The results from MCDA will be integrated into Ecological Risk Assessment to characterize the degree of contamination in the prioritized areas. The last stage was later described using these findings and analyzed using MCDA, in order to identify the best sediment dredging option, accounting for the economic, environmental and technical aspects of dredging, which is beneficial for dredging and sediment management industries.

  5. Options for improving hazardous waste cleanups using risk-based criteria

    SciTech Connect

    Elcock, D.

    1995-06-01

    This paper explores how risk- and technology-based criteria are currently used in the RCRA and CERCLA cleanup programs. It identifies ways in which risk could be further incorporated into RCRA and CERCLA cleanup requirements and the implications of risk-based approaches. The more universal use of risk assessment as embodied in the risk communication and risk improvement bills before Congress is not addressed. Incorporating risk into the laws and regulations governing hazardous waste cleanup, will allow the use of the best scientific information available to further the goal of environmental protection in the United States while containing costs. and may help set an example for other countries that may be developing cleanup programs, thereby contributing to enhanced global environmental management.

  6. Degradation monitoring using probabilistic inference

    NASA Astrophysics Data System (ADS)

    Alpay, Bulent

    In order to increase safety and improve economy and performance in a nuclear power plant (NPP), the source and extent of component degradations should be identified before failures and breakdowns occur. It is also crucial for the next generation of NPPs, which are designed to have a long core life and high fuel burnup to have a degradation monitoring system in order to keep the reactor in a safe state, to meet the designed reactor core lifetime and to optimize the scheduled maintenance. Model-based methods are based on determining the inconsistencies between the actual and expected behavior of the plant, and use these inconsistencies for detection and diagnostics of degradations. By defining degradation as a random abrupt change from the nominal to a constant degraded state of a component, we employed nonlinear filtering techniques based on state/parameter estimation. We utilized a Bayesian recursive estimation formulation in the sequential probabilistic inference framework and constructed a hidden Markov model to represent a general physical system. By addressing the problem of a filter's inability to estimate an abrupt change, which is called the oblivious filter problem in nonlinear extensions of Kalman filtering, and the sample impoverishment problem in particle filtering, we developed techniques to modify filtering algorithms by utilizing additional data sources to improve the filter's response to this problem. We utilized a reliability degradation database that can be constructed from plant specific operational experience and test and maintenance reports to generate proposal densities for probable degradation modes. These are used in a multiple hypothesis testing algorithm. We then test samples drawn from these proposal densities with the particle filtering estimates based on the Bayesian recursive estimation formulation with the Metropolis Hastings algorithm, which is a well-known Markov chain Monte Carlo method (MCMC). This multiple hypothesis testing

  7. Probabilistic Cue Combination: Less is More

    PubMed Central

    Yurovsky, Daniel; Boyer, Ty W.; Smith, Linda B.; Yu, Chen

    2012-01-01

    Learning about the structure of the world requires learning probabilistic relationships: rules in which cues do not predict outcomes with certainty. However, in some cases, the ability to track probabilistic relationships is a handicap, leading adults to perform non-normatively in prediction tasks. For example, in the dilution effect, predictions made from the combination of two cues of different strengths are less accurate than those made from the stronger cue alone. Here we show that dilution is an adult problem; 11-month-old infants combine strong and weak predictors normatively. These results extend and add support for the less is more hypothesis: limited cognitive resources can lead children to represent probabilistic information differently from adults, and this difference in representation can have important downstream consequences for prediction. PMID:23432826

  8. Probabilistic Learning by Rodent Grid Cells

    PubMed Central

    Cheung, Allen

    2016-01-01

    Mounting evidence shows mammalian brains are probabilistic computers, but the specific cells involved remain elusive. Parallel research suggests that grid cells of the mammalian hippocampal formation are fundamental to spatial cognition but their diverse response properties still defy explanation. No plausible model exists which explains stable grids in darkness for twenty minutes or longer, despite being one of the first results ever published on grid cells. Similarly, no current explanation can tie together grid fragmentation and grid rescaling, which show very different forms of flexibility in grid responses when the environment is varied. Other properties such as attractor dynamics and grid anisotropy seem to be at odds with one another unless additional properties are assumed such as a varying velocity gain. Modelling efforts have largely ignored the breadth of response patterns, while also failing to account for the disastrous effects of sensory noise during spatial learning and recall, especially in darkness. Here, published electrophysiological evidence from a range of experiments are reinterpreted using a novel probabilistic learning model, which shows that grid cell responses are accurately predicted by a probabilistic learning process. Diverse response properties of probabilistic grid cells are statistically indistinguishable from rat grid cells across key manipulations. A simple coherent set of probabilistic computations explains stable grid fields in darkness, partial grid rescaling in resized arenas, low-dimensional attractor grid cell dynamics, and grid fragmentation in hairpin mazes. The same computations also reconcile oscillatory dynamics at the single cell level with attractor dynamics at the cell ensemble level. Additionally, a clear functional role for boundary cells is proposed for spatial learning. These findings provide a parsimonious and unified explanation of grid cell function, and implicate grid cells as an accessible neuronal population

  9. A~probabilistic tsunami hazard assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-05-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence based decision making on risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean Tsunami, but this has been largely concentrated on the Sunda Arc, with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent Probabilistic Tsunami Hazard Assessment (PTHA) for Indonesia. This assessment produces time independent forecasts of tsunami hazard at the coast from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte-carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and through sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting larger maximum magnitudes along the Sunda Arc. The annual probability of experiencing a tsunami with a height at the coast of > 0.5 m is greater than 10% for Sumatra, Java, the Sunda Islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of >3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  10. Probabilistic modeling of subgrade soil strengths

    NASA Astrophysics Data System (ADS)

    Chou, Y. T.

    1981-09-01

    A concept of spatial average in probabilistic modeling of subgrade soil strength is presented. The advantage of the application of spatial average to pavement engineering is explained. The link between the concept and the overall probability-based pavement design procedure is formulated and explained. In the earlier part of the report, a literature review of the concept and procedure of probabilistic design of pavements, which includes the concepts of variations and reliability, is presented. Finally, an outline of a probability based pavement design procedure for the Corps of Engineers is presented.

  11. Impact of Probabilistic Weather on Flight Routing Decisions

    NASA Technical Reports Server (NTRS)

    Sheth, Kapil; Sridhar, Banavar; Mulfinger, Daniel

    2006-01-01

    Flight delays in the United States have been found to increase year after year, along with the increase in air traffic. During the four-month period from May through August of 2005, weather related delays accounted for roughly 70% of all reported delays, The current weather prediction in tactical (within 2 hours) timeframe is at manageable levels, however, the state of forecasting weather for strategic (2-6 hours) timeframe is still not dependable for long-term planning. In the absence of reliable severe weather forecasts, the decision-making for flights longer than two hours is challenging. This paper deals with an approach of using probabilistic weather prediction for Traffic Flow Management use, and a general method using this prediction for estimating expected values of flight length and delays in the National Airspace System (NAS). The current state-of-the-art convective weather forecasting is employed to aid the decision makers in arriving at decisions for traffic flow and flight planing. The six-agency effort working on the Next Generation Air Transportation System (NGATS) have considered weather-assimilated decision-making as one of the principal foci out of a list of eight. The weather Integrated Product Team has considered integrated weather information and improved aviation weather forecasts as two of the main efforts (Ref. 1, 2). Recently, research has focused on the concept of operations for strategic traffic flow management (Ref. 3) and how weather data can be integrated for improved decision-making for efficient traffic management initiatives (Ref. 4, 5). An overview of the weather data needs and benefits of various participants in the air traffic system along with available products can be found in Ref. 6. Previous work related to use of weather data in identifying and categorizing pilot intrusions into severe weather regions (Ref. 7, 8) has demonstrated a need for better forecasting in the strategic planning timeframes and moving towards a

  12. A Probabilistic Model of Melody Perception

    ERIC Educational Resources Information Center

    Temperley, David

    2008-01-01

    This study presents a probabilistic model of melody perception, which infers the key of a melody and also judges the probability of the melody itself. The model uses Bayesian reasoning: For any "surface" pattern and underlying "structure," we can infer the structure maximizing P(structure [vertical bar] surface) based on knowledge of P(surface,…

  13. The Probabilistic Nature of Preferential Choice

    ERIC Educational Resources Information Center

    Rieskamp, Jorg

    2008-01-01

    Previous research has developed a variety of theories explaining when and why people's decisions under risk deviate from the standard economic view of expected utility maximization. These theories are limited in their predictive accuracy in that they do not explain the probabilistic nature of preferential choice, that is, why an individual makes…

  14. Balkanization and Unification of Probabilistic Inferences

    ERIC Educational Resources Information Center

    Yu, Chong-Ho

    2005-01-01

    Many research-related classes in social sciences present probability as a unified approach based upon mathematical axioms, but neglect the diversity of various probability theories and their associated philosophical assumptions. Although currently the dominant statistical and probabilistic approach is the Fisherian tradition, the use of Fisherian…

  15. Dynamic Probabilistic Instability of Composite Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2009-01-01

    A computationally effective method is described to evaluate the non-deterministic dynamic instability (probabilistic dynamic buckling) of thin composite shells. The method is a judicious combination of available computer codes for finite element, composite mechanics and probabilistic structural analysis. The solution method is incrementally updated Lagrangian. It is illustrated by applying it to thin composite cylindrical shell subjected to dynamic loads. Both deterministic and probabilistic buckling loads are evaluated to demonstrate the effectiveness of the method. A universal plot is obtained for the specific shell that can be used to approximate buckling loads for different load rates and different probability levels. Results from this plot show that the faster the rate, the higher the buckling load and the shorter the time. The lower the probability, the lower is the buckling load for a specific time. Probabilistic sensitivity results show that the ply thickness, the fiber volume ratio and the fiber longitudinal modulus, dynamic load and loading rate are the dominant uncertainties in that order.

  16. Probabilistic Grammars for Natural Languages. Psychology Series.

    ERIC Educational Resources Information Center

    Suppes, Patrick

    The purpose of this paper is to define the framework within which empirical investigations of probabilistic grammars can take place and to sketch how this attack can be made. The full presentation of empirical results will be left to other papers. In the detailed empirical work, the author has depended on the collaboration of E. Gammon and A.…

  17. Probabilistic classification learning in Tourette syndrome.

    PubMed

    Kéri, Szabolcs; Szlobodnyik, Csaba; Benedek, György; Janka, Zoltán; Gádoros, Júlia

    2002-01-01

    Tourette syndrome (TS) is characterised by stereotyped involuntary movements, called tics. Some evidence suggests that structural and functional abnormalities of the basal ganglia may explain these motor symptoms. In this study, the probabilistic classification learning (PCL) test was used to evaluate basal ganglia functions in 10 children with less severe tics (Yale Global Tic Severity Scale (YGTSS) scores<30) and in 10 children with more severe symptoms (YGTSS score>30). In the PCL task, participants are asked to decide whether different combinations of four geometric forms (cues) predict rainy or sunny weather. Each cue is probabilistically related to a weather outcome, and feedback is provided after each decision. After completion of the probabilistic stimulus-response learning procedure, subjects received a transfer test to assess explicit knowledge about the cues. The children with TS exhibited impaired learning in the PCL task in comparison with the 20 healthy control subjects. This impairment was more pronounced in the TS patients with severe symptoms, and there was a significant negative relationship between the final classification performance and the YGTSS scores. The patients showed normal learning in the transfer test. These results suggest that the neostriatal habit learning system, which may play a central role in the acquisition of probabilistic associations, is dysfunctional in TS, especially in the case of more severe motor symptoms. The classification performance and the severity of tics were independent of the explicit knowledge obtained during the test.

  18. Bayesian probabilistic population projections for all countries

    PubMed Central

    Raftery, Adrian E.; Li, Nan; Ševčíková, Hana; Gerland, Patrick; Heilig, Gerhard K.

    2012-01-01

    Projections of countries’ future populations, broken down by age and sex, are widely used for planning and research. They are mostly done deterministically, but there is a widespread need for probabilistic projections. We propose a Bayesian method for probabilistic population projections for all countries. The total fertility rate and female and male life expectancies at birth are projected probabilistically using Bayesian hierarchical models estimated via Markov chain Monte Carlo using United Nations population data for all countries. These are then converted to age-specific rates and combined with a cohort component projection model. This yields probabilistic projections of any population quantity of interest. The method is illustrated for five countries of different demographic stages, continents and sizes. The method is validated by an out of sample experiment in which data from 1950–1990 are used for estimation, and applied to predict 1990–2010. The method appears reasonably accurate and well calibrated for this period. The results suggest that the current United Nations high and low variants greatly underestimate uncertainty about the number of oldest old from about 2050 and that they underestimate uncertainty for high fertility countries and overstate uncertainty for countries that have completed the demographic transition and whose fertility has started to recover towards replacement level, mostly in Europe. The results also indicate that the potential support ratio (persons aged 20–64 per person aged 65+) will almost certainly decline dramatically in most countries over the coming decades. PMID:22908249

  19. Pigeons' Discounting of Probabilistic and Delayed Reinforcers

    ERIC Educational Resources Information Center

    Green, Leonard; Myerson, Joel; Calvert, Amanda L.

    2010-01-01

    Pigeons' discounting of probabilistic and delayed food reinforcers was studied using adjusting-amount procedures. In the probability discounting conditions, pigeons chose between an adjusting number of food pellets contingent on a single key peck and a larger, fixed number of pellets contingent on completion of a variable-ratio schedule. In the…

  20. Probabilistic Scale-Space Filtering Program

    NASA Technical Reports Server (NTRS)

    Kulkarni, Deepak; Kutulakos, Kiriakos

    1993-01-01

    Probabilistic Scale-Space Filtering (PSF) computer program implements scale-space technique to describe input signals as collections of nested hills and valleys organized in treelike structure. Helps to construct sparse representations of complicated signals. Calculates probabilities, with extracted features corresponding to physical processes. Written in C language (49 percent) and Common Lisp (51 percent).

  1. Probabilistic Relational Structures and Their Applications

    ERIC Educational Resources Information Center

    Domotor, Zoltan

    The principal objects of the investigation reported were, first, to study qualitative probability relations on Boolean algebras, and secondly, to describe applications in the theories of probability logic, information, automata, and probabilistic measurement. The main contribution of this work is stated in 10 definitions and 20 theorems. The basic…

  2. Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William

    2009-01-01

    This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).

  3. Probabilistic Assessment of Radiation Risk for Astronauts in Space Missions

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee; DeAngelis, Giovanni; Cucinotta, Francis A.

    2009-01-01

    Accurate predictions of the health risks to astronauts from space radiation exposure are necessary for enabling future lunar and Mars missions. Space radiation consists of solar particle events (SPEs), comprised largely of medium energy protons, (less than 100 MeV); and galactic cosmic rays (GCR), which include protons and heavy ions of higher energies. While the expected frequency of SPEs is strongly influenced by the solar activity cycle, SPE occurrences themselves are random in nature. A solar modulation model has been developed for the temporal characterization of the GCR environment, which is represented by the deceleration potential, phi. The risk of radiation exposure from SPEs during extra-vehicular activities (EVAs) or in lightly shielded vehicles is a major concern for radiation protection, including determining the shielding and operational requirements for astronauts and hardware. To support the probabilistic risk assessment for EVAs, which would be up to 15% of crew time on lunar missions, we estimated the probability of SPE occurrence as a function of time within a solar cycle using a nonhomogeneous Poisson model to fit the historical database of measurements of protons with energy > 30 MeV, (phi)30. The resultant organ doses and dose equivalents, as well as effective whole body doses for acute and cancer risk estimations are analyzed for a conceptual habitat module and a lunar rover during defined space mission periods. This probabilistic approach to radiation risk assessment from SPE and GCR is in support of mission design and operational planning to manage radiation risks for space exploration.

  4. A Novel TRM Calculation Method by Probabilistic Concept

    NASA Astrophysics Data System (ADS)

    Audomvongseree, Kulyos; Yokoyama, Akihiko; Verma, Suresh Chand; Nakachi, Yoshiki

    In a new competitive environment, it becomes possible for the third party to access a transmission facility. From this structure, to efficiently manage the utilization of the transmission network, a new definition about Available Transfer Capability (ATC) has been proposed. According to the North American ElectricReliability Council (NERC)’s definition, ATC depends on several parameters, i. e. Total Transfer Capability (TTC), Transmission Reliability Margin (TRM), and Capacity Benefit Margin (CBM). This paper is focused on the calculation of TRM which is one of the security margin reserved for any uncertainty of system conditions. The TRM calculation by probabilistic method is proposed in this paper. Based on the modeling of load forecast error and error in transmission line limitation, various cases of transmission transfer capability and its related probabilistic nature can be calculated. By consideration of the proposed concept of risk analysis, the appropriate required amount of TRM can be obtained. The objective of this research is to provide realistic information on the actual ability of the network which may be an alternative choice for system operators to make an appropriate decision in the competitive market. The advantages of the proposed method are illustrated by application to the IEEJ-WEST10 model system.

  5. Internal modelling under Risk-Based Capital (RBC) framework

    NASA Astrophysics Data System (ADS)

    Ling, Ang Siew; Hin, Pooi Ah

    2015-12-01

    Very often the methods for the internal modelling under the Risk-Based Capital framework make use of the data which are in the form of run-off triangle. The present research will instead extract from a group of n customers, the historical data for the sum insured si of the i-th customer together with the amount paid yij and the amount aij reported but not yet paid in the j-th development year for j = 1, 2, 3, 4, 5, 6. We model the future value (yij+1, aij+1) to be dependent on the present year value (yij, aij) and the sum insured si via a conditional distribution which is derived from a multivariate power-normal mixture distribution. For a group of given customers with different original purchase dates, the distribution of the aggregate claims liabilities may be obtained from the proposed model. The prediction interval based on the distribution for the aggregate claim liabilities is found to have good ability of covering the observed aggregate claim liabilities.

  6. Guidelines for risk-based prioritization of DOE activities

    SciTech Connect

    1998-04-01

    This standard describes issues that should be considered when comparing, selecting, or implementing risk-based prioritization (RBP) systems. It also discusses characteristics that should be used in evaluating the quality of an RBP system and its associated results. The purpose of this standard is to provide guidance for selecting or developing an RBP system so that when implemented, it will: (a) improve the quality of the RBP systems employed by DOE and its contractors; (b) improve the consistency and comparability of RBP system results; (c) satisfy DOE requests to perform RBP; (d) help ensure that limited resources are used efficiently and effectively; (e) help ensure that characteristics for evaluating RBP systems are met and properly balanced; (f) promote greater understanding, use, and acceptance of RBP systems; (g) promote greater understanding between DOE and its stakeholders and regulators; (h) improve the quality of resource allocation, planning, and scheduling decisions. This standard is applicable to any and all uses of RBP by DOE elements, including cases in which RBP is requested by DOE or is used to help allocate resources among alternatives that compete for resources. Prioritizing alternatives that compete for limited resources encompasses many policy issues that are inherent to an RBP effort. It is the position of this standard that policy issues should be determined by the decision maker(s) requesting the prioritization. For additional information on policy issues, refer to section 10 on Application Guidance for Policy Issues.

  7. Risk Classification and Risk-based Safety and Mission Assurance

    NASA Technical Reports Server (NTRS)

    Leitner, Jesse A.

    2014-01-01

    Recent activities to revamp and emphasize the need to streamline processes and activities for Class D missions across the agency have led to various interpretations of Class D, including the lumping of a variety of low-cost projects into Class D. Sometimes terms such as Class D minus are used. In this presentation, mission risk classifications will be traced to official requirements and definitions as a measure to ensure that projects and programs align with the guidance and requirements that are commensurate for their defined risk posture. As part of this, the full suite of risk classifications, formal and informal will be defined, followed by an introduction to the new GPR 8705.4 that is currently under review.GPR 8705.4 lays out guidance for the mission success activities performed at the Classes A-D for NPR 7120.5 projects as well as for projects not under NPR 7120.5. Furthermore, the trends in stepping from Class A into higher risk posture classifications will be discussed. The talk will conclude with a discussion about risk-based safety and mission assuranceat GSFC.

  8. 12 CFR 222.73 - Content, form, and timing of risk-based pricing notices.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 3 2012-01-01 2012-01-01 false Content, form, and timing of risk-based pricing... Pricing § 222.73 Content, form, and timing of risk-based pricing notices. (a) Content of the notice—(1) In general. The risk-based pricing notice required by § 222.72(a) or (c) must include: (i) A statement that...

  9. Probabilistic Structural Health Monitoring of the Orbiter Wing Leading Edge

    NASA Technical Reports Server (NTRS)

    Yap, Keng C.; Macias, Jesus; Kaouk, Mohamed; Gafka, Tammy L.; Kerr, Justin H.

    2011-01-01

    A structural health monitoring (SHM) system can contribute to the risk management of a structure operating under hazardous conditions. An example is the Wing Leading Edge Impact Detection System (WLEIDS) that monitors the debris hazards to the Space Shuttle Orbiter s Reinforced Carbon-Carbon (RCC) panels. Since Return-to-Flight (RTF) after the Columbia accident, WLEIDS was developed and subsequently deployed on board the Orbiter to detect ascent and on-orbit debris impacts, so as to support the assessment of wing leading edge structural integrity prior to Orbiter re-entry. As SHM is inherently an inverse problem, the analyses involved, including those performed for WLEIDS, tend to be associated with significant uncertainty. The use of probabilistic approaches to handle the uncertainty has resulted in the successful implementation of many development and application milestones.

  10. Probabilistic Modeling of Aircraft Trajectories for Dynamic Separation Volumes

    NASA Technical Reports Server (NTRS)

    Lewis, Timothy A.

    2016-01-01

    With a proliferation of new and unconventional vehicles and operations expected in the future, the ab initio airspace design will require new approaches to trajectory prediction for separation assurance and other air traffic management functions. This paper presents an approach to probabilistic modeling of the trajectory of an aircraft when its intent is unknown. The approach uses a set of feature functions to constrain a maximum entropy probability distribution based on a set of observed aircraft trajectories. This model can be used to sample new aircraft trajectories to form an ensemble reflecting the variability in an aircraft's intent. The model learning process ensures that the variability in this ensemble reflects the behavior observed in the original data set. Computational examples are presented.

  11. How probabilistic risk assessment can mislead terrorism risk analysts.

    PubMed

    Brown, Gerald G; Cox, Louis Anthony Tony

    2011-02-01

    Traditional probabilistic risk assessment (PRA), of the type originally developed for engineered systems, is still proposed for terrorism risk analysis. We show that such PRA applications are unjustified in general. The capacity of terrorists to seek and use information and to actively research different attack options before deciding what to do raises unique features of terrorism risk assessment that are not adequately addressed by conventional PRA for natural and engineered systems-in part because decisions based on such PRA estimates do not adequately hedge against the different probabilities that attackers may eventually act upon. These probabilities may differ from the defender's (even if the defender's experts are thoroughly trained, well calibrated, unbiased probability assessors) because they may be conditioned on different information. We illustrate the fundamental differences between PRA and terrorism risk analysis, and suggest use of robust decision analysis for risk management when attackers may know more about some attack options than we do.

  12. Probabilistic assessment of smart composite structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Shiao, Michael C.

    1994-01-01

    A composite wing with spars and bulkheads is used to demonstrate the effectiveness of probabilistic assessment of smart composite structures to control uncertainties in distortions and stresses. Results show that a smart composite wing can be controlled to minimize distortions and to have specified stress levels in the presence of defects. Structural responses such as changes in angle of attack, vertical displacements, and stress in the control and controlled plies are probabilistically assessed to quantify their respective uncertainties. Sensitivity factors are evaluated to identify those parameters that have the greatest influence on a specific structural response. Results show that smart composite structures can be configured to control both distortions and ply stresses to satisfy specified design requirements.

  13. Exact and Approximate Probabilistic Symbolic Execution

    NASA Technical Reports Server (NTRS)

    Luckow, Kasper; Pasareanu, Corina S.; Dwyer, Matthew B.; Filieri, Antonio; Visser, Willem

    2014-01-01

    Probabilistic software analysis seeks to quantify the likelihood of reaching a target event under uncertain environments. Recent approaches compute probabilities of execution paths using symbolic execution, but do not support nondeterminism. Nondeterminism arises naturally when no suitable probabilistic model can capture a program behavior, e.g., for multithreading or distributed systems. In this work, we propose a technique, based on symbolic execution, to synthesize schedulers that resolve nondeterminism to maximize the probability of reaching a target event. To scale to large systems, we also introduce approximate algorithms to search for good schedulers, speeding up established random sampling and reinforcement learning results through the quantification of path probabilities based on symbolic execution. We implemented the techniques in Symbolic PathFinder and evaluated them on nondeterministic Java programs. We show that our algorithms significantly improve upon a state-of- the-art statistical model checking algorithm, originally developed for Markov Decision Processes.

  14. Probabilistic Methods for Structural Design and Reliability

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Whitlow, Woodrow, Jr. (Technical Monitor)

    2002-01-01

    This report describes a formal method to quantify structural damage tolerance and reliability in the presence of a multitude of uncertainties in turbine engine components. The method is based at the material behavior level where primitive variables with their respective scatter ranges are used to describe behavior. Computational simulation is then used to propagate the uncertainties to the structural scale where damage tolerance and reliability are usually specified. Several sample cases are described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from this method demonstrate, that it is mature and that it can be used to probabilistically evaluate turbine engine structural components. It may be inferred from the results that the method is suitable for probabilistically predicting the remaining life in aging or in deteriorating structures, for making strategic projections and plans, and for achieving better, cheaper, faster products that give competitive advantages in world markets.

  15. Cortical Correspondence with Probabilistic Fiber Connectivity

    PubMed Central

    Oguz, Ipek; Niethammer, Marc; Cates, Josh; Whitaker, Ross; Fletcher, Thomas; Vachet, Clement; Styner, Martin

    2009-01-01

    This paper presents a novel method of optimizing point-based correspondence among populations of human cortical surfaces by combining structural cues with probabilistic connectivity maps. The proposed method establishes a tradeoff between an even sampling of the cortical surfaces (a low surface entropy) and the similarity of corresponding points across the population (a low ensemble entropy). The similarity metric, however, isn’t constrained to be just spatial proximity, but uses local sulcal depth measurements as well as probabilistic connectivity maps, computed from DWI scans via a stochastic tractography algorithm, to enhance the correspondence definition. We propose a novel method for projecting this fiber connectivity information on the cortical surface, using a surface evolution technique. Our cortical correspondence method does not require a spherical parameterization. Experimental results are presented, showing improved correspondence quality demonstrated by a cortical thickness analysis, as compared to correspondence methods using spatial metrics as the sole correspondence criterion. PMID:19694301

  16. Modelling default and likelihood reasoning as probabilistic

    NASA Technical Reports Server (NTRS)

    Buntine, Wray

    1990-01-01

    A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. 'Likely' and 'by default' are in fact treated as duals in the same sense as 'possibility' and 'necessity'. To model these four forms probabilistically, a logic QDP and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequence results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.

  17. Probabilistic structural analysis computer code (NESSUS)

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.

    1988-01-01

    Probabilistic structural analysis has been developed to analyze the effects of fluctuating loads, variable material properties, and uncertain analytical models especially for high performance structures such as SSME turbopump blades. The computer code NESSUS (Numerical Evaluation of Stochastic Structure Under Stress) was developed to serve as a primary computation tool for the characterization of the probabilistic structural response due to the stochastic environments by statistical description. The code consists of three major modules NESSUS/PRE, NESSUS/FEM, and NESSUS/FPI. NESSUS/PRE is a preprocessor which decomposes the spatially correlated random variables into a set of uncorrelated random variables using a modal analysis method. NESSUS/FEM is a finite element module which provides structural sensitivities to all the random variables considered. NESSUS/FPI is Fast Probability Integration method by which a cumulative distribution function or a probability density function is calculated.

  18. Probabilistic Safety Assessment of Tehran Research Reactor

    SciTech Connect

    Hosseini, Seyed Mohammad Hadi; Nematollahi, Mohammad Reza; Sepanloo, Kamran

    2004-07-01

    Probabilistic Safety Assessment (PSA) application is found to be a practical tool for research reactor safety due to intense involvement of human interactions in an experimental facility. In this paper the application of the Probabilistic Safety Assessment to the Tehran Research Reactor (TRR) is presented. The level 1 PSA application involved: Familiarization with the plant, selection of accident initiators, mitigating functions and system definitions, event tree constructions and quantification, fault tree constructions and quantification, human reliability, component failure data base development and dependent failure analysis. Each of the steps of the analysis given above is discussed with highlights from the selected results. Quantification of the constructed models is done using SAPHIRE software. This Study shows that the obtained core damage frequency for Tehran Research Reactor (8.368 E-6 per year) well meets the IAEA criterion for existing nuclear power plants (1E-4). But safety improvement suggestions are offered to decrease the most probable accidents. (authors)

  19. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multifactor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.

  20. Probabilistic Analysis of Gas Turbine Field Performance

    NASA Technical Reports Server (NTRS)

    Gorla, Rama S. R.; Pai, Shantaram S.; Rusick, Jeffrey J.

    2002-01-01

    A gas turbine thermodynamic cycle was computationally simulated and probabilistically evaluated in view of the several uncertainties in the performance parameters, which are indices of gas turbine health. Cumulative distribution functions and sensitivity factors were computed for the overall thermal efficiency and net specific power output due to the thermodynamic random variables. These results can be used to quickly identify the most critical design variables in order to optimize the design, enhance performance, increase system availability and make it cost effective. The analysis leads to the selection of the appropriate measurements to be used in the gas turbine health determination and to the identification of both the most critical measurements and parameters. Probabilistic analysis aims at unifying and improving the control and health monitoring of gas turbine aero-engines by increasing the quality and quantity of information available about the engine's health and performance.

  1. Initial guidelines for probabilistic seismic hazard analysis

    SciTech Connect

    Budnitz, R.J.

    1994-10-01

    In the late 1980s, the methodology for performing probabilistic seismic hazard analysis (PSHA) was exercised extensively for eastern-U.S. nuclear power plant sites by the Electric Power Research Institute (EPRI) and Lawrence Livermore National Laboratory (LLNL) under NRC sponsorship. Unfortunately, the seismic-hazard-curve results of these two studies differed substantially for many of the eastern reactor sites, which has motivated all concerned to revisit the approaches taken. This project is that revisitation.

  2. Dynamic competitive probabilistic principal components analysis.

    PubMed

    López-Rubio, Ezequiel; Ortiz-DE-Lazcano-Lobato, Juan Miguel

    2009-04-01

    We present a new neural model which extends the classical competitive learning (CL) by performing a Probabilistic Principal Components Analysis (PPCA) at each neuron. The model also has the ability to learn the number of basis vectors required to represent the principal directions of each cluster, so it overcomes a drawback of most local PCA models, where the dimensionality of a cluster must be fixed a priori. Experimental results are presented to show the performance of the network with multispectral image data.

  3. Multiscale/Multifunctional Probabilistic Composite Fatigue

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A multilevel (multiscale/multifunctional) evaluation is demonstrated by applying it to three different sample problems. These problems include the probabilistic evaluation of a space shuttle main engine blade, an engine rotor and an aircraft wing. The results demonstrate that the blade will fail at the highest probability path, the engine two-stage rotor will fail by fracture at the rim and the aircraft wing will fail at 109 fatigue cycles with a probability of 0.9967.

  4. 76 FR 13902 - Fair Credit Reporting Risk-Based Pricing Regulations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-15

    ... Commission propose to amend their respective risk-based pricing rules to require disclosure of credit scores and information relating to credit scores in risk-based pricing notices if a credit score of the... pricing provisions on January 15, 2010 (75 FR 2724) (January 2010 Final Rule). The January 2010 Final...

  5. 12 CFR 327.12 - Prepayment of quarterly risk-based assessments.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... insured status terminates, any amount of its prepaid assessment remaining (other than any amounts needed... 12 Banks and Banking 4 2011-01-01 2011-01-01 false Prepayment of quarterly risk-based assessments... STATEMENTS OF GENERAL POLICY ASSESSMENTS In General § 327.12 Prepayment of quarterly risk-based...

  6. 76 FR 39885 - Risk-Based Targeting of Foreign Flagged Mobile Offshore Drilling Units (MODUs)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-07

    ... SECURITY Coast Guard Risk-Based Targeting of Foreign Flagged Mobile Offshore Drilling Units (MODUs) AGENCY... Office of Vessel Activities Policy Letter 11-06, Risk-Based Targeting of Foreign Flagged Mobile Offshore Drilling Units (MODUs). This policy letter announces changes to the Coast Guard's system used to...

  7. 12 CFR 956.4 - Risk-based capital requirement for investments.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Risk-based capital requirement for investments. 956.4 Section 956.4 Banks and Banking FEDERAL HOUSING FINANCE BOARD FEDERAL HOME LOAN BANK ASSETS AND OFF-BALANCE SHEET ITEMS FEDERAL HOME LOAN BANK INVESTMENTS § 956.4 Risk-based capital requirement...

  8. 12 CFR 652.75 - Your responsibility for determining the risk-based capital level.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Your responsibility for determining the risk... Requirements § 652.75 Your responsibility for determining the risk-based capital level. (a) You must determine your risk-based capital level using the procedures in this subpart, appendix A to this subpart, and...

  9. The Diagnostic Challenge Competition: Probabilistic Techniques for Fault Diagnosis in Electrical Power Systems

    NASA Technical Reports Server (NTRS)

    Ricks, Brian W.; Mengshoel, Ole J.

    2009-01-01

    Reliable systems health management is an important research area of NASA. A health management system that can accurately and quickly diagnose faults in various on-board systems of a vehicle will play a key role in the success of current and future NASA missions. We introduce in this paper the ProDiagnose algorithm, a diagnostic algorithm that uses a probabilistic approach, accomplished with Bayesian Network models compiled to Arithmetic Circuits, to diagnose these systems. We describe the ProDiagnose algorithm, how it works, and the probabilistic models involved. We show by experimentation on two Electrical Power Systems based on the ADAPT testbed, used in the Diagnostic Challenge Competition (DX 09), that ProDiagnose can produce results with over 96% accuracy and less than 1 second mean diagnostic time.

  10. Probabilistic Models for Solar Particle Events

    NASA Technical Reports Server (NTRS)

    Adams, James H., Jr.; Dietrich, W. F.; Xapsos, M. A.; Welton, A. M.

    2009-01-01

    Probabilistic Models of Solar Particle Events (SPEs) are used in space mission design studies to provide a description of the worst-case radiation environment that the mission must be designed to tolerate.The models determine the worst-case environment using a description of the mission and a user-specified confidence level that the provided environment will not be exceeded. This poster will focus on completing the existing suite of models by developing models for peak flux and event-integrated fluence elemental spectra for the Z>2 elements. It will also discuss methods to take into account uncertainties in the data base and the uncertainties resulting from the limited number of solar particle events in the database. These new probabilistic models are based on an extensive survey of SPE measurements of peak and event-integrated elemental differential energy spectra. Attempts are made to fit the measured spectra with eight different published models. The model giving the best fit to each spectrum is chosen and used to represent that spectrum for any energy in the energy range covered by the measurements. The set of all such spectral representations for each element is then used to determine the worst case spectrum as a function of confidence level. The spectral representation that best fits these worst case spectra is found and its dependence on confidence level is parameterized. This procedure creates probabilistic models for the peak and event-integrated spectra.

  11. Asteroid Risk Assessment: A Probabilistic Approach.

    PubMed

    Reinhardt, Jason C; Chen, Xi; Liu, Wenhao; Manchev, Petar; Paté-Cornell, M Elisabeth

    2016-02-01

    Following the 2013 Chelyabinsk event, the risks posed by asteroids attracted renewed interest, from both the scientific and policy-making communities. It reminded the world that impacts from near-Earth objects (NEOs), while rare, have the potential to cause great damage to cities and populations. Point estimates of the risk (such as mean numbers of casualties) have been proposed, but because of the low-probability, high-consequence nature of asteroid impacts, these averages provide limited actionable information. While more work is needed to further refine its input distributions (e.g., NEO diameters), the probabilistic model presented in this article allows a more complete evaluation of the risk of NEO impacts because the results are distributions that cover the range of potential casualties. This model is based on a modularized simulation that uses probabilistic inputs to estimate probabilistic risk metrics, including those of rare asteroid impacts. Illustrative results of this analysis are presented for a period of 100 years. As part of this demonstration, we assess the effectiveness of civil defense measures in mitigating the risk of human casualties. We find that they are likely to be beneficial but not a panacea. We also compute the probability-but not the consequences-of an impact with global effects ("cataclysm"). We conclude that there is a continued need for NEO observation, and for analyses of the feasibility and risk-reduction effectiveness of space missions designed to deflect or destroy asteroids that threaten the Earth.

  12. Asteroid Risk Assessment: A Probabilistic Approach.

    PubMed

    Reinhardt, Jason C; Chen, Xi; Liu, Wenhao; Manchev, Petar; Paté-Cornell, M Elisabeth

    2016-02-01

    Following the 2013 Chelyabinsk event, the risks posed by asteroids attracted renewed interest, from both the scientific and policy-making communities. It reminded the world that impacts from near-Earth objects (NEOs), while rare, have the potential to cause great damage to cities and populations. Point estimates of the risk (such as mean numbers of casualties) have been proposed, but because of the low-probability, high-consequence nature of asteroid impacts, these averages provide limited actionable information. While more work is needed to further refine its input distributions (e.g., NEO diameters), the probabilistic model presented in this article allows a more complete evaluation of the risk of NEO impacts because the results are distributions that cover the range of potential casualties. This model is based on a modularized simulation that uses probabilistic inputs to estimate probabilistic risk metrics, including those of rare asteroid impacts. Illustrative results of this analysis are presented for a period of 100 years. As part of this demonstration, we assess the effectiveness of civil defense measures in mitigating the risk of human casualties. We find that they are likely to be beneficial but not a panacea. We also compute the probability-but not the consequences-of an impact with global effects ("cataclysm"). We conclude that there is a continued need for NEO observation, and for analyses of the feasibility and risk-reduction effectiveness of space missions designed to deflect or destroy asteroids that threaten the Earth. PMID:26215051

  13. Probabilistic design of advanced composite structure

    NASA Technical Reports Server (NTRS)

    Gray, P. M.; Riskalla, M. G.

    1992-01-01

    Advanced composite technology offers potentials for sizable improvements in many areas: weight savings, maintainability, durability, and reliability. However, there are a number of inhibitors to these improvements. One of the biggest inhibitors is the imposition of traditional metallic approaches to design of composite structure. This is especially detrimental in composites because new materials technology demands new design approaches. Of particular importance are the decisions made regarding structural criteria. Significant changes cannot be implemented without careful consideration and exploration. This new approach is to implement changes on a controlled, verifiable basis. Probabilistic design is the methodology and the process to accomplish this. Its foundation is to base design criteria and objectives on reliability targets instead of arbitrary factors carried over from metallic structural history. The background is discussed of probabilistic design and the results are presented of a side-by-side comparison to generic aircraft structure designed the 'old' way and the 'new'. Activities are also defined that need to be undertaken to evolve available approaches to probabilistic design followed by summary and recommendations.

  14. From deterministic dynamics to probabilistic descriptions

    PubMed Central

    Misra, B.; Prigogine, I.; Courbage, M.

    1979-01-01

    The present work is devoted to the following question: What is the relationship between the deterministic laws of dynamics and probabilistic description of physical processes? It is generally accepted that probabilistic processes can arise from deterministic dynamics only through a process of “coarse graining” or “contraction of description” that inevitably involves a loss of information. In this work we present an alternative point of view toward the relationship between deterministic dynamics and probabilistic descriptions. Speaking in general terms, we demonstrate the possibility of obtaining (stochastic) Markov processes from deterministic dynamics simply through a “change of representation” that involves no loss of information provided the dynamical system under consideration has a suitably high degree of instability of motion. The fundamental implications of this finding for statistical mechanics and other areas of physics are discussed. From a mathematical point of view, the theory we present is a theory of invertible, positivity-preserving, and necessarily nonunitary similarity transformations that convert the unitary groups associated with deterministic dynamics to contraction semigroups associated with stochastic Markov processes. We explicitly construct such similarity transformations for the so-called Bernoulli systems. This construction illustrates also the construction of the so-called Lyapounov variables and the operator of “internal time,” which play an important role in our approach to the problem of irreversibility. The theory we present can also be viewed as a theory of entropy-increasing evolutions and their relationship to deterministic dynamics. PMID:16592691

  15. Amplification uncertainty relation for probabilistic amplifiers

    NASA Astrophysics Data System (ADS)

    Namiki, Ryo

    2015-09-01

    Traditionally, quantum amplification limit refers to the property of inevitable noise addition on canonical variables when the field amplitude of an unknown state is linearly transformed through a quantum channel. Recent theoretical studies have determined amplification limits for cases of probabilistic quantum channels or general quantum operations by specifying a set of input states or a state ensemble. However, it remains open how much excess noise on canonical variables is unavoidable and whether there exists a fundamental trade-off relation between the canonical pair in a general amplification process. In this paper we present an uncertainty-product form of amplification limits for general quantum operations by assuming an input ensemble of Gaussian-distributed coherent states. It can be derived as a straightforward consequence of canonical uncertainty relations and retrieves basic properties of the traditional amplification limit. In addition, our amplification limit turns out to give a physical limitation on probabilistic reduction of an Einstein-Podolsky-Rosen uncertainty. In this regard, we find a condition that probabilistic amplifiers can be regarded as local filtering operations to distill entanglement. This condition establishes a clear benchmark to verify an advantage of non-Gaussian operations beyond Gaussian operations with a feasible input set of coherent states and standard homodyne measurements.

  16. Integrating Sequence Evolution into Probabilistic Orthology Analysis.

    PubMed

    Ullah, Ikram; Sjöstrand, Joel; Andersson, Peter; Sennblad, Bengt; Lagergren, Jens

    2015-11-01

    Orthology analysis, that is, finding out whether a pair of homologous genes are orthologs - stemming from a speciation - or paralogs - stemming from a gene duplication - is of central importance in computational biology, genome annotation, and phylogenetic inference. In particular, an orthologous relationship makes functional equivalence of the two genes highly likely. A major approach to orthology analysis is to reconcile a gene tree to the corresponding species tree, (most commonly performed using the most parsimonious reconciliation, MPR). However, most such phylogenetic orthology methods infer the gene tree without considering the constraints implied by the species tree and, perhaps even more importantly, only allow the gene sequences to influence the orthology analysis through the a priori reconstructed gene tree. We propose a sound, comprehensive Bayesian Markov chain Monte Carlo-based method, DLRSOrthology, to compute orthology probabilities. It efficiently sums over the possible gene trees and jointly takes into account the current gene tree, all possible reconciliations to the species tree, and the, typically strong, signal conveyed by the sequences. We compare our method with PrIME-GEM, a probabilistic orthology approach built on a probabilistic duplication-loss model, and MrBayesMPR, a probabilistic orthology approach that is based on conventional Bayesian inference coupled with MPR. We find that DLRSOrthology outperforms these competing approaches on synthetic data as well as on biological data sets and is robust to incomplete taxon sampling artifacts. PMID:26130236

  17. Probabilistic drought classification using gamma mixture models

    NASA Astrophysics Data System (ADS)

    Mallya, Ganeshchandra; Tripathi, Shivam; Govindaraju, Rao S.

    2015-07-01

    Drought severity is commonly reported using drought classes obtained by assigning pre-defined thresholds on drought indices. Current drought classification methods ignore modeling uncertainties and provide discrete drought classification. However, the users of drought classification are often interested in knowing inherent uncertainties in classification so that they can make informed decisions. Recent studies have used hidden Markov models (HMM) for quantifying uncertainties in drought classification. The HMM method conceptualizes drought classes as distinct hydrological states that are not observed (hidden) but affect observed hydrological variables. The number of drought classes or hidden states in the model is pre-specified, which can sometimes result in model over-specification problem. This study proposes an alternate method for probabilistic drought classification where the number of states in the model is determined by the data. The proposed method adapts Standard Precipitation Index (SPI) methodology of drought classification by employing gamma mixture model (Gamma-MM) in a Bayesian framework. The method alleviates the problem of choosing a suitable distribution for fitting data in SPI analysis, quantifies modeling uncertainties, and propagates them for probabilistic drought classification. The method is tested on rainfall data over India. Comparison of the results with standard SPI show important differences particularly when SPI assumptions on data distribution are violated. Further, the new method is simpler and more parsimonious than HMM based drought classification method and can be a viable alternative for probabilistic drought classification.

  18. Risk Management of NASA Projects

    NASA Technical Reports Server (NTRS)

    Sarper, Hueseyin

    1997-01-01

    Various NASA Langley Research Center and other center projects were attempted for analysis to obtain historical data comparing pre-phase A study and the final outcome for each project. This attempt, however, was abandoned once it became clear that very little documentation was available. Next, extensive literature search was conducted on the role of risk and reliability concepts in project management. Probabilistic risk assessment (PRA) techniques are being used with increasing regularity both in and outside of NASA. The value and the usage of PRA techniques were reviewed for large projects. It was found that both civilian and military branches of the space industry have traditionally refrained from using PRA, which was developed and expanded by nuclear industry. Although much has changed with the end of the cold war and the Challenger disaster, it was found that ingrained anti-PRA culture is hard to stop. Examples of skepticism against the use of risk management and assessment techniques were found both in the literature and in conversations with some technical staff. Program and project managers need to be convinced that the applicability and use of risk management and risk assessment techniques is much broader than just in the traditional safety-related areas of application. The time has come to begin to uniformly apply these techniques. The whole idea of risk-based system can maximize the 'return on investment' that the public demands. Also, it would be very useful if all project documents of NASA Langley Research Center, pre-phase A through final report, are carefully stored in a central repository preferably in electronic format.

  19. An evaluation of the role of risk-based decision-making in a former manufactured gas plant site remediation.

    PubMed

    Vyas, Vikram M; Gochfeld, Michael G; Georgopoulos, Panos G; Lioy, Paul J; Sussman, Nancy R

    2006-02-01

    Environmental remediation decisions are driven by the need to minimize human health and ecological risks posed by environmental releases. The Risk Assessment Guidance for Superfund Sites enunciates the principles of exposure and risk assessment that are to be used for reaching remediation decisions for sites under Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA). Experience with remediation management under CERCLA has led to recognition of some crucial infirmities in the processes for managing remediation: cleanup management policies are ad hoc in character, mandates and practices are strongly conservative, and contaminant risk management occurs in an artificially narrow context. The purpose of this case study is to show how a policy of risk-based decision-making was used to avoid customary pitfalls in site remediation. This case study describes the risk-based decision-making process in a remedial action program at a former manufactured gas plant site that successfully achieved timely and effective cleanup. The remediation process operated outside the confines of the CERCLA process under an administrative consent order between the utility and the New Jersey Department of Environmental Protection. A residential use end state was negotiated as part of this agreement. The attendant uncertainties, complications, and unexpected contingencies were overcome by using the likely exposures associated with the desired end state to structure all of the remediation management decisions and by collecting site-specific information from the very outset to obtain a detailed and realistic characterization of human health risks that needed to be mitigated. The lessons from this case study are generalizable to more complicated remediation cases, when supported by correspondingly sophisticated technical approaches. PMID:16570377

  20. Hanford Mission Plan risk-based prioritization methodologies

    SciTech Connect

    Hesser, W.A.; Madden, M.S.; Pyron, N.M.; Butcher, J.L.

    1994-08-01

    Sites across the US Department (DOE) complex recognize the critical need for a systematic method for prioritizing among their work scope activities. Here at the Hanford Site, Pacific Northwest Laboratory and Westinghouse Hanford Company (WHC) conducted preliminary research into techniques to meet this need and assist managers in making financial resource allocation decisions. This research is a subtask of the risk management task of the Hanford Mission Plan as described in the WHC Integrated Planning Work Breakdown Structure 1.8.2 Fiscal Year 1994 Work Plan. The research team investigated prioritization techniques used at other DOE sites and compared them with the Priority Planning Grid (PPG), a tool used at Hanford. The authors concluded that the PPG could be used for prioritization of resource allocation, but it needed to be revised to better reflect the Site`s priorities and objectives. The revised PPG was tested with three Hanford programs, the PPG was modified, and updated procedures were prepared.

  1. Probabilistic structural analysis algorithm development for computational efficiency

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.

    1991-01-01

    The PSAM (Probabilistic Structural Analysis Methods) program is developing a probabilistic structural risk assessment capability for the SSME components. An advanced probabilistic structural analysis software system, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), is being developed as part of the PSAM effort to accurately simulate stochastic structures operating under severe random loading conditions. One of the challenges in developing the NESSUS system is the development of the probabilistic algorithms that provide both efficiency and accuracy. The main probability algorithms developed and implemented in the NESSUS system are efficient, but approximate in nature. In the last six years, the algorithms have improved very significantly.

  2. Applications of probabilistic peak-shaving technique in generation planning

    SciTech Connect

    Malik, A.S.; Cory, B.J.; Wijayatunga, P.D.C.

    1999-11-01

    This paper presents two novel applications of probabilistic peak-shaving technique in generation planning, i.e., to simulate efficiently and accurately multiple limited-energy units probabilistically in equivalent load duration curve method and to simulate efficiently the candidate plants, whose different configurations are tested for finding the least-cost generation expansion planning solution. The applications of the technique are demonstrated with the help of two hand calculation examples. An efficient algorithm is also presented to simulate multiple limited-energy units probabilistically, for different hydrological conditions, in a generation mix of hydro-thermal units in probabilistic production costing framework.

  3. Probabilistic alternatives to Bayesianism: the case of explanationism

    PubMed Central

    Douven, Igor; Schupbach, Jonah N.

    2015-01-01

    There has been a probabilistic turn in contemporary cognitive science. Far and away, most of the work in this vein is Bayesian, at least in name. Coinciding with this development, philosophers have increasingly promoted Bayesianism as the best normative account of how humans ought to reason. In this paper, we make a push for exploring the probabilistic terrain outside of Bayesianism. Non-Bayesian, but still probabilistic, theories provide plausible competitors both to descriptive and normative Bayesian accounts. We argue for this general idea via recent work on explanationist models of updating, which are fundamentally probabilistic but assign a substantial, non-Bayesian role to explanatory considerations. PMID:25964769

  4. Application of probabilistic analysis/design methods in space programs - The approaches, the status, and the needs

    NASA Technical Reports Server (NTRS)

    Ryan, Robert S.; Townsend, John S.

    1993-01-01

    The prospective improvement of probabilistic methods for space program analysis/design entails the further development of theories, codes, and tools which match specific areas of application, the drawing of lessons from previous uses of probability and statistics data bases, the enlargement of data bases (especially in the field of structural failures), and the education of engineers and managers on the advantages of these methods. An evaluation is presently made of the current limitations of probabilistic engineering methods. Recommendations are made for specific applications.

  5. Risk-Based Decision Process for Accelerated Closure of a Nuclear Weapons Facility

    SciTech Connect

    Butler, L.; Norland, R. L.; DiSalvo, R.; Anderson, M.

    2003-02-25

    Nearly 40 years of nuclear weapons production at the Rocky Flats Environmental Technology Site (RFETS or Site) resulted in contamination of soil and underground systems and structures with hazardous substances, including plutonium, uranium and hazardous waste constituents. The Site was placed on the National Priority List in 1989. There are more than 370 Individual Hazardous Substance Sites (IHSSs) at RFETS. Accelerated cleanup and closure of RFETS is being achieved through implementation and refinement of a regulatory framework that fosters programmatic and technical innovations: (1) extensive use of ''accelerated actions'' to remediate IHSSs, (2) development of a risk-based screening process that triggers and helps define the scope of accelerated actions consistent with the final remedial action objectives for the Site, (3) use of field instrumentation for real time data collection, (4) a data management system that renders near real time field data assessment, and (5) a regulatory agency consultative process to facilitate timely decisions. This paper presents the process and interim results for these aspects of the accelerated closure program applied to Environmental Restoration activities at the Site.

  6. Risk based in vitro performance assessment of extended release abuse deterrent formulations.

    PubMed

    Xu, Xiaoming; Gupta, Abhay; Al-Ghabeish, Manar; Calderon, Silvia N; Khan, Mansoor A

    2016-03-16

    High strength extended release opioid products, which are indispensable tools in the management of pain, are associated with serious risks of unintentional and potentially fatal overdose, as well as of misuse and abuse that might lead to addiction. The issue of drug abuse becomes increasingly prominent when the dosage forms can be readily manipulated to release a high amount of opioid or to extract the drug in certain products or solvents. One approach to deter opioid drug abuse is by providing novel abuse deterrent formulations (ADF), with properties that may be viewed as barriers to abuse of the product. However, unlike regular extended release formulations, assessment of ADF technologies are challenging, in part due to the great variety of formulation designs available to achieve deterrence of abuse by oral, parenteral, nasal and respiratory routes. With limited prior history or literature information, and lack of compendial standards, evaluation and regulatory approval of these novel drug products become increasingly difficult. The present article describes a risk-based standardized in-vitro approach that can be utilized in general evaluation of abuse deterrent features for all ADF products. PMID:26784976

  7. A Risk-based Model Predictive Control Approach to Adaptive Interventions in Behavioral Health

    PubMed Central

    Zafra-Cabeza, Ascensión; Rivera, Daniel E.; Collins, Linda M.; Ridao, Miguel A.; Camacho, Eduardo F.

    2010-01-01

    This paper examines how control engineering and risk management techniques can be applied in the field of behavioral health through their use in the design and implementation of adaptive behavioral interventions. Adaptive interventions are gaining increasing acceptance as a means to improve prevention and treatment of chronic, relapsing disorders, such as abuse of alcohol, tobacco, and other drugs, mental illness, and obesity. A risk-based Model Predictive Control (MPC) algorithm is developed for a hypothetical intervention inspired by Fast Track, a real-life program whose long-term goal is the prevention of conduct disorders in at-risk children. The MPC-based algorithm decides on the appropriate frequency of counselor home visits, mentoring sessions, and the availability of after-school recreation activities by relying on a model that includes identifiable risks, their costs, and the cost/benefit assessment of mitigating actions. MPC is particularly suited for the problem because of its constraint-handling capabilities, and its ability to scale to interventions involving multiple tailoring variables. By systematically accounting for risks and adapting treatment components over time, an MPC approach as described in this paper can increase intervention effectiveness and adherence while reducing waste, resulting in advantages over conventional fixed treatment. A series of simulations are conducted under varying conditions to demonstrate the effectiveness of the algorithm. PMID:21643450

  8. Health-risk-based groundwater remediation system optimization through clusterwise linear regression.

    PubMed

    He, L; Huang, G H; Lu, H W

    2008-12-15

    This study develops a health-risk-based groundwater management (HRGM) model. The model incorporates the considerations of environmental quality and human health risks into a general framework. To solve the model, a proxy-based optimization approach is proposed, where a semiparametric statistical method (i.e., clusterwise linear regression) is used to create a set of rapid-response and easy-to-use proxy modules for capturing the relations between remediation policies and the resulting human health risks. Through replacing the simulation and health risk assessment modules with the proxy ones, many orders of magnitude of computational cost can be saved. The model solutions reveal that (i) a long remediation period corresponds to a low total pumping rate, (ii) a stringent risk standard implies a high total pumping rate, and (iii) the human health risk associated with benzene would be significantly reduced if it is regarded as constraints of the model. These implications would assist decision makers in understanding the effects of remediation duration and human-health risk level on optimal remediation policies and in designing a robust groundwater remediation system. Results from postoptimization simulation show that the carcinogenic risk would decrease to satisfy the regulated risk standard under the given remediation policies.

  9. Probabilistic, Seismically-Induced Landslide Hazard Mapping of Western Oregon

    NASA Astrophysics Data System (ADS)

    Olsen, M. J.; Sharifi Mood, M.; Gillins, D. T.; Mahalingam, R.

    2015-12-01

    Earthquake-induced landslides can generate significant damage within urban communities by damaging structures, obstructing lifeline connection routes and utilities, generating various environmental impacts, and possibly resulting in loss of life. Reliable hazard and risk maps are important to assist agencies in efficiently allocating and managing limited resources to prepare for such events. This research presents a new methodology in order to communicate site-specific landslide hazard assessments in a large-scale, regional map. Implementation of the proposed methodology results in seismic-induced landslide hazard maps that depict the probabilities of exceeding landslide displacement thresholds (e.g. 0.1, 0.3, 1.0 and 10 meters). These maps integrate a variety of data sources including: recent landslide inventories, LIDAR and photogrammetric topographic data, geology map, mapped NEHRP site classifications based on available shear wave velocity data in each geologic unit, and USGS probabilistic seismic hazard curves. Soil strength estimates were obtained by evaluating slopes present along landslide scarps and deposits for major geologic units. Code was then developed to integrate these layers to perform a rigid, sliding block analysis to determine the amount and associated probabilities of displacement based on each bin of peak ground acceleration in the seismic hazard curve at each pixel. The methodology was applied to western Oregon, which contains weak, weathered, and often wet soils at steep slopes. Such conditions have a high landslide hazard even without seismic events. A series of landslide hazard maps highlighting the probabilities of exceeding the aforementioned thresholds were generated for the study area. These output maps were then utilized in a performance based design framework enabling them to be analyzed in conjunction with other hazards for fully probabilistic-based hazard evaluation and risk assessment. a) School of Civil and Construction

  10. Towards probabilistic forecasts of volcanic ash transport in the atmosphere

    NASA Astrophysics Data System (ADS)

    Zidikheri, M. J.; Dare, R.; Potts, R.; Lucas, C.

    2015-12-01

    Satellite based remote sensing techniques are the primary means of identifying the location of volcanic ash during eruption events. This information is then used to initialize ash dispersion models, which employ meteorological fields to forecast the future locations of ash. Remote sensing of ash in tropical regions is especially challenging due to the difficulty of demarcating ash from the ice and water in convective clouds which results in frequent missed and false detections of ash. Dispersion models also contain uncertainties that arise from uncertainties in the meteorological fields and the source term such the height of the ash column. Communicating forecasting uncertainties is becoming increasingly important to stakeholders for the purposes of improved risk management. For these reasons the Bureau of Meteorology is engaged in research with the aim of providing probabilistic forecasts of ash in the near future based on ensembles of dispersion model simulations. The ensembles are constructed to reflect two fundamental uncertainties. Firstly, uncertainties in the meteorological fields are incorporated by the use of the Bureau's ensemble model system which issues a set of 24 meteorological forecasts representing possible states of the atmosphere. Secondly, uncertainties in the model parameters such as the ash column height are also incorporated. This is accomplished by running a suite of dispersion model simulations, each with a different value of the model parameter. Pattern correlations are then used to quantify the match between the model and observations. The model parameters which provide the best matches between model and observations are then employed in the ensemble to issue probabilistic forecasts of ash. This process can also ameliorate errors due to incorrect or missing model physics. The efficacy of these techniques shall be demonstrated by the use of several case studies.

  11. Developing and evaluating distributions for probabilistic human exposure assessments

    SciTech Connect

    Maddalena, Randy L.; McKone, Thomas E.

    2002-08-01

    This report describes research carried out at the Lawrence Berkeley National Laboratory (LBNL) to assist the U. S. Environmental Protection Agency (EPA) in developing a consistent yet flexible approach for evaluating the inputs to probabilistic risk assessments. The U.S. EPA Office of Emergency and Remedial Response (OERR) recently released Volume 3 Part A of Risk Assessment Guidance for Superfund (RAGS), as an update to the existing two-volume set of RAGS. The update provides policy and technical guidance on performing probabilistic risk assessment (PRA). Consequently, EPA risk managers and decision-makers need to review and evaluate the adequacy of PRAs for supporting regulatory decisions. A critical part of evaluating a PRA is the problem of evaluating or judging the adequacy of input distributions PRA. Although the overarching theme of this report is the need to improve the ease and consistency of the regulatory review process, the specific objectives are presented in two parts. The objective of Part 1 is to develop a consistent yet flexible process for evaluating distributions in a PRA by identifying the critical attributes of an exposure factor distribution and discussing how these attributes relate to the task-specific adequacy of the input. This objective is carried out with emphasis on the perspective of a risk manager or decision-maker. The proposed evaluation procedure provides consistency to the review process without a loss of flexibility. As a result, the approach described in Part 1 provides an opportunity to apply a single review framework for all EPA regions and yet provide the regional risk manager with the flexibility to deal with site- and case-specific issues in the PRA process. However, as the number of inputs to a PRA increases, so does the complexity of the process for calculating, communicating and managing risk. As a result, there is increasing effort required of both the risk professionals performing the analysis and the risk manager

  12. Dynamic fluctuations in dopamine efflux in the prefrontal cortex and nucleus accumbens during risk-based decision making.

    PubMed

    St Onge, Jennifer R; Ahn, Soyon; Phillips, Anthony G; Floresco, Stan B

    2012-11-21

    Mesocorticolimbic dopamine (DA) has been implicated in cost/benefit decision making about risks and rewards. The prefrontal cortex (PFC) and nucleus accumbens (NAc) are two DA terminal regions that contribute to decision making in distinct manners. However, how fluctuations of tonic DA levels may relate to different aspects of decision making remains to be determined. The present study measured DA efflux in the PFC and NAc with microdialysis in well trained rats performing a probabilistic discounting task. Selection of a small/certain option always delivered one pellet, whereas another, large/risky option yielded four pellets, with probabilities that decreased (100-12.5%) or increased (12.5-100%) across four blocks of trials. Yoked-reward groups were also included to control for reward delivery. PFC DA efflux during decision making decreased or increased over a session, corresponding to changes in large/risky reward probabilities. Similar profiles were observed from yoked-rewarded rats, suggesting that fluctuations in PFC DA reflect changes in the relative rate of reward received. NAc DA efflux also showed decreasing/increasing trends over the session during both tasks. However, DA efflux was higher during decision making on free- versus forced-choice trials and during periods of greater reward uncertainty. Moreover, changes in NAc DA closely tracked shifts in choice biases. These data reveal dynamic and dissociable fluctuations in PFC and NAc DA transmission associated with different aspects of risk-based decision making. PFC DA may signal changes in reward availability that facilitates modification of choice biases, whereas NAc DA encodes integrated signals about reward rates, uncertainty, and choice, reflecting implementation of decision policies.

  13. Risk-Based Disposal Plan for PCB Paint in the TRA Fluorinel Dissolution Process Mockup and Gamma Facilities Canal

    SciTech Connect

    R. A. Montgomery

    2008-05-01

    This Toxic Substances Control Act Risk-Based Polychlorinated Biphenyl Disposal plan was developed for the Test Reactor Area Fluorinel Dissolution Process Mockup and Gamma Facilities Waste System, located in Building TRA-641 at the Reactor Technology Complex, Idaho National Laboratory Site, to address painted surfaces in the empty canal under 40 CFR 761.62(c) for paint, and under 40 CFR 761.61(c) for PCBs that may have penetrated into the concrete. The canal walls and floor will be painted with two coats of contrasting non-PCB paint and labeled as PCB. The canal is covered with open decking; the access grate is locked shut and signed to indicate PCB contamination in the canal. Access to the canal will require facility manager permission. Protective equipment for personnel and equipment entering the canal will be required. Waste from the canal, generated during ultimate Decontamination and Decommissioning, shall be managed and disposed as PCB Bulk Product Waste.

  14. Risk-based assessment of the surety of information systems

    SciTech Connect

    Jansma, R.M.; Fletcher, S.K.; Murphy, M.D.; Lim, J.J.; Wyss, G.D.

    1996-07-01

    When software is used in safety-critical, security-critical, or mission-critical situations, it is imperative to understand and manage the risks involved. A risk assessment methodology and toolset have been developed which are specific to software systems and address a broad range of risks including security, safety, and correct operation. A unique aspect of this methodology is the use of a modeling technique that captures interactions and tradeoffs among risk mitigators. This paper describes the concepts and components of the methodology and presents its application to example systems.

  15. A risk-based approach to robotic mission requirements

    NASA Technical Reports Server (NTRS)

    Dias, William C.; Bourke, Roger D.

    1992-01-01

    A NASA Risk Team has developed a method for the application of risk management to the definition of robotic mission requirements for the Space Exploration Initiative. These requirements encompass environmental information, infrastructural emplacement in advance, and either technology testing or system/subsystems demonstration. Attention is presently given to a method for step-by-step consideration and analysis of the risk component inherent in mission architecture, followed by a calculation of the subjective risk level. Mitigation strategies are then applied with the same rules, and a comparison is made.

  16. Probabilistic Climate Scenario Information for Risk Assessment

    NASA Astrophysics Data System (ADS)

    Dairaku, K.; Ueno, G.; Takayabu, I.

    2014-12-01

    Climate information and services for Impacts, Adaptation and Vulnerability (IAV) Assessments are of great concern. In order to develop probabilistic regional climate information that represents the uncertainty in climate scenario experiments in Japan, we compared the physics ensemble experiments using the 60km global atmospheric model of the Meteorological Research Institute (MRI-AGCM) with multi-model ensemble experiments with global atmospheric-ocean coupled models (CMIP3) of SRES A1b scenario experiments. The MRI-AGCM shows relatively good skills particularly in tropics for temperature and geopotential height. Variability in surface air temperature of physical ensemble experiments with MRI-AGCM was within the range of one standard deviation of the CMIP3 model in the Asia region. On the other hand, the variability of precipitation was relatively well represented compared with the variation of the CMIP3 models. Models which show the similar reproducibility in the present climate shows different future climate change. We couldn't find clear relationships between present climate and future climate change in temperature and precipitation. We develop a new method to produce probabilistic information of climate change scenarios by weighting model ensemble experiments based on a regression model (Krishnamurti et al., Science, 1999). The method can be easily applicable to other regions and other physical quantities, and also to downscale to finer-scale dependent on availability of observation dataset. The prototype of probabilistic information in Japan represents the quantified structural uncertainties of multi-model ensemble experiments of climate change scenarios. Acknowledgments: This study was supported by the SOUSEI Program, funded by Ministry of Education, Culture, Sports, Science and Technology, Government of Japan.

  17. Predicting the onset of psychosis in patients at clinical high risk: practical guide to probabilistic prognostic reasoning.

    PubMed

    Fusar-Poli, P; Schultze-Lutter, F

    2016-02-01

    Prediction of psychosis in patients at clinical high risk (CHR) has become a mainstream focus of clinical and research interest worldwide. When using CHR instruments for clinical purposes, the predicted outcome is but only a probability; and, consequently, any therapeutic action following the assessment is based on probabilistic prognostic reasoning. Yet, probabilistic reasoning makes considerable demands on the clinicians. We provide here a scholarly practical guide summarising the key concepts to support clinicians with probabilistic prognostic reasoning in the CHR state. We review risk or cumulative incidence of psychosis in, person-time rate of psychosis, Kaplan-Meier estimates of psychosis risk, measures of prognostic accuracy, sensitivity and specificity in receiver operator characteristic curves, positive and negative predictive values, Bayes' theorem, likelihood ratios, potentials and limits of real-life applications of prognostic probabilistic reasoning in the CHR state. Understanding basic measures used for prognostic probabilistic reasoning is a prerequisite for successfully implementing the early detection and prevention of psychosis in clinical practice. Future refinement of these measures for CHR patients may actually influence risk management, especially as regards initiating or withholding treatment.

  18. Review of the probabilistic failure analysis methodology and other probabilistic approaches for application in aerospace structural design

    NASA Technical Reports Server (NTRS)

    Townsend, J.; Meyers, C.; Ortega, R.; Peck, J.; Rheinfurth, M.; Weinstock, B.

    1993-01-01

    Probabilistic structural analyses and design methods are steadily gaining acceptance within the aerospace industry. The safety factor approach to design has long been the industry standard, and it is believed by many to be overly conservative and thus, costly. A probabilistic approach to design may offer substantial cost savings. This report summarizes several probabilistic approaches: the probabilistic failure analysis (PFA) methodology developed by Jet Propulsion Laboratory, fast probability integration (FPI) methods, the NESSUS finite element code, and response surface methods. Example problems are provided to help identify the advantages and disadvantages of each method.

  19. Subcortical structure segmentation using probabilistic atlas priors

    NASA Astrophysics Data System (ADS)

    Gouttard, Sylvain; Styner, Martin; Joshi, Sarang; Smith, Rachel G.; Cody Hazlett, Heather; Gerig, Guido

    2007-03-01

    The segmentation of the subcortical structures of the brain is required for many forms of quantitative neuroanatomic analysis. The volumetric and shape parameters of structures such as lateral ventricles, putamen, caudate, hippocampus, pallidus and amygdala are employed to characterize a disease or its evolution. This paper presents a fully automatic segmentation of these structures via a non-rigid registration of a probabilistic atlas prior and alongside a comprehensive validation. Our approach is based on an unbiased diffeomorphic atlas with probabilistic spatial priors built from a training set of MR images with corresponding manual segmentations. The atlas building computes an average image along with transformation fields mapping each training case to the average image. These transformation fields are applied to the manually segmented structures of each case in order to obtain a probabilistic map on the atlas. When applying the atlas for automatic structural segmentation, an MR image is first intensity inhomogeneity corrected, skull stripped and intensity calibrated to the atlas. Then the atlas image is registered to the image using an affine followed by a deformable registration matching the gray level intensity. Finally, the registration transformation is applied to the probabilistic maps of each structures, which are then thresholded at 0.5 probability. Using manual segmentations for comparison, measures of volumetric differences show high correlation with our results. Furthermore, the dice coefficient, which quantifies the volumetric overlap, is higher than 62% for all structures and is close to 80% for basal ganglia. The intraclass correlation coefficient computed on these same datasets shows a good inter-method correlation of the volumetric measurements. Using a dataset of a single patient scanned 10 times on 5 different scanners, reliability is shown with a coefficient of variance of less than 2 percents over the whole dataset. Overall, these validation

  20. Analytic gain in probabilistic decompression sickness models.

    PubMed

    Howle, Laurens E

    2013-11-01

    Decompression sickness (DCS) is a disease known to be related to inert gas bubble formation originating from gases dissolved in body tissues. Probabilistic DCS models, which employ survival and hazard functions, are optimized by fitting model parameters to experimental dive data. In the work reported here, I develop methods to find the survival function gain parameter analytically, thus removing it from the fitting process. I show that the number of iterations required for model optimization is significantly reduced. The analytic gain method substantially improves the condition number of the Hessian matrix which reduces the model confidence intervals by more than an order of magnitude. PMID:24209920

  1. Probabilistic Algorithm for Sampler Siting (PASS)

    2007-05-29

    PASS (Probabilistic Approach to Sampler Siting) optimizes the placement of samplers in buildings. The program exhaustively checks every sampler-network that can be formed, evaluating against user-supplied simulations of the possible release scenarios. The program identifies the networks that maximize the probablity of detecting a release from among the suite of user-supllied scenarios. The user may specify how many networks to report, in order to provide a number of choices in cases where many networks havemore » very similar behavior.« less

  2. Probabilistic computer model of optimal runway turnoffs

    NASA Technical Reports Server (NTRS)

    Schoen, M. L.; Preston, O. W.; Summers, L. G.; Nelson, B. A.; Vanderlinden, L.; Mcreynolds, M. C.

    1985-01-01

    Landing delays are currently a problem at major air carrier airports and many forecasters agree that airport congestion will get worse by the end of the century. It is anticipated that some types of delays can be reduced by an efficient optimal runway exist system allowing increased approach volumes necessary at congested airports. A computerized Probabilistic Runway Turnoff Model which locates exits and defines path geometry for a selected maximum occupancy time appropriate for each TERPS aircraft category is defined. The model includes an algorithm for lateral ride comfort limits.

  3. Automatic probabilistic knowledge acquisition from data

    NASA Technical Reports Server (NTRS)

    Gevarter, W. B.

    1986-01-01

    A computer program for extracting significant correlations of attributes from masses of data is outlined. This information can then be used to develop a knowledge base for a probabilistic expert system. The method determines the best estimate of joint probabilities of attributes from data put into contingency table form. A major output from the program is a general formula for calculating any probability relation associated with the data. These probability relations can be utilized to form IF-THEN rules with associated probability, useful for expert systems.

  4. A periodic probabilistic photonic cluster state generator

    NASA Astrophysics Data System (ADS)

    Fanto, Michael L.; Smith, A. Matthew; Alsing, Paul M.; Tison, Christopher C.; Preble, Stefan F.; Lott, Gordon E.; Osman, Joseph M.; Szep, Attila; Kim, Richard S.

    2014-10-01

    The research detailed in this paper describes a Periodic Cluster State Generator (PCSG) consisting of a monolithic integrated waveguide device that employs four wave mixing, an array of probabilistic photon guns, single mode sequential entanglers and an array of controllable entangling gates between modes to create arbitrary cluster states. Utilizing the PCSG one is able to produce a cluster state with nearest neighbor entanglement in the form of a linear or square lattice. Cluster state resources of this type have been proven to be able to perform universal quantum computation.

  5. Probabilistic remote state preparation by W states

    NASA Astrophysics Data System (ADS)

    Liu, Jin-Ming; Wang, Yu-Zhu

    2004-02-01

    In this paper we consider a scheme for probabilistic remote state preparation of a general qubit by using W states. The scheme consists of the sender, Alice and two remote receivers Bob and Carol. Alice performs a projective measurement on her qubit in the basis spanned by the state she wants to prepare and its orthocomplement. This allows either Bob or Carol to reconstruct the state with finite success probability. It is shown that for some special ensembles of qubits, the remote state preparation scheme requires only two classical bits, unlike the case in the scheme of quantum teleportation where three classical bits are needed.

  6. Analytic gain in probabilistic decompression sickness models.

    PubMed

    Howle, Laurens E

    2013-11-01

    Decompression sickness (DCS) is a disease known to be related to inert gas bubble formation originating from gases dissolved in body tissues. Probabilistic DCS models, which employ survival and hazard functions, are optimized by fitting model parameters to experimental dive data. In the work reported here, I develop methods to find the survival function gain parameter analytically, thus removing it from the fitting process. I show that the number of iterations required for model optimization is significantly reduced. The analytic gain method substantially improves the condition number of the Hessian matrix which reduces the model confidence intervals by more than an order of magnitude.

  7. Probabilistic analysis of fires in nuclear plants

    SciTech Connect

    Unione, A.; Teichmann, T.

    1985-01-01

    The aim of this paper is to describe a multilevel (i.e., staged) probabilistic analysis of fire risks in nuclear plants (as part of a general PRA) which maximizes the benefits of the FRA (fire risk assessment) in a cost effective way. The approach uses several stages of screening, physical modeling of clearly dominant risk contributors, searches for direct (e.g., equipment dependences) and secondary (e.g., fire induced internal flooding) interactions, and relies on lessons learned and available data from and surrogate FRAs. The general methodology is outlined. 6 figs., 10 tabs.

  8. Ensemble postprocessing for probabilistic quantitative precipitation forecasts

    NASA Astrophysics Data System (ADS)

    Bentzien, S.; Friederichs, P.

    2012-12-01

    Precipitation is one of the most difficult weather variables to predict in hydrometeorological applications. In order to assess the uncertainty inherent in deterministic numerical weather prediction (NWP), meteorological services around the globe develop ensemble prediction systems (EPS) based on high-resolution NWP systems. With non-hydrostatic model dynamics and without parameterization of deep moist convection, high-resolution NWP models are able to describe convective processes in more detail and provide more realistic mesoscale structures. However, precipitation forecasts are still affected by displacement errors, systematic biases and fast error growth on small scales. Probabilistic guidance can be achieved from an ensemble setup which accounts for model error and uncertainty of initial and boundary conditions. The German Meteorological Service (Deutscher Wetterdienst, DWD) provides such an ensemble system based on the German-focused limited-area model COSMO-DE. With a horizontal grid-spacing of 2.8 km, COSMO-DE is the convection-permitting high-resolution part of the operational model chain at DWD. The COSMO-DE-EPS consists of 20 realizations of COSMO-DE, driven by initial and boundary conditions derived from 4 global models and 5 perturbations of model physics. Ensemble systems like COSMO-DE-EPS are often limited with respect to ensemble size due to the immense computational costs. As a consequence, they can be biased and exhibit insufficient ensemble spread, and probabilistic forecasts may be not well calibrated. In this study, probabilistic quantitative precipitation forecasts are derived from COSMO-DE-EPS and evaluated at more than 1000 rain gauges located all over Germany. COSMO-DE-EPS is a frequently updated ensemble system, initialized 8 times a day. We use the time-lagged approach to inexpensively increase ensemble spread, which results in more reliable forecasts especially for extreme precipitation events. Moreover, we will show that statistical

  9. Holistic risk-based environmental decision making: a Native perspective.

    PubMed Central

    Arquette, Mary; Cole, Maxine; Cook, Katsi; LaFrance, Brenda; Peters, Margaret; Ransom, James; Sargent, Elvera; Smoke, Vivian; Stairs, Arlene

    2002-01-01

    Native American Nations have become increasingly concerned about the impacts of toxic substances. Although risk assessment and risk management processes have been used by government agencies to help estimate and manage risks associated with exposure to toxicants, these tools have many inadequacies and as a result have not served Native people well. In addition, resources have not always been adequate to address the concerns of Native Nations, and involvement of Native decision makers on a government-to-government basis in discussions regarding risk has only recently become common. Finally, because the definitions of health used by Native people are strikingly different from that of risk assessors, there is also a need to expand current definitions and incorporate traditional knowledge into decision making. Examples are discussed from the First Environment Restoration Initiative, a project that is working to address toxicant issues facing the Mohawk territory of Akwesasne. This project is developing a community-defined model in which health is protected at the same time that traditional cultural practices, which have long been the key to individual and community health, are maintained and restored. PMID:11929736

  10. The probabilistic cell: implementation of a probabilistic inference by the biochemical mechanisms of phototransduction.

    PubMed

    Houillon, Audrey; Bessière, Pierre; Droulez, Jacques

    2010-09-01

    When we perceive the external world, our brain has to deal with the incompleteness and uncertainty associated with sensory inputs, memory and prior knowledge. In theoretical neuroscience probabilistic approaches have received a growing interest recently, as they account for the ability to reason with incomplete knowledge and to efficiently describe perceptive and behavioral tasks. How can the probability distributions that need to be estimated in these models be represented and processed in the brain, in particular at the single cell level? We consider the basic function carried out by photoreceptor cells which consists in detecting the presence or absence of light. We give a system-level understanding of the process of phototransduction based on a bayesian formalism: we show that the process of phototransduction is equivalent to a temporal probabilistic inference in a Hidden Markov Model (HMM), for estimating the presence or absence of light. Thus, the biochemical mechanisms of phototransduction underlie the estimation of the current state probability distribution of the presence of light. A classical descriptive model describes the interactions between the different molecular messengers, ions, enzymes and channel proteins occurring within the photoreceptor by a set of nonlinear coupled differential equations. In contrast, the probabilistic HMM model is described by a discrete recurrence equation. It appears that the binary HMM has a general solution in the case of constant input. This allows a detailed analysis of the dynamics of the system. The biochemical system and the HMM behave similarly under steady-state conditions. Consequently a formal equivalence can be found between the biochemical system and the HMM. Numerical simulations further extend the results to the dynamic case and to noisy input. All in all, we have derived a probabilistic model equivalent to a classical descriptive model of phototransduction, which has the additional advantage of assigning a

  11. 12 CFR 652.80 - When you must determine the risk-based capital level.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... activity that could have a significant effect on capital, you must determine a pro forma risk-based capital level, which must include the new business activity, and report this pro forma determination to...

  12. 12 CFR 652.80 - When you must determine the risk-based capital level.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... activity that could have a significant effect on capital, you must determine a pro forma risk-based capital level, which must include the new business activity, and report this pro forma determination to...

  13. A quantitative risk-based model for reasoning over critical system properties

    NASA Technical Reports Server (NTRS)

    Feather, M. S.

    2002-01-01

    This position paper suggests the use of a quantitative risk-based model to help support reeasoning and decision making that spans many of the critical properties such as security, safety, survivability, fault tolerance, and real-time.

  14. 12 CFR 956.4 - Risk-based capital requirement for investments.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... OFF-BALANCE SHEET ITEMS FEDERAL HOME LOAN BANK INVESTMENTS § 956.4 Risk-based capital requirement for... below the second highest credit rating, in an amount equal to or greater than the outstanding balance...

  15. Enhancement of the Probabilistic CEramic Matrix Composite ANalyzer (PCEMCAN) Computer Code

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin

    2000-01-01

    This report represents a final technical report for Order No. C-78019-J entitled "Enhancement of the Probabilistic Ceramic Matrix Composite Analyzer (PCEMCAN) Computer Code." The scope of the enhancement relates to including the probabilistic evaluation of the D-Matrix terms in MAT2 and MAT9 material properties card (available in CEMCAN code) for the MSC/NASTRAN. Technical activities performed during the time period of June 1, 1999 through September 3, 1999 have been summarized, and the final version of the enhanced PCEMCAN code and revisions to the User's Manual is delivered along with. Discussions related to the performed activities were made to the NASA Project Manager during the performance period. The enhanced capabilities have been demonstrated using sample problems.

  16. 12 CFR Appendix A to Subpart B of... - Risk-Based Capital Stress Test

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 7 2014-01-01 2014-01-01 false Risk-Based Capital Stress Test A Appendix A to... Appendix A to Subpart B of Part 652— Risk-Based Capital Stress Test 2.0 Credit Risk. 2.1 Loss-Frequency and... Volume. 2.5 Calculation of Loss Rates for Use in the Stress Test for All Types of Loans, Except...

  17. 12 CFR Appendix A to Subpart B of... - Risk-Based Capital Stress Test

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 7 2012-01-01 2012-01-01 false Risk-Based Capital Stress Test A Appendix A to... Appendix A to Subpart B of Part 652— Risk-Based Capital Stress Test 2.0Credit Risk. 2.1Loss-Frequency and... Volume. 2.5Calculation of Loss Rates for Use in the Stress Test for All Types of Loans, Except...

  18. 12 CFR Appendix A to Subpart B of... - Risk-Based Capital Stress Test

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 7 2013-01-01 2013-01-01 false Risk-Based Capital Stress Test A Appendix A to... Appendix A to Subpart B of Part 652— Risk-Based Capital Stress Test 2.0Credit Risk. 2.1Loss-Frequency and... Volume. 2.5Calculation of Loss Rates for Use in the Stress Test for All Types of Loans, Except...

  19. Modelling structured data with Probabilistic Graphical Models

    NASA Astrophysics Data System (ADS)

    Forbes, F.

    2016-05-01

    Most clustering and classification methods are based on the assumption that the objects to be clustered are independent. However, in more and more modern applications, data are structured in a way that makes this assumption not realistic and potentially misleading. A typical example that can be viewed as a clustering task is image segmentation where the objects are the pixels on a regular grid and depend on neighbouring pixels on this grid. Also, when data are geographically located, it is of interest to cluster data with an underlying dependence structure accounting for some spatial localisation. These spatial interactions can be naturally encoded via a graph not necessarily regular as a grid. Data sets can then be modelled via Markov random fields and mixture models (e.g. the so-called MRF and Hidden MRF). More generally, probabilistic graphical models are tools that can be used to represent and manipulate data in a structured way while modeling uncertainty. This chapter introduces the basic concepts. The two main classes of probabilistic graphical models are considered: Bayesian networks and Markov networks. The key concept of conditional independence and its link to Markov properties is presented. The main problems that can be solved with such tools are described. Some illustrations are given associated with some practical work.

  20. Spatial planning using probabilistic flood maps

    NASA Astrophysics Data System (ADS)

    Alfonso, Leonardo; Mukolwe, Micah; Di Baldassarre, Giuliano

    2015-04-01

    Probabilistic flood maps account for uncertainty in flood inundation modelling and convey a degree of certainty in the outputs. Major sources of uncertainty include input data, topographic data, model structure, observation data and parametric uncertainty. Decision makers prefer less ambiguous information from modellers; this implies that uncertainty is suppressed to yield binary flood maps. Though, suppressing information may potentially lead to either surprise or misleading decisions. Inclusion of uncertain information in the decision making process is therefore desirable and transparent. To this end, we utilise the Prospect theory and information from a probabilistic flood map to evaluate potential decisions. Consequences related to the decisions were evaluated using flood risk analysis. Prospect theory explains how choices are made given options for which probabilities of occurrence are known and accounts for decision makers' characteristics such as loss aversion and risk seeking. Our results show that decision making is pronounced when there are high gains and loss, implying higher payoffs and penalties, therefore a higher gamble. Thus the methodology may be appropriately considered when making decisions based on uncertain information.

  1. Probabilistic deployment for multiple sensor systems

    NASA Astrophysics Data System (ADS)

    Qian, Ming; Ferrari, Silvia

    2005-05-01

    The performance of many multi-sensor systems can be significantly improved by using a priori environmental information and sensor data to plan the movements of sensor platforms that are later deployed with the purpose of improving the quality of the final detection and classification results. However, existing path planning algorithms and ad-hoc data processing (e.g., fusion) techniques do not allow for the systematic treatment of multiple and heterogeneous sensors and their platforms. This paper presents a method that combines Bayesian network inference with probabilistic roadmap (PRM) planners to utilize the information obtained by different sensors and their level of uncertainty. The uncertainty of prior sensed information is represented by entropy values obtained from the Bayesian network (BN) models of the respective sensor measurement processes. The PRM algorithm is modified to utilize the entropy distribution in optimizing the path of posterior sensor platforms that have the following objectives: (1) improve the quality of the sensed information, i.e., through fusion, (2) minimize the distance traveled by the platforms, and (3) avoid obstacles. This so-called Probabilistic Deployment (PD) method is applied to a demining system comprised of ground-penetrating radars (GPR), electromagnetic (EMI), and infrared sensors (IR) installed on ground platforms, to detect and classify buried mines. Numerical simulations show that PD is more efficient than path planning techniques that do not utilize a priori information, such as complete coverage, random coverage method, or PRM methods that do not utilize Bayesian inference.

  2. Probabilistic Dynamic Buckling of Smart Composite Shells

    NASA Technical Reports Server (NTRS)

    Abumeri, Galib H.; Chamis, Christos C.

    2003-01-01

    A computational simulation method is presented to evaluate the deterministic and nondeterministic dynamic buckling of smart composite shells. The combined use of composite mechanics, finite element computer codes, and probabilistic analysis enable the effective assessment of the dynamic buckling load of smart composite shells. A universal plot is generated to estimate the dynamic buckling load of composite shells at various load rates and probabilities. The shell structure is also evaluated with smart fibers embedded in the plies right below the outer plies. The results show that, on the average, the use of smart fibers improved the shell buckling resistance by about 10 percent at different probabilities and delayed the buckling occurrence time. The probabilistic sensitivities results indicate that uncertainties in the fiber volume ratio and ply thickness have major effects on the buckling load while uncertainties in the electric field strength and smart material volume fraction have moderate effects. For the specific shell considered in this evaluation, the use of smart composite material is not recommended because the shell buckling resistance can be improved by simply re-arranging the orientation of the outer plies, as shown in the dynamic buckling analysis results presented in this report.

  3. Probabilistic Analysis of a Composite Crew Module

    NASA Technical Reports Server (NTRS)

    Mason, Brian H.; Krishnamurthy, Thiagarajan

    2011-01-01

    An approach for conducting reliability-based analysis (RBA) of a Composite Crew Module (CCM) is presented. The goal is to identify and quantify the benefits of probabilistic design methods for the CCM and future space vehicles. The coarse finite element model from a previous NASA Engineering and Safety Center (NESC) project is used as the baseline deterministic analysis model to evaluate the performance of the CCM using a strength-based failure index. The first step in the probabilistic analysis process is the determination of the uncertainty distributions for key parameters in the model. Analytical data from water landing simulations are used to develop an uncertainty distribution, but such data were unavailable for other load cases. The uncertainty distributions for the other load scale factors and the strength allowables are generated based on assumed coefficients of variation. Probability of first-ply failure is estimated using three methods: the first order reliability method (FORM), Monte Carlo simulation, and conditional sampling. Results for the three methods were consistent. The reliability is shown to be driven by first ply failure in one region of the CCM at the high altitude abort load set. The final predicted probability of failure is on the order of 10-11 due to the conservative nature of the factors of safety on the deterministic loads.

  4. Damage identification with probabilistic neural networks

    SciTech Connect

    Klenke, S.E.; Paez, T.L.

    1995-12-01

    This paper investigates the use of artificial neural networks (ANNs) to identify damage in mechanical systems. Two probabilistic neural networks (PNNs) are developed and used to judge whether or not damage has occurred in a specific mechanical system, based on experimental measurements. The first PNN is a classical type that casts Bayesian decision analysis into an ANN framework, it uses exemplars measured from the undamaged and damaged system to establish whether system response measurements of unknown origin come from the former class (undamaged) or the latter class (damaged). The second PNN establishes the character of the undamaged system in terms of a kernel density estimator of measures of system response; when presented with system response measures of unknown origin, it makes a probabilistic judgment whether or not the data come from the undamaged population. The physical system used to carry out the experiments is an aerospace system component, and the environment used to excite the system is a stationary random vibration. The results of damage identification experiments are presented along with conclusions rating the effectiveness of the approaches.

  5. Variational approach to probabilistic finite elements

    NASA Technical Reports Server (NTRS)

    Belytschko, T.; Liu, W. K.; Mani, A.; Besterfield, G.

    1991-01-01

    Probabilistic finite element methods (PFEM), synthesizing the power of finite element methods with second-moment techniques, are formulated for various classes of problems in structural and solid mechanics. Time-invariant random materials, geometric properties and loads are incorporated in terms of their fundamental statistics viz. second-moments. Analogous to the discretization of the displacement field in finite element methods, the random fields are also discretized. Preserving the conceptual simplicity, the response moments are calculated with minimal computations. By incorporating certain computational techniques, these methods are shown to be capable of handling large systems with many sources of uncertainties. By construction, these methods are applicable when the scale of randomness is not very large and when the probabilistic density functions have decaying tails. The accuracy and efficiency of these methods, along with their limitations, are demonstrated by various applications. Results obtained are compared with those of Monte Carlo simulation and it is shown that good accuracy can be obtained for both linear and nonlinear problems. The methods are amenable to implementation in deterministic FEM based computer codes.

  6. A probabilistic bridge safety evaluation against floods.

    PubMed

    Liao, Kuo-Wei; Muto, Yasunori; Chen, Wei-Lun; Wu, Bang-Ho

    2016-01-01

    To further capture the influences of uncertain factors on river bridge safety evaluation, a probabilistic approach is adopted. Because this is a systematic and nonlinear problem, MPP-based reliability analyses are not suitable. A sampling approach such as a Monte Carlo simulation (MCS) or importance sampling is often adopted. To enhance the efficiency of the sampling approach, this study utilizes Bayesian least squares support vector machines to construct a response surface followed by an MCS, providing a more precise safety index. Although there are several factors impacting the flood-resistant reliability of a bridge, previous experiences and studies show that the reliability of the bridge itself plays a key role. Thus, the goal of this study is to analyze the system reliability of a selected bridge that includes five limit states. The random variables considered here include the water surface elevation, water velocity, local scour depth, soil property and wind load. Because the first three variables are deeply affected by river hydraulics, a probabilistic HEC-RAS-based simulation is performed to capture the uncertainties in those random variables. The accuracy and variation of our solutions are confirmed by a direct MCS to ensure the applicability of the proposed approach. The results of a numerical example indicate that the proposed approach can efficiently provide an accurate bridge safety evaluation and maintain satisfactory variation. PMID:27386269

  7. A probabilistic bridge safety evaluation against floods.

    PubMed

    Liao, Kuo-Wei; Muto, Yasunori; Chen, Wei-Lun; Wu, Bang-Ho

    2016-01-01

    To further capture the influences of uncertain factors on river bridge safety evaluation, a probabilistic approach is adopted. Because this is a systematic and nonlinear problem, MPP-based reliability analyses are not suitable. A sampling approach such as a Monte Carlo simulation (MCS) or importance sampling is often adopted. To enhance the efficiency of the sampling approach, this study utilizes Bayesian least squares support vector machines to construct a response surface followed by an MCS, providing a more precise safety index. Although there are several factors impacting the flood-resistant reliability of a bridge, previous experiences and studies show that the reliability of the bridge itself plays a key role. Thus, the goal of this study is to analyze the system reliability of a selected bridge that includes five limit states. The random variables considered here include the water surface elevation, water velocity, local scour depth, soil property and wind load. Because the first three variables are deeply affected by river hydraulics, a probabilistic HEC-RAS-based simulation is performed to capture the uncertainties in those random variables. The accuracy and variation of our solutions are confirmed by a direct MCS to ensure the applicability of the proposed approach. The results of a numerical example indicate that the proposed approach can efficiently provide an accurate bridge safety evaluation and maintain satisfactory variation.

  8. Probabilistic stellar rotation periods with Gaussian processes

    NASA Astrophysics Data System (ADS)

    Angus, Ruth; Aigrain, Suzanne; Foreman-Mackey, Daniel

    2015-08-01

    Stellar rotation has many applications in the field of exoplanets. High-precision photometry from space-based missions like Kepler and K2 allows us to measure stellar rotation periods directly from light curves. Stellar variability produced by rotation is usually not sinusoidal or perfectly periodic, therefore sine-fitting periodograms are not well suited to rotation period measurement. Autocorrelation functions are often used to extract periodic information from light curves, however uncertainties on rotation periods measured by autocorrelation are difficult to define. A ‘by eye’ check, or a set of heuristic criteria are used to validate measurements and rotation periods are only reported for stars that pass this vetting process. A probabilistic rotation period measurement method, with a suitable generative model bypasses the need for a validation stage and can produce realistic uncertainties. The physics driving the production of variability in stellar light curves is still poorly understood and difficult to model. We therefore use an effective model for stellar variability: a Gaussian process with a quasi-periodic covariance function. By injecting fake signals into Kepler light curves we show that the GP model is well suited to quasi-periodic, non-sinusoidal signals, is capable of modelling noise and physical signals simultaneously and provides probabilistic rotation period measurements with realistic uncertainties.

  9. Probabilistic Dynamic Buckling of Smart Composite Shells

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Abumeri, Galib H.

    2007-01-01

    A computational simulation method is presented to evaluate the deterministic and nondeterministic dynamic buckling of smart composite shells. The combined use of intraply hybrid composite mechanics, finite element computer codes, and probabilistic analysis enable the effective assessment of the dynamic buckling load of smart composite shells. A universal plot is generated to estimate the dynamic buckling load of composite shells at various load rates and probabilities. The shell structure is also evaluated with smart fibers embedded in the plies right next to the outer plies. The results show that, on the average, the use of smart fibers improved the shell buckling resistance by about 10% at different probabilities and delayed the buckling occurrence time. The probabilistic sensitivities results indicate that uncertainties in the fiber volume ratio and ply thickness have major effects on the buckling load while uncertainties in the electric field strength and smart material volume fraction have moderate effects. For the specific shell considered in this evaluation, the use of smart composite material is not recommended because the shell buckling resistance can be improved by simply re-arranging the orientation of the outer plies, as shown in the dynamic buckling analysis results presented in this report.

  10. Probabilistic Fatigue Life Analysis of High Density Electronics Packaging

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Kolawa, E. A.; Sutharshana, S.; Newlin, L. E.; Creager, M.

    1996-01-01

    The fatigue of thin film metal interconnections in high density electronics packaging subjected to thermal cycling has been evaluated using a probabilistic fracture mechanics methodology. This probabilistic methodology includes characterization of thin film stress using an experimentally calibrated finite element model and simulation of flaw growth in the thin films using a stochastic crack growth model.

  11. Perception of Speech Reflects Optimal Use of Probabilistic Speech Cues

    ERIC Educational Resources Information Center

    Clayards, Meghan; Tanenhaus, Michael K.; Aslin, Richard N.; Jacobs, Robert A.

    2008-01-01

    Listeners are exquisitely sensitive to fine-grained acoustic detail within phonetic categories for sounds and words. Here we show that this sensitivity is optimal given the probabilistic nature of speech cues. We manipulated the probability distribution of one probabilistic cue, voice onset time (VOT), which differentiates word initial labial…

  12. The Role of Language in Building Probabilistic Thinking

    ERIC Educational Resources Information Center

    Nacarato, Adair Mendes; Grando, Regina Célia

    2014-01-01

    This paper is based on research that investigated the development of probabilistic language and thinking by students 10-12 years old. The focus was on the adequate use of probabilistic terms in social practice. A series of tasks was developed for the investigation and completed by the students working in groups. The discussions were video recorded…

  13. A framework for probabilistic atlas-based organ segmentation

    NASA Astrophysics Data System (ADS)

    Dong, Chunhua; Chen, Yen-Wei; Foruzan, Amir Hossein; Han, Xian-Hua; Tateyama, Tomoko; Wu, Xing

    2016-03-01

    Probabilistic atlas based on human anatomical structure has been widely used for organ segmentation. The challenge is how to register the probabilistic atlas to the patient volume. Additionally, there is the disadvantage that the conventional probabilistic atlas may cause a bias toward the specific patient study due to a single reference. Hence, we propose a template matching framework based on an iterative probabilistic atlas for organ segmentation. Firstly, we find a bounding box for the organ based on human anatomical localization. Then, the probabilistic atlas is used as a template to find the organ in this bounding box by using template matching technology. Comparing our method with conventional and recently developed atlas-based methods, our results show an improvement in the segmentation accuracy for multiple organs (p < 0:00001).

  14. Nuclear power and probabilistic safety assessment (PSA): past through future applications

    NASA Astrophysics Data System (ADS)

    Stamatelatos, M. G.; Moieni, P.; Everline, C. J.

    1995-03-01

    Nuclear power reactor safety in the United States is about to enter a new era -- an era of risk- based management and risk-based regulation. First, there was the age of `prescribed safety assessment,' during which a series of design-basis accidents in eight categories of severity, or classes, were postulated and analyzed. Toward the end of that era, it was recognized that `Class 9,' or `beyond design basis,' accidents would need special attention because of the potentially severe health and financial consequences of these accidents. The accident at Three Mile Island showed that sequences of low-consequence, high-frequency events and human errors can be much more risk dominant than the Class 9 accidents. A different form of safety assessment, PSA, emerged and began to gain ground against the deterministic safety establishment. Eventually, this led to the current regulatory requirements for individual plant examinations (IPEs). The IPEs can serve as a basis for risk-based regulation and management, a concept that may ultimately transform the U.S. regulatory process from its traditional deterministic foundations to a process predicated upon PSA. Beyond the possibility of a regulatory environment predicated upon PSA lies the possibility of using PSA as the foundation for managing daily nuclear power plant operations.

  15. Opportunities of probabilistic flood loss models

    NASA Astrophysics Data System (ADS)

    Schröter, Kai; Kreibich, Heidi; Lüdtke, Stefan; Vogel, Kristin; Merz, Bruno

    2016-04-01

    Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. However, reliable flood damage models are a prerequisite for the practical usefulness of the model results. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks and traditional stage damage functions. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005, 2006 and 2013 in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of sharpness of the predictions the reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The comparison of the uni-variable Stage damage function and the multivariable model approach emphasises the importance to quantify predictive uncertainty. With each explanatory variable, the multi-variable model reveals an additional source of uncertainty. However, the predictive performance in terms of precision (mbe), accuracy (mae) and reliability (HR) is clearly improved

  16. Multivariate postprocessing techniques for probabilistic hydrological forecasting

    NASA Astrophysics Data System (ADS)

    Hemri, Stephan; Lisniak, Dmytro; Klein, Bastian

    2016-04-01

    Hydrologic ensemble forecasts driven by atmospheric ensemble prediction systems need statistical postprocessing in order to account for systematic errors in terms of both mean and spread. Runoff is an inherently multivariate process with typical events lasting from hours in case of floods to weeks or even months in case of droughts. This calls for multivariate postprocessing techniques that yield well calibrated forecasts in univariate terms and ensure a realistic temporal dependence structure at the same time. To this end, the univariate ensemble model output statistics (EMOS; Gneiting et al., 2005) postprocessing method is combined with two different copula approaches that ensure multivariate calibration throughout the entire forecast horizon. These approaches comprise ensemble copula coupling (ECC; Schefzik et al., 2013), which preserves the dependence structure of the raw ensemble, and a Gaussian copula approach (GCA; Pinson and Girard, 2012), which estimates the temporal correlations from training observations. Both methods are tested in a case study covering three subcatchments of the river Rhine that represent different sizes and hydrological regimes: the Upper Rhine up to the gauge Maxau, the river Moselle up to the gauge Trier, and the river Lahn up to the gauge Kalkofen. The results indicate that both ECC and GCA are suitable for modelling the temporal dependences of probabilistic hydrologic forecasts (Hemri et al., 2015). References Gneiting, T., A. E. Raftery, A. H. Westveld, and T. Goldman (2005), Calibrated probabilistic forecasting using ensemble model output statistics and minimum CRPS estimation, Monthly Weather Review, 133(5), 1098-1118, DOI: 10.1175/MWR2904.1. Hemri, S., D. Lisniak, and B. Klein, Multivariate postprocessing techniques for probabilistic hydrological forecasting, Water Resources Research, 51(9), 7436-7451, DOI: 10.1002/2014WR016473. Pinson, P., and R. Girard (2012), Evaluating the quality of scenarios of short-term wind power

  17. Probabilistic Flash Flood Forecasting using Stormscale Ensembles

    NASA Astrophysics Data System (ADS)

    Hardy, J.; Gourley, J. J.; Kain, J. S.; Clark, A.; Novak, D.; Hong, Y.

    2013-12-01

    Flash flooding is one of the most costly and deadly natural hazards in the US and across the globe. The loss of life and property from flash floods could be mitigated with better guidance from hydrological models, but these models have limitations. For example, they are commonly initialized using rainfall estimates derived from weather radars, but the time interval between observations of heavy rainfall and a flash flood can be on the order of minutes, particularly for small basins in urban settings. Increasing the lead time for these events is critical for protecting life and property. Therefore, this study advances the use of quantitative precipitation forecasts (QPFs) from a stormscale NWP ensemble system into a distributed hydrological model setting to yield basin-specific, probabilistic flash flood forecasts (PFFFs). Rainfall error characteristics of the individual members are first diagnosed and quantified in terms of structure, amplitude, and location (SAL; Wernli et al., 2008). Amplitude and structure errors are readily correctable due to their diurnal nature, and the fine scales represented by the CAPS QPF members are consistent with radar-observed rainfall, mainly showing larger errors with afternoon convection. To account for the spatial uncertainty of the QPFs, we use an elliptic smoother, as in Marsh et al. (2012), to produce probabilistic QPFs (PQPFs). The elliptic smoother takes into consideration underdispersion, which is notoriously associated with stormscale ensembles, and thus, is good for targeting the approximate regions that may receive heavy rainfall. However, stormscale details contained in individual members are still needed to yield reasonable flash flood simulations. Therefore, on a case study basis, QPFs from individual members are then run through the hydrological model with their predicted structure and corrected amplitudes, but the locations of individual rainfall elements are perturbed within the PQPF elliptical regions using Monte

  18. Risk-Based Ranking Experiences for Cold War Legacy Facilities in the United States

    SciTech Connect

    Droppo, James G.

    2003-05-01

    Over the past two decades, a number of government agencies in the United States have faced increasing public scrutiny for their efforts to address the wide range of potential environmental issues related to Cold War legacies. Risk-based ranking was selected as a means of defining the relative importance of issues. Ambitious facility-wide risk-based ranking applications were undertaken. However, although facility-wide risk-based ranking efforts can build invaluable understanding of the potential issues related to Cold War legacies, conducting such efforts is difficult because of the potentially enormous scope and the potentially strong institutional barriers. The U.S. experience is that such efforts are worth undertaking to start building a knowledge base and infrastructure that are based on a thorough understanding of risk. In both the East and the West, the legacy of the Cold War includes a wide range of potential environmental issues associated with large industrial complexes of weapon production facilities. The responsible agencies or ministries are required to make decisions that could benefit greatly from information on the relative importance of these potential issues. Facility-wide risk-based ranking of potential health and environmental issues is one means to help these decision makers. The initial U.S. risk-based ranking applications described in this chapter were “ground-breaking” in that they defined new methodologies and approaches to meet the challenges. Many of these approaches fit the designation of a population-centred risk assessment. These U.S. activities parallel efforts that are just beginning for similar facilities in the countries of the former Soviet Union. As described below, conducting a facility-wide risk-based ranking has special challenges and potential pitfalls. Little guidance exists to conduct major risk-based rankings. For those considering undertaking such efforts, the material contained in this chapter should be useful

  19. EXAMPLE OF A RISK BASED DISPOSAL APPROVAL SOLIDIFICATION OF HANFORD SITE TRANSURANIC (TRU) WASTE

    SciTech Connect

    PRIGNANO AL

    2007-11-14

    The Hanford Site requested, and the U.S. Environmental Protection Agency (EPA) Region 10 approved, a Toxic Substances Control Act of 1976 (TSCA) risk-based disposal approval (RBDA) for solidifying approximately four cubic meters of waste from a specific area of one of the K East Basin: the North Loadout Pit (NLOP). The NLOP waste is a highly radioactive sludge that contained polychlorinated biphenyls (PCBs) regulated under TSCA. The prescribed disposal method for liquid PCB waste under TSCA regulations is either thermal treatment or decontamination. Due to the radioactive nature of the waste, however, neither thermal treatment nor decontamination was a viable option. As a result, the proposed treatment consisted of solidifying the material to comply with waste acceptance criteria at the Waste Isolation Pilot Plant (WPP) in Carlsbad, New Mexico, or possibly the Environmental Restoration Disposal Facility at the Hanford Site, depending on the resulting transuranic (TRU) content of the stabilized waste. The RBDA evaluated environmental risks associated with potential airborne PCBs. In addition, the RBDA made use of waste management controls already in place at the treatment unit. The treatment unit, the T Plant Complex, is a Resource Conservation and Recovery Act of 1976 (RCRA)-permitted facility used for storing and treating radioactive waste. The EPA found that the proposed activities did not pose an unreasonable risk to human health or the environment. Treatment took place from October 26,2005 to June 9,2006, and 332 208-liter (55-gallon) containers of solidified waste were produced. All treated drums assayed to date are TRU and will be disposed at WIPP.

  20. A generic risk-based surveying method for invading plant pathogens.

    PubMed

    Parnell, S; Gottwald, T R; Riley, T; van den Bosch, F

    2014-06-01

    Invasive plant pathogens are increasing with international trade and travel, with damaging environmental and economic consequences. Recent examples include tree diseases such as sudden oak death in the Western United States and ash dieback in Europe. To control an invading pathogen it is crucial that newly infected sites are quickly detected so that measures can be implemented to control the epidemic. However, since sampling resources are often limited, not all locations can be inspected and locations must be prioritized for surveying. Existing approaches to achieve this are often species specific and rely on detailed data collection and parameterization, which is difficult, especially when new arrivals are unanticipated. Consequently regulatory sampling responses are often ad hoc and developed without due consideration of epidemiology, leading to the suboptimal deployment of expensive sampling resources. We introduce a flexible risk-based sampling method that is pathogen generic and enables available information to be utilized to develop epidemiologically informed sampling programs for virtually any biologically relevant plant pathogen. By targeting risk we aim to inform sampling schemes that identify high-impact locations that can be subsequently treated in order to reduce inoculum in the landscape. This "damage limitation" is often the initial management objective following the first discovery of a new invader. Risk at each location is determined by the product of the basic reproductive number (R0), as a measure of local epidemic size, and the probability of infection. We illustrate how the risk estimates can be used to prioritize a survey by weighting a random sample so that the highest-risk locations have the highest probability of selection. We demonstrate and test the method using a high-quality spatially and temporally resolved data set on Huanglongbing disease (HLB) in Florida, USA. We show that even when available epidemiological information is relatively

  1. Thrombocytosis: Diagnostic Evaluation, Thrombotic Risk Stratification, and Risk-Based Management Strategies

    PubMed Central

    Bleeker, Jonathan S.; Hogan, William J.

    2011-01-01

    Thrombocytosis is a commonly encountered clinical scenario, with a large proportion of cases discovered incidentally. The differential diagnosis for thrombocytosis is broad and the diagnostic process can be challenging. Thrombocytosis can be spurious, attributed to a reactive process or due to clonal disorder. This distinction is important as it carries implications for evaluation, prognosis, and treatment. Clonal thrombocytosis associated with the myeloproliferative neoplasms, especially essential thrombocythemia and polycythemia vera, carries a unique prognostic profile, with a markedly increased risk of thrombosis. This risk is the driving factor behind treatment strategies in these disorders. Clinical trials utilizing targeted therapies in thrombocytosis are ongoing with new therapeutic targets waiting to be explored. This paper will outline the mechanisms underlying thrombocytosis, the diagnostic evaluation of thrombocytosis, complications of thrombocytosis with a special focus on thrombotic risk as well as treatment options for clonal processes leading to thrombocytosis, including essential thrombocythemia and polycythemia vera. PMID:22084665

  2. Augmenting Probabilistic Risk Assesment with Malevolent Initiators

    SciTech Connect

    Curtis Smith; David Schwieder

    2011-11-01

    As commonly practiced, the use of probabilistic risk assessment (PRA) in nuclear power plants only considers accident initiators such as natural hazards, equipment failures, and human error. Malevolent initiators are ignored in PRA, but are considered the domain of physical security, which uses vulnerability assessment based on an officially specified threat (design basis threat). This paper explores the implications of augmenting and extending existing PRA models by considering new and modified scenarios resulting from malevolent initiators. Teaming the augmented PRA models with conventional vulnerability assessments can cost-effectively enhance security of a nuclear power plant. This methodology is useful for operating plants, as well as in the design of new plants. For the methodology, we have proposed an approach that builds on and extends the practice of PRA for nuclear power plants for security-related issues. Rather than only considering 'random' failures, we demonstrated a framework that is able to represent and model malevolent initiating events and associated plant impacts.

  3. A probabilistic analysis of silicon cost

    NASA Technical Reports Server (NTRS)

    Reiter, L. J.

    1983-01-01

    Silicon materials costs represent both a cost driver and an area where improvement can be made in the manufacture of photovoltaic modules. The cost from three processes for the production of low-cost silicon being developed under the U.S. Department of Energy's (DOE) National Photovoltaic Program is analyzed. The approach is based on probabilistic inputs and makes use of two models developed at the Jet Propulsion Laboratory: SIMRAND (SIMulation of Research ANd Development) and IPEG (Improved Price Estimating Guidelines). The approach, assumptions, and limitations are detailed along with a verification of the cost analyses methodology. Results, presented in the form of cumulative probability distributions for silicon cost, indicate that there is a 55% chance of reaching the DOE target of $16/kg for silicon material. This is a technically achievable cost based on expert forecasts of the results of ongoing research and development and do not imply any market prices for a given year.

  4. Retinal blood vessels extraction using probabilistic modelling.

    PubMed

    Kaba, Djibril; Wang, Chuang; Li, Yongmin; Salazar-Gonzalez, Ana; Liu, Xiaohui; Serag, Ahmed

    2014-01-01

    The analysis of retinal blood vessels plays an important role in detecting and treating retinal diseases. In this review, we present an automated method to segment blood vessels of fundus retinal image. The proposed method could be used to support a non-intrusive diagnosis in modern ophthalmology for early detection of retinal diseases, treatment evaluation or clinical study. This study combines the bias correction and an adaptive histogram equalisation to enhance the appearance of the blood vessels. Then the blood vessels are extracted using probabilistic modelling that is optimised by the expectation maximisation algorithm. The method is evaluated on fundus retinal images of STARE and DRIVE datasets. The experimental results are compared with some recently published methods of retinal blood vessels segmentation. The experimental results show that our method achieved the best overall performance and it is comparable to the performance of human experts.

  5. Social inequalities in probabilistic labor markets

    NASA Astrophysics Data System (ADS)

    Inoue, Jun-Ichi; Chen, He

    2015-03-01

    We discuss social inequalities in labor markets for university graduates in Japan by using the Gini and k-indices . Feature vectors which specify the abilities of candidates (students) are built-into the probabilistic labor market model. Here we systematically examine what kind of selection processes (strategies) by companies according to the weighted feature vector of each candidate could induce what type of inequalities in the number of informal acceptances leading to a large mismatch between students and companies. This work was financially supported by Grant-in-Aid for Scientific Research (C) of Japan Society for the Promotion of Science (JSPS) No. 2533027803 and Grant-in-Aid for Scientific Research on Innovative Area No. 2512001313.

  6. Performing Probabilistic Risk Assessment Through RAVEN

    SciTech Connect

    A. Alfonsi; C. Rabiti; D. Mandelli; J. Cogliati; R. Kinoshita

    2013-06-01

    The Reactor Analysis and Virtual control ENviroment (RAVEN) code is a software tool that acts as the control logic driver and post-processing engine for the newly developed Thermal-Hydraulic code RELAP-7. RAVEN is now a multi-purpose Probabilistic Risk Assessment (PRA) software framework that allows dispatching different functionalities: Derive and actuate the control logic required to simulate the plant control system and operator actions (guided procedures), allowing on-line monitoring/controlling in the Phase Space Perform both Monte-Carlo sampling of random distributed events and Dynamic Event Tree based analysis Facilitate the input/output handling through a Graphical User Interface (GUI) and a post-processing data mining module

  7. Self-insight in probabilistic category learning.

    PubMed

    Kemény, Ferenc; Lukács, Ágnes

    2013-01-01

    The Weather Prediction (WP) task is one of the most extensively used Probabilistic Category Learning tasks. Although it has been usually treated as an implicit task, its implicit nature has been questioned with focus on the structural knowledge of the acquired information. The goal of the current studies is to test if participants acquire explicit knowledge on the WP task. Experiment 1 addresses this question directly with the help of a subjective measure on self-insight in two groups: an experimental group facing the WP task and a control group with a task lacking predictive structure. Participants in the experimental group produced more explicit reports than the control group, and only on trials with explicit knowledge was their performance higher. Experiment 2 provided further evidence against the implicitness of the task by showing that decreasing stimulus presentation times extends the learning process, but does not result in more implicit processes.

  8. Probabilistic risk assessment of disassembly procedures

    SciTech Connect

    O`Brien, D.A.; Bement, T.R.; Letellier, B.C.

    1993-11-01

    The purpose of this report is to describe the use of Probabilistic Risk (Safety) Assessment (PRA or PSA) at a Department of Energy (DOE) facility. PRA is a methodology for (i) identifying combinations of events that, if they occur, lead to accidents (ii) estimating the frequency of occurrence of each combination of events and (iii) estimating the consequences of each accident. Specifically the study focused on evaluating the risks associated with dissembling a hazardous assembly. The PRA for the operation included a detailed evaluation only for those potential accident sequences which could lead to significant off-site consequences and affect public health. The overall purpose of this study was to investigate the feasibility of establishing a risk-consequence goal for DOE operations.

  9. Efficient Probabilistic Diagnostics for Electrical Power Systems

    NASA Technical Reports Server (NTRS)

    Mengshoel, Ole J.; Chavira, Mark; Cascio, Keith; Poll, Scott; Darwiche, Adnan; Uckun, Serdar

    2008-01-01

    We consider in this work the probabilistic approach to model-based diagnosis when applied to electrical power systems (EPSs). Our probabilistic approach is formally well-founded, as it based on Bayesian networks and arithmetic circuits. We investigate the diagnostic task known as fault isolation, and pay special attention to meeting two of the main challenges . model development and real-time reasoning . often associated with real-world application of model-based diagnosis technologies. To address the challenge of model development, we develop a systematic approach to representing electrical power systems as Bayesian networks, supported by an easy-to-use speci.cation language. To address the real-time reasoning challenge, we compile Bayesian networks into arithmetic circuits. Arithmetic circuit evaluation supports real-time diagnosis by being predictable and fast. In essence, we introduce a high-level EPS speci.cation language from which Bayesian networks that can diagnose multiple simultaneous failures are auto-generated, and we illustrate the feasibility of using arithmetic circuits, compiled from Bayesian networks, for real-time diagnosis on real-world EPSs of interest to NASA. The experimental system is a real-world EPS, namely the Advanced Diagnostic and Prognostic Testbed (ADAPT) located at the NASA Ames Research Center. In experiments with the ADAPT Bayesian network, which currently contains 503 discrete nodes and 579 edges, we .nd high diagnostic accuracy in scenarios where one to three faults, both in components and sensors, were inserted. The time taken to compute the most probable explanation using arithmetic circuits has a small mean of 0.2625 milliseconds and standard deviation of 0.2028 milliseconds. In experiments with data from ADAPT we also show that arithmetic circuit evaluation substantially outperforms joint tree propagation and variable elimination, two alternative algorithms for diagnosis using Bayesian network inference.

  10. Entanglement and thermodynamics in general probabilistic theories

    NASA Astrophysics Data System (ADS)

    Chiribella, Giulio; Scandolo, Carlo Maria

    2015-10-01

    Entanglement is one of the most striking features of quantum mechanics, and yet it is not specifically quantum. More specific to quantum mechanics is the connection between entanglement and thermodynamics, which leads to an identification between entropies and measures of pure state entanglement. Here we search for the roots of this connection, investigating the relation between entanglement and thermodynamics in the framework of general probabilistic theories. We first address the question whether an entangled state can be transformed into another by means of local operations and classical communication. Under two operational requirements, we prove a general version of the Lo-Popescu theorem, which lies at the foundations of the theory of pure-state entanglement. We then consider a resource theory of purity where free operations are random reversible transformations, modelling the scenario where an agent has limited control over the dynamics of a closed system. Our key result is a duality between the resource theory of entanglement and the resource theory of purity, valid for every physical theory where all processes arise from pure states and reversible interactions at the fundamental level. As an application of the main result, we establish a one-to-one correspondence between entropies and measures of pure bipartite entanglement. The correspondence is then used to define entanglement measures in the general probabilistic framework. Finally, we show a duality between the task of information erasure and the task of entanglement generation, whereby the existence of entropy sinks (systems that can absorb arbitrary amounts of information) becomes equivalent to the existence of entanglement sources (correlated systems from which arbitrary amounts of entanglement can be extracted).

  11. Probabilistic properties of the Curve Number

    NASA Astrophysics Data System (ADS)

    Rutkowska, Agnieszka; Banasik, Kazimierz; Kohnova, Silvia; Karabova, Beata

    2013-04-01

    The determination of the Curve Number (CN) is fundamental for the hydrological rainfall-runoff SCS-CN method which assesses the runoff volume in small catchments. The CN depends on geomorphologic and physiographic properties of the catchment and traditionally it is assumed to be constant for each catchment. Many practitioners and researchers observe, however, that the parameter is characterized by a variability. This sometimes causes inconsistency in the river discharge prediction using the SCS-CN model. Hence probabilistic and statistical methods are advisable to investigate the CN as a random variable and to complement and improve the deterministic model. The results that will be presented contain determination of the probabilistic properties of the CNs for various Slovakian and Polish catchments using statistical methods. The detailed study concerns the description of empirical distributions (characteristics, QQ-plots and coefficients of goodness of fit, histograms), testing of the statistical hypotheses about some theoretical distributions (Kolmogorov-Smirnow, Anderson-Darling, Cramer-von Mises, χ2, Shapiro-Wilk), construction of confidence intervals and comparisons among catchments. The relationship between confidence intervals and the ARC soil classification will also be performed. The comparison between the border values of the confidence intervals and the ARC I and ARC III conditions is crucial for further modeling. The study of the response of the catchment to the stormy rainfall depth when the variability of the CN arises is also of special interest. ACKNOWLEDGMENTS The investigation described in the contribution has been initiated by first Author research visit to Technical University of Bratislava in 2012 within a STSM of the COST Action ES0901. Data used here have been provided by research project no. N N305 396238 founded by PL-Ministry of Science and Higher Education. The support provided by the organizations is gratefully acknowledged.

  12. Probabilistic Evaluation of Advanced Ceramic Matrix Composite Structures

    NASA Technical Reports Server (NTRS)

    Abumeri, Galib H.; Chamis, Christos C.

    2003-01-01

    The objective of this report is to summarize the deterministic and probabilistic structural evaluation results of two structures made with advanced ceramic composites (CMC): internally pressurized tube and uniformly loaded flange. The deterministic structural evaluation includes stress, displacement, and buckling analyses. It is carried out using the finite element code MHOST, developed for the 3-D inelastic analysis of structures that are made with advanced materials. The probabilistic evaluation is performed using the integrated probabilistic assessment of composite structures computer code IPACS. The affects of uncertainties in primitive variables related to the material, fabrication process, and loadings on the material property and structural response behavior are quantified. The primitive variables considered are: thermo-mechanical properties of fiber and matrix, fiber and void volume ratios, use temperature, and pressure. The probabilistic structural analysis and probabilistic strength results are used by IPACS to perform reliability and risk evaluation of the two structures. The results will show that the sensitivity information obtained for the two composite structures from the computational simulation can be used to alter the design process to meet desired service requirements. In addition to detailed probabilistic analysis of the two structures, the following were performed specifically on the CMC tube: (1) predicted the failure load and the buckling load, (2) performed coupled non-deterministic multi-disciplinary structural analysis, and (3) demonstrated that probabilistic sensitivities can be used to select a reduced set of design variables for optimization.

  13. Probabilistic structural analysis methods for select space propulsion system components

    NASA Technical Reports Server (NTRS)

    Millwater, H. R.; Cruse, T. A.

    1989-01-01

    The Probabilistic Structural Analysis Methods (PSAM) project developed at the Southwest Research Institute integrates state-of-the-art structural analysis techniques with probability theory for the design and analysis of complex large-scale engineering structures. An advanced efficient software system (NESSUS) capable of performing complex probabilistic analysis has been developed. NESSUS contains a number of software components to perform probabilistic analysis of structures. These components include: an expert system, a probabilistic finite element code, a probabilistic boundary element code and a fast probability integrator. The NESSUS software system is shown. An expert system is included to capture and utilize PSAM knowledge and experience. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator (FPI). The expert system menu structure is summarized. The NESSUS system contains a state-of-the-art nonlinear probabilistic finite element code, NESSUS/FEM, to determine the structural response and sensitivities. A broad range of analysis capabilities and an extensive element library is present.

  14. Methodology for setting risk-based concentrations of contaminants in soil and groundwater and application to a model contaminated site.

    PubMed

    Fujinaga, Aiichiro; Uchiyama, Iwao; Morisawa, Shinsuke; Yoneda, Minoru; Sasamoto, Yuzuru

    2012-01-01

    In Japan, environmental standards for contaminants in groundwater and in leachate from soil are set with the assumption that they are used for drinking water over a human lifetime. Where there is neither a well nor groundwater used for drinking, the standard is thus too severe. Therefore, remediation based on these standards incurs excessive effort and cost. In contrast, the environmental-assessment procedure used in the United States and the Netherlands considers the site conditions (land use, existing wells, etc.); however, a risk assessment is required for each site. Therefore, this study proposes a new framework for judging contamination in Japan by considering the merits of the environmental standards used and a method for risk assessment. The framework involves setting risk-based concentrations that are attainable remediation goals for contaminants in soil and groundwater. The framework was then applied to a model contaminated site for risk management, and the results are discussed regarding the effectiveness and applicability of the new methodology.

  15. Methods for Probabilistic Fault Diagnosis: An Electrical Power System Case Study

    NASA Technical Reports Server (NTRS)

    Ricks, Brian W.; Mengshoel, Ole J.

    2009-01-01

    Health management systems that more accurately and quickly diagnose faults that may occur in different technical systems on-board a vehicle will play a key role in the success of future NASA missions. We discuss in this paper the diagnosis of abrupt continuous (or parametric) faults within the context of probabilistic graphical models, more specifically Bayesian networks that are compiled to arithmetic circuits. This paper extends our previous research, within the same probabilistic setting, on diagnosis of abrupt discrete faults. Our approach and diagnostic algorithm ProDiagnose are domain-independent; however we use an electrical power system testbed called ADAPT as a case study. In one set of ADAPT experiments, performed as part of the 2009 Diagnostic Challenge, our system turned out to have the best performance among all competitors. In a second set of experiments, we show how we have recently further significantly improved the performance of the probabilistic model of ADAPT. While these experiments are obtained for an electrical power system testbed, we believe they can easily be transitioned to real-world systems, thus promising to increase the success of future NASA missions.

  16. NESSUS/expert and NESSUS/FPI in the Probabilistic Structural Analysis Methods (PSAM) program

    NASA Technical Reports Server (NTRS)

    Burnside, O. H.

    1987-01-01

    The Numerical Evaluation of Stochastic Structures under Stress (NESSUS) is the primary computer code being developed in the NASA Probabilistic Structural Analysis Methods (PSAM) project. It consists of four modules NESSUS/EXPERT, NESSUS/FPI, NESSUS/PRE and NESSUS/FEM. This presentation concentrates on EXPERT and FPI. To provide an effective interface between NESSUS and the user, an expert system module called NESSUS/EXPERT is being developed. That system uses the CLIPS artificial intelligence code developed to NASA-JSC. The code is compatible with FORTRAN, the standard language for codes in PSAM. The user interacts with the CLIPS inference engine, which is linked to the knowledge database. The perturbation database generated by NESSUS/FEM and managed in EXPERT is used to develop the so-called response or performance model in the random variables. Two independent probabilistic methods are available in PSAM for the computation of the probabilistic structural response. These are the Fast Probability Integration (FPI) method and Monte Carlo simulation. FPI is classified as an advanced reliability method and has been developed over the past ten years by researchers addressing the reliability of civil engineering structures. Monte Carlo is a well-established technique for computing probabilities by conducting a number of deterministic analyses with specified input distributional information.

  17. Verification of a probabilistic flood forecasting system for an Alpine Region of northern Italy

    NASA Astrophysics Data System (ADS)

    Laiolo, P.; Gabellani, S.; Rebora, N.; Rudari, R.; Ferraris, L.; Ratto, S.; Stevenin, H.

    2012-04-01

    Probabilistic hydrometeorological forecasting chains are increasingly becoming an operational tool used by civil protection centres for issuing flood alerts. One of the most important requests of decision makers is to have reliable systems, for this reason an accurate verification of their predictive performances become essential. The aim of this work is to validate a probabilistic flood forecasting system: Flood-PROOFS. The system works in real time, since 2008, in an alpine Region of northern Italy, Valle d'Aosta. It is used by the Civil Protection regional service to issue warnings and by the local water company to protect its facilities. Flood-PROOFS uses as input Quantitative Precipitation Forecast (QPF) derived from the Italian limited area model meteorological forecast (COSMO-I7) and forecasts issued by regional expert meteorologists. Furthermore the system manages and uses both real time meteorological and satellite data and real time data on the maneuvers performed by the water company on dams and river devices. The main outputs produced by the computational chain are deterministic and probabilistic discharge forecasts in different cross sections of the considered river network. The validation of the flood prediction system has been conducted on a 25 months period considering different statistical methods such as Brier score, Rank histograms and verification scores. The results highlight good performances of the system as support system for emitting warnings but there is a lack of statistics especially for huge discharge events.

  18. Policy-driven development of cost-effective, risk-based surveillance strategies.

    PubMed

    Reist, M; Jemmi, T; Stärk, K D C

    2012-07-01

    Animal health and residue surveillance verifies the good health status of the animal population, thereby supporting international free trade of animals and animal products. However, active surveillance is costly and time-consuming. The development of cost-effective tools for animal health and food hazard surveillance is therefore a priority for decision-makers in the field of veterinary public health. The assumption of this paper is that outcome-based formulation of standards, legislation leaving room for risk-based approaches and close collaboration and a mutual understanding and exchange between scientists and policy makers are essential for cost-effective surveillance. We illustrate this using the following examples: (i) a risk-based sample size calculation for surveys to substantiate freedom from diseases/infection, (ii) a cost-effective national surveillance system for Bluetongue using scenario tree modelling and (iii) a framework for risk-based residue monitoring. Surveys to substantiate freedom from infectious bovine rhinotracheitis and enzootic bovine leucosis between 2002 and 2009 saved over 6 million € by applying a risk-based sample size calculation approach, and by taking into account prior information from repeated surveys. An open, progressive policy making process stimulates research and science to develop risk-based and cost-efficient survey methodologies. Early involvement of policy makers in scientific developments facilitates implementation of new findings and full exploitation of benefits for producers and consumers. PMID:22265642

  19. A probabilistic choice model based on Tsallis’ statistics

    NASA Astrophysics Data System (ADS)

    Takahashi, Taiki

    2007-12-01

    Decision under risk and uncertainty (probabilistic choice) has been attracting attention in econophysics and neuroeconomics. This paper proposes a probabilistic choice model based on a mathematical equivalence of delay and uncertainty in decision-making, and the deformed algebra developed in the Tsallis’ non-extensive thermodynamics. Furthermore, it is shown that this model can be utilized to quantify the degree of consistency in probabilistic choice in humans and animals. Future directions in the application of the model to studies in econophysics, neurofinance, neuroeconomics, and social physics are discussed.

  20. Application of Probabilistic Analysis to Aircraft Impact Dynamics

    NASA Technical Reports Server (NTRS)

    Lyle, Karen H.; Padula, Sharon L.; Stockwell, Alan E.

    2003-01-01

    Full-scale aircraft crash simulations performed with nonlinear, transient dynamic, finite element codes can incorporate structural complexities such as: geometrically accurate models; human occupant models; and advanced material models to include nonlinear stressstrain behaviors, laminated composites, and material failure. Validation of these crash simulations is difficult due to a lack of sufficient information to adequately determine the uncertainty in the experimental data and the appropriateness of modeling assumptions. This paper evaluates probabilistic approaches to quantify the uncertainty in the simulated responses. Several criteria are used to determine that a response surface method is the most appropriate probabilistic approach. The work is extended to compare optimization results with and without probabilistic constraints.

  1. Probabilistic constitutive relationships for material strength degradation models

    NASA Technical Reports Server (NTRS)

    Boyce, L.; Chamis, C. C.

    1989-01-01

    In the present probabilistic methodology for the strength of aerospace propulsion system structural components subjected to such environmentally-induced primitive variables as loading stresses, high temperature, chemical corrosion, and radiation, time is encompassed as an interacting element, allowing the projection of creep and fatigue effects. A probabilistic constitutive equation is postulated to account for the degradation of strength due to these primitive variables which may be calibrated by an appropriately curve-fitted least-squares multiple regression of experimental data. The resulting probabilistic constitutive equation is embodied in the PROMISS code for aerospace propulsion component random strength determination.

  2. A New Scheme for Probabilistic Teleportation and Its Potential Applications

    NASA Astrophysics Data System (ADS)

    Wei, Jia-Hua; Dai, Hong-Yi; Zhang, Ming

    2013-12-01

    We propose a novel scheme to probabilistically teleport an unknown two-level quantum state when the information of the partially entangled state is only available for the sender. This is in contrast with the fact that the receiver must know the non-maximally entangled state in previous typical schemes for the teleportation. Additionally, we illustrate two potential applications of the novel scheme for probabilistic teleportation from a sender to a receiver with the help of an assistant, who plays distinct roles under different communication conditions, and our results show that the novel proposal could enlarge the applied range of probabilistic teleportation.

  3. Development of a new adaptive ordinal approach to continuous-variable probabilistic optimization.

    SciTech Connect

    Romero, Vicente JosÔe; Chen, Chun-Hung (George Mason University, Fairfax, VA)

    2006-11-01

    A very general and robust approach to solving continuous-variable optimization problems involving uncertainty in the objective function is through the use of ordinal optimization. At each step in the optimization problem, improvement is based only on a relative ranking of the uncertainty effects on local design alternatives, rather than on precise quantification of the effects. One simply asks ''Is that alternative better or worse than this one?'' -not ''HOW MUCH better or worse is that alternative to this one?'' The answer to the latter question requires precise characterization of the uncertainty--with the corresponding sampling/integration expense for precise resolution. However, in this report we demonstrate correct decision-making in a continuous-variable probabilistic optimization problem despite extreme vagueness in the statistical characterization of the design options. We present a new adaptive ordinal method for probabilistic optimization in which the trade-off between computational expense and vagueness in the uncertainty characterization can be conveniently managed in various phases of the optimization problem to make cost-effective stepping decisions in the design space. Spatial correlation of uncertainty in the continuous-variable design space is exploited to dramatically increase method efficiency. Under many circumstances the method appears to have favorable robustness and cost-scaling properties relative to other probabilistic optimization methods, and uniquely has mechanisms for quantifying and controlling error likelihood in design-space stepping decisions. The method is asymptotically convergent to the true probabilistic optimum, so could be useful as a reference standard against which the efficiency and robustness of other methods can be compared--analogous to the role that Monte Carlo simulation plays in uncertainty propagation.

  4. Targeted computational probabilistic corroboration of experimental knee wear simulator: the importance of accounting for variability.

    PubMed

    Strickland, M A; Dressler, M R; Render, T; Browne, M; Taylor, M

    2011-04-01

    Experimental testing is widely used to predict wear of total knee replacement (TKR) devices. Computational models cannot replace this essential in vitro testing, but they do have complementary strengths and capabilities, which make in silico models a valuable support tool for experimental wear investigations. For effective exploitation, these two separate domains should be closely corroborated together; this requires extensive data-sharing and cross-checking at every stage of simulation and testing. However, isolated deterministic corroborations provide only a partial perspective; in vitro testing is inherently variable, and relatively small changes in the environmental and kinematic conditions at the articulating interface can account for considerable variation in the reported wear rates. Understanding these variations will be key to managing uncertainty in the tests, resulting in a 'cleaner' investigation environment for further refining current theories of wear. This study demonstrates the value of probabilistic in silico methods by describing a specific, targeted corroboration of the AMTI knee wear simulator, using rigid body dynamics software models. A deterministic model of the simulator under displacement-control was created for investigation. Firstly, a large sample of experimental data (N>100) was collated, and a probabilistic computational study (N>1000 trials) was used to compare the kinetic performance envelopes for in vitro and in silico models, to more fully corroborate the mechanical model. Secondly, corresponding theoretical wear-rate predictions were compared to the experimentally reported wear data, to assess the robustness of current wear theories to uncertainty (as distinct from the mechanical variability). The results reveal a good corroboration for the physical mechanics of the wear test rig; however they demonstrate that the distributions for wear are not currently well-predicted. The probabilistic domain is found to be far more sensitive at

  5. A review of NRC staff uses of probabilistic risk assessment

    SciTech Connect

    Not Available

    1994-03-01

    The NRC staff uses probabilistic risk assessment (PRA) and risk management as important elements its licensing and regulatory processes. In October 1991, the NRC`s Executive Director for Operations established the PRA Working Group to address concerns identified by the Advisory Committee on Reactor Safeguards with respect to unevenness and inconsistency in the staff`s current uses of PRA. After surveying current staff uses of PRA and identifying needed improvements, the Working Group defined a set of basic principles for staff PRA use and identified three areas for improvements: guidance development, training enhancements, and PRA methods development. For each area of improvement, the Working Group took certain actions and recommended additional work. The Working Group recommended integrating its work with other recent PRA-related activities the staff completed and improving staff interactions with PRA users in the nuclear industry. The Working Group took two key actions by developing general guidance for two uses of PRA within the NRC (that is, screening or prioritizing reactor safety issues and analyzing such issues in detail) and developing guidance on basic terms and methods important to the staff`s uses of PRA.

  6. An approach for probabilistic forecasting of seasonal turbidity threshold exceedance

    NASA Astrophysics Data System (ADS)

    Towler, Erin; Rajagopalan, Balaji; Summers, R. Scott; Yates, David

    2010-06-01

    Though climate forecasts offer substantial promise for improving water resource oversight, additional tools are needed to translate these forecasts into water-quality-based products that can be useful to water utility managers. To this end, a generalized approach is developed that uses seasonal forecasts to predict the likelihood of exceeding a prescribed water quality limit. Because many water quality standards are based on thresholds, this study utilizes a logistic regression technique, which employs nonparametric or "local" estimation that can capture nonlinear features in the data. The approach is applied to a drinking water source in the Pacific Northwest United States that has experienced elevated turbidity values that are correlated with streamflow. The main steps of the approach are to (1) obtain a seasonal probabilistic precipitation forecast, (2) generate streamflow scenarios conditional on the precipitation forecast, (3) use a local logistic regression to compute the turbidity threshold exceedance probabilities, and (4) quantify the likelihood of turbidity exceedance corresponding to the seasonal climate forecast. Results demonstrate that forecasts offer a slight improvement over climatology, but that representative forecasts are conservative and result in only a small shift in total exceedance likelihood. Synthetic forecasts are included to show the sensitivity of the total exceedance likelihood. The technique is general and could be applied to other water quality variables that depend on climate or hydroclimate.

  7. Advanced neutron source reactor probabilistic flow blockage assessment

    SciTech Connect

    Ramsey, C.T.

    1995-08-01

    The Phase I Level I Probabilistic Risk Assessment (PRA) of the conceptual design of the Advanced Neutron Source (ANS) Reactor identified core flow blockage as the most likely internal event leading to fuel damage. The flow blockage event frequency used in the original ANS PRA was based primarily on the flow blockage work done for the High Flux Isotope Reactor (HFIR) PRA. This report examines potential flow blockage scenarios and calculates an estimate of the likelihood of debris-induced fuel damage. The bulk of the report is based specifically on the conceptual design of ANS with a 93%-enriched, two-element core; insights to the impact of the proposed three-element core are examined in Sect. 5. In addition to providing a probability (uncertainty) distribution for the likelihood of core flow blockage, this ongoing effort will serve to indicate potential areas of concern to be focused on in the preliminary design for elimination or mitigation. It will also serve as a loose-parts management tool.

  8. BOAST 98-MC: A Probabilistic Simulation Module for BOAST 98

    SciTech Connect

    Aiysha Sultana; Anne Oudinot; Reynaldo Gonzalez; Scott Reeves

    2006-09-08

    This work was performed by Advanced Resources International (ARI) on behalf of the United States Department of Energy (DOE) in order to develop a user-friendly, PC-based interface that couples DOE's BOAST 98 software with the Monte Carlo simulation technique. The objectives of the work were to improve reservoir management and maximize oil recoveries by understanding and quantifying reservoir uncertainty as well as improving the capabilities of DOE's BOAST 98 software by incorporating a probabilistic module in the simulator. In this model, probability distributions can be assigned to unknown input parameters such as permeability, porosity, etc. Options have also been added to the input file to be able to vary relative permeability curves as well as well spacing. Hundreds of simulations can then automatically be run to explore the many combinations of uncertain reservoir parameters across their spectrum of uncertainty. Output data such as oil rate and water rate can then be plotted. When historical data are available, they can be uploaded and a least-square error-function run between the simulation data and the history data. The set of input parameters leading to the best match is thus determined. Sensitivity charts (Tornado plots) that rank the uncertain parameters according to the impact they have on the outputs can also be generated.

  9. Probabilistic analysis of maintenance and operation of artificial recharge ponds

    NASA Astrophysics Data System (ADS)

    Pedretti, Daniele; Barahona-Palomo, Marco; Bolster, Diogo; Fernàndez-Garcia, Daniel; Sanchez-Vila, Xavier; Tartakovsky, Daniel M.

    2012-02-01

    Aquifer artificial recharge from surface infiltration ponds is often conducted to replenish depleted aquifers in arid and semi-arid zones. Physical and bio-geochemical clogging decreases the host soil's infiltration capacity, which has to be restored with periodic maintenance activities. We develop a probabilistic modeling framework that quantifies the risk of a pond's infiltration capacity falling below its target value due to soil heterogeneity and clogging. This framework can act as a tool to aid managers in optimally selecting and designing maintenance strategies. Our model enables one to account for a variety of maintenance strategies that target different clogging mechanisms. The framework is applied to an existing pond in Barcelona, Spain as well as to several synthetic infiltration ponds with varying statistical distributions of initial infiltration capacity. We find that physical clogging mechanisms induce the greatest uncertainty and that maintenance targeted at these can yield optimal results. However, considering the fundamental role of the spatial variability in the initial properties, we conclude that an adequate initial characterization of the surface infiltration ponds is crucial to determining the degree of uncertainty of different maintenance solutions and thus to making cost-effective and reliable decisions.

  10. Upward Flammability Testing: A Probabilistic Measurement

    NASA Technical Reports Server (NTRS)

    Davis, Samuel E.; Engel, Carl D.; Richardson, Erin R.

    2003-01-01

    Examination of NASA-STD-6001 Test 1 data suggests burn length outcome for a given environment has a large statistical variation from run to run. Large data sets show that burn length data form cumulative probability distribution curves, which describe a material's characteristic to burn in a specific environment, suggesting that the current practice of testing three samples at specific conditions is inadequate. Sufficient testing can establish material characteristics probability curves to provide the probability that a material will sustain a burn length of at least 15.24 cm (6.0 in.) or will sustain burning until all material is consumed. A simple pasdfail criterion may not be possible or practical. Future application of flammability data for some material classes may require the engineer to assess risk based on the probability of an occurrence and the probable outcome with different materials as characterized with cumulative burn length distributions for specific use conditions.

  11. Risk-Based Sampling: I Don't Want to Weight in Vain.

    PubMed

    Powell, Mark R

    2015-12-01

    Recently, there has been considerable interest in developing risk-based sampling for food safety and animal and plant health for efficient allocation of inspection and surveillance resources. The problem of risk-based sampling allocation presents a challenge similar to financial portfolio analysis. Markowitz (1952) laid the foundation for modern portfolio theory based on mean-variance optimization. However, a persistent challenge in implementing portfolio optimization is the problem of estimation error, leading to false "optimal" portfolios and unstable asset weights. In some cases, portfolio diversification based on simple heuristics (e.g., equal allocation) has better out-of-sample performance than complex portfolio optimization methods due to estimation uncertainty. Even for portfolios with a modest number of assets, the estimation window required for true optimization may imply an implausibly long stationary period. The implications for risk-based sampling are illustrated by a simple simulation model of lot inspection for a small, heterogeneous group of producers.

  12. A risk-based approach to cost-benefit analysis of software safety activities

    SciTech Connect

    Fortier, S.C.; Michael, J.B.

    1993-05-01

    Assumptions about the economics of making a system safe are usually not explicitly stated in industrial and software models of safety-critical systems. These assumptions span a wide spectrum of economic tradeoffs with respect to resources expended to make a system safe. The missing component in these models that is necessary for capturing the effect of economic tradeoffs is risk. A qualitative risk-based software safety model is proposed that combines features of industrial and software systems safety models. The risk-based model provides decision makers with a basis for performing cost-benefit analyses of software safety-related activities.

  13. A risk-based approach to cost-benefit analysis of software safety activities

    SciTech Connect

    Fortier, S.C. ); Michael, J.B. )

    1993-01-01

    Assumptions about the economics of making a system safe are usually not explicitly stated in industrial and software models of safety-critical systems. These assumptions span a wide spectrum of economic tradeoffs with respect to resources expended to make a system safe. The missing component in these models that is necessary for capturing the effect of economic tradeoffs is risk. A qualitative risk-based software safety model is proposed that combines features of industrial and software systems safety models. The risk-based model provides decision makers with a basis for performing cost-benefit analyses of software safety-related activities.

  14. Predicting rib fracture risk with whole-body finite element models: development and preliminary evaluation of a probabilistic analytical framework.

    PubMed

    Forman, Jason L; Kent, Richard W; Mroz, Krystoffer; Pipkorn, Bengt; Bostrom, Ola; Segui-Gomez, Maria

    2012-01-01

    This study sought to develop a strain-based probabilistic method to predict rib fracture risk with whole-body finite element (FE) models, and to describe a method to combine the results with collision exposure information to predict injury risk and potential intervention effectiveness in the field. An age-adjusted ultimate strain distribution was used to estimate local rib fracture probabilities within an FE model. These local probabilities were combined to predict injury risk and severity within the whole ribcage. The ultimate strain distribution was developed from a literature dataset of 133 tests. Frontal collision simulations were performed with the THUMS (Total HUman Model for Safety) model with four levels of delta-V and two restraints: a standard 3-point belt and a progressive 3.5-7 kN force-limited, pretensioned (FL+PT) belt. The results of three simulations (29 km/h standard, 48 km/h standard, and 48 km/h FL+PT) were compared to matched cadaver sled tests. The numbers of fractures predicted for the comparison cases were consistent with those observed experimentally. Combining these results with field exposure informantion (ΔV, NASS-CDS 1992-2002) suggests a 8.9% probability of incurring AIS3+ rib fractures for a 60 year-old restrained by a standard belt in a tow-away frontal collision with this restraint, vehicle, and occupant configuration, compared to 4.6% for the FL+PT belt. This is the first study to describe a probabilistic framework to predict rib fracture risk based on strains observed in human-body FE models. Using this analytical framework, future efforts may incorporate additional subject or collision factors for multi-variable probabilistic injury prediction. PMID:23169122

  15. Predicting Rib Fracture Risk With Whole-Body Finite Element Models: Development and Preliminary Evaluation of a Probabilistic Analytical Framework

    PubMed Central

    Forman, Jason L.; Kent, Richard W.; Mroz, Krystoffer; Pipkorn, Bengt; Bostrom, Ola; Segui-Gomez, Maria

    2012-01-01

    This study sought to develop a strain-based probabilistic method to predict rib fracture risk with whole-body finite element (FE) models, and to describe a method to combine the results with collision exposure information to predict injury risk and potential intervention effectiveness in the field. An age-adjusted ultimate strain distribution was used to estimate local rib fracture probabilities within an FE model. These local probabilities were combined to predict injury risk and severity within the whole ribcage. The ultimate strain distribution was developed from a literature dataset of 133 tests. Frontal collision simulations were performed with the THUMS (Total HUman Model for Safety) model with four levels of delta-V and two restraints: a standard 3-point belt and a progressive 3.5–7 kN force-limited, pretensioned (FL+PT) belt. The results of three simulations (29 km/h standard, 48 km/h standard, and 48 km/h FL+PT) were compared to matched cadaver sled tests. The numbers of fractures predicted for the comparison cases were consistent with those observed experimentally. Combining these results with field exposure informantion (ΔV, NASS-CDS 1992–2002) suggests a 8.9% probability of incurring AIS3+ rib fractures for a 60 year-old restrained by a standard belt in a tow-away frontal collision with this restraint, vehicle, and occupant configuration, compared to 4.6% for the FL+PT belt. This is the first study to describe a probabilistic framework to predict rib fracture risk based on strains observed in human-body FE models. Using this analytical framework, future efforts may incorporate additional subject or collision factors for multi-variable probabilistic injury prediction. PMID:23169122

  16. Probabilistic structural analysis of adaptive/smart/intelligent space structures

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Chamis, Christos C.

    1991-01-01

    A three-bay, space, cantilever truss is probabilistically evaluated for adaptive/smart/intelligent behavior. For each behavior, the scatter (ranges) in buckling loads, vibration frequencies, and member axial forces are probabilistically determined. Sensitivities associated with uncertainties in the structure, material and load variables that describe the truss are determined for different probabilities. The relative magnitude for these sensitivities are used to identify significant truss variables that control/classify its behavior to respond as an adaptive/smart/intelligent structure. Results show that the probabilistic buckling loads and vibration frequencies increase for each truss classification, with a substantial increase for intelligent trusses. Similarly, the probabilistic member axial forces reduce for adaptive and intelligent trusses and increase for smart trusses.

  17. Probabilistic liquefaction triggering based on the cone penetration test

    USGS Publications Warehouse

    Moss, R.E.S.; Seed, R.B.; Kayen, R.E.; Stewart, J.P.; Tokimatsu, K.

    2005-01-01

    Performance-based earthquake engineering requires a probabilistic treatment of potential failure modes in order to accurately quantify the overall stability of the system. This paper is a summary of the application portions of the probabilistic liquefaction triggering correlations proposed recently proposed by Moss and co-workers. To enable probabilistic treatment of liquefaction triggering, the variables comprising the seismic load and the liquefaction resistance were treated as inherently uncertain. Supporting data from an extensive Cone Penetration Test (CPT)-based liquefaction case history database were used to develop a probabilistic correlation. The methods used to measure the uncertainty of the load and resistance variables, how the interactions of these variables were treated using Bayesian updating, and how reliability analysis was applied to produce curves of equal probability of liquefaction are presented. The normalization for effective overburden stress, the magnitude correlated duration weighting factor, and the non-linear shear mass participation factor used are also discussed.

  18. EFFECTS OF CORRELATED PROBABILISTIC EXPOSURE MODEL INPUTS ON SIMULATED RESULTS

    EPA Science Inventory

    In recent years, more probabilistic models have been developed to quantify aggregate human exposures to environmental pollutants. The impact of correlation among inputs in these models is an important issue, which has not been resolved. Obtaining correlated data and implementi...

  19. Probabilistic structural analysis of adaptive/smart/intelligent space structures

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Chamis, Christos C.

    1992-01-01

    A three-bay, space, cantilever truss is probabilistically evaluated for adaptive/smart/intelligent behavior. For each behavior, the scatter (ranges) in buckling loads, vibration frequencies, and member axial forces are probabilistically determined. Sensitivities associated with uncertainties in the structure, material and load variables that describe the truss are determined for different probabilities. The relative magnitude for these sensitivities are used to identify significant truss variables that control/classify its behavior to respond as an adaptive/smart/intelligent structure. Results show that the probabilistic buckling loads and vibration frequencies increase for each truss classification, with a substantial increase for intelligent trusses. Similarly, the probabilistic member axial forces reduce for adaptive and intelligent trusses and increase for smart trusses.

  20. A Markov Chain Approach to Probabilistic Swarm Guidance

    NASA Technical Reports Server (NTRS)

    Acikmese, Behcet; Bayard, David S.

    2012-01-01

    This paper introduces a probabilistic guidance approach for the coordination of swarms of autonomous agents. The main idea is to drive the swarm to a prescribed density distribution in a prescribed region of the configuration space. In its simplest form, the probabilistic approach is completely decentralized and does not require communication or collabo- ration between agents. Agents make statistically independent probabilistic decisions based solely on their own state, that ultimately guides the swarm to the desired density distribution in the configuration space. In addition to being completely decentralized, the probabilistic guidance approach has a novel autonomous self-repair property: Once the desired swarm density distribution is attained, the agents automatically repair any damage to the distribution without collaborating and without any knowledge about the damage.

  1. Approaches to implementing deterministic models in a probabilistic framework

    SciTech Connect

    Talbott, D.V.

    1995-04-01

    The increasing use of results from probabilistic risk assessments in the decision-making process makes it ever more important to eliminate simplifications in probabilistic models that might lead to conservative results. One area in which conservative simplifications are often made is modeling the physical interactions that occur during the progression of an accident sequence. This paper demonstrates and compares different approaches for incorporating deterministic models of physical parameters into probabilistic models; parameter range binning, response curves, and integral deterministic models. An example that combines all three approaches in a probabilistic model for the handling of an energetic material (i.e. high explosive, rocket propellant,...) is then presented using a directed graph model.

  2. A PROBABILISTIC MODELING FRAMEWORK FOR PREDICTING POPULATION EXPOSURES TO BENZENE

    EPA Science Inventory

    The US Environmental Protection Agency (EPA) is modifying their probabilistic Stochastic Human Exposure Dose Simulation (SHEDS) model to assess aggregate exposures to air toxics. Air toxics include urban Hazardous Air Pollutants (HAPS) such as benzene from mobile sources, part...

  3. Probabilistic protocols in quantum information science: Use and abuse

    NASA Astrophysics Data System (ADS)

    Caves, Carlton

    2014-03-01

    Protocols in quantum information science often succeed with less than unit probability, but nonetheless perform useful tasks because success occurs often enough to make tolerable the overhead from having to perform the protocol several times. Any probabilistic protocol must be analyzed from the perspective of the resources required to make the protocol succeed. I present results from analyses of two probabilistic protocols: (i) nondeterministic (or immaculate) linear amplification, in which an input coherent state is amplified some of the time to a larger-amplitude coherent state, and (ii) probabilistic quantum metrology, in which one attempts to improve estimation of a parameter (or parameters) by post-selecting on a particular outcome. The analysis indicates that there is little to be gained from probabilistic protocols in these two situations.

  4. Remote Sensing Classification Uncertainty: Validating Probabilistic Pixel Level Classification

    NASA Astrophysics Data System (ADS)

    Vrettas, Michail; Cornford, Dan; Bastin, Lucy; Pons, Xavier; Sevillano, Eva; Moré, Gerard; Serra, Pere; Ninyerola, Miquel

    2013-04-01

    There already exists an extensive literature on classification of remotely sensed imagery, and indeed classification more widely, that considers a wide range of probabilistic and non-probabilistic classification methodologies. Although for many probabilistic classification methodologies posterior class probabilities are produced per pixel (observation) these are often not communicated at the pixel level, and typically not validated at the pixel level. Most often the probabilistic classification in converted into a hard classification (of the most probable class) and the accuracy of the resulting classification is reported in terms of a global confusion matrix, or some score derived from this. For applications where classification accuracy is spatially variable and where pixel level estimates of uncertainty can be meaningfully exploited in workflows that propagate uncertainty validating and communicating the pixel level uncertainty opens opportunities for more refined and accountable modelling. In this work we describe our recent work applying and validation of a range of probabilistic classifiers. Using a multi-temporal Landsat data set of the Ebro Delta in Catalonia, which has been carefully radiometrically and geometrically corrected, we present a range of Bayesian classifiers from simple Bayesian linear discriminant analysis to a complex variational Gaussian process based classifier. Field study derived labelled data, classified into 8 classes, which primarily consider land use and the degree of flooding in what is a rice growing region, are used to train the pixel level classifiers. Our focus is not so much on the classification accuracy, but rather the validation of the probabilistic classification made by all methods. We present a range of validation plots and scores, many of which are used for probabilistic weather forecast verification, but are new to remote sensing classification including of course the standard measures of misclassification, but also

  5. Integration of Evidence Base into a Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Saile, Lyn; Lopez, Vilma; Bickham, Grandin; Kerstman, Eric; FreiredeCarvalho, Mary; Byrne, Vicky; Butler, Douglas; Myers, Jerry; Walton, Marlei

    2011-01-01

    INTRODUCTION: A probabilistic decision support model such as the Integrated Medical Model (IMM) utilizes an immense amount of input data that necessitates a systematic, integrated approach for data collection, and management. As a result of this approach, IMM is able to forecasts medical events, resource utilization and crew health during space flight. METHODS: Inflight data is the most desirable input for the Integrated Medical Model. Non-attributable inflight data is collected from the Lifetime Surveillance for Astronaut Health study as well as the engineers, flight surgeons, and astronauts themselves. When inflight data is unavailable cohort studies, other models and Bayesian analyses are used, in addition to subject matters experts input on occasion. To determine the quality of evidence of a medical condition, the data source is categorized and assigned a level of evidence from 1-5; the highest level is one. The collected data reside and are managed in a relational SQL database with a web-based interface for data entry and review. The database is also capable of interfacing with outside applications which expands capabilities within the database itself. Via the public interface, customers can access a formatted Clinical Findings Form (CLiFF) that outlines the model input and evidence base for each medical condition. Changes to the database are tracked using a documented Configuration Management process. DISSCUSSION: This strategic approach provides a comprehensive data management plan for IMM. The IMM Database s structure and architecture has proven to support additional usages. As seen by the resources utilization across medical conditions analysis. In addition, the IMM Database s web-based interface provides a user-friendly format for customers to browse and download the clinical information for medical conditions. It is this type of functionality that will provide Exploratory Medicine Capabilities the evidence base for their medical condition list

  6. Probabilistic Seismic Hazard assessment in Albania

    NASA Astrophysics Data System (ADS)

    Muco, B.; Kiratzi, A.; Sulstarova, E.; Kociu, S.; Peci, V.; Scordilis, E.

    2002-12-01

    Albania is one of the coutries with highest sesimicity in Europe.The history of instrumental monitoring of seismicity in this country started since 1968 with the setting up of the first seismographic station of Tirana and more effectively after the beginning of the operation of the Albanian Seismological Network in 1976. There is a rich evidence that during two thousands years Albania has been hit by many disastrous earthquakes. The highest magnitude estimated is 7.2. After the end of Communist era and opening of the country, a boom of constructions started in Albania continuing even now. It makes more indispensabile the producing of accurate seismic hazard maps for preventing the damages of future probable earthquakes. Some efforts have already been done in seismic hazard assessment(Sulstarova et al., 1980; Kociu, 2000; Muco et al., 2002). In this approach, the probabilistic technique has been used in one joint work between Seismological Institute of Tirana, Albania and Department of Geophysics of Aristotle University of Thessaloniki, Greece, into the framework of NATO SfP project "SeisAlbania". The earthquake catalogue adopted was specifically conceived for this seismic hazard analysis and contains 530 events with magnitude M>4.5 from the year 58 up to 2000. We divided the country in 8 seismotectonic zones giving for them the most representative fault characteristics. The computer code used for hazard calculation was OHAZ, developed from the Geophysical Survey of Slovenia and the attenuation models used were Ambraseys et al., 1996; Sabetta and Pugliese, 1996 and Margaris et al., 2001. The hazard maps are obtained for 100, 475, 2375 and 4746 return periods, for rock soil condition. Analyzing the map of PGA values for a return period of 475 years, there are separated 5 zones with different escalation of PGA values: 1)the zone with PGA (0.20 - 0.24 g) 1.8 percent of Albanian territory, 2)the zone with PGA (0.16 - 0.20 g) 22.6 percent of Albanian territory, 3)the

  7. Probabilistic seismic demand analysis of nonlinear structures

    NASA Astrophysics Data System (ADS)

    Shome, Nilesh

    Recent earthquakes in California have initiated improvement in current design philosophy and at present the civil engineering community is working towards development of performance-based earthquake engineering of structures. The objective of this study is to develop efficient, but accurate procedures for probabilistic analysis of nonlinear seismic behavior of structures. The proposed procedures help the near-term development of seismic-building assessments which require an estimation of seismic demand at a given intensity level. We also develop procedures to estimate the probability of exceedance of any specified nonlinear response level due to future ground motions at a specific site. This is referred as Probabilistic Seismic Demand Analysis (PSDA). The latter procedure prepares the way for the next stage development of seismic assessment that consider the uncertainties in nonlinear response and capacity. The proposed procedures require structure-specific nonlinear analyses for a relatively small set of recorded accelerograms and (site-specific or USGS-map-like) seismic hazard analyses. We have addressed some of the important issues of nonlinear seismic demand analysis, which are selection of records for structural analysis, the number of records to be used, scaling of records, etc. Initially these issues are studied through nonlinear analysis of structures for a number of magnitude-distance bins of records. Subsequently we introduce regression analysis of response results against spectral acceleration, magnitude, duration, etc., which helps to resolve these issues more systematically. We illustrate the demand-hazard calculations through two major example problems: a 5story and a 20-story SMRF building. Several simple, but quite accurate closed-form solutions have also been proposed to expedite the demand-hazard calculations. We find that vector-valued (e.g., 2-D) PSDA estimates demand hazard more accurately. This procedure, however, requires information about 2

  8. Lawrence Livermore National Laboratory Probabilistic Seismic Hazard Codes Validation

    SciTech Connect

    Savy, J B

    2003-02-08

    Probabilistic Seismic Hazard Analysis (PSHA) is a methodology that estimates the likelihood that various levels of earthquake-caused ground motion will be exceeded at a given location in a given future time-period. LLNL has been developing the methodology and codes in support of the Nuclear Regulatory Commission (NRC) needs for reviews of site licensing of nuclear power plants, since 1978. A number of existing computer codes have been validated and still can lead to ranges of hazard estimates in some cases. Until now, the seismic hazard community had not agreed on any specific method for evaluation of these codes. The Earthquake Engineering Research Institute (EERI) and the Pacific Engineering Earthquake Research (PEER) center organized an exercise in testing of existing codes with the aim of developing a series of standard tests that future developers could use to evaluate and calibrate their own codes. Seven code developers participated in the exercise, on a voluntary basis. Lawrence Livermore National laboratory participated with some support from the NRC. The final product of the study will include a series of criteria for judging of the validity of the results provided by a computer code. This EERI/PEER project was first planned to be completed by June of 2003. As the group neared completion of the tests, the managing team decided that new tests were necessary. As a result, the present report documents only the work performed to this point. It demonstrates that the computer codes developed by LLNL perform all calculations correctly and as intended. Differences exist between the results of the codes tested, that are attributed to a series of assumptions, on the parameters and models, that the developers had to make. The managing team is planning a new series of tests to help in reaching a consensus on these assumptions.

  9. Probabilistic constitutive relationships for cyclic material strength models

    NASA Technical Reports Server (NTRS)

    Boyce, L.; Chamis, C. C.

    1988-01-01

    A methodology is developed that provides a probabilistic treatment for the lifetime of structural components of aerospace propulsion systems subjected to fatigue. Material strength degradation models, based on primitive variables, include both a fatigue strength reduction model and a fatigue crack growth model. Probabilistic analysis is based on simulation, and both maximum entropy and maximum penalized likelihood methods are used for the generation of probability density functions. The resulting constitutive relationships are included in several computer programs.

  10. Characterization and evaluation of uncertainty in probabilistic risk analysis

    SciTech Connect

    Parry, G.W.; Winter, P.W.

    1981-01-01

    The sources of uncertainty in probabilistic risk analysis are discussed, using the event and fault-tree methodology as an example. The role of statistics in quantifying these uncertainties is investigated. A class of uncertainties is identified which are, at present, unquantifiable using either classical or Bayesian statistics. it is argued that Bayesian statistics is the more appropriate vehicle for the probabilistic analysis of rare events, and a short review is given with some discussion on the representation of ignorance.

  11. Process for computing geometric perturbations for probabilistic analysis

    DOEpatents

    Fitch, Simeon H. K.; Riha, David S.; Thacker, Ben H.

    2012-04-10

    A method for computing geometric perturbations for probabilistic analysis. The probabilistic analysis is based on finite element modeling, in which uncertainties in the modeled system are represented by changes in the nominal geometry of the model, referred to as "perturbations". These changes are accomplished using displacement vectors, which are computed for each node of a region of interest and are based on mean-value coordinate calculations.

  12. Probabilistic Simulation of Multi-Scale Composite Behavior

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2012-01-01

    A methodology is developed to computationally assess the non-deterministic composite response at all composite scales (from micro to structural) due to the uncertainties in the constituent (fiber and matrix) properties, in the fabrication process and in structural variables (primitive variables). The methodology is computationally efficient for simulating the probability distributions of composite behavior, such as material properties, laminate and structural responses. Bi-products of the methodology are probabilistic sensitivities of the composite primitive variables. The methodology has been implemented into the computer codes PICAN (Probabilistic Integrated Composite ANalyzer) and IPACS (Integrated Probabilistic Assessment of Composite Structures). The accuracy and efficiency of this methodology are demonstrated by simulating the uncertainties in composite typical laminates and comparing the results with the Monte Carlo simulation method. Available experimental data of composite laminate behavior at all scales fall within the scatters predicted by PICAN. Multi-scaling is extended to simulate probabilistic thermo-mechanical fatigue and to simulate the probabilistic design of a composite redome in order to illustrate its versatility. Results show that probabilistic fatigue can be simulated for different temperature amplitudes and for different cyclic stress magnitudes. Results also show that laminate configurations can be selected to increase the redome reliability by several orders of magnitude without increasing the laminate thickness--a unique feature of structural composites. The old reference denotes that nothing fundamental has been done since that time.

  13. Different developmental patterns of simple deductive and probabilistic inferential reasoning.

    PubMed

    Markovits, Henry; Thompson, Valerie

    2008-09-01

    In three studies, we examined simple counterexample-based and probabilistic reasoning in children 6, 7, and 9 years of age. In the first study, participants were asked to make conditional (if-then) inferences under both categorical (certain or uncertain) and probabilistic instructions. Results showed that 6-year-olds respond to both forms of inference in similar ways, but whereas probabilistic conditional inferences showed little development over this period, categorical inferences clearly improved between 6 and 7 years of age. An analysis of the children's justifications indicated that performance under categorical instructions was strongly related to counterexample generation at all ages, whereas this was true only for the younger children for inferences under probabilistic instructions. These findings were replicated in a second study, using problems that referred to concrete stimuli with varying probabilities of inference. A third study tested the hypothesis that children confused probability judgments with judgments of confidence and demonstrated a clear dissociation between these two constructs. Overall, these results show that children are capable of accurate conditional inferences under probabilistic instructions at a very early age and that the differentiation between categorical and probabilistic conditional reasoning is clear by at least 9 years ofage. These results are globally consistent with dual-process theories but suggest some difficulties for the way that the analytic-heuristic distinction underlying these theories has been conceptualized. PMID:18927025

  14. Addressing the Hard Factors for Command File Errors by Probabilistic Reasoning

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila; Bryant, Larry

    2014-01-01

    Command File Errors (CFE) are managed using standard risk management approaches at the Jet Propulsion Laboratory. Over the last few years, more emphasis has been made on the collection, organization, and analysis of these errors for the purpose of reducing the CFE rates. More recently, probabilistic modeling techniques have been used for more in depth analysis of the perceived error rates of the DAWN mission and for managing the soft factors in the upcoming phases of the mission. We broadly classify the factors that can lead to CFE's as soft factors, which relate to the cognition of the operators and hard factors which relate to the Mission System which is composed of the hardware, software and procedures used for the generation, verification & validation and execution of commands. The focus of this paper is to use probabilistic models that represent multiple missions at JPL to determine the root cause and sensitivities of the various components of the mission system and develop recommendations and techniques for addressing them. The customization of these multi-mission models to a sample interplanetary spacecraft is done for this purpose.

  15. Probabilistic earthquake hazard assessment for Peninsular India

    NASA Astrophysics Data System (ADS)

    Ashish; Lindholm, C.; Parvez, I. A.; Kühn, D.

    2016-04-01

    In this paper, a new probabilistic seismic hazard assessment (PSHA) is presented for Peninsular India. The PSHA has been performed using three different recurrence models: a classical seismic zonation model, a fault model, and a grid model. The development of a grid model based on a non-parameterized recurrence model using an adaptation of the Kernel-based method that has not been applied to this region before. The results obtained from the three models have been combined in a logic tree structure in order to investigate the impact of different weights of the models. Three suitable attenuation relations have been considered in terms of spectral acceleration for the stable continental crust as well as for the active crust within the Gujarat region. While Peninsular India has experienced large earthquakes, e.g., Latur and Jabalpur, it represents in general a stable continental region with little earthquake activity, as also confirmed in our hazard results. On the other hand, our study demonstrates that both the Gujarat and the Koyna regions are exposed to a high seismic hazard. The peak ground acceleration for 10 % exceedance in 50 years observed in Koyna is 0.4 g and in the Kutch region of Gujarat up to 0.3 g. With respect to spectral acceleration at 1 Hz, estimated ground motion amplitudes are higher in Gujarat than in the Koyna region due to the higher frequency of occurrence of larger earthquakes. We discuss the higher PGA levels for Koyna compared Gujarat and do not accept them uncritically.

  16. Astrobiological Complexity with Probabilistic Cellular Automata

    NASA Astrophysics Data System (ADS)

    Vukotić, Branislav; Ćirković, Milan M.

    2012-08-01

    The search for extraterrestrial life and intelligence constitutes one of the major endeavors in science, but has yet been quantitatively modeled only rarely and in a cursory and superficial fashion. We argue that probabilistic cellular automata (PCA) represent the best quantitative framework for modeling the astrobiological history of the Milky Way and its Galactic Habitable Zone. The relevant astrobiological parameters are to be modeled as the elements of the input probability matrix for the PCA kernel. With the underlying simplicity of the cellular automata constructs, this approach enables a quick analysis of large and ambiguous space of the input parameters. We perform a simple clustering analysis of typical astrobiological histories with "Copernican" choice of input parameters and discuss the relevant boundary conditions of practical importance for planning and guiding empirical astrobiological and SETI projects. In addition to showing how the present framework is adaptable to more complex situations and updated observational databases from current and near-future space missions, we demonstrate how numerical results could offer a cautious rationale for continuation of practical SETI searches.

  17. Simplified probabilistic risk assessment in fuel reprocessing

    SciTech Connect

    Solbrig, C.W.

    1993-03-01

    An evaluation was made to determine if a backup mass tracking computer would significantly reduce the probability of criticality in the fuel reprocessing of the Integral Fast Reactor. Often tradeoff studies, such as this, must be made that would greatly benefit from a Probably Risk Assessment (PRA). The major benefits of a complete PRA can often be accrued with a Simplified Probabilistic Risk Assessment (SPRA). An SPRA was performed by selecting a representative fuel reprocessing operation (moving a piece of fuel) for analysis. It showed that the benefit of adding parallel computers was small compared to the benefit which could be obtained by adding parallelism to two computer input steps and two of the weighing operations. The probability of an incorrect material moves with the basic process is estimated to be 4 out of 100 moves. The actual values of the probability numbers are considered accurate to within an order of magnitude. The most useful result of developing the fault trees accrue from the ability to determine where significant improvements in the process can be made. By including the above mentioned parallelism, the error move rate can be reduced to 1 out of 1000.

  18. Simplified probabilistic risk assessment in fuel reprocessing

    SciTech Connect

    Solbrig, C.W.

    1993-01-01

    An evaluation was made to determine if a backup mass tracking computer would significantly reduce the probability of criticality in the fuel reprocessing of the Integral Fast Reactor. Often tradeoff studies, such as this, must be made that would greatly benefit from a Probably Risk Assessment (PRA). The major benefits of a complete PRA can often be accrued with a Simplified Probabilistic Risk Assessment (SPRA). An SPRA was performed by selecting a representative fuel reprocessing operation (moving a piece of fuel) for analysis. It showed that the benefit of adding parallel computers was small compared to the benefit which could be obtained by adding parallelism to two computer input steps and two of the weighing operations. The probability of an incorrect material moves with the basic process is estimated to be 4 out of 100 moves. The actual values of the probability numbers are considered accurate to within an order of magnitude. The most useful result of developing the fault trees accrue from the ability to determine where significant improvements in the process can be made. By including the above mentioned parallelism, the error move rate can be reduced to 1 out of 1000.

  19. Probabilistic earthquake hazard assessment for Peninsular India

    NASA Astrophysics Data System (ADS)

    Ashish; Lindholm, C.; Parvez, I. A.; Kühn, D.

    2016-01-01

    In this paper, a new probabilistic seismic hazard assessment (PSHA) is presented for Peninsular India. The PSHA has been performed using three different recurrence models: a classical seismic zonation model, a fault model, and a grid model. The development of a grid model based on a non-parameterized recurrence model using an adaptation of the Kernel-based method that has not been applied to this region before. The results obtained from the three models have been combined in a logic tree structure in order to investigate the impact of different weights of the models. Three suitable attenuation relations have been considered in terms of spectral acceleration for the stable continental crust as well as for the active crust within the Gujarat region. While Peninsular India has experienced large earthquakes, e.g., Latur and Jabalpur, it represents in general a stable continental region with little earthquake activity, as also confirmed in our hazard results. On the other hand, our study demonstrates that both the Gujarat and the Koyna regions are exposed to a high seismic hazard. The peak ground acceleration for 10 % exceedance in 50 years observed in Koyna is 0.4 g and in the Kutch region of Gujarat up to 0.3 g. With respect to spectral acceleration at 1 Hz, estimated ground motion amplitudes are higher in Gujarat than in the Koyna region due to the higher frequency of occurrence of larger earthquakes. We discuss the higher PGA levels for Koyna compared Gujarat and do not accept them uncritically.

  20. Disease-Specific Probabilistic Brain Atlases

    PubMed Central

    Thompson, Paul; Mega, Michael S.; Toga, Arthur W.

    2009-01-01

    Atlases of the human brain, in health and disease, provide a comprehensive framework for understanding brain structure and function. The complexity and variability of brain structure, especially in the gyral patterns of the human cortex, present challenges in creating standardized brain atlases that reflect the anatomy of a population. This paper introduces the concept of a population-based, disease-specific brain atlas that can reflect the unique anatomy and physiology of a particular clinical subpopulation. Based on well-characterized patient groups, disease-specific atlases contain thousands of structure models, composite maps, average templates, and visualizations of structural variability, asymmetry and group-specific differences. They correlate the structural, metabolic, molecular and histologic hallmarks of the disease. Rather than simply fusing information from multiple subjects and sources, new mathematical strategies are introduced to resolve group-specific features not apparent in individual scans. High-dimensional elastic mappings, based on covariant partial differential equations, are developed to encode patterns of cortical variation. In the resulting brain atlas, disease-specific features and regional asymmetries emerge that are not apparent in individual anatomies. The resulting probabilistic atlas can identify patterns of altered structure and function, and can guide algorithms for knowledge-based image analysis, automated image labeling, tissue classification, data mining and functional image analysis. PMID:19424457