Science.gov

Sample records for probabilistic risk-based management

  1. Incorporating linguistic, probabilistic, and possibilistic information in a risk-based approach for ranking contaminated sites.

    PubMed

    Zhang, Kejiang; Achari, Gopal; Pei, Yuansheng

    2010-10-01

    Different types of uncertain information-linguistic, probabilistic, and possibilistic-exist in site characterization. Their representation and propagation significantly influence the management of contaminated sites. In the absence of a framework with which to properly represent and integrate these quantitative and qualitative inputs together, decision makers cannot fully take advantage of the available and necessary information to identify all the plausible alternatives. A systematic methodology was developed in the present work to incorporate linguistic, probabilistic, and possibilistic information into the Preference Ranking Organization METHod for Enrichment Evaluation (PROMETHEE), a subgroup of Multi-Criteria Decision Analysis (MCDA) methods for ranking contaminated sites. The identification of criteria based on the paradigm of comparative risk assessment provides a rationale for risk-based prioritization. Uncertain linguistic, probabilistic, and possibilistic information identified in characterizing contaminated sites can be properly represented as numerical values, intervals, probability distributions, and fuzzy sets or possibility distributions, and linguistic variables according to their nature. These different kinds of representation are first transformed into a 2-tuple linguistic representation domain. The propagation of hybrid uncertainties is then carried out in the same domain. This methodology can use the original site information directly as much as possible. The case study shows that this systematic methodology provides more reasonable results. © 2010 SETAC.

  2. Risk-Based Probabilistic Approach to Aeropropulsion System Assessment

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.

    2002-01-01

    In an era of shrinking development budgets and resources, where there is also an emphasis on reducing the product development cycle, the role of system assessment, performed in the early stages of an engine development program, becomes very critical to the successful development of new aeropropulsion systems. A reliable system assessment not only helps to identify the best propulsion system concept among several candidates, it can also identify which technologies are worth pursuing. This is particularly important for advanced aeropropulsion technology development programs, which require an enormous amount of resources. In the current practice of deterministic, or point-design, approaches, the uncertainties of design variables are either unaccounted for or accounted for by safety factors. This could often result in an assessment with unknown and unquantifiable reliability. Consequently, it would fail to provide additional insight into the risks associated with the new technologies, which are often needed by decision makers to determine the feasibility and return-on-investment of a new aircraft engine. In this work, an alternative approach based on the probabilistic method was described for a comprehensive assessment of an aeropropulsion system. The statistical approach quantifies the design uncertainties inherent in a new aeropropulsion system and their influences on engine performance. Because of this, it enhances the reliability of a system assessment. A technical assessment of a wave-rotor-enhanced gas turbine engine was performed to demonstrate the methodology. The assessment used probability distributions to account for the uncertainties that occur in component efficiencies and flows and in mechanical design variables. The approach taken in this effort was to integrate the thermodynamic cycle analysis embedded in the computer code NEPP (NASA Engine Performance Program) and the engine weight analysis embedded in the computer code WATE (Weight Analysis of Turbine

  3. Risk-based water resources planning: Incorporating probabilistic nonstationary climate uncertainties

    NASA Astrophysics Data System (ADS)

    Borgomeo, Edoardo; Hall, Jim W.; Fung, Fai; Watts, Glenn; Colquhoun, Keith; Lambert, Chris

    2014-08-01

    We present a risk-based approach for incorporating nonstationary probabilistic climate projections into long-term water resources planning. The proposed methodology uses nonstationary synthetic time series of future climates obtained via a stochastic weather generator based on the UK Climate Projections (UKCP09) to construct a probability distribution of the frequency of water shortages in the future. The UKCP09 projections extend well beyond the range of current hydrological variability, providing the basis for testing the robustness of water resources management plans to future climate-related uncertainties. The nonstationary nature of the projections combined with the stochastic simulation approach allows for extensive sampling of climatic variability conditioned on climate model outputs. The probability of exceeding planned frequencies of water shortages of varying severity (defined as Levels of Service for the water supply utility company) is used as a risk metric for water resources planning. Different sources of uncertainty, including demand-side uncertainties, are considered simultaneously and their impact on the risk metric is evaluated. Supply-side and demand-side management strategies can be compared based on how cost-effective they are at reducing risks to acceptable levels. A case study based on a water supply system in London (UK) is presented to illustrate the methodology. Results indicate an increase in the probability of exceeding the planned Levels of Service across the planning horizon. Under a 1% per annum population growth scenario, the probability of exceeding the planned Levels of Service is as high as 0.5 by 2040. The case study also illustrates how a combination of supply and demand management options may be required to reduce the risk of water shortages.

  4. Probabilistic modeling of percutaneous absorption for risk-based exposure assessments and transdermal drug delivery.

    SciTech Connect

    Ho, Clifford Kuofei

    2004-06-01

    Chemical transport through human skin can play a significant role in human exposure to toxic chemicals in the workplace, as well as to chemical/biological warfare agents in the battlefield. The viability of transdermal drug delivery also relies on chemical transport processes through the skin. Models of percutaneous absorption are needed for risk-based exposure assessments and drug-delivery analyses, but previous mechanistic models have been largely deterministic. A probabilistic, transient, three-phase model of percutaneous absorption of chemicals has been developed to assess the relative importance of uncertain parameters and processes that may be important to risk-based assessments. Penetration routes through the skin that were modeled include the following: (1) intercellular diffusion through the multiphase stratum corneum; (2) aqueous-phase diffusion through sweat ducts; and (3) oil-phase diffusion through hair follicles. Uncertainty distributions were developed for the model parameters, and a Monte Carlo analysis was performed to simulate probability distributions of mass fluxes through each of the routes. Sensitivity analyses using stepwise linear regression were also performed to identify model parameters that were most important to the simulated mass fluxes at different times. This probabilistic analysis of percutaneous absorption (PAPA) method has been developed to improve risk-based exposure assessments and transdermal drug-delivery analyses, where parameters and processes can be highly uncertain.

  5. Probabilistic Risk Based Decision Support for Oil and Gas Exploration and Production Facilities in Sensitive Ecosystems

    SciTech Connect

    Greg Thoma; John Veil; Fred Limp; Jackson Cothren; Bruce Gorham; Malcolm Williamson; Peter Smith; Bob Sullivan

    2009-05-31

    This report describes work performed during the initial period of the project 'Probabilistic Risk Based Decision Support for Oil and Gas Exploration and Production Facilities in Sensitive Ecosystems.' The specific region that is within the scope of this study is the Fayetteville Shale Play. This is an unconventional, tight formation, natural gas play that currently has approximately 1.5 million acres under lease, primarily to Southwestern Energy Incorporated and Chesapeake Energy Incorporated. The currently active play encompasses a region from approximately Fort Smith, AR east to Little Rock, AR approximately 50 miles wide (from North to South). The initial estimates for this field put it almost on par with the Barnett Shale play in Texas. It is anticipated that thousands of wells will be drilled during the next several years; this will entail installation of massive support infrastructure of roads and pipelines, as well as drilling fluid disposal pits and infrastructure to handle millions of gallons of fracturing fluids. This project focuses on gas production in Arkansas as the test bed for application of proactive risk management decision support system for natural gas exploration and production. The activities covered in this report include meetings with representative stakeholders, development of initial content and design for an educational web site, and development and preliminary testing of an interactive mapping utility designed to provide users with information that will allow avoidance of sensitive areas during the development of the Fayetteville Shale Play. These tools have been presented to both regulatory and industrial stakeholder groups, and their feedback has been incorporated into the project.

  6. Microbial quality of reclaimed water for urban reuses: Probabilistic risk-based investigation and recommendations.

    PubMed

    Chhipi-Shrestha, Gyan; Hewage, Kasun; Sadiq, Rehan

    2017-01-15

    Although Canada has abundant freshwater resources, many cities still experience seasonal water shortage. Supply-side and demand-side management is a core strategy to address this water shortage. Under this strategy, reclaimed water, which the Canadian public is willing to use for non-potable purposes, is an option. However, no universal guidelines exist for reclaimed water use. Despite the federal government's long-term goal to develop guidelines for many water reuse applications, guidelines have only been prescribed for reclaimed water use in toilet and urinal flushing in Canada. At the provincial level, British Columbia (BC) has promulgated guidelines for wide applications of reclaimed water but only at broad class levels. This research has investigated and proposed probabilistic risk-based recommended values for microbial quality of reclaimed water in various non-potable urban reuses. The health risk was estimated by using quantitative microbial risk assessment. Two-dimensional Monte Carlo simulations were used in the analysis to include variability and uncertainty in input data. The proposed recommended values are based on the indicator organism E. coli. The required treatment levels for reuse were also estimated. In addition, the recommended values were successfully applied to three wastewater treatment effluents in the Okanagan Valley, BC, Canada. The health risks associated with other bacterial pathogens (Campylobacter jejuni and Salmonella spp.), virus (adenovirus, norovirus, and rotavirus), and protozoa (Cryptosporidium parvum and Giardia spp.), were also estimated. The estimated risks indicate the effectiveness of the E. coli-based water quality recommended values. Sensitivity analysis shows the pathogenic E. coli ratio and morbidity are the most sensitive input parameters for all water reuses. The proposed recommended values could be further improved by using national or regional data on water exposures, disease burden per case, and the susceptibility

  7. Willingness-to-pay for a probabilistic flood forecast: a risk-based decision-making game

    NASA Astrophysics Data System (ADS)

    Arnal, Louise; Ramos, Maria-Helena; Coughlan de Perez, Erin; Cloke, Hannah Louise; Stephens, Elisabeth; Wetterhall, Fredrik; van Andel, Schalk Jan; Pappenberger, Florian

    2016-08-01

    Probabilistic hydro-meteorological forecasts have over the last decades been used more frequently to communicate forecast uncertainty. This uncertainty is twofold, as it constitutes both an added value and a challenge for the forecaster and the user of the forecasts. Many authors have demonstrated the added (economic) value of probabilistic over deterministic forecasts across the water sector (e.g. flood protection, hydroelectric power management and navigation). However, the richness of the information is also a source of challenges for operational uses, due partially to the difficulty in transforming the probability of occurrence of an event into a binary decision. This paper presents the results of a risk-based decision-making game on the topic of flood protection mitigation, called "How much are you prepared to pay for a forecast?". The game was played at several workshops in 2015, which were attended by operational forecasters and academics working in the field of hydro-meteorology. The aim of this game was to better understand the role of probabilistic forecasts in decision-making processes and their perceived value by decision-makers. Based on the participants' willingness-to-pay for a forecast, the results of the game show that the value (or the usefulness) of a forecast depends on several factors, including the way users perceive the quality of their forecasts and link it to the perception of their own performances as decision-makers.

  8. Willingness-to-pay for a probabilistic flood forecast: a risk-based decision-making game

    NASA Astrophysics Data System (ADS)

    Arnal, Louise; Ramos, Maria-Helena; Coughlan, Erin; Cloke, Hannah L.; Stephens, Elisabeth; Wetterhall, Fredrik; van Andel, Schalk-Jan; Pappenberger, Florian

    2016-04-01

    Forecast uncertainty is a twofold issue, as it constitutes both an added value and a challenge for the forecaster and the user of the forecasts. Many authors have demonstrated the added (economic) value of probabilistic forecasts over deterministic forecasts for a diversity of activities in the water sector (e.g. flood protection, hydroelectric power management and navigation). However, the richness of the information is also a source of challenges for operational uses, due partially to the difficulty to transform the probability of occurrence of an event into a binary decision. The setup and the results of a risk-based decision-making experiment, designed as a game on the topic of flood protection mitigation, called ``How much are you prepared to pay for a forecast?'', will be presented. The game was played at several workshops in 2015, including during this session at the EGU conference in 2015, and a total of 129 worksheets were collected and analysed. The aim of this experiment was to contribute to the understanding of the role of probabilistic forecasts in decision-making processes and their perceived value by decision-makers. Based on the participants' willingness-to-pay for a forecast, the results of the game showed that the value (or the usefulness) of a forecast depends on several factors, including the way users perceive the quality of their forecasts and link it to the perception of their own performances as decision-makers. Balancing avoided costs and the cost (or the benefit) of having forecasts available for making decisions is not straightforward, even in a simplified game situation, and is a topic that deserves more attention from the hydrological forecasting community in the future.

  9. Risk based management of piping systems

    SciTech Connect

    Conley, M.J.; Aller, J.E.; Tallin, A.; Weber, B.J.

    1996-07-01

    The API Piping Inspection Code is the first such Code to require classification of piping based on the consequences of failure, and to use this classification to influence inspection activity. Since this Code was published, progress has been made in the development of tools to improve on this approach by determining not only the consequences of failure, but also the likelihood of failure. ``Risk`` is defined as the product of the consequence and the likelihood. Measuring risk provides the means to formally manage risk by matching the inspection effort (costs) to the benefits of reduced risk. Using such a cost/benefit analysis allows the optimization of inspection budgets while meeting societal demands for reduction of the risk associated with process plant piping. This paper presents an overview of the tools developed to measure risk, and the methods to determine the effects of past and future inspections on the level of risk. The methodology is being developed as an industry-sponsored project under the direction of an API committee. The intent is to develop an API Recommended Practice that will be linked to In-Service Inspection Standards and the emerging Fitness for Service procedures. Actual studies using a similar approach have shown that a very high percentage of the risk due to piping in an operating facility is associated with relatively few pieces of piping. This permits inspection efforts to be focused on those piping systems that will result in the greatest risk reduction.

  10. A Practical Probabilistic Graphical Modeling Tool for Weighing Ecological Risk-Based Evidence

    EPA Science Inventory

    Past weight-of-evidence frameworks for adverse ecological effects have provided soft-scoring procedures for judgments based on the quality and measured attributes of evidence. Here, we provide a flexible probabilistic structure for weighing and integrating lines of evidence for e...

  11. A Practical Probabilistic Graphical Modeling Tool for Weighing Ecological Risk-Based Evidence

    EPA Science Inventory

    Past weight-of-evidence frameworks for adverse ecological effects have provided soft-scoring procedures for judgments based on the quality and measured attributes of evidence. Here, we provide a flexible probabilistic structure for weighing and integrating lines of evidence for e...

  12. Risk-based underground pipeline safety management considering corrosion effect.

    PubMed

    Shin, Seolin; Lee, Gunhak; Ahmed, Usama; Lee, Yongkyu; Na, Jonggeol; Han, Chonghun

    2017-08-18

    Due to the long term usage and irregular maintenance for corrosion checks, catastrophic accidents have been increasing in underground pipelines. In this study, a new safety management methodology of underground pipeline, risk-based pipeline management, is introduced reflecting corrosion effect. First, principle of the risk-based pipeline management is presented compared with an original method, qualitative measure. It is distinguished from the qualitative measure by reflecting societal risk and corrosion in safety management of underground pipeline. And then, it is applied to an existing underground propylene pipeline in Ulsan Industrial Complex, South Korea. The consequence analysis is based on real information, and the frequency analysis reflects degree of corrosion. For calculation of corrosion rate, direct current voltage gradient (DCVG) and close interval potential survey (CIPS) are conducted. As a result of applying the risk-based pipeline management, risk integral is reduced by 56.8% compared to the qualitative measure. Finally, sensitivity analysis is conducted on variables, which affect the risk of the pipeline. This study would contribute to introduce quantitative measure to pipeline management and increase safety of pipeline. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Risk-based management of solid tumors in children.

    PubMed

    Grosfeld, J L

    2000-11-01

    During the past 25 years significant improvements in survival (56% to 75%) have been observed for children with malignant solid tumors. Multidisciplinary cooperative studies using combined therapy (surgery, chemotherapy, and irradiation) have played a major role. This report describes how recognition of biologic and genetic factors has permitted risk categorization and resulted in new treatment protocols that individualize care. Genetic alterations and biologic factors concerning the multiple endocrine neoplasia syndromes, Wilms' tumor, and neuroblastoma are described. Using the these data new treatment protocols are designed according to whether a patient is categorized as having a low-, intermediate-, or high-risk tumor, which determines the intensity and type of treatment required. Identification of biologic markers and specific gene alterations may be critical in establishing the behavior of tumors (low versus high-risk). Risk-based management permits individualized care for each patient, maximizes survival, minimizes morbidity, and improves the quality of life.

  14. Risk-based principles for defining and managing water security

    PubMed Central

    Hall, Jim; Borgomeo, Edoardo

    2013-01-01

    The concept of water security implies concern about potentially harmful states of coupled human and natural water systems. Those harmful states may be associated with water scarcity (for humans and/or the environment), floods or harmful water quality. The theories and practices of risk analysis and risk management have been developed and elaborated to deal with the uncertain occurrence of harmful events. Yet despite their widespread application in public policy, theories and practices of risk management have well-known limitations, particularly in the context of severe uncertainties and contested values. Here, we seek to explore the boundaries of applicability of risk-based principles as a means of formalizing discussion of water security. Not only do risk concepts have normative appeal, but they also provide an explicit means of addressing the variability that is intrinsic to hydrological, ecological and socio-economic systems. We illustrate the nature of these interconnections with a simulation study, which demonstrates how water resources planning could take more explicit account of epistemic uncertainties, tolerability of risk and the trade-offs in risk among different actors. PMID:24080616

  15. Uncertainty in environmental risk assessment: implications for risk-based management of river basins.

    PubMed

    Ragas, Ad M J; Huijbregts, Mark A J; Henning-de Jong, Irmgard; Leuven, Rob S E W

    2009-01-01

    Environmental risk assessment is typically uncertain due to different perceptions of the risk problem and limited knowledge about the physical, chemical, and biological processes underlying the risk. The present paper provides a systematic overview of the implications of different types of uncertainty for risk management, with a focus on risk-based management of river basins. Three different types of uncertainty are distinguished: 1) problem definition uncertainty, 2) true uncertainty, and 3) variability. Methods to quantify and describe these types of uncertainty are discussed and illustrated in 4 case studies. The case studies demonstrate that explicit regulation of uncertainty can improve risk management (e.g., by identification of the most effective risk reduction measures, optimization of the use of resources, and improvement of the decision-making process). It is concluded that the involvement of nongovernmental actors as prescribed by the European Union Water Framework Directive (WFD) provides challenging opportunities to address problem definition uncertainty and those forms of true uncertainty that are difficult to quantify. However, the WFD guidelines for derivation and application of environmental quality standards could be improved by the introduction of a probabilistic approach to deal with true uncertainty and a better scientific basis for regulation of variability.

  16. COMMUNICATING PROBABILISTIC RISK OUTCOMES TO RISK MANAGERS

    EPA Science Inventory

    Increasingly, risk assessors are moving away from simple deterministic assessments to probabilistic approaches that explicitly incorporate ecological variability, measurement imprecision, and lack of knowledge (collectively termed "uncertainty"). While the new methods provide an...

  17. Managing wildfire events: risk-based decision making among a group of federal fire managers

    Treesearch

    Robyn S. Wilson; Patricia L. Winter; Lynn A. Maguire; Timothy. Ascher

    2011-01-01

    Managing wildfire events to achieve multiple management objectives involves a high degree of decision complexity and uncertainty, increasing the likelihood that decisions will be informed by experience-based heuristics triggered by available cues at the time of the decision. The research reported here tests the prevalence of three risk-based biases among 206...

  18. Probabilistic thinking and death anxiety: a terror management based study.

    PubMed

    Hayslip, Bert; Schuler, Eric R; Page, Kyle S; Carver, Kellye S

    2014-01-01

    Terror Management Theory has been utilized to understand how death can change behavioral outcomes and social dynamics. One area that is not well researched is why individuals willingly engage in risky behavior that could accelerate their mortality. One method of distancing a potential life threatening outcome when engaging in risky behaviors is through stacking probability in favor of the event not occurring, termed probabilistic thinking. The present study examines the creation and psychometric properties of the Probabilistic Thinking scale in a sample of young, middle aged, and older adults (n = 472). The scale demonstrated adequate internal consistency reliability for each of the four subscales, excellent overall internal consistency, and good construct validity regarding relationships with measures of death anxiety. Reliable age and gender effects in probabilistic thinking were also observed. The relationship of probabilistic thinking as part of a cultural buffer against death anxiety is discussed, as well as its implications for Terror Management research.

  19. Towards risk-based drought management in the Netherlands: quantifying the welfare effects of water shortage

    NASA Astrophysics Data System (ADS)

    van der Vat, Marnix; Femke, Schasfoort; Rhee Gigi, Van; Manfred, Wienhoven; Nico, Polman; Joost, Delsman; den Hoek Paul, Van; Maat Judith, Ter; Marjolein, Mens

    2016-04-01

    It is widely acknowledged that drought management should move from a crisis to a risk-based approach. A risk-based approach to managing water resources requires a sound drought risk analysis, quantifying the probability and impacts of water shortage due to droughts. Impacts of droughts are for example crop yield losses, hydropower production losses, and water shortage for municipal and industrial use. Many studies analyse the balance between supply and demand, but there is little experience in translating this into economic metrics that can be used in a decision-making process on investments to reduce drought risk. We will present a drought risk analysis method for the Netherlands, with a focus on the underlying economic method to quantify the welfare effects of water shortage for different water users. Both the risk-based approach as well as the economic valuation of water shortage for various water users was explored in a study for the Dutch Government. First, an historic analysis of the effects of droughts on revenues and prices in agriculture as well as on shipping and nature was carried out. Second, a drought risk analysis method was developed that combines drought hazard and drought impact analysis in a probabilistic way for various sectors. This consists of a stepwise approach, from water availability through water shortage to economic impact, for a range of drought events with a certain return period. Finally, a local case study was conducted to test the applicability of the drought risk analysis method. Through the study, experience was gained into integrating hydrological and economic analyses, which is a prerequisite for drought risk analysis. Results indicate that the risk analysis method is promising and applicable for various sectors. However, it was also found that quantification of economic impacts from droughts is time-consuming, because location- and sector-specific data is needed, which is not always readily available. Furthermore, for some

  20. Probabilistic economic frameworks for disaster risk management

    NASA Astrophysics Data System (ADS)

    Dulac, Guillaume; Forni, Marc

    2013-04-01

    Starting from the general concept of risk, we set up an economic analysis framework for Disaster Risk Management (DRM) investment. It builds on uncertainty management techniques - notably Monte Carlo simulations - and includes both a risk and performance metrics adapted to recurring issues in disaster risk management as entertained by governments and international organisations. This type of framework proves to be enlightening in several regards, and is thought to ease the promotion of DRM projects as "investments" rather than "costs to be born" and allow for meaningful comparison between DRM and other sectors. We then look at the specificities of disaster risk investments of medium to large scales through this framework, where some "invariants" can be identified, notably: (i) it makes more sense to perform analysis over long-term horizons -space and time scales are somewhat linked; (ii) profiling of the fluctuations of the gains and losses of DRM investments over long periods requires the ability to handle possibly highly volatile variables; (iii) complexity increases with the scale which results in a higher sensitivity of the analytic framework on the results; (iv) as the perimeter of analysis (time, theme and space-wise) is widened, intrinsic parameters of the project tend to weight lighter. This puts DRM in a very different perspective from traditional modelling, which usually builds on more intrinsic features of the disaster as it relates to the scientific knowledge about hazard(s). As models hardly accommodate for such complexity or "data entropy" (they require highly structured inputs), there is a need for a complementary approach to understand risk at global scale. The proposed framework suggests opting for flexible ad hoc modelling of specific issues consistent with one's objective, risk and performance metrics. Such tailored solutions are strongly context-dependant (time and budget, sensitivity of the studied variable in the economic framework) and can

  1. Rational risk-based decision support for drinking water well managers by optimized monitoring designs

    NASA Astrophysics Data System (ADS)

    Enzenhöfer, R.; Geiges, A.; Nowak, W.

    2011-12-01

    Advection-based well-head protection zones are commonly used to manage the contamination risk of drinking water wells. Considering the insufficient knowledge about hazards and transport properties within the catchment, current Water Safety Plans recommend that catchment managers and stakeholders know, control and monitor all possible hazards within the catchments and perform rational risk-based decisions. Our goal is to supply catchment managers with the required probabilistic risk information, and to generate tools that allow for optimal and rational allocation of resources between improved monitoring versus extended safety margins and risk mitigation measures. To support risk managers with the indispensable information, we address the epistemic uncertainty of advective-dispersive solute transport and well vulnerability (Enzenhoefer et al., 2011) within a stochastic simulation framework. Our framework can separate between uncertainty of contaminant location and actual dilution of peak concentrations by resolving heterogeneity with high-resolution Monte-Carlo simulation. To keep computational costs low, we solve the reverse temporal moment transport equation. Only in post-processing, we recover the time-dependent solute breakthrough curves and the deduced well vulnerability criteria from temporal moments by non-linear optimization. Our first step towards optimal risk management is optimal positioning of sampling locations and optimal choice of data types to reduce best the epistemic prediction uncertainty for well-head delineation, using the cross-bred Likelihood Uncertainty Estimator (CLUE, Leube et al., 2011) for optimal sampling design. Better monitoring leads to more reliable and realistic protection zones and thus helps catchment managers to better justify smaller, yet conservative safety margins. In order to allow an optimal choice in sampling strategies, we compare the trade-off in monitoring versus the delineation costs by accounting for ill

  2. Method for Water Management Considering Long-term Probabilistic Forecasts

    NASA Astrophysics Data System (ADS)

    Hwang, J.; Kang, J.; Suh, A. S.

    2015-12-01

    This research is aimed at predicting the monthly inflow of the Andong-dam basin in South Korea using long-term probabilistic forecasts to apply long-term forecasts to water management. Forecasted Cumulative Distribution Functions (CDFs) of monthly precipitation are plotted by combining the range of monthly precipitation based on proper Probability Density Function (PDF) in past data with probabilistic forecasts in each category. Ensembles of inflow are estimated by entering generated ensembles of precipitation based on the CDFs into the 'abcd' water budget model. The bias and RMSE between averages in past data and observed inflow are compared to them in forecasted ensembles. In our results, the bias and RMSE of average precipitation in the forecasted ensemble are bigger than in past data, whereas the average inflow in the forecasted ensemble is smaller than in past data. This result could be used for reference data to apply long-term forecasts to water management, because of the limit in the number of forecasted data for verification and differences between the Andong-dam basin and the forecasted regions. This research has significance by suggesting a method of applying probabilistic information in climate variables from long-term forecasts to water management in Korea. Original data of a climate model, which produces long-term probabilistic forecasts should be verified directly as input data of a water budget model in the future, so that a more scientific response in water management against uncertainty of climate change could be reached.

  3. Verification of recursive probabilistic integration (RPI) method for fatigue life management using non-destructive inspections

    NASA Astrophysics Data System (ADS)

    Chen, Tzikang J.; Shiao, Michael

    2016-04-01

    This paper verified a generic and efficient assessment concept for probabilistic fatigue life management. The concept is developed based on an integration of damage tolerance methodology, simulations methods1, 2, and a probabilistic algorithm RPI (recursive probability integration)3-9 considering maintenance for damage tolerance and risk-based fatigue life management. RPI is an efficient semi-analytical probabilistic method for risk assessment subjected to various uncertainties such as the variability in material properties including crack growth rate, initial flaw size, repair quality, random process modeling of flight loads for failure analysis, and inspection reliability represented by probability of detection (POD). In addition, unlike traditional Monte Carlo simulations (MCS) which requires a rerun of MCS when maintenance plan is changed, RPI can repeatedly use a small set of baseline random crack growth histories excluding maintenance related parameters from a single MCS for various maintenance plans. In order to fully appreciate the RPI method, a verification procedure was performed. In this study, MC simulations in the orders of several hundred billions were conducted for various flight conditions, material properties, and inspection scheduling, POD and repair/replacement strategies. Since the MC simulations are time-consuming methods, the simulations were conducted parallelly on DoD High Performance Computers (HPC) using a specialized random number generator for parallel computing. The study has shown that RPI method is several orders of magnitude more efficient than traditional Monte Carlo simulations.

  4. Risk-Based Models for Managing Data Privacy in Healthcare

    ERIC Educational Resources Information Center

    AL Faresi, Ahmed

    2011-01-01

    Current research in health care lacks a systematic investigation to identify and classify various sources of threats to information privacy when sharing health data. Identifying and classifying such threats would enable the development of effective information security risk monitoring and management policies. In this research I put the first step…

  5. Risk-Based Models for Managing Data Privacy in Healthcare

    ERIC Educational Resources Information Center

    AL Faresi, Ahmed

    2011-01-01

    Current research in health care lacks a systematic investigation to identify and classify various sources of threats to information privacy when sharing health data. Identifying and classifying such threats would enable the development of effective information security risk monitoring and management policies. In this research I put the first step…

  6. Managing long-term polycyclic aromatic hydrocarbon contaminated soils: a risk-based approach.

    PubMed

    Duan, Luchun; Naidu, Ravi; Thavamani, Palanisami; Meaklim, Jean; Megharaj, Mallavarapu

    2015-06-01

    Polycyclic aromatic hydrocarbons (PAHs) are a family of contaminants that consist of two or more aromatic rings fused together. Soils contaminated with PAHs pose significant risk to human and ecological health. Over the last 50 years, significant research has been directed towards the cleanup of PAH-contaminated soils to background level. However, this achieved only limited success especially with high molecular weight compounds. Notably, during the last 5-10 years, the approach to remediate PAH-contaminated soils has changed considerably. A risk-based prioritization of remediation interventions has become a valuable step in the management of contaminated sites. The hydrophobicity of PAHs underlines that their phase distribution in soil is strongly influenced by factors such as soil properties and ageing of PAHs within the soil. A risk-based approach recognizes that exposure and environmental effects of PAHs are not directly related to the commonly measured total chemical concentration. Thus, a bioavailability-based assessment using a combination of chemical analysis with toxicological assays and nonexhaustive extraction technique would serve as a valuable tool in risk-based approach for remediation of PAH-contaminated soils. In this paper, the fate and availability of PAHs in contaminated soils and their relevance to risk-based management of long-term contaminated soils are reviewed. This review may serve as guidance for the use of site-specific risk-based management methods.

  7. Introduction to Decision Support Systems for Risk Based Management of Contaminated Sites

    EPA Science Inventory

    A book on Decision Support Systems for Risk-based Management of contaminated sites is appealing for two reasons. First, it addresses the problem of contaminated sites, which has worldwide importance. Second, it presents Decision Support Systems (DSSs), which are powerful comput...

  8. Introduction to Decision Support Systems for Risk Based Management of Contaminated Sites

    EPA Science Inventory

    A book on Decision Support Systems for Risk-based Management of contaminated sites is appealing for two reasons. First, it addresses the problem of contaminated sites, which has worldwide importance. Second, it presents Decision Support Systems (DSSs), which are powerful comput...

  9. Managing wildfire events: risk-based decision making among a group of federal fire managers.

    PubMed

    Wilson, Robyn S; Winter, Patricia L; Maguire, Lynn A; Ascher, Timothy

    2011-05-01

    Managing wildfire events to achieve multiple management objectives involves a high degree of decision complexity and uncertainty, increasing the likelihood that decisions will be informed by experience-based heuristics triggered by available cues at the time of the decision. The research reported here tests the prevalence of three risk-based biases among 206 individuals in the USDA Forest Service with authority to choose how to manage a wildfire event (i.e., line officers and incident command personnel). The results indicate that the subjects exhibited loss aversion, choosing the safe option more often when the consequences of the choice were framed as potential gains, but this tendency was less pronounced among those with risk seeking attitudes. The subjects also exhibited discounting, choosing to minimize short-term over long-term risk due to a belief that future risk could be controlled, but this tendency was less pronounced among those with more experience. Finally, the subjects, in particular those with more experience, demonstrated a status quo bias, choosing suppression more often when their reported status quo was suppression. The results of this study point to a need to carefully construct the decision process to ensure that the uncertainty and conflicting objectives inherent in wildfire management do not result in the overuse of common heuristics. Individual attitudes toward risk or an agency culture of risk aversion may counterbalance such heuristics, whereas increased experience may lead to overconfident intuitive judgments and a failure to incorporate new and relevant information into the decision. © 2010 Society for Risk Analysis.

  10. A risk-based framework for water resource management under changing water availability, policy options, and irrigation expansion

    NASA Astrophysics Data System (ADS)

    Hassanzadeh, Elmira; Elshorbagy, Amin; Wheater, Howard; Gober, Patricia

    2016-08-01

    resemble nonlinear functions of changes in individual drivers. The proposed risk-based framework can be linked to any water resource system assessment scheme to quantify the risk in system performance under changing conditions, with the larger goal of proposing alternative policy options to address future uncertainties and management concerns.

  11. Probabilistic spill occurrence simulation for chemical spills management.

    PubMed

    Cao, Weihua; Li, James; Joksimovic, Darko; Yuan, Arnold; Banting, Doug

    2013-11-15

    Inland chemical spills pose a great threat to water quality in worldwide area. A sophisticated probabilistic spill-event model that characterizes temporal and spatial randomness and quantifies statistical uncertainty due to limited spill data is a major component in spill management and associated decision making. This paper presents a MATLAB-based Monte Carlo simulation (MMCS) model for simulating the probabilistic quantifiable occurrences of inland chemical spills by time, magnitude, and location based on North America Industry Classification System codes. The model's aleatory and epistemic uncertainties were quantified through integrated bootstrap resampling technique. Benzene spills in the St. Clair River area of concern were used as a case to demonstrate the model by simulating spill occurrences, occurrence time, and mass expected for a 10-year period. Uncertainty analysis indicates that simulated spill characteristics can be described by lognormal distributions with positive skewness. The simulated spill time series will enable a quantitative risk analysis for water quality impairments due to the spills. The MMCS model can also help governments to evaluate their priority list of spilled chemicals.

  12. Dynamic Resource Management in Clouds: A Probabilistic Approach

    NASA Astrophysics Data System (ADS)

    Gonçalves, Paulo; Roy, Shubhabrata; Begin, Thomas; Loiseau, Patrick

    Dynamic resource management has become an active area of research in the Cloud Computing paradigm. Cost of resources varies significantly depending on configuration for using them. Hence efficient management of resources is of prime interest to both Cloud Providers and Cloud Users. In this work we suggest a probabilistic resource provisioning approach that can be exploited as the input of a dynamic resource management scheme. Using a Video on Demand use case to justify our claims, we propose an analytical model inspired from standard models developed for epidemiology spreading, to represent sudden and intense workload variations. We show that the resulting model verifies a Large Deviation Principle that statistically characterizes extreme rare events, such as the ones produced by “buzz/flash crowd effects” that may cause workload overflow in the VoD context. This analysis provides valuable insight on expectable abnormal behaviors of systems. We exploit the information obtained using the Large Deviation Principle for the proposed Video on Demand use-case for defining policies (Service Level Agreements). We believe these policies for elastic resource provisioning and usage may be of some interest to all stakeholders in the emerging context of cloud networking.

  13. The role of risk-based prioritization in total quality management

    SciTech Connect

    Bennett, C.T.

    1994-10-01

    The climate in which government managers must make decisions grows more complex and uncertain. All stakeholders - the public, industry, and Congress - are demanding greater consciousness, responsibility, and accountability of programs and their budgets. Yet, managerial decisions have become multifaceted, involve greater risk, and operate over much longer time periods. Over the last four or five decades, as policy analysis and decisions became more complex, scientists from psychology, operations research, systems science, and economics have developed a more or less coherent process called decision analysis to aid program management. The process of decision analysis - a systems theoretic approach - provides the backdrop for this paper. The Laboratory Integrated Prioritization System (LIPS) has been developed as a systems analytic and risk-based prioritization tool to aid the management of the Tri-Labs` (Lawrence Livermore, Los Alamos, and Sandia) operating resources. Preliminary analyses of the effects of LIPS has confirmed the practical benefits of decision and systems sciences - the systematic, quantitative reduction in uncertainty. To date, the use of LIPS - and, hence, its value - has been restricted to resource allocation within the Tri-Labs` operations budgets. This report extends the role of risk-based prioritization to the support of DOE Total Quality Management (TQM) programs. Furthermore, this paper will argue for the requirement to institutionalize an evolutionary, decision theoretic approach to the policy analysis of the Department of Energy`s Program Budget.

  14. Development of risk-based air quality management strategies under impacts of climate change.

    PubMed

    Liao, Kuo-Jen; Amar, Praveen; Tagaris, Efthimios; Russell, Armistead G

    2012-05-01

    Climate change is forecast to adversely affect air quality through perturbations in meteorological conditions, photochemical reactions, and precursor emissions. To protect the environment and human health from air pollution, there is an increasing recognition of the necessity of developing effective air quality management strategies under the impacts of climate change. This paper presents a framework for developing risk-based air quality management strategies that can help policy makers improve their decision-making processes in response to current and future climate change about 30-50 years from now. Development of air quality management strategies under the impacts of climate change is fundamentally a risk assessment and risk management process involving four steps: (1) assessment of the impacts of climate change and associated uncertainties; (2) determination of air quality targets; (3) selections of potential air quality management options; and (4) identification of preferred air quality management strategies that minimize control costs, maximize benefits, or limit the adverse effects of climate change on air quality when considering the scarcity of resources. The main challenge relates to the level of uncertainties associated with climate change forecasts and advancements in future control measures, since they will significantly affect the risk assessment results and development of effective air quality management plans. The concept presented in this paper can help decision makers make appropriate responses to climate change, since it provides an integrated approach for climate risk assessment and management when developing air quality management strategies. Development of climate-responsive air quality management strategies is fundamentally a risk assessment and risk management process. The risk assessment process includes quantification of climate change impacts on air quality and associated uncertainties. Risk management for air quality under the impacts of

  15. Developing risk-based screening guidelines for dioxin management at a Melbourne sewage treatment plant.

    PubMed

    Gorman, J; Mival, K; Wright, J; Howell, M

    2003-01-01

    Dioxin is a generic term used to refer to the congeners of polychlorinated dibenzo-p-dioxins (PCDDs) and polychlorinated dibenzofurans (PCDFs). The principal source of dioxin production is generally thought to be from unintended by-products of waste incineration, but dioxins are also naturally formed from volcanic activity and forest fires (WHO, 1998). Estimates of dioxin emissions in Australia suggest that approximately 75% of the total PCDD and PCDF emissions derive from prescribed burning and wild bushfires. Currently, no screening guidelines for dioxins within soils are available in Australia. This paper presents the general approach and results of a human health risk-based assessment performed by URS Australia in 2001 to develop site specific reference criteria for remediation of a former sewage plant in Melbourne. Risk-based soil remediation concentrations for dioxins at the sewage treatment plant site were developed using tolerable daily intake values of 4, 2 and 1 pg/kg/day. The potentially significant exposure pathways and processes for exposure to dioxins were identified and risk-based soil concentrations derived in accordance with the general method framework presented in the National Environmental Protection Measure (Assessment of Site Contamination). The derived dioxin reference criteria were used to develop an effective risk management program focussed on those conditions that present the greatest contribution to overall risk to human health.

  16. Seasonal Water Resources Management and Probabilistic Operations Forecast in the San Juan Basin

    NASA Astrophysics Data System (ADS)

    Daugherty, L.; Zagona, E. A.; Rajagopalan, B.; Grantz, K.; Miller, W. P.; Werner, K.

    2013-12-01

    within the NWS Community Hydrologic Prediction System (CHPS) to produce an ensemble streamflow forecast. The ensemble traces are used to drive the MTOM with the initial conditions of the water resources system and the operating rules, to provide ensembles of water resources management and operation metrics. We applied this integrated approach to forecasting in the San Juan River Basin (SJRB) using a portion of the Colorado River MTOM. The management objectives in the basin include water supply for irrigation, tribal water rights, environmental flows, and flood control. The spring streamflow ensembles were issued at four different lead times on the first of each month from January - April, and are incorporated into the MTOM for the period 2002-2010. Ensembles of operational performance metrics for the SJRB such as Navajo Reservoir releases, end of water year storage, environmental flows and water supply for irrigation were computed and their skills evaluated against variables obtained in a baseline simulation using historical streamflow. Preliminary results indicate that thus obtained probabilistic forecasts may produce increased skill especially at long lead time (e.g., on Jan and Feb 1st). The probabilistic information for water management variables provide risks of system vulnerabilities and thus enables risk-based efficient planning and operations.

  17. Risk-based requirements management framework with applications to assurance cases

    NASA Astrophysics Data System (ADS)

    Feng, D.; Eyster, C.

    The current regulatory approach for assuring device safety primarily focuses on compliance with prescriptive safety regulations and relevant safety standards. This approach, however, does not always lead to a safe system design even though safety regulations and standards have been met. In the medical device industry, several high profile recalls involving infusion pumps have prompted the regulatory agency to reconsider how device safety should be managed, reviewed and approved. An assurance case has been cited as a promising tool to address this growing concern. Assurance cases have been used in safety-critical systems for some time. Most assurance cases, if not all, in literature today are developed in an ad hoc fashion, independent from risk management and requirement development. An assurance case is a resource-intensive endeavor that requires additional effort and documentation from equipment manufacturers. Without a well-organized requirements infrastructure in place, such “ additional effort” can be substantial, to the point where the cost of adoption outweighs the benefit of adoption. In this paper, the authors present a Risk-Based Requirements and Assurance Management (RBRAM) methodology. The RBRAM is an elaborate framework that combines Risk-Based Requirements Management (RBRM) with assurance case methods. Such an integrated framework can help manufacturers leverage an existing risk management to present a comprehensive assurance case with minimal additional effort while providing a supplementary means to reexamine the integrity of the system design in terms of the mission objective. Although the example used is from the medical industry, the authors believe that the RBRAM methodology underlines the fundamental principle of risk management, and offers a simple, yet effective framework applicable to aerospace industry, perhaps, to any industry.

  18. Achievements of risk-based produced water management on the Norwegian continental shelf (2002-2008).

    PubMed

    Smit, Mathijs G D; Frost, Tone K; Johnsen, Ståle

    2011-10-01

    In 1996, the Norwegian government issued a White Paper requiring the Norwegian oil industry to reach the goal of "zero discharge" for the marine environment by 2005. To achieve this goal, the Norwegian oil and gas industry initiated the Zero Discharge Programme for discharges of produced formation water from the hydrocarbon-containing reservoir, in close communication with regulators. The environmental impact factor (EIF), a risk-based management tool, was developed by the industry to quantify and document the environmental risks from produced water discharges. The EIF represents a volume of recipient water containing concentrations of one or more substances to a level exceeding a generic threshold for ecotoxicological effects. In addition, this tool facilitates the identification and selection of cost-effective risk mitigation measures. The EIF tool has been used by all operators on the Norwegian continental shelf since 2002 to report progress toward the goal of "zero discharge," interpreted as "zero harmful discharges," to the regulators. Even though produced water volumes have increased by approximately 30% between 2002 and 2008 on the Norwegian continental shelf, the total environmental risk from produced water discharges expressed by the summed EIF for all installations has been reduced by approximately 55%. The total amount of oil discharged to the sea has been reduced by 18% over the period 2000 to 2006. The experience from the Zero Discharge Programme shows that a risk-based approach is an excellent working tool to reduce discharges of potential harmful substances from offshore oil and gas installations.

  19. MO-E-9A-01: Risk Based Quality Management: TG100 In Action

    SciTech Connect

    Huq, M; Palta, J; Dunscombe, P; Thomadsen, B

    2014-06-15

    One of the goals of quality management in radiation therapy is to gain high confidence that patients will receive the prescribed treatment correctly. To accomplish these goals professional societies such as the American Association of Physicists in Medicine (AAPM) has published many quality assurance (QA), quality control (QC), and quality management (QM) guidance documents. In general, the recommendations provided in these documents have emphasized on performing device-specific QA at the expense of process flow and protection of the patient against catastrophic errors. Analyses of radiation therapy incidents find that they are most often caused by flaws in the overall therapy process, from initial consult through final treatment, than by isolated hardware or computer failures detectable by traditional physics QA. This challenge is shared by many intrinsically hazardous industries. Risk assessment tools and analysis techniques have been developed to define, identify, and eliminate known and/or potential failures, problems, or errors, from a system, process and/or service before they reach the customer. These include, but are not limited to, process mapping, failure modes and effects analysis (FMEA), fault tree analysis (FTA), and establishment of a quality management program that best avoids the faults and risks that have been identified in the overall process. These tools can be easily adapted to radiation therapy practices because of their simplicity and effectiveness to provide efficient ways to enhance the safety and quality of treatment processes. Task group 100 (TG100) of AAPM has developed a risk-based quality management program that uses these tools. This session will be devoted to a discussion of these tools and how these tools can be used in a given radiotherapy clinic to develop a risk based QM program. Learning Objectives: Learn how to design a process map for a radiotherapy process. Learn how to perform a FMEA analysis for a given process. Learn what

  20. Irrigation and Instream Management under Drought Conditions using Probabilistic Constraints

    NASA Astrophysics Data System (ADS)

    Oviedo-Salcedo, D. M.; Cai, X.; Valocchi, A. J.

    2009-12-01

    It is well-known that river-aquifer flux exchange may be an important control on low flow condition in a stream. Moreover, the connections between streams and underlying formations can be spatially variable due to geological heterogeneity and landscape topography. For example, during drought seasons, farming activities may induce critical peak pumping rates to supply irrigation water needs for crops, and this leads to increased concerns about reductions in baseflow and adverse impacts upon riverine ecosystems. Quantitative management of the subsurface water resources is a required key component in this particular human-nature interaction system to evaluate the tradeoffs between irrigation for agriculture and the ecosystems low flow requirements. This work presents an optimization scheme developed upon the systems reliability-based design optimization -SRBDO- analysis, which accounts for prescribed probabilistic constraint evaluation. This approach can provide optimal solutions in the presence of uncertainty with a higher level of confidence. In addition, the proposed methodology quantifies and controls the risk of failure. SRBDO have been developed in the aerospace industry and extensively applied in the field of structural engineering, but has only seen limited application in the field of hydrology. SRBDO uses probability theory to model uncertainty and to determine the probability of failure by solving a mathematical nonlinear programming problem. Furthermore, the reliability-based design optimization provides a complete and detailed insight of the relative importance of each random variable involved in the application, in this case the surface -groundwater coupled system. Importance measures and sensitivity analyses of both, random variables and probability distribution function parameters are integral components of the system reliability analysis. Therefore, with this methodology it is possible to assess the contribution of each uncertain variable on the total

  1. Emerging contaminants in the environment: Risk-based analysis for better management.

    PubMed

    Naidu, Ravi; Arias Espana, Victor Andres; Liu, Yanju; Jit, Joytishna

    2016-07-01

    Emerging contaminants (ECs) are chemicals of a synthetic origin or deriving from a natural source that has recently been discovered and for which environmental or public health risks are yet to be established. This is due to limited available information on their interaction and toxicological impacts on receptors. Several types of ECs exist such as antibiotics, pesticides, pharmaceuticals, personal care products, effluents, certain naturally occurring contaminants and more recently nanomaterials. ECs may derive from a known source, for example released directly to the aquatic environment from direct discharges such as those from wastewater treatment plants. Although in most instances the direct source cannot be identified, ECs have been detected in virtually every country's natural environment and as a consequence they represent a global problem. There is very limited information on the fate and transport of ECs in the environment and their toxicological impact. This lack of information can be attributed to limited financial resources and the lack of analytical techniques for detecting their effects on ecosystems and human health on their own or as mixture. We do not know how ECs interact with each other or various contaminants. This paper presents an overview of existing knowledge on ECs, their fate and transport and a risk-based analysis for ECs management and complementary strategies.

  2. Risk-based Inspection Scheduling Planning for Intelligent Agent in the Autonomous Fault Management

    SciTech Connect

    Hari Nugroho, Djoko; Sudarno

    2010-06-22

    This paper developed an autonomous fault management focusing to the inspection scheduling planning which was implemented to the advanced small nuclear reactor without on-site refuelling to assure the safety without human intervention. The inspection scheduling planning was developed optimally on the risk-based approach compromising between two important constraints related to the risk of action planning as such failure probability and shortest path. Performance was represented using computer simulation implemented to the DURESS components location and failure probability. It could be concluded that the first priority to be inspected was flow sensor FB2 which had the largest comparation value of 0.104233 comparing with the other components. The next route would be visited were sequentially FB1, FA2, FA1, FB, FA, VB, pump B, VA, pump A, VB2, VB1, VA2, VA1, reservoir 2, reservoir 1, FR2, and FR1. The movement route planning could be transferred to activate the robot arm which reflected as intelligent agent.

  3. Health Risk-Based Assessment and Management of Heavy Metals-Contaminated Soil Sites in Taiwan

    PubMed Central

    Lai, Hung-Yu; Hseu, Zeng-Yei; Chen, Ting-Chien; Chen, Bo-Ching; Guo, Horng-Yuh; Chen, Zueng-Sang

    2010-01-01

    Risk-based assessment is a way to evaluate the potential hazards of contaminated sites and is based on considering linkages between pollution sources, pathways, and receptors. These linkages can be broken by source reduction, pathway management, and modifying exposure of the receptors. In Taiwan, the Soil and Groundwater Pollution Remediation Act (SGWPR Act) uses one target regulation to evaluate the contamination status of soil and groundwater pollution. More than 600 sites contaminated with heavy metals (HMs) have been remediated and the costs of this process are always high. Besides using soil remediation techniques to remove contaminants from these sites, the selection of possible remediation methods to obtain rapid risk reduction is permissible and of increasing interest. This paper discusses previous soil remediation techniques applied to different sites in Taiwan and also clarified the differences of risk assessment before and after soil remediation obtained by applying different risk assessment models. This paper also includes many case studies on: (1) food safety risk assessment for brown rice growing in a HMs-contaminated site; (2) a tiered approach to health risk assessment for a contaminated site; (3) risk assessment for phytoremediation techniques applied in HMs-contaminated sites; and (4) soil remediation cost analysis for contaminated sites in Taiwan. PMID:21139851

  4. Data Supply Chain Management: Supply Chain Management for Incentive and Risk-based Assured Information Sharing

    DTIC Science & Technology

    2011-12-01

    situation, the end customer needs the data product either for sale or to help him make decisions. Another definition for SCM provided by the APICS...the analogy to data supply chain management. In both cases, the end customer needs a product. The product has to be assembled from multiple components...data source should we get the data? Is it A or B? Production decisions are important for data supply chain. When a customer needs a data product

  5. A risk-based approach to evaluating wildlife demographics for management in a changing climate: A case study of the Lewis's Woodpecker

    USGS Publications Warehouse

    Towler, Erin; Saab, Victoria A.; Sojda, Richard S.; Dickinson, Katherine; Bruyere, Cindy L.; Newlon, Karen R.

    2012-01-01

    Given the projected threat that climate change poses to biodiversity, the need for proactive response efforts is clear. However, integrating uncertain climate change information into conservation planning is challenging, and more explicit guidance is needed. To this end, this article provides a specific example of how a risk-based approach can be used to incorporate a species' response to climate into conservation decisions. This is shown by taking advantage of species' response (i.e., impact) models that have been developed for a well-studied bird species of conservation concern. Specifically, we examine the current and potential impact of climate on nest survival of the Lewis's Woodpecker (Melanerpes lewis) in two different habitats. To address climate uncertainty, climate scenarios are developed by manipulating historical weather observations to create ensembles (i.e., multiple sequences of daily weather) that reflect historical variability and potential climate change. These ensembles allow for a probabilistic evaluation of the risk posed to Lewis's Woodpecker nest survival and are used in two demographic analyses. First, the relative value of each habitat is compared in terms of nest survival, and second, the likelihood of exceeding a critical population threshold is examined. By embedding the analyses in a risk framework, we show how management choices can be made to be commensurate with a defined level of acceptable risk. The results can be used to inform habitat prioritization and are discussed in the context of an economic framework for evaluating trade-offs between management alternatives.

  6. A Risk-Based Approach to Evaluating Wildlife Demographics for Management in a Changing Climate: A Case Study of the Lewis's Woodpecker

    NASA Astrophysics Data System (ADS)

    Towler, Erin; Saab, Victoria A.; Sojda, Richard S.; Dickinson, Katherine; Bruyère, Cindy L.; Newlon, Karen R.

    2012-12-01

    Given the projected threat that climate change poses to biodiversity, the need for proactive response efforts is clear. However, integrating uncertain climate change information into conservation planning is challenging, and more explicit guidance is needed. To this end, this article provides a specific example of how a risk-based approach can be used to incorporate a species' response to climate into conservation decisions. This is shown by taking advantage of species' response (i.e., impact) models that have been developed for a well-studied bird species of conservation concern. Specifically, we examine the current and potential impact of climate on nest survival of the Lewis's Woodpecker ( Melanerpes lewis) in two different habitats. To address climate uncertainty, climate scenarios are developed by manipulating historical weather observations to create ensembles (i.e., multiple sequences of daily weather) that reflect historical variability and potential climate change. These ensembles allow for a probabilistic evaluation of the risk posed to Lewis's Woodpecker nest survival and are used in two demographic analyses. First, the relative value of each habitat is compared in terms of nest survival, and second, the likelihood of exceeding a critical population threshold is examined. By embedding the analyses in a risk framework, we show how management choices can be made to be commensurate with a defined level of acceptable risk. The results can be used to inform habitat prioritization and are discussed in the context of an economic framework for evaluating trade-offs between management alternatives.

  7. National Drought Policy: Shifting the Paradigm from Crisis to Risk-based Management

    NASA Astrophysics Data System (ADS)

    Wilhite, D. A.; Sivakumar, M. K.; Stefanski, R.

    2011-12-01

    Drought is a normal part of climate for virtually all of the world's climatic regimes. To better address the risks associated with this hazard and societal vulnerability, there must be a dramatic paradigm shift in our approach to drought management in the coming decade in the light of the increasing frequency of droughts and projections of increased severity and duration of these events in the future for many regions, especially in the developing world. Addressing this challenge will require an improved awareness of drought as a natural hazard, the establishment of integrated drought monitoring and early warning systems, a higher level of preparedness that fully incorporates risk-based management, and the adoption of national drought policies that are directed at increasing the coping capacity and resilience of populations to future drought episodes. The World Meteorological Organization (WMO), in partnership with other United Nations' agencies, the National Drought Mitigation Center at the University of Nebraska, NOAA, the U.S. Department of Agriculture, and other partners, is currently launching a program to organize a High Level Meeting on National Drought Policy (HMNDP) in March 2013 to encourage the development of national drought policies through the development of a compendium of key policy elements. The key objectives of a national drought policy are to: (1) encourage vulnerable economic sectors and population groups to adopt self-reliant measures that promote risk management; (2) promote sustainable use of the agricultural and natural resource base; and (3) facilitate early recovery from drought through actions consistent with national drought policy objectives. The key elements of a drought policy framework are policy and governance, including political will; addressing risk and improving early warnings, including vulnerability analysis, impact assessment, and communication; mitigation and preparedness, including the application of effective and

  8. Risk-based water resources planning: Coupling water allocation and water quality management under extreme droughts

    NASA Astrophysics Data System (ADS)

    Mortazavi-Naeini, M.; Bussi, G.; Hall, J. W.; Whitehead, P. G.

    2016-12-01

    The main aim of water companies is to have a reliable and safe water supply system. To fulfil their duty the water companies have to consider both water quality and quantity issues and challenges. Climate change and population growth will have an impact on water resources both in terms of available water and river water quality. Traditionally, a distinct separation between water quality and abstraction has existed. However, water quality can be a bottleneck in a system since water treatment works can only treat water if it meets certain standards. For instance, high turbidity and large phytoplankton content can increase sharply the cost of treatment or even make river water unfit for human consumption purposes. It is vital for water companies to be able to characterise the quantity and quality of water under extreme weather events and to consider the occurrence of eventual periods when water abstraction has to cease due to water quality constraints. This will give them opportunity to decide on water resource planning and potential changes to reduce the system failure risk. We present a risk-based approach for incorporating extreme events, based on future climate change scenarios from a large ensemble of climate model realisations, into integrated water resources model through combined use of water allocation (WATHNET) and water quality (INCA) models. The annual frequency of imposed restrictions on demand is considered as measure of reliability. We tested our approach on Thames region, in the UK, with 100 extreme events. The results show increase in frequency of imposed restrictions when water quality constraints were considered. This indicates importance of considering water quality issues in drought management plans.

  9. Communicating uncertainty: managing the inherent probabilistic character of hazard estimates

    NASA Astrophysics Data System (ADS)

    Albarello, Dario

    2013-04-01

    Science is much more fixing the limits of our knowledge about possible occurrences than the identification of any "truth". This is particularly true when scientific statements concern prediction of natural phenomena largely exceeding the laboratory scale as in the case of seismogenesis. In these cases, many scenarios about future occurrences result possible (plausible) and the contribution of scientific knowledge (based on the available knowledge about underlying processes or the phenomenological studies) mainly consists in attributing to each scenario a different level of likelihood (probability). In other terms, scientific predictions in the field of geosciences (hazard assessment) are inherently probabilistic. However, despite of this, many scientist (seismologists, etc.) in communicating their position in public debates tend to stress the " truth" of their statements against the fancy character of pseudo-scientific assertions: stronger is the opposition of science and pseudo-science, more hidden becomes the probabilistic character of scientific statements. The problem arises when this kind of "probabilistic" knowledge becomes the basis of any political action (e.g., to impose expensive form of risk reducing activities): in these cases the lack of any definitive "truth" requires a direct assumption of responsibility by the relevant decider (being the single citizen or the legitimate expression of a larger community) to choose among several possibilities (however characterized by different levels of likelihood). In many cases, this can be uncomfortable and strong is the attitude to delegate to the scientific counterpart the responsibility of these decisions. This "transfer" from the genuine political field to an improper scientific context is also facilitated by the lack of a diffuse culture of "probability" outside the scientific community (and in many cases inside also). This is partially the effect of the generalized adoption (by media and scientific

  10. A probabilistic risk management based process for planning and management of technology development

    NASA Astrophysics Data System (ADS)

    Largent, Matthew Clinton

    In the current environment of limited research funding and evolving aerospace needs and requirements, the development of new technology is a critical process. Technologies are designed to meet specific system performance needs, but must be developed in order to reduce uncertainty associated with meeting the needs, as well as uncertainty regarding additional effects that the technology will have on the system. The development project will have risk associated with meeting budget and schedule requirements, and with the completion of the development project plan. Existing methods for technology development fall short of quantifying all areas of risk and uncertainty, and do not provide a method for linking the reduction of performance uncertainty with the management of cost, time, and project risk. This thesis introduces the Technology Development Planning and Management (TDPM) process, a structured process using probabilistic methods and risk management concepts to assist in the planning and management of technology development projects. The TDPM process focuses on planning activities to reduce the areas of performance uncertainty that have the largest effects on system level goals. The cost and schedule uncertainty and project risk associated with the project plan are quantified in order to allow informed management of the project plan and eventual development project. TDPM was implemented for two technology development examples. The first example focused on the implementation of the process for a simple technology development project, showcasing the ability to plan for uncertainty reduction, demonstrate the resulting effects on the system level, and still manage the project cost and schedule risk. The second example was performed by an experienced technology development manager, who implemented TDPM on the hypothetical development of a technology currently being studied. Through the examples, the TDPM process was shown to be a valid and useful tool that advances the

  11. The future of monitoring in clinical research - a holistic approach: linking risk-based monitoring with quality management principles.

    PubMed

    Ansmann, Eva B; Hecht, Arthur; Henn, Doris K; Leptien, Sabine; Stelzer, Hans Günther

    2013-01-01

    Since several years risk-based monitoring is the new "magic bullet" for improvement in clinical research. Lots of authors in clinical research ranging from industry and academia to authorities are keen on demonstrating better monitoring-efficiency by reducing monitoring visits, monitoring time on site, monitoring costs and so on, always arguing with the use of risk-based monitoring principles. Mostly forgotten is the fact, that the use of risk-based monitoring is only adequate if all mandatory prerequisites at site and for the monitor and the sponsor are fulfilled.Based on the relevant chapter in ICH GCP (International Conference on Harmonisation of technical requirements for registration of pharmaceuticals for human use - Good Clinical Practice) this publication takes a holistic approach by identifying and describing the requirements for future monitoring and the use of risk-based monitoring. As the authors are operational managers as well as QA (Quality Assurance) experts, both aspects are represented to come up with efficient and qualitative ways of future monitoring according to ICH GCP.

  12. The future of monitoring in clinical research – a holistic approach: Linking risk-based monitoring with quality management principles

    PubMed Central

    Ansmann, Eva B.; Hecht, Arthur; Henn, Doris K.; Leptien, Sabine; Stelzer, Hans Günther

    2013-01-01

    Since several years risk-based monitoring is the new “magic bullet” for improvement in clinical research. Lots of authors in clinical research ranging from industry and academia to authorities are keen on demonstrating better monitoring-efficiency by reducing monitoring visits, monitoring time on site, monitoring costs and so on, always arguing with the use of risk-based monitoring principles. Mostly forgotten is the fact, that the use of risk-based monitoring is only adequate if all mandatory prerequisites at site and for the monitor and the sponsor are fulfilled. Based on the relevant chapter in ICH GCP (International Conference on Harmonisation of technical requirements for registration of pharmaceuticals for human use – Good Clinical Practice) this publication takes a holistic approach by identifying and describing the requirements for future monitoring and the use of risk-based monitoring. As the authors are operational managers as well as QA (Quality Assurance) experts, both aspects are represented to come up with efficient and qualitative ways of future monitoring according to ICH GCP. PMID:23382708

  13. Risk-based Spacecraft Fire Safety Experiments

    NASA Technical Reports Server (NTRS)

    Apostolakis, G.; Catton, I.; Issacci, F.; Paulos, T.; Jones, S.; Paxton, K.; Paul, M.

    1992-01-01

    Viewgraphs on risk-based spacecraft fire safety experiments are presented. Spacecraft fire risk can never be reduced to a zero probability. Probabilistic risk assessment is a tool to reduce risk to an acceptable level.

  14. How to quantify sustainable development: a risk-based approach to water quality management.

    PubMed

    Sarang, Amin; Vahedi, Arman; Shamsai, Abolfazl

    2008-02-01

    Since the term was coined in the Brundtland report in 1987, the issue of sustainable development has been challenged in terms of quantification. Different policy options may lend themselves more or less to the underlying principles of sustainability, but no analytical tools are available for a more in-depth assessment of the degree of sustainability. Overall, there are two major schools of thought employing the sustainability concept in managerial decisions: those of measuring and those of monitoring. Measurement of relative sustainability is the key issue in bridging the gap between theory and practice of sustainability of water resources systems. The objective of this study is to develop a practical tool for quantifying and assessing the degree of relative sustainability of water quality systems based on risk-based indicators, including reliability, resilience, and vulnerability. Current work on the Karoun River, the largest river in Iran, has included the development of an integrated model consisting of two main parts: a water quality simulation subroutine to evaluate Dissolved Oxygen Biological Oxygen Demand (DO-BOD) response, and an estimation of risk-based indicators subroutine via the First Order Reliability Method (FORM) and Monte Carlo Simulation (MCS). We also developed a simple waste load allocation model via Least Cost and Uniform Treatment approaches in order to consider the optimal point of pollutants control costs given a desired reliability value which addresses DO in two different targets. The Risk-based approach developed herein, particularly via the FORM technique, appears to be an appropriately efficient tool for estimating the relative sustainability. Moreover, our results in the Karoun system indicate that significant changes in sustainability values are possible through dedicating money for treatment and strict pollution controls while simultaneously requiring a technical advance along change in current attitudes for environment protection.

  15. How to Quantify Sustainable Development: A Risk-Based Approach to Water Quality Management

    NASA Astrophysics Data System (ADS)

    Sarang, Amin; Vahedi, Arman; Shamsai, Abolfazl

    2008-02-01

    Since the term was coined in the Brundtland report in 1987, the issue of sustainable development has been challenged in terms of quantification. Different policy options may lend themselves more or less to the underlying principles of sustainability, but no analytical tools are available for a more in-depth assessment of the degree of sustainability. Overall, there are two major schools of thought employing the sustainability concept in managerial decisions: those of measuring and those of monitoring. Measurement of relative sustainability is the key issue in bridging the gap between theory and practice of sustainability of water resources systems. The objective of this study is to develop a practical tool for quantifying and assessing the degree of relative sustainability of water quality systems based on risk-based indicators, including reliability, resilience, and vulnerability. Current work on the Karoun River, the largest river in Iran, has included the development of an integrated model consisting of two main parts: a water quality simulation subroutine to evaluate Dissolved Oxygen Biological Oxygen Demand (DO-BOD) response, and an estimation of risk-based indicators subroutine via the First Order Reliability Method (FORM) and Monte Carlo Simulation (MCS). We also developed a simple waste load allocation model via Least Cost and Uniform Treatment approaches in order to consider the optimal point of pollutants control costs given a desired reliability value which addresses DO in two different targets. The Risk-based approach developed herein, particularly via the FORM technique, appears to be an appropriately efficient tool for estimating the relative sustainability. Moreover, our results in the Karoun system indicate that significant changes in sustainability values are possible through dedicating money for treatment and strict pollution controls while simultaneously requiring a technical advance along change in current attitudes for environment protection.

  16. A new approach to risk evaluation and management: risk-based, precaution-based, and discourse-based strategies.

    PubMed

    Klinke, Andreas; Renn, Ortwin

    2002-12-01

    Our concept of nine risk evaluation criteria, six risk classes, a decision tree, and three management categories was developed to improve the effectiveness, efficiency, and political feasibility of risk management procedures. The main task of risk evaluation and management is to develop adequate tools for dealing with the problems of complexity, uncertainty. and ambiguity. Based on the characteristics of different risk types and these three major problems, we distinguished three types of management--risk-based, precaution-based, and discourse-based strategies. The risk-based strategy--is the common solution to risk problems. Once the probabilities and their corresponding damage potentials are calculated, risk managers are required to set priorities according to the severity of the risk, which may be operationalized as a linear combination of damage and probability or as a weighted combination thereof. Within our new risk classification, the two central components have been augmented with other physical and social criteria that still demand risk-based strategies as long as uncertainty is low and ambiguity absent. Risk-based strategies are best solutions to problems of complexity and some components of uncertainty, for example, variation among individuals. If the two most important risk criteria, probability of occurrence and extent of damage, are relatively well known and little uncertainty is left, the traditional risk-based approach seems reasonable. If uncertainty plays a large role, in particular, indeterminacy or lack of knowledge, the risk-based approach becomes counterproductive. Judging the relative severity of risks on the basis of uncertain parameters does not make much sense. Under these circumstances, management strategies belonging to the precautionary management style are required. The precautionary approach has been the basis for much of the European environmental and health protection legislation and regulation. Our own approach to risk management

  17. Mobile human network management and recommendation by probabilistic social mining.

    PubMed

    Min, Jun-Ki; Cho, Sung-Bae

    2011-06-01

    Recently, inferring or sharing of mobile contexts has been actively investigated as cell phones have become more than a communication device. However, most of them focused on utilizing the contexts on social network services, while the means in mining or managing the human network itself were barely considered. In this paper, the SmartPhonebook, which mines users' social connections to manage their relationships by reasoning social and personal contexts, is presented. It works like an artificial assistant which recommends the candidate callees whom the users probably would like to contact in a certain situation. Moreover, it visualizes their social contexts like closeness and relationship with others in order to let the users know their social situations. The proposed method infers the social contexts based on the contact patterns, while it extracts the personal contexts such as the users' emotional states and behaviors from the mobile logs. Here, Bayesian networks are exploited to handle the uncertainties in the mobile environment. The proposed system has been implemented with the MS Windows Mobile 2003 SE Platform on Samsung SPH-M4650 smartphone and has been tested on real-world data. The experimental results showed that the system provides an efficient and informative way for mobile social networking.

  18. Resolution of Probabilistic Weather Forecasts with Application in Disease Management.

    PubMed

    Hughes, G; McRoberts, N; Burnett, F J

    2017-02-01

    Predictive systems in disease management often incorporate weather data among the disease risk factors, and sometimes this comes in the form of forecast weather data rather than observed weather data. In such cases, it is useful to have an evaluation of the operational weather forecast, in addition to the evaluation of the disease forecasts provided by the predictive system. Typically, weather forecasts and disease forecasts are evaluated using different methodologies. However, the information theoretic quantity expected mutual information provides a basis for evaluating both kinds of forecast. Expected mutual information is an appropriate metric for the average performance of a predictive system over a set of forecasts. Both relative entropy (a divergence, measuring information gain) and specific information (an entropy difference, measuring change in uncertainty) provide a basis for the assessment of individual forecasts.

  19. Application of risk-based multiple criteria decision analysis for selection of the best agricultural scenario for effective watershed management.

    PubMed

    Javidi Sabbaghian, Reza; Zarghami, Mahdi; Nejadhashemi, A Pouyan; Sharifi, Mohammad Bagher; Herman, Matthew R; Daneshvar, Fariborz

    2016-03-01

    Effective watershed management requires the evaluation of agricultural best management practice (BMP) scenarios which carefully consider the relevant environmental, economic, and social criteria involved. In the Multiple Criteria Decision-Making (MCDM) process, scenarios are first evaluated and then ranked to determine the most desirable outcome for the particular watershed. The main challenge of this process is the accurate identification of the best solution for the watershed in question, despite the various risk attitudes presented by the associated decision-makers (DMs). This paper introduces a novel approach for implementation of the MCDM process based on a comparative neutral risk/risk-based decision analysis, which results in the selection of the most desirable scenario for use in the entire watershed. At the sub-basin level, each scenario includes multiple BMPs with scores that have been calculated using the criteria derived from two cases of neutral risk and risk-based decision-making. The simple additive weighting (SAW) operator is applied for use in neutral risk decision-making, while the ordered weighted averaging (OWA) and induced OWA (IOWA) operators are effective for risk-based decision-making. At the watershed level, the BMP scores of the sub-basins are aggregated to calculate each scenarios' combined goodness measurements; the most desirable scenario for the entire watershed is then selected based on the combined goodness measurements. Our final results illustrate the type of operator and risk attitudes needed to satisfy the relevant criteria within the number of sub-basins, and how they ultimately affect the final ranking of the given scenarios. The methodology proposed here has been successfully applied to the Honeyoey Creek-Pine Creek watershed in Michigan, USA to evaluate various BMP scenarios and determine the best solution for both the stakeholders and the overall stream health. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. FlySec: a risk-based airport security management system based on security as a service concept

    NASA Astrophysics Data System (ADS)

    Kyriazanos, Dimitris M.; Segou, Olga E.; Zalonis, Andreas; Thomopoulos, Stelios C. A.

    2016-05-01

    Complementing the ACI/IATA efforts, the FLYSEC European H2020 Research and Innovation project (http://www.fly-sec.eu/) aims to develop and demonstrate an innovative, integrated and end-to-end airport security process for passengers, enabling a guided and streamlined procedure from the landside to airside and into the boarding gates, and offering for an operationally validated innovative concept for end-to-end aviation security. FLYSEC ambition turns through a well-structured work plan into: (i) innovative processes facilitating risk-based screening; (ii) deployment and integration of new technologies and repurposing existing solutions towards a risk-based Security paradigm shift; (iii) improvement of passenger facilitation and customer service, bringing security as a real service in the airport of tomorrow;(iv) achievement of measurable throughput improvement and a whole new level of Quality of Service; and (v) validation of the results through advanced "in-vitro" simulation and "in-vivo" pilots. On the technical side, FLYSEC achieves its ambitious goals by integrating new technologies on video surveillance, intelligent remote image processing and biometrics combined with big data analysis, open-source intelligence and crowdsourcing. Repurposing existing technologies is also in the FLYSEC objectives, such as mobile application technologies for improved passenger experience and positive boarding applications (i.e. services to facilitate boarding and landside/airside way finding) as well as RFID for carry-on luggage tracking and quick unattended luggage handling. In this paper, the authors will describe the risk based airport security management system which powers FLYSEC intelligence and serves as the backend on top of which FLYSEC's front end technologies reside for security services management, behaviour and risk analysis.

  1. A Risk-based, Practice-centered Approach to Project Management for HPCMP CREATE

    DTIC Science & Technology

    2015-10-05

    deploying multi-physics High Performance Computing (HPC) software applications for engineers to design and make accurate predictions of the performance...successful software project management practices for the development of multi-physics, HPC engineering software . Based on lessons-learned from the HPC...manage the highly-distributed CREATE program. HPCMP CREATE, HPC, Software Risk Management, Acquisition Engineering, Digital Engineering UU UU UU UU 18

  2. A risk-based approach to sanitary sewer pipe asset management.

    PubMed

    Baah, Kelly; Dubey, Brajesh; Harvey, Richard; McBean, Edward

    2015-02-01

    Wastewater collection systems are an important component of proper management of wastewater to prevent environmental and human health implications from mismanagement of anthropogenic waste. Due to aging and inadequate asset management practices, the wastewater collection assets of many cities around the globe are in a state of rapid decline and in need of urgent attention. Risk management is a tool which can help prioritize resources to better manage and rehabilitate wastewater collection systems. In this study, a risk matrix and a weighted sum multi-criteria decision-matrix are used to assess the consequence and risk of sewer pipe failure for a mid-sized city, using ArcGIS. The methodology shows that six percent of the uninspected sewer pipe assets of the case study have a high consequence of failure while four percent of the assets have a high risk of failure and hence provide priorities for inspection. A map incorporating risk of sewer pipe failure and consequence is developed to facilitate future planning, rehabilitation and maintenance programs. The consequence of failure assessment also includes a novel failure impact factor which captures the effect of structurally defective stormwater pipes on the failure assessment. The methodology recommended in this study can serve as a basis for future planning and decision making and has the potential to be universally applied by municipal sewer pipe asset managers globally to effectively manage the sanitary sewer pipe infrastructure within their jurisdiction. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Risk based bridge data collection and asset management and the role of structural health monitoring

    NASA Astrophysics Data System (ADS)

    Omenzetter, Piotr; Bush, Simon; Henning, Theunis; McCarten, Peter

    2011-04-01

    Bridges are critical to the operation and functionality of the whole road networks. It is therefore essential that specific data is collected regarding bridge asset condition and performance, as this allows proactive management of the assets and associated risks and more accurate short and long term financial planning. This paper proposes and discusses a strategy for collection of data on bridge condition and performance. Recognizing that risk management is the primary driver of asset management, the proposed strategy prioritizes bridges for levels of data collection including core, intermediate and advanced. Individual bridges are seen as parts of wider networks and bridge risk and criticality assessment emphasizes bridge failure or underperformance risk in the network context. The paper demonstrates how more reliable and detailed data can assist in managing network and bridge risks and provides a rationale for application of higher data collection levels for bridges characterized by higher risk and criticality. As the bridge risk and/or criticality increases planned and proactive integration of structural health monitoring (SHM) data into asset management is outlined. An example of bridge prioritization for data collection using several bridges taken from a national highway network is provided using an existing risk and criticality scoring methodology. The paper concludes with a discussion on the role of SHM in data collection for bridge asset management and where SHM can make the largest impacts.

  4. Quantitative risk-based approach for improving water quality management in mining.

    PubMed

    Liu, Wenying; Moran, Chris J; Vink, Sue

    2011-09-01

    The potential environmental threats posed by freshwater withdrawal and mine water discharge are some of the main drivers for the mining industry to improve water management. The use of multiple sources of water supply and introducing water reuse into the mine site water system have been part of the operating philosophies employed by the mining industry to realize these improvements. However, a barrier to implementation of such good water management practices is concomitant water quality variation and the resulting impacts on the efficiency of mineral separation processes, and an increased environmental consequence of noncompliant discharge events. There is an increasing appreciation that conservative water management practices, production efficiency, and environmental consequences are intimately linked through the site water system. It is therefore essential to consider water management decisions and their impacts as an integrated system as opposed to dealing with each impact separately. This paper proposes an approach that could assist mine sites to manage water quality issues in a systematic manner at the system level. This approach can quantitatively forecast the risk related with water quality and evaluate the effectiveness of management strategies in mitigating the risk by quantifying implications for production and hence economic viability.

  5. Residual hydrophobic organic contaminants in soil: Are they a barrier to risk-based approaches for managing contaminated land?

    PubMed

    Umeh, Anthony C; Duan, Luchun; Naidu, Ravi; Semple, Kirk T

    2017-01-01

    Risk-based approaches to managing contaminated land, rather than approaches based on complete contaminant removal, have gained acceptance as they are likely to be more feasible and cost effective. Risk-based approaches aim to minimise risks of exposure of a specified contaminant to humans. However, adopting a risk-based approach over alternative overly-conservative approaches requires that associated uncertainties in decision making are understood and minimised. Irrespective of the nature of contaminants, a critical uncertainty is whether there are potential risks associated with exposure to the residual contaminant fractions in soil to humans and other ecological receptors, and how they should be considered in the risk assessment process. This review focusing on hydrophobic organic contaminants (HOCs), especially polycyclic aromatic hydrocarbons (PAHs), suggests that there is significant uncertainty on the residual fractions of contaminants from risk perspectives. This is because very few studies have focused on understanding the desorption behaviour of HOCs, with few or no studies considering the influence of exposure-specific factors. In particular, it is not clear whether the exposure of soil-associated HOCs to gastrointestinal fluids and enzyme processes release bound residues. Although, in vitro models have been used to predict PAH bioaccessibility, and chemical extractions have been used to determine residual fractions in various soils, there are still doubts about what is actually being measured. Therefore it is not certain which bioaccessibility method currently represents the best choice, or provides the best estimate, of in vivo PAH bioavailability. It is suggested that the fate and behaviour of HOCs in a wide range of soils, and that consider exposure-specific scenarios, be investigated. Exposure-specific scenarios are important for validation purposes, which may be useful for the development of standardised methods and procedures for HOC

  6. Six Sigma Quality Management System and Design of Risk-based Statistical Quality Control.

    PubMed

    Westgard, James O; Westgard, Sten A

    2017-03-01

    Six sigma concepts provide a quality management system (QMS) with many useful tools for managing quality in medical laboratories. This Six Sigma QMS is driven by the quality required for the intended use of a test. The most useful form for this quality requirement is the allowable total error. Calculation of a sigma-metric provides the best predictor of risk for an analytical examination process, as well as a design parameter for selecting the statistical quality control (SQC) procedure necessary to detect medically important errors. Simple point estimates of sigma at medical decision concentrations are sufficient for laboratory applications. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Use of probabilistic risk assessment (PRA) in expert systems to advise nuclear plant operators and managers

    SciTech Connect

    Uhrig, R.E.

    1988-01-01

    The use of expert systems in nuclear power plants to provide advice to managers, supervisors and/or operators is a concept that is rapidly gaining acceptance. Generally, expert systems rely on the expertise of human experts or knowledge that has been modified in publications, books, or regulations to provide advice under a wide variety of conditions. In this work, a probabilistic risk assessment (PRA)/sup 3/ of a nuclear power plant performed previously is used to assess the safety status of nuclear power plants and to make recommendations to the plant personnel. 5 refs., 1 fig., 2 tabs.

  8. A Risk-based Assessment And Management Framework For Multipollutant Air Quality.

    PubMed

    Frey, H Christopher; Hubbell, Bryan

    2009-06-01

    The National Research Council recommended both a risk- and performance-based multipollutant approach to air quality management. Specifically, management decisions should be based on minimizing the exposure to, and risk of adverse effects from, multiple sources of air pollution and that the success of these decisions should be measured by how well they achieved this objective. We briefly describe risk analysis and its application within the current approach to air quality management. Recommendations are made as to how current practice could evolve to support a fully risk- and performance-based multipollutant air quality management system. The ability to implement a risk assessment framework in a credible and policy-relevant manner depends on the availability of component models and data which are scientifically sound and developed with an understanding of their application in integrated assessments. The same can be said about accountability assessments used to evaluate the outcomes of decisions made using such frameworks. The existing risk analysis framework, although typically applied to individual pollutants, is conceptually well suited for analyzing multipollutant management actions. Many elements of this framework, such as emissions and air quality modeling, already exist with multipollutant characteristics. However, the framework needs to be supported with information on exposure and concentration response relationships that result from multipollutant health studies. Because the causal chain that links management actions to emission reductions, air quality improvements, exposure reductions and health outcomes is parallel between prospective risk analyses and retrospective accountability assessments, both types of assessment should be placed within a single framework with common metrics and indicators where possible. Improvements in risk reductions can be obtained by adopting a multipollutant risk analysis framework within the current air quality management

  9. A Risk-based Assessment And Management Framework For Multipollutant Air Quality

    PubMed Central

    Frey, H. Christopher; Hubbell, Bryan

    2010-01-01

    The National Research Council recommended both a risk- and performance-based multipollutant approach to air quality management. Specifically, management decisions should be based on minimizing the exposure to, and risk of adverse effects from, multiple sources of air pollution and that the success of these decisions should be measured by how well they achieved this objective. We briefly describe risk analysis and its application within the current approach to air quality management. Recommendations are made as to how current practice could evolve to support a fully risk- and performance-based multipollutant air quality management system. The ability to implement a risk assessment framework in a credible and policy-relevant manner depends on the availability of component models and data which are scientifically sound and developed with an understanding of their application in integrated assessments. The same can be said about accountability assessments used to evaluate the outcomes of decisions made using such frameworks. The existing risk analysis framework, although typically applied to individual pollutants, is conceptually well suited for analyzing multipollutant management actions. Many elements of this framework, such as emissions and air quality modeling, already exist with multipollutant characteristics. However, the framework needs to be supported with information on exposure and concentration response relationships that result from multipollutant health studies. Because the causal chain that links management actions to emission reductions, air quality improvements, exposure reductions and health outcomes is parallel between prospective risk analyses and retrospective accountability assessments, both types of assessment should be placed within a single framework with common metrics and indicators where possible. Improvements in risk reductions can be obtained by adopting a multipollutant risk analysis framework within the current air quality management

  10. Management of groundwater in farmed pond area using risk-based regulation.

    PubMed

    Huang, Jun-Ying; Liao, Chiao-Miao; Lin, Kao-Hung; Lee, Cheng-Haw

    2014-09-01

    Blackfoot disease (BFD) had occurred seriously in the Yichu, Hsuehchia, Putai, and Peimen townships of Chia-Nan District of Taiwan in the early days. These four townships are the districts of fishpond cultivation domestically in Taiwan. Groundwater becomes the main water supply because of short income in surface water. The problems of over pumping in groundwater may not only result in land subsidence and seawater intrusion but also be harmful to the health of human giving rise to the bioaccumulation via food chain in groundwater with arsenic (As). This research uses sequential indicator simulation (SIS) to characterize the spatial arsenic distribution in groundwater in the four townships. Risk assessment is applied to explore the dilution ratio (DR) of groundwater utilization, which is defined as the ratio showing the volume of groundwater utilization compared to pond water, for fish farming in the range of target cancer risk (TR) especially on the magnitude of 10(-4)~10(-6). Our study results reveal that the 50th percentile of groundwater DRs served as a regulation standard can be used to perform fish farm groundwater management for a TR of 10(-6). For a TR of 5 × 10(-6), we suggest using the 75th percentile of DR for groundwater management. For a TR of 10(-5), we suggest using the 95th percentile of the DR standard for performing groundwater management in fish farm areas. For the TR of exceeding 5 × 10(-5), we do not suggest establishing groundwater management standards under these risk standards. Based on the research results, we suggest that establishing a TR at 10(-5) and using the 95th percentile of DR are best for groundwater management in fish farm areas.

  11. Assistance to the states with risk based data management. Quarterly technical progress report, April 1--June 30, 1995

    SciTech Connect

    Paque, M.J.

    1995-07-28

    The Tasks of this project are to: (1) complete implementation of a Risk Based Data Management System (RBDMS) in the States of Alaska, Mississippi, Montana, Nebraska; and (2) conduct Area of Review (AOR) Workshops in the states of California, Oklahoma, Kansas, and Texas. The RBDMS was designed to be a comprehensive database with the ability to expand into multiple areas, including oil and gas production. The database includes comprehensive well information for both producing and injection wells. It includes automated features for performing functions redated to AOR analyses, environmental risk analyses, well evaluation, permit evaluation, compliance monitoring, operator bonding assessments, operational monitoring and tracking, and more. This quarterly report describes the status of the development of the RBDMS project in both stated tasks and proposes further steps in its implementation.

  12. Probabilistic Risk Assessment Procedures Guide for NASA Managers and Practitioners (Second Edition)

    NASA Technical Reports Server (NTRS)

    Stamatelatos,Michael; Dezfuli, Homayoon; Apostolakis, George; Everline, Chester; Guarro, Sergio; Mathias, Donovan; Mosleh, Ali; Paulos, Todd; Riha, David; Smith, Curtis; hide

    2011-01-01

    Probabilistic Risk Assessment (PRA) is a comprehensive, structured, and logical analysis method aimed at identifying and assessing risks in complex technological systems for the purpose of cost-effectively improving their safety and performance. NASA's objective is to better understand and effectively manage risk, and thus more effectively ensure mission and programmatic success, and to achieve and maintain high safety standards at NASA. NASA intends to use risk assessment in its programs and projects to support optimal management decision making for the improvement of safety and program performance. In addition to using quantitative/probabilistic risk assessment to improve safety and enhance the safety decision process, NASA has incorporated quantitative risk assessment into its system safety assessment process, which until now has relied primarily on a qualitative representation of risk. Also, NASA has recently adopted the Risk-Informed Decision Making (RIDM) process [1-1] as a valuable addition to supplement existing deterministic and experience-based engineering methods and tools. Over the years, NASA has been a leader in most of the technologies it has employed in its programs. One would think that PRA should be no exception. In fact, it would be natural for NASA to be a leader in PRA because, as a technology pioneer, NASA uses risk assessment and management implicitly or explicitly on a daily basis. NASA has probabilistic safety requirements (thresholds and goals) for crew transportation system missions to the International Space Station (ISS) [1-2]. NASA intends to have probabilistic requirements for any new human spaceflight transportation system acquisition. Methods to perform risk and reliability assessment in the early 1960s originated in U.S. aerospace and missile programs. Fault tree analysis (FTA) is an example. It would have been a reasonable extrapolation to expect that NASA would also become the world leader in the application of PRA. That was

  13. Propagating Water Quality Analysis Uncertainty Into Resource Management Decisions Through Probabilistic Modeling

    NASA Astrophysics Data System (ADS)

    Gronewold, A. D.; Wolpert, R. L.; Reckhow, K. H.

    2007-12-01

    Most probable number (MPN) and colony-forming-unit (CFU) are two estimates of fecal coliform bacteria concentration commonly used as measures of water quality in United States shellfish harvesting waters. The MPN is the maximum likelihood estimate (or MLE) of the true fecal coliform concentration based on counts of non-sterile tubes in serial dilution of a sample aliquot, indicating bacterial metabolic activity. The CFU is the MLE of the true fecal coliform concentration based on the number of bacteria colonies emerging on a growth plate after inoculation from a sample aliquot. Each estimating procedure has intrinsic variability and is subject to additional uncertainty arising from minor variations in experimental protocol. Several versions of each procedure (using different sized aliquots or different numbers of tubes, for example) are in common use, each with its own levels of probabilistic and experimental error and uncertainty. It has been observed empirically that the MPN procedure is more variable than the CFU procedure, and that MPN estimates are somewhat higher on average than CFU estimates, on split samples from the same water bodies. We construct a probabilistic model that provides a clear theoretical explanation for the observed variability in, and discrepancy between, MPN and CFU measurements. We then explore how this variability and uncertainty might propagate into shellfish harvesting area management decisions through a two-phased modeling strategy. First, we apply our probabilistic model in a simulation-based analysis of future water quality standard violation frequencies under alternative land use scenarios, such as those evaluated under guidelines of the total maximum daily load (TMDL) program. Second, we apply our model to water quality data from shellfish harvesting areas which at present are closed (either conditionally or permanently) to shellfishing, to determine if alternative laboratory analysis procedures might have led to different

  14. A risk-based approach to management of leachables utilizing statistical analysis of extractables.

    PubMed

    Stults, Cheryl L M; Mikl, Jaromir; Whelehan, Oliver; Morrical, Bradley; Duffield, William; Nagao, Lee M

    2015-04-01

    To incorporate quality by design concepts into the management of leachables, an emphasis is often put on understanding the extractable profile for the materials of construction for manufacturing disposables, container-closure, or delivery systems. Component manufacturing processes may also impact the extractable profile. An approach was developed to (1) identify critical components that may be sources of leachables, (2) enable an understanding of manufacturing process factors that affect extractable profiles, (3) determine if quantitative models can be developed that predict the effect of those key factors, and (4) evaluate the practical impact of the key factors on the product. A risk evaluation for an inhalation product identified injection molding as a key process. Designed experiments were performed to evaluate the impact of molding process parameters on the extractable profile from an ABS inhaler component. Statistical analysis of the resulting GC chromatographic profiles identified processing factors that were correlated with peak levels in the extractable profiles. The combination of statistically significant molding process parameters was different for different types of extractable compounds. ANOVA models were used to obtain optimal process settings and predict extractable levels for a selected number of compounds. The proposed paradigm may be applied to evaluate the impact of material composition and processing parameters on extractable profiles and utilized to manage product leachables early in the development process and throughout the product lifecycle.

  15. Application of risk-based assessment and management to riverbank filtration sites in India.

    PubMed

    Bartak, Rico; Page, Declan; Sandhu, Cornelius; Grischek, Thomas; Saini, Bharti; Mehrotra, Indu; Jain, Chakresh K; Ghosh, Narayan C

    2015-03-01

    This is the first reported study of a riverbank filtration (RBF) scheme to be assessed following the Australian Guidelines for Managed Aquifer Recharge. A comprehensive staged approach to assess the risks from 12 hazards to human health and the environment has been undertaken. Highest risks from untreated ground and Ganga River water were identified with pathogens, turbidity, iron, manganese, total dissolved solids and total hardness. Recovered water meets the guideline values for inorganic chemicals and salinity but exceeds limits for thermotolerant coliforms frequently. A quantitative microbial risk assessment undertaken on the water recovered from the aquifer indicated that the residual risks of 0.00165 disability-adjusted life years (DALYs) posed by the reference bacteria Escherichia coli O157:H7 were below the national diarrhoeal incidence of 0.027 DALYs and meet the health target in this study of 0.005 DALYs per person per year, which corresponds to the World Health Organization (WHO) regional diarrhoeal incidence in South-East Asia. Monsoon season was a major contributor to the calculated burden of disease and final DALYs were strongly dependent on RBF and disinfection pathogen removal capabilities. Finally, a water safety plan was developed with potential risk management procedures to minimize residual risks related to pathogens.

  16. A Framework for Probabilistic Evaluation of Interval Management Tolerance in the Terminal Radar Control Area

    NASA Technical Reports Server (NTRS)

    Hercencia-Zapana, Heber; Herencia-Zapana, Heber; Hagen, George E.; Neogi, Natasha

    2012-01-01

    Projections of future traffic in the national airspace show that most of the hub airports and their attendant airspace will need to undergo significant redevelopment and redesign in order to accommodate any significant increase in traffic volume. Even though closely spaced parallel approaches increase throughput into a given airport, controller workload in oversubscribed metroplexes is further taxed by these approaches that require stringent monitoring in a saturated environment. The interval management (IM) concept in the TRACON area is designed to shift some of the operational burden from the control tower to the flight deck, placing the flight crew in charge of implementing the required speed changes to maintain a relative spacing interval. The interval management tolerance is a measure of the allowable deviation from the desired spacing interval for the IM aircraft (and its target aircraft). For this complex task, Formal Methods can help to ensure better design and system implementation. In this paper, we propose a probabilistic framework to quantify the uncertainty and performance associated with the major components of the IM tolerance. The analytical basis for this framework may be used to formalize both correctness and probabilistic system safety claims in a modular fashion at the algorithmic level in a way compatible with several Formal Methods tools.

  17. A Quantitative, Risk-Based Approach to the Management of Neonatal Early-Onset Sepsis.

    PubMed

    Kuzniewicz, Michael W; Puopolo, Karen M; Fischer, Allen; Walsh, Eileen M; Li, Sherian; Newman, Thomas B; Kipnis, Patricia; Escobar, Gabriel J

    2017-04-01

    Current algorithms for management of neonatal early-onset sepsis (EOS) result in medical intervention for large numbers of uninfected infants. We developed multivariable prediction models for estimating the risk of EOS among late preterm and term infants based on objective data available at birth and the newborn's clinical status. To examine the effect of neonatal EOS risk prediction models on sepsis evaluations and antibiotic use and assess their safety in a large integrated health care system. The study cohort includes 204 485 infants born at 35 weeks' gestation or later at a Kaiser Permanente Northern California hospital from January 1, 2010, through December 31, 2015. The study compared 3 periods when EOS management was based on (1) national recommended guidelines (baseline period [January 1, 2010, through November 31, 2012]), (2) multivariable estimates of sepsis risk at birth (learning period [December 1, 2012, through June 30, 2014]), and (3) the multivariable risk estimate combined with the infant's clinical condition in the first 24 hours after birth (EOS calculator period [July 1, 2014, through December 31, 2015]). The primary outcome was antibiotic administration in the first 24 hours. Secondary outcomes included blood culture use, antibiotic administration between 24 and 72 hours, clinical outcomes, and readmissions for EOS. The study cohort included 204 485 infants born at 35 weeks' gestation or later: 95 343 in the baseline period (mean [SD] age, 39.4 [1.3] weeks; 46 651 male [51.0%]; 37 007 white, non-Hispanic [38.8%]), 52 881 in the learning period (mean [SD] age, 39.3 [1.3] weeks; 27 067 male [51.2%]; 20 175 white, non-Hispanic [38.2%]), and 56 261 in the EOS calculator period (mean [SD] age, 39.4 [1.3] weeks; 28 575 male [50.8%]; 20 484 white, non-Hispanic [36.4%]). In a comparison of the baseline period with the EOS calculator period, blood culture use decreased from 14.5% to 4.9% (adjusted difference, -7.7%; 95% CI, -13.1% to -2

  18. A Risk-Based Approach to Manage Nutrient Contamination From Household Wastewater

    NASA Astrophysics Data System (ADS)

    Gold, A. J.; Sims, J. T.

    2001-05-01

    Nutrients originating from decentralized wastewater treatment systems (DWTS) can pose a risk to human and ecosystem health. Assessing the likelihood and magnitude of this risk is a formidable and complex challenge. However, a properly constructed risk assessment is essential if we are to design and implement practices for DWTS that minimize the impacts of nutrients on our environment. To do this successfully, we must carefully consider: (i) the specific risks posed by nutrients emitted by DWTS and the sensitivity of humans and ecosystems to these risks; (ii) the pathways by which nutrients move from DWTS to the sectors of the environment where the risk will occur (most often ground and surface waters); (iii) the micro and macro-scale processes that affect the transport and transformations of nutrients once they are emitted from the DWTS and how this in turn affects risk; and (iv) the effects of current or alternative DWTS design and management practices on nutrient transport and subsequent risks to humans and ecosystems. In this paper we examine the risks of nutrients from DWTS to human and ecosystem health at both the micro and the macro?level spatial scales. We focus primarily on the factors that control the movement of N and P from DWTS to ground and surface waters and the research needs related to controlling nonpoint source nutrient pollution from DWTS. At the micro?scale the exposure pathways include the system and the immediate surroundings, i.e., the subsurface environment near the DWTS. The exposed individual or ecosystem at the micro-scale can be a household well, lake, stream or estuary that borders an individual wastewater treatment system. At the macro?level our focus is at the aquifer and watershed scale and the risks posed to downstream ecosystems and water users by nonpoint source pollution of these waters by nutrients from DWTS. We analyze what is known about the effectiveness of current designs at mitigating these risks and our ability to predict

  19. Management of the Area 5 Radioactive Waste Management Site using Decision-based, Probabilistic Performance Assessment Modeling

    SciTech Connect

    Carilli, J.; Crowe, B.; Black, P.; Tauxe, J.; Stockton, T.; Catlett, K.; Yucel, V.

    2003-02-27

    Low-level radioactive waste from cleanup activities at the Nevada Test Site and from multiple sites across the U.S. Department of Energy (DOE) complex is disposed at two active Radioactive Waste Management Sites (RWMS) on the Nevada Test Site. These facilities, which are managed by the DOE National Nuclear Security Administration Nevada Site Office, were recently designated as one of two regional disposal centers and yearly volumes of disposed waste now exceed 50,000 m3 (> 2 million ft3). To safely and cost-effectively manage the disposal facilities, the Waste Management Division of Environmental Management has implemented decision-based management practices using flexible and problem-oriented probabilistic performance assessment modeling. Deterministic performance assessments and composite analyses were completed originally for the Area 5 and Area 3 RWMSs located in, respectively, Frenchman Flat and Yucca Flat on the Nevada Test Site. These documents provide the technical bases for issuance of disposal authorization statements for continuing operation of the disposal facilities. Both facilities are now in a maintenance phase that requires testing of conceptual models, reduction of uncertainty, and site monitoring all leading to eventual closure of the facilities and transition to long-term stewardship.

  20. Towards risk-based drought management in the Netherlands: making water supply levels transparent to water users

    NASA Astrophysics Data System (ADS)

    Maat Judith, Ter; Marjolein, Mens; Vuren Saskia, Van; der Vat Marnix, Van

    2016-04-01

    Improving Predictions and Management of Hydrological Extremes (IMPREX), running from 2016-2019, a consortium of the Dutch research institute Deltares and the Dutch water management consultant HKV will design and build a tool to support quantitative risk-informed decision-making for fresh water management for the Netherlands, in particular the decision on water supply service levels. The research will be conducted in collaboration with the Dutch Ministry for Infrastructure and Environment, the Freshwater Supply Programme Office, the Dutch governmental organisation responsible for water management (Rijkswaterstaat), the Foundation for Applied Water Research, (STOWA, knowledge centre of the water boards) and a number of water boards. In the session we will present the conceptual framework for a risk-based approach for water shortage management and share thoughts on how the proposed tool can be applied in the Dutch water management context.

  1. Evaluating the impacts of agricultural land management practices on water resources: A probabilistic hydrologic modeling approach.

    PubMed

    Prada, A F; Chu, M L; Guzman, J A; Moriasi, D N

    2017-02-24

    Evaluating the effectiveness of agricultural land management practices in minimizing environmental impacts using models is challenged by the presence of inherent uncertainties during the model development stage. One issue faced during the model development stage is the uncertainty involved in model parameterization. Using a single optimized set of parameters (one snapshot) to represent baseline conditions of the system limits the applicability and robustness of the model to properly represent future or alternative scenarios. The objective of this study was to develop a framework that facilitates model parameter selection while evaluating uncertainty to assess the impacts of land management practices at the watershed scale. The model framework was applied to the Lake Creek watershed located in southwestern Oklahoma, USA. A two-step probabilistic approach was implemented to parameterize the Agricultural Policy/Environmental eXtender (APEX) model using global uncertainty and sensitivity analysis to estimate the full spectrum of total monthly water yield (WYLD) and total monthly Nitrogen loads (N) in the watershed under different land management practices. Twenty-seven models were found to represent the baseline scenario in which uncertainty of up to 29% and 400% in WYLD and N, respectively, is plausible. Changing the land cover to pasture manifested the highest decrease in N to up to 30% for a full pasture coverage while changing to full winter wheat cover can increase the N up to 11%. The methodology developed in this study was able to quantify the full spectrum of system responses, the uncertainty associated with them, and the most important parameters that drive their variability. Results from this study can be used to develop strategic decisions on the risks and tradeoffs associated with different management alternatives that aim to increase productivity while also minimizing their environmental impacts.

  2. Integrated software health management for aerospace guidance, navigation, and control systems: A probabilistic reasoning approach

    NASA Astrophysics Data System (ADS)

    Mbaya, Timmy

    Embedded Aerospace Systems have to perform safety and mission critical operations in a real-time environment where timing and functional correctness are extremely important. Guidance, Navigation, and Control (GN&C) systems substantially rely on complex software interfacing with hardware in real-time; any faults in software or hardware, or their interaction could result in fatal consequences. Integrated Software Health Management (ISWHM) provides an approach for detection and diagnosis of software failures while the software is in operation. The ISWHM approach is based on probabilistic modeling of software and hardware sensors using a Bayesian network. To meet memory and timing constraints of real-time embedded execution, the Bayesian network is compiled into an Arithmetic Circuit, which is used for on-line monitoring. This type of system monitoring, using an ISWHM, provides automated reasoning capabilities that compute diagnoses in a timely manner when failures occur. This reasoning capability enables time-critical mitigating decisions and relieves the human agent from the time-consuming and arduous task of foraging through a multitude of isolated---and often contradictory---diagnosis data. For the purpose of demonstrating the relevance of ISWHM, modeling and reasoning is performed on a simple simulated aerospace system running on a real-time operating system emulator, the OSEK/Trampoline platform. Models for a small satellite and an F-16 fighter jet GN&C (Guidance, Navigation, and Control) system have been implemented. Analysis of the ISWHM is then performed by injecting faults and analyzing the ISWHM's diagnoses.

  3. An ontology-based nurse call management system (oNCS) with probabilistic priority assessment

    PubMed Central

    2011-01-01

    Background The current, place-oriented nurse call systems are very static. A patient can only make calls with a button which is fixed to a wall of a room. Moreover, the system does not take into account various factors specific to a situation. In the future, there will be an evolution to a mobile button for each patient so that they can walk around freely and still make calls. The system would become person-oriented and the available context information should be taken into account to assign the correct nurse to a call. The aim of this research is (1) the design of a software platform that supports the transition to mobile and wireless nurse call buttons in hospitals and residential care and (2) the design of a sophisticated nurse call algorithm. This algorithm dynamically adapts to the situation at hand by taking the profile information of staff members and patients into account. Additionally, the priority of a call probabilistically depends on the risk factors, assigned to a patient. Methods The ontology-based Nurse Call System (oNCS) was developed as an extension of a Context-Aware Service Platform. An ontology is used to manage the profile information. Rules implement the novel nurse call algorithm that takes all this information into account. Probabilistic reasoning algorithms are designed to determine the priority of a call based on the risk factors of the patient. Results The oNCS system is evaluated through a prototype implementation and simulations, based on a detailed dataset obtained from Ghent University Hospital. The arrival times of nurses at the location of a call, the workload distribution of calls amongst nurses and the assignment of priorities to calls are compared for the oNCS system and the current, place-oriented nurse call system. Additionally, the performance of the system is discussed. Conclusions The execution time of the nurse call algorithm is on average 50.333 ms. Moreover, the oNCS system significantly improves the assignment of nurses

  4. Geodata-based probabilistic risk assessment and management of pesticides in Germany: a conceptual framework.

    PubMed

    Schulz, Ralf; Stehle, Sebastian; Elsaesser, David; Matezki, Steffen; Müller, Alexandra; Neumann, Michael; Ohliger, Renja; Wogram, Jörn; Zenker, Katharina

    2009-01-01

    The procedure for the risk assessment of pesticides in Germany is currently further developed from a deterministic to a geodata-based probabilistic risk assessment (GeoPRA) approach. As the initial step, the exposure assessment for spray drift in permanent crops, such as vineyards, fruit orchards, and hops, is considered. In our concept, geoinformation tools are used to predict distribution functions for exposure concentrations based mainly on spatial information regarding the neighbourhood of crops and surface waters. A total number of 23 factors affecting the drift into surface waters were assessed and suggestions for their inclusion into the approach developed. The main objectives are to base the exposure estimation on a realistic representation of local landscape characteristics and on empirical results for the impact of each feature on the drift deposition. A framework for the identification of high-risk sites (active management areas [AMAs]) based on protection goals and ecological considerations was developed in order to implement suitable risk mitigation measures. The inclusion of active mitigation measures at sites with identified and verified risk is considered a central and important part of the overall assessment strategy. The suggested GeoPRA procedure itself is comprised of the following 4 steps, including elements of the extensive preliminary work conducted so far: 1) nationwide risk assessment, preferably based only on geodata-based factors; 2) identification of AMAs, including the spatial extension of contamination, the level of contamination, and the tolerable effect levels; 3) refined exposure assessment, using aerial photographs and field surveys; and 4) mitigation measures, with a focus on landscape-level active mitigation measures leading to effective risk reductions. The suggested GeoPRA procedure offers the possibility to actively involve the farming community in the process of pesticide management. Overall, the new procedure will aim at

  5. Developing a risk-based trading scheme for cattle in England: farmer perspectives on managing trading risk for bovine tuberculosis.

    PubMed

    Little, R; Wheeler, K; Edge, S

    2017-02-11

    This paper examines farmer attitudes towards the development of a voluntary risk-based trading scheme for cattle in England as a risk mitigation measure for bovine tuberculosis (bTB). The research reported here was commissioned to gather evidence on the type of scheme that would have a good chance of success in improving the information farmers receive about the bTB risk of cattle they buy. Telephone interviews were conducted with a stratified random sample of 203 cattle farmers in England, splitting the interviews equally between respondents in the high-risk area and low-risk area for bTB. Supplementary interviews and focus groups with farmers were also carried out across the risk areas. Results suggest a greater enthusiasm for a risk-based trading scheme in low-risk areas compared with high-risk areas and among members of breed societies and cattle health schemes. Third-party certification of herds by private vets or the Animal and Plant Health Agency were regarded as the most credible source, with farmer self-certification being favoured by sellers, but being regarded as least credible by buyers. Understanding farmers' attitudes towards voluntary risk-based trading is important to gauge likely uptake, understand preferences for information provision and to assist in monitoring, evaluating and refining the scheme once established.

  6. Transient flow conditions in probabilistic wellhead protection: importance and ways to manage spatial and temporal uncertainty in capture zone delineation

    NASA Astrophysics Data System (ADS)

    Enzenhoefer, R.; Rodriguez-Pretelin, A.; Nowak, W.

    2012-12-01

    "From an engineering standpoint, the quantification of uncertainty is extremely important not only because it allows estimating risk but mostly because it allows taking optimal decisions in an uncertain framework" (Renard, 2007). The most common way to account for uncertainty in the field of subsurface hydrology and wellhead protection is to randomize spatial parameters, e.g. the log-hydraulic conductivity or porosity. This enables water managers to take robust decisions in delineating wellhead protection zones with rationally chosen safety margins in the spirit of probabilistic risk management. Probabilistic wellhead protection zones are commonly based on steady-state flow fields. However, several past studies showed that transient flow conditions may substantially influence the shape and extent of catchments. Therefore, we believe they should be accounted for in the probabilistic assessment and in the delineation process. The aim of our work is to show the significance of flow transients and to investigate the interplay between spatial uncertainty and flow transients in wellhead protection zone delineation. To this end, we advance our concept of probabilistic capture zone delineation (Enzenhoefer et al., 2012) that works with capture probabilities and other probabilistic criteria for delineation. The extended framework is able to evaluate the time fraction that any point on a map falls within a capture zone. In short, we separate capture probabilities into spatial/statistical and time-related frequencies. This will provide water managers additional information on how to manage a well catchment in the light of possible hazard conditions close to the capture boundary under uncertain and time-variable flow conditions. In order to save computational costs, we take advantage of super-positioned flow components with time-variable coefficients. We assume an instantaneous development of steady-state flow conditions after each temporal change in driving forces, following

  7. Handbook of methods for risk-based analysis of technical specifications

    SciTech Connect

    Samanta, P.K.; Kim, I.S.; Mankamo, T.; Vesely, W.E.

    1996-09-01

    Technical Specifications (TS) requirements for nuclear power plants define the Limiting Conditions for Operations (LCOs) and Surveillance Requirements (SRs) to assure safety during operation. In general, these requirements are based on deterministic analyses and engineering judgments. Improvements in these requirements are facilitated by the availability of plant-specific Probabilistic Risk Assessments (PRAs). The US Nuclear Regulatory Commission (USNRC) Office of Research sponsored research to develop systematic, risk-based methods to improve various aspects of TS requirements. A handbook of methods summarizing such risk-based approaches has been completed in 1994. It is expected that this handbook will provide valuable input to NRC`s present work in developing guidance for using PRA in risk-informed regulation. The handbook addresses reliability and risk-based methods for evaluating allowed outage times (AOTs), action statements requiring shutdown where shutdown risk may be substantial, surveillance test intervals (STIs), managing plant configurations, and scheduling maintenance.

  8. Flood Risk and Probabilistic Benefit Assessment to Support Management of Flood-Prone Lands: Evidence From Candaba Floodplains, Philippines

    NASA Astrophysics Data System (ADS)

    Juarez, A. M.; Kibler, K. M.; Sayama, T.; Ohara, M.

    2016-12-01

    Flood management decision-making is often supported by risk assessment, which may overlook the role of coping capacity and the potential benefits derived from direct use of flood-prone land. Alternatively, risk-benefit analysis can support floodplain management to yield maximum socio-ecological benefits for the minimum flood risk. We evaluate flood risk-probabilistic benefit tradeoffs of livelihood practices compatible with direct human use of flood-prone land (agriculture/wild fisheries) and nature conservation (wild fisheries only) in Candaba, Philippines. Located north-west to Metro Manila, Candaba area is a multi-functional landscape that provides a temporally-variable mix of possible land uses, benefits and ecosystem services of local and regional value. To characterize inundation from 1.3- to 100-year recurrence intervals we couple frequency analysis with rainfall-runoff-inundation modelling and remotely-sensed data. By combining simulated probabilistic floods with both damage and benefit functions (e.g. fish capture and rice yield with flood intensity) we estimate potential damages and benefits over varying probabilistic flood hazards. We find that although direct human uses of flood-prone land are associated with damages, for all the investigated magnitudes of flood events with different frequencies, the probabilistic benefits ( 91 million) exceed risks by a large margin ( 33 million). Even considering risk, probabilistic livelihood benefits of direct human uses far exceed benefits provided by scenarios that exclude direct "risky" human uses (difference of 85 million). In addition, we find that individual coping strategies, such as adapting crop planting periods to the flood pulse or fishing rather than cultivating rice in the wet season, minimize flood losses ( 6 million) while allowing for valuable livelihood benefits ($ 125 million) in flood-prone land. Analysis of societal benefits and local capacities to cope with regular floods demonstrate the

  9. Probabilistic floodplain hazard mapping: managing uncertainty by using a bivariate approach for flood frequency analysis

    NASA Astrophysics Data System (ADS)

    Candela, Angela; Tito Aronica, Giuseppe

    2014-05-01

    Floods are a global problem and are considered the most frequent natural disaster world-wide. Many studies show that the severity and frequency of floods have increased in recent years and underline the difficulty to separate the effects of natural climatic changes and human influences as land management practices, urbanization etc. Flood risk analysis and assessment is required to provide information on current or future flood hazard and risks in order to accomplish flood risk mitigation, to propose, evaluate and select measures to reduce it. Both components of risk can be mapped individually and are affected by multiple uncertainties as well as the joint estimate of flood risk. Major sources of uncertainty include statistical analysis of extremes events, definition of hydrological input, channel and floodplain topography representation, the choice of effective hydraulic roughness coefficients. The classical procedure to estimate flood discharge for a chosen probability of exceedance is to deal with a rainfall-runoff model associating to risk the same return period of original rainfall, in accordance with the iso-frequency criterion. Alternatively, a flood frequency analysis to a given record of discharge data is applied, but again the same probability is associated to flood discharges and respective risk. Moreover, since flood peaks and corresponding flood volumes are variables of the same phenomenon, they should be, directly, correlated and, consequently, multivariate statistical analyses must be applied. This study presents an innovative approach to obtain flood hazard maps where hydrological input (synthetic flood design event) to a 2D hydraulic model has been defined by generating flood peak discharges and volumes from: a) a classical univariate approach, b) a bivariate statistical analysis, through the use of copulas. The univariate approach considers flood hydrographs generation by an indirect approach (rainfall-runoff transformation using input rainfall

  10. Undiscovered locatable mineral resources in the Bay Resource Management Plan Area, Southwestern Alaska: A probabilistic assessment

    USGS Publications Warehouse

    Schmidt, J.M.; Light, T.D.; Drew, L.J.; Wilson, Frederic H.; Miller, M.L.; Saltus, R.W.

    2007-01-01

    The Bay Resource Management Plan (RMP) area in southwestern Alaska, north and northeast of Bristol Bay contains significant potential for undiscovered locatable mineral resources of base and precious metals, in addition to metallic mineral deposits that are already known. A quantitative probabilistic assessment has identified 24 tracts of land that are permissive for 17 mineral deposit model types likely to be explored for within the next 15 years in this region. Commodities we discuss in this report that have potential to occur in the Bay RMP area are Ag, Au, Cr, Cu, Fe, Hg, Mo, Pb, Sn, W, Zn, and platinum-group elements. Geoscience data for the region are sufficient to make quantitative estimates of the number of undiscovered deposits only for porphyry copper, epithermal vein, copper skarn, iron skarn, hot-spring mercury, placer gold, and placer platinum-deposit models. A description of a group of shallow- to intermediate-level intrusion-related gold deposits is combined with grade and tonnage data from 13 deposits of this type to provide a quantitative estimate of undiscovered deposits of this new type. We estimate that significant resources of Ag, Au, Cu, Fe, Hg, Mo, Pb, and Pt occur in the Bay Resource Management Plan area in these deposit types. At the 10th percentile probability level, the Bay RMP area is estimated to contain 10,067 metric tons silver, 1,485 metric tons gold, 12.66 million metric tons copper, 560 million metric tons iron, 8,100 metric tons mercury, 500,000 metric tons molybdenum, 150 metric tons lead, and 17 metric tons of platinum in undiscovered deposits of the eight quantified deposit types. At the 90th percentile probability level, the Bay RMP area is estimated to contain 89 metric tons silver, 14 metric tons gold, 911,215 metric tons copper, 330,000 metric tons iron, 1 metric ton mercury, 8,600 metric tons molybdenum and 1 metric ton platinum in undiscovered deposits of the eight deposit types. Other commodities, which may occur in the

  11. Research and development supporting risk-based wildfire effects prediction for fuels and fire management: Status and needs

    Treesearch

    Kevin Hyde; Matthew B. Dickinson; Gil Bohrer; David Calkin; Louisa Evers; Julie Gilbertson-Day; Tessa Nicolet; Kevin Ryan; Christina Tague

    2013-01-01

    Wildland fire management has moved beyond a singular focus on suppression, calling for wildfire management for ecological benefit where no critical human assets are at risk. Processes causing direct effects and indirect, long-term ecosystem changes are complex and multidimensional. Robust risk-assessment tools are required that account for highly variable effects on...

  12. The Effect of Forest Management Strategy on Carbon Storage and Revenue in Western Washington: A Probabilistic Simulation of Tradeoffs.

    PubMed

    Fischer, Paul W; Cullen, Alison C; Ettl, Gregory J

    2017-01-01

    The objectives of this study are to understand tradeoffs between forest carbon and timber values, and evaluate the impact of uncertainty in improved forest management (IFM) carbon offset projects to improve forest management decisions. The study uses probabilistic simulation of uncertainty in financial risk for three management scenarios (clearcutting in 45- and 65-year rotations and no harvest) under three carbon price schemes (historic voluntary market prices, cap and trade, and carbon prices set to equal net present value (NPV) from timber-oriented management). Uncertainty is modeled for value and amount of carbon credits and wood products, the accuracy of forest growth model forecasts, and four other variables relevant to American Carbon Registry methodology. Calculations use forest inventory data from a 1,740 ha forest in western Washington State, using the Forest Vegetation Simulator (FVS) growth model. Sensitivity analysis shows that FVS model uncertainty contributes more than 70% to overall NPV variance, followed in importance by variability in inventory sample (3-14%), and short-term prices for timber products (8%), while variability in carbon credit price has little influence (1.1%). At regional average land-holding costs, a no-harvest management scenario would become revenue-positive at a carbon credit break-point price of $14.17/Mg carbon dioxide equivalent (CO2 e). IFM carbon projects are associated with a greater chance of both large payouts and large losses to landowners. These results inform policymakers and forest owners of the carbon credit price necessary for IFM approaches to equal or better the business-as-usual strategy, while highlighting the magnitude of financial risk and reward through probabilistic simulation. © 2016 Society for Risk Analysis.

  13. Routine culture-based screening versus risk-based management for the prevention of early-onset group B streptococcus disease in the neonate: a systematic review.

    PubMed

    Kurz, Ella; Davis, Deborah

    2015-04-17

    Early-onset group B streptococcus disease, recognized as the most common cause of early onset neonatal sepsis in developed countries, is transmitted vertically from the group B streptococcus carrier mother to the neonate in the peripartum. Accordingly, early-onset group B streptococcus disease is prevented by halting the transmission of the microorganism from the mother to the infant. Two main methods, routine culture-based screening and risk-based management, may be used in the identification of mothers requiring intrapartum antibiotic prophylaxis in labor. While there are advantages and disadvantages to each, there is limited high level evidence available as to which method is superior. To identify the effectiveness of risk-based management versus routine culture-based screening in the prevention of early-onset group B streptococcus disease in the neonate. This review considered studies which treated pregnant women with intrapartum antibiotic prophylaxis following risk- and culture-based protocols for the prevention of early-onset group B streptococcus disease in the neonate. Types of intervention: This review considered studies that evaluated risk-based management against routine culture-based screening for the prevention of early-onset group B streptococcus disease in the neonate. Types of studies: This review looked for highest evidence available which in this case consisted of one quasi experimental study and eight comparative cohort studies with historical or concurrent control groups. Types of outcomes: Incidence of early-onset group B streptococcus disease in neonates as measured by positive group B streptococcus culture from an otherwise sterile site. Secondary outcomes include neonatal death due to group B streptococcus sepsis and percentage of women who received intrapartum antibiotic prophylaxis. A multi-step search strategy was used to find studies which were limited to the English language and published between January 2000 and June 2013. The quality

  14. A family of analytical probabilistic models for urban stormwater management planning

    SciTech Connect

    Papa, F.; Adams, B.J.; Guo, Y.

    1998-07-01

    This paper presents the synthesis of over fifteen years of research on the topic of analytical probabilistic models, as an alternative approach to continuous simulation, that have been derived for the performance analysis of urban runoff quantity and quality control systems. These models overcome the limitations imposed by single event modeling through the use of long term rainfall records and are significantly more computationally efficient and less cumbersome than other methods of continuous analysis. These attributes promote the comprehensive analysis of drainage system design alternatives at the screening and planning levels.

  15. Risk-Based Approach for Microbiological Food Safety Management in the Dairy Industry: The Case of Listeria monocytogenes in Soft Cheese Made from Pasteurized Milk.

    PubMed

    Tenenhaus-Aziza, Fanny; Daudin, Jean-Jacques; Maffre, Alexandre; Sanaa, Moez

    2014-01-01

    According to Codex Alimentarius Commission recommendations, management options applied at the process production level should be based on good hygiene practices, HACCP system, and new risk management metrics such as the food safety objective. To follow this last recommendation, the use of quantitative microbiological risk assessment is an appealing approach to link new risk-based metrics to management options that may be applied by food operators. Through a specific case study, Listeria monocytogenes in soft cheese made from pasteurized milk, the objective of the present article is to practically show how quantitative risk assessment could be used to direct potential intervention strategies at different food processing steps. Based on many assumptions, the model developed estimates the risk of listeriosis at the moment of consumption taking into account the entire manufacturing process and potential sources of contamination. From pasteurization to consumption, the amplification of a primo-contamination event of the milk, the fresh cheese or the process environment is simulated, over time, space, and between products, accounting for the impact of management options, such as hygienic operations and sampling plans. A sensitivity analysis of the model will help orientating data to be collected prioritarily for the improvement and the validation of the model. What-if scenarios were simulated and allowed for the identification of major parameters contributing to the risk of listeriosis and the optimization of preventive and corrective measures.

  16. A risk-based decision tool for the management of organic waste in agriculture and farming activities (FARMERS).

    PubMed

    Río, Miguel; Franco-Uría, Amaya; Abad, Emilio; Roca, Enrique

    2011-01-30

    Currently, specific management guidelines must be implemented for guaranteeing the safe reuse of organic waste in agriculture. With that aim, this work was focused on the development of a decision support tool for a safe and sustainable management of cattle manure as fertiliser in pastureland, to control and limit metal accumulation in soil and to reduce metal biotransfer from soil to other compartments. The system was developed on the basis of an environmental risk assessment multi-compartmental model. In contrast to other management tools, a long-term dynamic modelling approach was selected considering the persistence of metals in the environment. A detailed description of the underlying flow equations which accounts for distribution, human exposure and risk characterisation of metals in the assessed scenario was presented, as well as model parameterization. The tool was implemented in Visual C++ and is structured on a data base, where all required data is stored, the risk assessment model and a GIS module for the visualization of the scenario characteristics and the results obtained (risk indexes). The decision support system allows choosing among three estimation options, depending on the needs of the user, which provide information to both farmers and policy makers. The first option is useful for evaluating the adequacy of the current management practices of the different farms, and the remaining ones provides information on the measures that can be taken to carry out a fertilising plan without exceeding risk to human health. Among other results, maximum values of application rates of manure, maximum permissible metal content of manure and maximum application times in a particular scenario can be estimated by this system. To illustrate tool application, a real case study with data corresponding to different farms of a milk production cooperative was presented.

  17. Groundwater contamination from waste management sites: The interaction between risk-based engineering design and regulatory policy: 1. Methodology

    NASA Astrophysics Data System (ADS)

    Massmann, Joel; Freeze, R. Allan

    1987-02-01

    This paper puts in place a risk-cost-benefit analysis for waste management facilities that explicitly recognizes the adversarial relationship that exists in a regulated market economy between the owner/operator of a waste management facility and the government regulatory agency under whose terms the facility must be licensed. The risk-cost-benefit analysis is set up from the perspective of the owner/operator. It can be used directly by the owner/operator to assess alternative design strategies. It can also be used by the regulatory agency to assess alternative regulatory policy, but only in an indirect manner, by examining the response of an owner/operator to the stimuli of various policies. The objective function is couched in terms of a discounted stream of benefits, costs, and risks over an engineering time horizon. Benefits are in the form of revenues for services provided; costs are those of construction and operation of the facility. Risk is defined as the cost associated with the probability of failure, with failure defined as the occurrence of a groundwater contamination event that violates the licensing requirements established for the facility. Failure requires a breach of the containment structure and contaminant migration through the hydrogeological environment to a compliance surface. The probability of failure can be estimated on the basis of reliability theory for the breach of containment and with a Monte-Carlo finite-element simulation for the advective contaminant transport. In the hydrogeological environment the hydraulic conductivity values are defined stochastically. The probability of failure is reduced by the presence of a monitoring network operated by the owner/operator and located between the source and the regulatory compliance surface. The level of reduction in the probability of failure depends on the probability of detection of the monitoring network, which can be calculated from the stochastic contaminant transport simulations. While

  18. Network analysis of swine shipments in Ontario, Canada, to support disease spread modelling and risk-based disease management.

    PubMed

    Dorjee, S; Revie, C W; Poljak, Z; McNab, W B; Sanchez, J

    2013-10-01

    Understanding contact networks are important for modelling and managing the spread and control of communicable diseases in populations. This study characterizes the swine shipment network of a multi-site production system in southwestern Ontario, Canada. Data were extracted from a company's database listing swine shipments among 251 swine farms, including 20 sow, 69 nursery and 162 finishing farms, for the 2-year period of 2006 to 2007. Several network metrics were generated. The number of shipments per week between pairs of farms ranged from 1 to 6. The medians (and ranges) of out-degree were: sow 6 (1-21), nursery 8 (0-25), and finishing 0 (0-4), over the entire 2-year study period. Corresponding estimates for in-degree of nursery and finishing farms were 3 (0-9) and 3 (0-12) respectively. Outgoing and incoming infection chains (OIC and IIC), were also measured. The medians (ranges) of the monthly OIC and IIC were 0 (0-8) and 0 (0-6), respectively, with very similar measures observed for 2-week intervals. Nursery farms exhibited high measures of centrality. This indicates that they pose greater risks of disease spread in the network. Therefore, they should be given a high priority for disease prevention and control measures affecting all age groups alike. The network demonstrated scale-free and small-world topologies as observed in other livestock shipment studies. This heterogeneity in contacts among farm types and network topologies should be incorporated in simulation models to improve their validity. In conclusion, this study provided useful epidemiological information and parameters for the control and modelling of disease spread among swine farms, for the first time from Ontario, Canada. Copyright © 2013 Elsevier B.V. All rights reserved.

  19. Synthetic drought event sets: thousands of meteorological drought events for risk-based management under present and future conditions

    NASA Astrophysics Data System (ADS)

    Guillod, Benoit P.; Massey, Neil; Otto, Friederike E. L.; Allen, Myles R.; Jones, Richard; Hall, Jim W.

    2016-04-01

    Droughts and related water scarcity can have large impacts on societies and consist of interactions between a number of natural and human factors. Meteorological conditions are usually the first natural trigger of droughts, and climate change is expected to impact these and thereby the frequency and intensity of the events. However, extreme events such as droughts are, by definition, rare, and accurately quantifying the risk related to such events is therefore difficult. The MaRIUS project (Managing the Risks, Impacts and Uncertainties of drought and water Scarcity) aims at quantifying the risks associated with droughts in the UK under present and future conditions. To do so, a large number of drought events, from climate model simulations downscaled at 25km over Europe, are being fed into hydrological models of various complexity and used for the estimation of drought risk associated with human and natural systems, including impacts on the economy, industry, agriculture, terrestrial and aquatic ecosystems, and socio-cultural aspects. Here, we present the hydro-meteorological drought event set that has been produced by weather@home [1] for MaRIUS. Using idle processor time on volunteers' computers around the world, we have run a very large number (10'000s) of Global Climate Model (GCM) simulations, downscaled at 25km over Europe by a nested Regional Climate Model (RCM). Simulations include the past 100 years as well as two future horizons (2030s and 2080s), and provide a large number of sequences of spatio-temporally consistent weather, which are consistent with the boundary forcing such as the ocean, greenhouse gases and solar forcing. The drought event set for use in impact studies is constructed by extracting sequences of dry conditions from these model runs, leading to several thousand drought events. In addition to describing methodological and validation aspects of the synthetic drought event sets, we provide insights into drought risk in the UK, its

  20. Integration of fuzzy analytic hierarchy process and probabilistic dynamic programming in formulating an optimal fleet management model

    NASA Astrophysics Data System (ADS)

    Teoh, Lay Eng; Khoo, Hooi Ling

    2013-09-01

    This study deals with two major aspects of airlines, i.e. supply and demand management. The aspect of supply focuses on the mathematical formulation of an optimal fleet management model to maximize operational profit of the airlines while the aspect of demand focuses on the incorporation of mode choice modeling as parts of the developed model. The proposed methodology is outlined in two-stage, i.e. Fuzzy Analytic Hierarchy Process is first adopted to capture mode choice modeling in order to quantify the probability of probable phenomena (for aircraft acquisition/leasing decision). Then, an optimization model is developed as a probabilistic dynamic programming model to determine the optimal number and types of aircraft to be acquired and/or leased in order to meet stochastic demand during the planning horizon. The findings of an illustrative case study show that the proposed methodology is viable. The results demonstrate that the incorporation of mode choice modeling could affect the operational profit and fleet management decision of the airlines at varying degrees.

  1. Effectiveness of chemical amendments for stabilisation of lead and antimony in risk-based land management of soils of shooting ranges.

    PubMed

    Sanderson, Peter; Naidu, Ravi; Bolan, Nanthi

    2015-06-01

    This study aims to examine the effectiveness of amendments for risk-based land management of shooting range soils and to explore the effectiveness of amendments applied to sites with differing soil physiochemical parameters. A series of amendments with differing mechanisms for stabilisation were applied to four shooting range soils and aged for 1 year. Chemical stabilisation was monitored by pore water extraction, toxicity characteristic leaching procedure (TCLP) and the physiologically based extraction test (PBET) over 1 year. The performance of amendments when applied in conditions reflecting field application did not match the performance in the batch studies. Pore water-extractable metals were not greatly affected by amendment addition. TCLP-extractable Pb was reduced significantly by amendments, particularly lime and magnesium oxide. Antimony leaching was reduced by red mud but mobilised by some of the other amendments. Bioaccessible Pb measured by PBET shows that bioaccessible Pb increased with time after an initial decrease due to the presence of metallic fragments in the soil. Amendments were able to reduce bioaccessible Pb by up to 50 %. Bioaccessible Sb was not readily reduced by soil amendments. Soil amendments were not equally effective across the four soils.

  2. Probabilistic soil erosion modeling using the Erosion Risk Management Tool (ERMIT) after wildfires

    Treesearch

    P. R. Robichaud; W. J. Elliot; J. W. Wagenbrenner

    2011-01-01

    The decision of whether or not to apply post-fire hillslope erosion mitigation treatments, and if so, where these treatments are most needed, is a multi-step process. Land managers must assess the risk of damaging runoff and sediment delivery events occurring on the unrecovered burned hillslope. We developed the Erosion Risk Management Tool (ERMiT) to address this need...

  3. Assessing risks to multiple resources affected by wildfire and forest management using an integrated probabilistic framework

    Treesearch

    Steven P. Norman; Danny C. Lee; Sandra Jacobson; Christine Damiani

    2010-01-01

    The tradeoffs that surround forest management are inherently complex, often involving multiple temporal and spatial scales. For example, conflicts may result when fuel treatments are designed to mediate long-term fuel hazards, but activities could impair sensitive aquatic habitat or degrade wildlife habitat in the short term. This complexity makes it hard for managers...

  4. State Assistance with Risk-Based Data Management: Inventory and needs assessment of 25 state Class II Underground Injection Control programs. Phase 1

    SciTech Connect

    Not Available

    1992-07-01

    As discussed in Section I of the attached report, state agencies must decide where to direct their limited resources in an effort to make optimum use of their available manpower and address those areas that pose the greatest risk to valuable drinking water sources. The Underground Injection Practices Research Foundation (UIPRF) proposed a risk-based data management system (RBDMS) to provide states with the information they need to effectively utilize staff resources, provide dependable documentation to justify program planning, and enhance environmental protection capabilities. The UIPRF structured its approach regarding environmental risk management to include data and information from production, injection, and inactive wells in its RBDMS project. Data from each of these well types is critical to the complete statistical evaluation of environmental risk and selected automated functions. This comprehensive approach allows state Underground Injection Control (UIC) programs to effectively evaluate the risk of contaminating underground sources of drinking water, while alleviating the additional work and associated problems that often arise when separate data bases are used. CH2M Hill and Digital Design Group, through a DOE grant to the UIPRF, completed an inventory and needs assessment of 25 state Class II UIC programs. The states selected for participation by the UIPRF were generally chosen based on interest and whether an active Class II injection well program was in place. The inventory and needs assessment provided an effective means of collecting and analyzing the interest, commitment, design requirements, utilization, and potential benefits of implementing a in individual state UIC programs. Personal contacts were made with representatives from each state to discuss the applicability of a RBDMS in their respective state.

  5. A cost-effectiveness analysis of a proactive management strategy for the Sprint Fidelis recall: a probabilistic decision analysis model.

    PubMed

    Bashir, Jamil; Cowan, Simone; Raymakers, Adam; Yamashita, Michael; Danter, Matthew; Krahn, Andrew; Lynd, Larry D

    2013-12-01

    The management of the recall is complicated by the competing risks of lead failure and complications that can occur with lead revision. Many of these patients are currently undergoing an elective generator change--an ideal time to consider lead revision. To determine the cost-effectiveness of a proactive management strategy for the Sprint Fidelis recall. We obtained detailed clinical outcomes and costing data from a retrospective analysis of 341 patients who received the Sprint Fidelis lead in British Columbia, where patients younger than 60 years were offered lead extraction when undergoing generator replacement. These population-based data were used to construct and populate a probabilistic Markov model in which a proactive management strategy was compared to a conservative strategy to determine the incremental cost per lead failure avoided. In our population, elective lead revisions were half the cost of emergent revisions and had a lower complication rate. In the model, the incremental cost-effectiveness ratio of proactive lead revision versus a recommended monitoring strategy was $12,779 per lead failure avoided. The proactive strategy resulted in 21 fewer failures per 100 patients treated and reduced the chance of an additional complication from an unexpected surgery. Cost-effectiveness analysis suggests that prospective lead revision should be considered when patients with a Sprint Fidelis lead present for pulse generator change. Elective revision of the lead is justified even when 25% of the population is operated on per year, and in some scenarios, it is both less costly and provides a better outcome. © 2013 Heart Rhythm Society Published by Heart Rhythm Society All rights reserved.

  6. Probabilistic Risk-Based Approach to Aeropropulsion System Assessment Developed

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.

    2001-01-01

    In an era of shrinking development budgets and resources, where there is also an emphasis on reducing the product development cycle, the role of system assessment, performed in the early stages of an engine development program, becomes very critical to the successful development of new aeropropulsion systems. A reliable system assessment not only helps to identify the best propulsion system concept among several candidates, it can also identify which technologies are worth pursuing. This is particularly important for advanced aeropropulsion technology development programs, which require an enormous amount of resources. In the current practice of deterministic, or point-design, approaches, the uncertainties of design variables are either unaccounted for or accounted for by safety factors. This could often result in an assessment with unknown and unquantifiable reliability. Consequently, it would fail to provide additional insight into the risks associated with the new technologies, which are often needed by decisionmakers to determine the feasibility and return-on-investment of a new aircraft engine.

  7. A PROBABILISTIC APPROACH FOR ANALYSIS OF UNCERTAINTY IN THE EVALUATION OF WATERSHED MANAGEMENT PRACTICES

    EPA Science Inventory

    A computational framework is presented for analyzing the uncertainty in model estimates of water quality benefits of best management practices (BMPs) in two small (<10 km2) watersheds in Indiana. The analysis specifically recognizes the significance of the difference b...

  8. A PROBABILISTIC APPROACH FOR ANALYSIS OF UNCERTAINTY IN THE EVALUATION OF WATERSHED MANAGEMENT PRACTICES

    EPA Science Inventory

    A computational framework is presented for analyzing the uncertainty in model estimates of water quality benefits of best management practices (BMPs) in two small (<10 km2) watersheds in Indiana. The analysis specifically recognizes the significance of the difference b...

  9. SU-E-T-128: Applying Failure Modes and Effects Analysis to a Risk-Based Quality Management for Stereotactic Radiosurgery in Brazil

    SciTech Connect

    Teixeira, F; Almeida, C de; Huq, M

    2015-06-15

    Purpose: The goal of the present work was to evaluate the process maps for stereotactic radiosurgery (SRS) treatment at three radiotherapy centers in Brazil and apply the FMEA technique to evaluate similarities and differences, if any, of the hazards and risks associated with these processes. Methods: A team, consisting of professionals from different disciplines and involved in the SRS treatment, was formed at each center. Each team was responsible for the development of the process map, and performance of FMEA and FTA. A facilitator knowledgeable in these techniques led the work at each center. The TG100 recommended scales were used for the evaluation of hazard and severity for each step for the major process “treatment planning”. Results: Hazard index given by the Risk Priority Number (RPN) is found to range from 4–270 for various processes and the severity (S) index is found to range from 1–10. The RPN values > 100 and severity value ≥ 7 were chosen to flag safety improvement interventions. Number of steps with RPN ≥100 were found to be 6, 59 and 45 for the three centers. The corresponding values for S ≥ 7 are 24, 21 and 25 respectively. The range of RPN and S values for each center belong to different process steps and failure modes. Conclusion: These results show that interventions to improve safety is different for each center and it is associated with the skill level of the professional team as well as the technology used to provide radiosurgery treatment. The present study will very likely be a model for implementation of risk-based prospective quality management program for SRS treatment in Brazil where currently there are 28 radiotherapy centers performing SRS. A complete FMEA for SRS for these three radiotherapy centers is currently under development.

  10. Probabilistic load simulation: Code development status

    NASA Astrophysics Data System (ADS)

    Newell, J. F.; Ho, H.

    1991-05-01

    The objective of the Composite Load Spectra (CLS) project is to develop generic load models to simulate the composite load spectra that are included in space propulsion system components. The probabilistic loads thus generated are part of the probabilistic design analysis (PDA) of a space propulsion system that also includes probabilistic structural analyses, reliability, and risk evaluations. Probabilistic load simulation for space propulsion systems demands sophisticated probabilistic methodology and requires large amounts of load information and engineering data. The CLS approach is to implement a knowledge based system coupled with a probabilistic load simulation module. The knowledge base manages and furnishes load information and expertise and sets up the simulation runs. The load simulation module performs the numerical computation to generate the probabilistic loads with load information supplied from the CLS knowledge base.

  11. Probabilistic load simulation: Code development status

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Ho, H.

    1991-01-01

    The objective of the Composite Load Spectra (CLS) project is to develop generic load models to simulate the composite load spectra that are included in space propulsion system components. The probabilistic loads thus generated are part of the probabilistic design analysis (PDA) of a space propulsion system that also includes probabilistic structural analyses, reliability, and risk evaluations. Probabilistic load simulation for space propulsion systems demands sophisticated probabilistic methodology and requires large amounts of load information and engineering data. The CLS approach is to implement a knowledge based system coupled with a probabilistic load simulation module. The knowledge base manages and furnishes load information and expertise and sets up the simulation runs. The load simulation module performs the numerical computation to generate the probabilistic loads with load information supplied from the CLS knowledge base.

  12. The value of myocardial perfusion scintigraphy in the diagnosis and management of angina and myocardial infarction: a probabilistic economic analysis.

    PubMed

    Hernández, Rodolfo; Vale, Luke

    2007-01-01

    Coronary heart disease (CHD) is the most common cause of death in the United Kingdom, accounting for more than 120,000 deaths in 2001, among the highest rates in the world. This study reports an economic evaluation of single photon emission computed tomography myocardial perfusion scintigraphy (SPECT) for the diagnosis and management of coronary artery disease (CAD). Strategies involving SPECT with and without stress electrocardiography (ECG) and coronary angiography (CA) were compared to diagnostic strategies not involving SPECT. The diagnosis decision was modeled with a decision tree model and long-term costs and consequences using a Markov model. Data to populate the models were obtained from a series of systematic reviews. Unlike earlier evaluations, a probabilistic analysis was included to assess the statistical imprecision of the results. The results are presented in terms of incremental cost per quality-adjusted life year (QALY). At prevalence levels of CAD of 10.5%, SPECT-based strategies are cost-effective; ECG-CA is highly unlikely to be optimal. At a ceiling ratio of Pound 20,000 per QALY, SPECT-CA has a 90% likelihood of being optimal. Beyond this threshold, this strategy becomes less likely to be cost-effective. At more than Pound 75,000 per QALY, coronary angiography is most likely to be optimal. For higher levels of prevalence (around 50%) and more than a Pound 10,000 per QALY threshold, coronary angiography is the optimal decision. SPECT-based strategies are likely to be cost-effective when risk of CAD is modest (10.5%). Sensitivity analyses show these strategies dominated non-SPECT-based strategies for risk of CAD up to 4%. At higher levels of prevalence, invasive strategies may become worthwhile. Finally, sensitivity analyses show stress echocardiography as a potentially cost-effective option, and further research to assess the relative cost-effectiveness of echocardiography should also be performed.

  13. The future of host cell protein (HCP) identification during process development and manufacturing linked to a risk-based management for their control.

    PubMed

    Bracewell, Daniel G; Francis, Richard; Smales, C Mark

    2015-09-01

    The use of biological systems to synthesize complex therapeutic products has been a remarkable success. However, during product development, great attention must be devoted to defining acceptable levels of impurities that derive from that biological system, heading this list are host cell proteins (HCPs). Recent advances in proteomic analytics have shown how diverse this class of impurities is; as such knowledge and capability grows inevitable questions have arisen about how thorough current approaches to measuring HCPs are. The fundamental issue is how to adequately measure (and in turn monitor and control) such a large number of protein species (potentially thousands of components) to ensure safe and efficacious products. A rather elegant solution is to use an immunoassay (enzyme-linked immunosorbent assay [ELISA]) based on polyclonal antibodies raised to the host cell (biological system) used to synthesize a particular therapeutic product. However, the measurement is entirely dependent on the antibody serum used, which dictates the sensitivity of the assay and the degree of coverage of the HCP spectrum. It provides one summed analog value for HCP amount; a positive if all HCP components can be considered equal, a negative in the more likely event one associates greater risk with certain components of the HCP proteome. In a thorough risk-based approach, one would wish to be able to account for this. These issues have led to the investigation of orthogonal analytical methods; most prominently mass spectrometry. These techniques can potentially both identify and quantify HCPs. The ability to measure and monitor thousands of proteins proportionally increases the amount of data acquired. Significant benefits exist if the information can be used to determine critical HCPs and thereby create an improved basis for risk management. We describe a nascent approach to risk assessment of HCPs based upon such data, drawing attention to timeliness in relation to biosimilar

  14. A generic probabilistic framework for structural health prognostics and uncertainty management

    NASA Astrophysics Data System (ADS)

    Wang, Pingfeng; Youn, Byeng D.; Hu, Chao

    2012-04-01

    Structural health prognostics can be broadly applied to various engineered artifacts in an engineered system. However, techniques and methodologies for health prognostics become application-specific. This study thus aims at formulating a generic framework of structural health prognostics, which is composed of four core elements: (i) a generic health index system with synthesized health index (SHI), (ii) a generic offline learning scheme using the sparse Bayes learning (SBL) technique, (iii) a generic online prediction scheme using the similarity-based interpolation (SBI), and (iv) an uncertainty propagation map for the prognostic uncertainty management. The SHI enables the use of heterogeneous sensory signals; the sparseness feature employing only a few neighboring kernel functions enables the real-time prediction of remaining useful lives (RULs) regardless of data size; the SBI predicts the RULs with the background health knowledge obtained under uncertain manufacturing and operation conditions; and the uncertainty propagation map enables the predicted RULs to be loaded with their statistical characteristics. The proposed generic framework of structural health prognostics is thus applicable to different engineered systems and its effectiveness is demonstrated with two cases studies.

  15. Probabilistic cost-benefit analysis of disaster risk management in a development context.

    PubMed

    Kull, Daniel; Mechler, Reinhard; Hochrainer-Stigler, Stefan

    2013-07-01

    Limited studies have shown that disaster risk management (DRM) can be cost-efficient in a development context. Cost-benefit analysis (CBA) is an evaluation tool to analyse economic efficiency. This research introduces quantitative, stochastic CBA frameworks and applies them in case studies of flood and drought risk reduction in India and Pakistan, while also incorporating projected climate change impacts. DRM interventions are shown to be economically efficient, with integrated approaches more cost-effective and robust than singular interventions. The paper highlights that CBA can be a useful tool if certain issues are considered properly, including: complexities in estimating risk; data dependency of results; negative effects of interventions; and distributional aspects. The design and process of CBA must take into account specific objectives, available information, resources, and the perceptions and needs of stakeholders as transparently as possible. Intervention design and uncertainties should be qualified through dialogue, indicating that process is as important as numerical results. © 2013 The Author(s). Journal compilation © Overseas Development Institute, 2013.

  16. A risk-based approach to evaluating wildlife demographics for management in a changing climate: A case study of the Lewis's Woodpecker

    Treesearch

    Erin Towler; Victoria A. Saab; Richard S. Sojda; Katherine Dickinson; Cindy L. Bruyere; Karen R. Newlon

    2012-01-01

    Given the projected threat that climate change poses to biodiversity, the need for proactive response efforts is clear. However, integrating uncertain climate change information into conservation planning is challenging, and more explicit guidance is needed. To this end, this article provides a specific example of how a risk-based approach can be used to incorporate a...

  17. Cost-Risk Trade-off of Solar Radiation Management and Mitigation under Probabilistic Information on Climate Sensitivity

    NASA Astrophysics Data System (ADS)

    Khabbazan, Mohammad Mohammadi; Roshan, Elnaz; Held, Hermann

    2017-04-01

    In principle solar radiation management (SRM) offers an option to ameliorate anthropogenic temperature rise. However we cannot expect it to simultaneously compensate for anthropogenic changes in further climate variables in a perfect manner. Here, we ask to what extent a proponent of the 2°C-temperature target would apply SRM in conjunction with mitigation in view of global or regional disparities in precipitation changes. We apply cost-risk analysis (CRA), which is a decision analytic framework that makes a trade-off between the expected welfare-loss from climate policy costs and the climate risks from transgressing a climate target. Here, in both global-scale and 'Giorgi'-regional-scale analyses, we evaluate the optimal mixture of SRM and mitigation under probabilistic information about climate sensitivity. To do so, we generalize CRA for the sake of including not only temperature risk, but also globally aggregated and regionally disaggregated precipitation risks. Social welfare is maximized for the following three valuation scenarios: temperature-risk-only, precipitation-risk-only, and equally weighted both-risks. For now, the Giorgi regions are treated by equal weight. We find that for regionally differentiated precipitation targets, the usage of SRM will be comparably more restricted. In the course of time, a cooling of up to 1.3°C can be attributed to SRM for the latter scenario and for a median climate sensitivity of 3°C (for a global target only, this number reduces by 0.5°C). Our results indicate that although SRM would almost completely substitute for mitigation in the globally aggregated analysis, it only saves 70% to 75% of the welfare-loss compared to a purely mitigation-based analysis (from economic costs and climate risks, approximately 4% in terms of BGE) when considering regional precipitation risks in precipitation-risk-only and both-risks scenarios. It remains to be shown how the inclusion of further risks or different regional weights would

  18. Probabilistic Plan Management

    DTIC Science & Technology

    2009-11-17

    115 6.3 Strategies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117 6.4 Experimental Results...6.4.2 Comparison of Strengthening Strategies . . . . . . . . . . . . . . . . . . . 124 6.4.3 Effects of Global Strengthening...103 6.1 The baseline strengthening strategy explores the full search space of the different orderings of backfills, swapping and pruning steps that can

  19. Development and use of risk-based inspection guides

    SciTech Connect

    Taylor, J.H.; Fresco, A.; Higgins, J.; Usher, J.; Long, S.M.

    1989-06-01

    Risk-based system inspection guides, for nuclear power plants which have been subjected to a probabilistic risk assessment (PRA), have been developed to provide guidance to NRC inspectors in prioritizing their inspection activities. Systems are prioritized, and then dominant component failure modes and human errors within those systems are identified for the above-stated purposes. Examples of applications to specific types of NRC inspection activities are also presented. Thus, the report provides guidance for both the development and use of risk-based system inspection guides. Work is proceeding to develop a method methodology for risk-based guidance for nuclear power plants not subject to a PRA. 18 refs., 1 fig.

  20. A risk-based focused decision-management approach for justifying characterization of Hanford tank waste. June 1996, Revision 1; April 1997, Revision 2

    SciTech Connect

    Colson, S.D.; Gephart, R.E.; Hunter, V.L.; Janata, J.; Morgan, L.G.

    1997-12-31

    This report describes a disciplined, risk-based decision-making approach for determining characterization needs and resolving safety issues during the storage and remediation of radioactive waste stored in Hanford tanks. The strategy recommended uses interactive problem evaluation and decision analysis methods commonly used in industry to solve problems under conditions of uncertainty (i.e., lack of perfect knowledge). It acknowledges that problem resolution comes through both the application of high-quality science and human decisions based upon preferences and sometimes hard-to-compare choices. It recognizes that to firmly resolve a safety problem, the controlling waste characteristics and chemical phenomena must be measurable or estimated to an acceptable level of confidence tailored to the decision being made.

  1. 12 CFR 932.3 - Risk-based capital requirement.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 7 2011-01-01 2011-01-01 false Risk-based capital requirement. 932.3 Section 932.3 Banks and Banking FEDERAL HOUSING FINANCE BOARD FEDERAL HOME LOAN BANK RISK MANAGEMENT AND CAPITAL STANDARDS FEDERAL HOME LOAN BANK CAPITAL REQUIREMENTS § 932.3 Risk-based capital requirement. (a...

  2. 12 CFR 932.3 - Risk-based capital requirement.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Risk-based capital requirement. 932.3 Section 932.3 Banks and Banking FEDERAL HOUSING FINANCE BOARD FEDERAL HOME LOAN BANK RISK MANAGEMENT AND CAPITAL STANDARDS FEDERAL HOME LOAN BANK CAPITAL REQUIREMENTS § 932.3 Risk-based capital requirement. (a...

  3. DEVELOPMENT OF PROTOCOLS AND DECISION SUPPORT TOOLS FOR ASSESSING WATERSHED SYSTEM ASSIMILATIVE CAPACITY (SAC), IN SUPPORT OF RISK-BASED ECOSYSTEM MANAGEMENT/RESTORATION PRACTICES

    EPA Science Inventory

    The National Risk Management Research Laboratory (NRMRL) has instituted a program for Risk Management Research for Ecosystem Restoration in Watersheds. This program is one component of the Office of Research and Development Ecosystem Protection Research Program. As part of this...

  4. On the use of hierarchical probabilistic models for characterizing and managing uncertainty in risk/safety assessment.

    PubMed

    Kodell, Ralph L; Chen, James J

    2007-04-01

    A general probabilistically-based approach is proposed for both cancer and noncancer risk/safety assessments. The familiar framework of the original ADI/RfD formulation is used, substituting in the numerator a benchmark dose derived from a hierarchical pharmacokinetic/pharmacodynamic model and in the denominator a unitary uncertainty factor derived from a hierarchical animal/average human/sensitive human model. The empirical probability distributions of the numerator and denominator can be combined to produce an empirical human-equivalent distribution for an animal-derived benchmark dose in external-exposure units.

  5. Probabilistic Exposure Analysis for Chemical Risk Characterization

    PubMed Central

    Bogen, Kenneth T.; Cullen, Alison C.; Frey, H. Christopher; Price, Paul S.

    2009-01-01

    This paper summarizes the state of the science of probabilistic exposure assessment (PEA) as applied to chemical risk characterization. Current probabilistic risk analysis methods applied to PEA are reviewed. PEA within the context of risk-based decision making is discussed, including probabilistic treatment of related uncertainty, interindividual heterogeneity, and other sources of variability. Key examples of recent experience gained in assessing human exposures to chemicals in the environment, and other applications to chemical risk characterization and assessment, are presented. It is concluded that, although improvements continue to be made, existing methods suffice for effective application of PEA to support quantitative analyses of the risk of chemically induced toxicity that play an increasing role in key decision-making objectives involving health protection, triage, civil justice, and criminal justice. Different types of information required to apply PEA to these different decision contexts are identified, and specific PEA methods are highlighted that are best suited to exposure assessment in these separate contexts. PMID:19223660

  6. Improving nutrient management practices in agriculture: The role of risk-based beliefs in understanding farmers' attitudes toward taking additional action

    NASA Astrophysics Data System (ADS)

    Wilson, Robyn S.; Howard, Gregory; Burnett, Elizabeth A.

    2014-08-01

    A recent increase in the amount of dissolved reactive phosphorus (DRP) entering the western Lake Erie basin is likely due to increased spring storm events in combination with issues related to fertilizer application and timing. These factors in combination with warmer lake temperatures have amplified the spread of toxic algal blooms. We assessed the attitudes of farmers in northwest Ohio toward taking at least one additional action to reduce nutrient loss on their farm. Specifically, we (1) identified to what extent farm and farmer characteristics (e.g., age, gross farm sales) as well as risk-based beliefs (e.g., efficacy, risk perception) influenced attitudes, and (2) assessed how these characteristics and beliefs differ in their predictive ability based on unobservable latent classes of farmers. Risk perception, or a belief that negative impacts to profit and water quality from nutrient loss were likely, was the most consistent predictor of farmer attitudes. Response efficacy, or a belief that taking action on one's farm made a difference, was found to significantly influence attitudes, although this belief was particularly salient for the minority class of farmers who were older and more motivated by profit. Communication efforts should focus on the negative impacts of nutrient loss to both the farm (i.e., profit) and the natural environment (i.e., water quality) to raise individual perceived risk among the majority, while the minority need higher perceived efficacy or more specific information about the economic effectiveness of particular recommended practices.

  7. A general risk-based adaptive management scheme incorporating the Bayesian Network Relative Risk Model with the South River, Virginia, as case study.

    PubMed

    Landis, Wayne G; Markiewicz, April J; Ayre, Kim K; Johns, Annie F; Harris, Meagan J; Stinson, Jonah M; Summers, Heather M

    2017-01-01

    Adaptive management has been presented as a method for the remediation, restoration, and protection of ecological systems. Recent reviews have found that the implementation of adaptive management has been unsuccessful in many instances. We present a modification of the model first formulated by Wyant and colleagues that puts ecological risk assessment into a central role in the adaptive management process. This construction has 3 overarching segments. Public engagement and governance determine the goals of society by identifying endpoints and specifying constraints such as costs. The research, engineering, risk assessment, and management section contains the decision loop estimating risk, evaluating options, specifying the monitoring program, and incorporating the data to re-evaluate risk. The 3rd component is the recognition that risk and public engagement can be altered by various externalities such as climate change, economics, technological developments, and population growth. We use the South River, Virginia, USA, study area and our previous research to illustrate each of these components. In our example, we use the Bayesian Network Relative Risk Model to estimate risks, evaluate remediation options, and provide lists of monitoring priorities. The research, engineering, risk assessment, and management loop also provides a structure in which data and the records of what worked and what did not, the learning process, can be stored. The learning process is a central part of adaptive management. We conclude that risk assessment can and should become an integral part of the adaptive management process. Integr Environ Assess Manag 2017;13:115-126. © 2016 SETAC.

  8. SCAP: a new methodology for safety management based on feedback from credible accident-probabilistic fault tree analysis system.

    PubMed

    Khan, F I; Iqbal, A; Ramesh, N; Abbasi, S A

    2001-10-12

    As it is conventionally done, strategies for incorporating accident--prevention measures in any hazardous chemical process industry are developed on the basis of input from risk assessment. However, the two steps-- risk assessment and hazard reduction (or safety) measures--are not linked interactively in the existing methodologies. This prevents a quantitative assessment of the impacts of safety measures on risk control. We have made an attempt to develop a methodology in which risk assessment steps are interactively linked with implementation of safety measures. The resultant system tells us the extent of reduction of risk by each successive safety measure. It also tells based on sophisticated maximum credible accident analysis (MCAA) and probabilistic fault tree analysis (PFTA) whether a given unit can ever be made 'safe'. The application of the methodology has been illustrated with a case study.

  9. Corrosion risk assessment and risk based inspection for sweet oil and gas corrosion -- Practical experience

    SciTech Connect

    Pursell, M.J.; Sehnan, C.; Naen, M.F.

    1999-11-01

    Successful and cost effective Corrosion Risk Assessment depends on a sensible use of prediction methods and good understanding of process factors. Both are discussed with examples. Practice semi-probabilistic Risk Based Inspection planning methods that measure risk directly as cost and personnel hazard are compared with traditional methods and discussed.

  10. Risk-Based Treatment Targets for Onsite Non-Potable Water Reuse

    EPA Science Inventory

    This presentation presents risk-based enteric pathogen log reduction targets for non-potable and potable uses of a variety of alternative source waters (i.e., municipal wastewater, locally-collected greywater, rainwater, and stormwater). A probabilistic, forward Quantitative Micr...

  11. Staged decision making based on probabilistic forecasting

    NASA Astrophysics Data System (ADS)

    Booister, Nikéh; Verkade, Jan; Werner, Micha; Cranston, Michael; Cumiskey, Lydia; Zevenbergen, Chris

    2016-04-01

    Flood forecasting systems reduce, but cannot eliminate uncertainty about the future. Probabilistic forecasts explicitly show that uncertainty remains. However, as - compared to deterministic forecasts - a dimension is added ('probability' or 'likelihood'), with this added dimension decision making is made slightly more complicated. A technique of decision support is the cost-loss approach, which defines whether or not to issue a warning or implement mitigation measures (risk-based method). With the cost-loss method a warning will be issued when the ratio of the response costs to the damage reduction is less than or equal to the probability of the possible flood event. This cost-loss method is not widely used, because it motivates based on only economic values and is a technique that is relatively static (no reasoning, yes/no decision). Nevertheless it has high potential to improve risk-based decision making based on probabilistic flood forecasting because there are no other methods known that deal with probabilities in decision making. The main aim of this research was to explore the ways of making decision making based on probabilities with the cost-loss method better applicable in practice. The exploration began by identifying other situations in which decisions were taken based on uncertain forecasts or predictions. These cases spanned a range of degrees of uncertainty: from known uncertainty to deep uncertainty. Based on the types of uncertainties, concepts of dealing with situations and responses were analysed and possible applicable concepts where chosen. Out of this analysis the concepts of flexibility and robustness appeared to be fitting to the existing method. Instead of taking big decisions with bigger consequences at once, the idea is that actions and decisions are cut-up into smaller pieces and finally the decision to implement is made based on economic costs of decisions and measures and the reduced effect of flooding. The more lead-time there is in

  12. Overview of the co-ordinated risk-based approach to science and management response and recovery for the 2012 eruptions of Tongariro volcano, New Zealand

    NASA Astrophysics Data System (ADS)

    Jolly, G. E.; Keys, H. J. R.; Procter, J. N.; Deligne, N. I.

    2014-10-01

    Tongariro volcano, New Zealand, lies wholly within the Tongariro National Park (TNP), one of New Zealand's major tourist destinations. Two small eruptions of the Te Maari vents on the northern flanks of Tongariro on 6 August 2012 and 21 November 2012 each produced a small ash cloud to < 8 km height accompanied by pyroclastic density currents and ballistic projectiles. The most popular day hike in New Zealand, the Tongariro Alpine Crossing (TAC), runs within 2 km of the Te Maari vents. The larger of the two eruptions (6 August 2012) severely impacted the TAC and resulted in its closure, impacting the local economic and potentially influencing national tourism. In this paper, we document the science and risk management response to the eruption, and detail how quantitative risk assessments were applied in a rapidly evolving situation to inform robust decision-making for when the TAC would be re-opened. The volcanologist and risk manager partnership highlights the value of open communication between scientists and stakeholders during a response to, and subsequent recovery from, a volcanic eruption.

  13. Risk-based decisionmaking (Panel)

    SciTech Connect

    Smith, T.H.

    1995-12-31

    By means of a panel discussion and extensive audience interaction, explore the current challenges and progress to date in applying risk considerations to decisionmaking related to low-level waste. This topic is especially timely because of the proposed legislation pertaining to risk-based decisionmaking and because of the increased emphasis placed on radiological performance assessments of low-level waste disposal.

  14. A fractional-factorial probabilistic-possibilistic optimization framework for planning water resources management systems with multi-level parametric interactions.

    PubMed

    Wang, S; Huang, G H; Zhou, Y

    2016-05-01

    In this study, a multi-level factorial-vertex fuzzy-stochastic programming (MFFP) approach is developed for optimization of water resources systems under probabilistic and possibilistic uncertainties. MFFP is capable of tackling fuzzy parameters at various combinations of α-cut levels, reflecting distinct attitudes of decision makers towards fuzzy parameters in the fuzzy discretization process based on the α-cut concept. The potential interactions among fuzzy parameters can be explored through a multi-level factorial analysis. A water resources management problem with fuzzy and random features is used to demonstrate the applicability of the proposed methodology. The results indicate that useful solutions can be obtained for the optimal allocation of water resources under fuzziness and randomness. They can help decision makers to identify desired water allocation schemes with maximized total net benefits. A variety of decision alternatives can also be generated under different scenarios of water management policies. The findings from the factorial experiment reveal the interactions among design factors (fuzzy parameters) and their curvature effects on the total net benefit, which are helpful in uncovering the valuable information hidden beneath the parameter interactions affecting system performance. A comparison between MFFP and the vertex method is also conducted to demonstrate the merits of the proposed methodology.

  15. A probabilistic approach for a cost-benefit analysis of oil spill management under uncertainty: A Bayesian network model for the Gulf of Finland.

    PubMed

    Helle, Inari; Ahtiainen, Heini; Luoma, Emilia; Hänninen, Maria; Kuikka, Sakari

    2015-08-01

    Large-scale oil accidents can inflict substantial costs to the society, as they typically result in expensive oil combating and waste treatment operations and have negative impacts on recreational and environmental values. Cost-benefit analysis (CBA) offers a way to assess the economic efficiency of management measures capable of mitigating the adverse effects. However, the irregular occurrence of spills combined with uncertainties related to the possible effects makes the analysis a challenging task. We develop a probabilistic modeling approach for a CBA of oil spill management and apply it in the Gulf of Finland, the Baltic Sea. The model has a causal structure, and it covers a large number of factors relevant to the realistic description of oil spills, as well as the costs of oil combating operations at open sea, shoreline clean-up, and waste treatment activities. Further, to describe the effects on environmental benefits, we use data from a contingent valuation survey. The results encourage seeking for cost-effective preventive measures, and emphasize the importance of the inclusion of the costs related to waste treatment and environmental values in the analysis. Although the model is developed for a specific area, the methodology is applicable also to other areas facing the risk of oil spills as well as to other fields that need to cope with the challenging combination of low probabilities, high losses and major uncertainties.

  16. Learning Probabilistic Logic Models from Probabilistic Examples

    PubMed Central

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2009-01-01

    Abstract We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples. PMID:19888348

  17. Towards risk-based management of critical infrastructures : enabling insights and analysis methodologies from a focused study of the bulk power grid.

    SciTech Connect

    Richardson, Bryan T.; LaViolette, Randall A.; Cook, Benjamin Koger

    2008-02-01

    This report summarizes research on a holistic analysis framework to assess and manage risks in complex infrastructures, with a specific focus on the bulk electric power grid (grid). A comprehensive model of the grid is described that can approximate the coupled dynamics of its physical, control, and market components. New realism is achieved in a power simulator extended to include relevant control features such as relays. The simulator was applied to understand failure mechanisms in the grid. Results suggest that the implementation of simple controls might significantly alter the distribution of cascade failures in power systems. The absence of cascade failures in our results raises questions about the underlying failure mechanisms responsible for widespread outages, and specifically whether these outages are due to a system effect or large-scale component degradation. Finally, a new agent-based market model for bilateral trades in the short-term bulk power market is presented and compared against industry observations.

  18. A Multi-Disciplinary Management of Flooding Risk Based on the Use of Rainfall Data, Historical Impacts Databases and Hydrological Modeling

    NASA Astrophysics Data System (ADS)

    Renard, F.; Alonso, L.; Soto, D.

    2014-12-01

    Greater Lyon (1.3 million inhabitants 650 km ²), France, is subjected to recurring floods, with numerous consequences. From the perspective of prevention and management of this risk, the local authorities, in partnership with multidisciplinary researchers, have developed since 1988 a database built by the field teams, which specifically identifies all floods (places, date, impacts, damage, etc.). At first, this historical database is compared to two other databases, the emergency services and the local newspaper ones, in georeferencing these events using a GIS. It turns out that the historical database is more complete and precise, but the contribution of the other two bases is not negligible, and a useful complement to the knowledge of impacts. Thanks to the dense rain measurement network (30 rain gauges), the flood information is then compared to the distribution of rainfall for each episode (interpolation by ordinary kriging). The results are satisfactory and validate the accuracy of the information contained in the database, but also the accuracy of rainfall measurements. Thereafter, the number of flood on the study area is confronted with rainfall characteristics (intensity, duration and height of precipitated water). It does not appear here clear relationship between the number of floods and rainfall characteristics, because of the diversity of land uses, its permeability and the the types of local sewer network and urban water management. Finally, floods observed in the database are compared spatially with a GIS to flooding from the sewer network modeling (using the software Canoe). A strong spatial similarity between floods observed in the field and simulated flood is found in the majority of cases, despite the limitations of each tools. These encouraging results confirm the accuracy of the database and the reliability of the simulation software, and offer many operational perspectives to better understand the flood and learn to cope with the flooding risk.

  19. Software for Probabilistic Risk Reduction

    NASA Technical Reports Server (NTRS)

    Hensley, Scott; Michel, Thierry; Madsen, Soren; Chapin, Elaine; Rodriguez, Ernesto

    2004-01-01

    A computer program implements a methodology, denoted probabilistic risk reduction, that is intended to aid in planning the development of complex software and/or hardware systems. This methodology integrates two complementary prior methodologies: (1) that of probabilistic risk assessment and (2) a risk-based planning methodology, implemented in a prior computer program known as Defect Detection and Prevention (DDP), in which multiple requirements and the beneficial effects of risk-mitigation actions are taken into account. The present methodology and the software are able to accommodate both process knowledge (notably of the efficacy of development practices) and product knowledge (notably of the logical structure of a system, the development of which one seeks to plan). Estimates of the costs and benefits of a planned development can be derived. Functional and non-functional aspects of software can be taken into account, and trades made among them. It becomes possible to optimize the planning process in the sense that it becomes possible to select the best suite of process steps and design choices to maximize the expectation of success while remaining within budget.

  20. Assessment of environmental risks from toxic and nontoxic stressors; a proposed concept for a risk-based management tool for offshore drilling discharges.

    PubMed

    Smit, Mathijs G D; Jak, Robbert G; Rye, Henrik; Frost, Tone Karin; Singsaas, Ivar; Karman, Chris C

    2008-04-01

    In order to improve the ecological status of aquatic systems, both toxic (e.g., chemical) and nontoxic stressors (e.g., suspended particles) should be evaluated. This paper describes an approach to environmental risk assessment of drilling discharges to the sea. These discharges might lead to concentrations of toxic compounds and suspended clay particles in the water compartment and concentrations of toxic compounds, burial of biota, change in sediment structure, and oxygen depletion in marine sediments. The main challenges were to apply existing protocols for environmental risk assessment to nontoxic stressors and to combine risks arising from exposure to these stressors with risk from chemical exposure. The defined approach is based on species sensitivity distributions (SSDs). In addition, precautionary principles from the EU-Technical Guidance Document were incorporated to assure that the method is acceptable in a regulatory context. For all stressors a protocol was defined to construct an SSD for no observed effect concentrations (or levels; NOEC(L)-SSD) to allow for the calculation of the potentially affected fraction of species from predicted exposures. Depending on the availability of data, a NOEC-SSD for toxicants can either be directly based on available NOECs or constructed from the predicted no effect concentration and the variation in sensitivity among species. For nontoxic stressors a NOEL-SSD can be extrapolated from an SSD based on effect or field data. Potentially affected fractions of species at predicted exposures are combined into an overall risk estimate. The developed approach facilitates environmental management of drilling discharges and can be applied to define risk-mitigating measures for both toxic and nontoxic stress.

  1. The impact of FNAC in the management of salivary gland lesions: Institutional experiences leading to a risk-based classification scheme.

    PubMed

    Rossi, Esther Diana; Wong, Lawrence Q; Bizzarro, Tommaso; Petrone, Gianluigi; Mule, Antonio; Fadda, Guido; Baloch, Zubair M

    2016-06-01

    Fine-needle aspiration cytology (FNAC) has proven its value as an essential step in the diagnosis of salivary gland lesions. Although the majority of salivary gland lesions, especially those that are common and benign, can be diagnosed with ease on FNAC, limited cellularity and morphologic lesion heterogeneity can pose diagnostic challenges and lead to false-positive and false-negative diagnoses. This study presents the institutional experience of FNAC of salivary gland lesions from 2 academic centers. A retrospective analysis was conducted on 1729 salivary gland FNAC specimens that were diagnosed over an 8-year period from January 2008 to March 2015. All samples were processed either with liquid-based cytology alone or in combination with air-dried, Diff-Quik-stained or alcohol-fixed, Papanicolaou-stained smears. Surgical excision was performed in 709 of 1749 FNACs (41%) that were diagnosed as nondiagnostic/inadequate (n = 29), benign (n = 111), neoplasm (n = 453), atypical (n = 15), suspicious for malignancy (n = 28), and malignant (n = 73). The overall concordance between cytologic and histologic diagnoses was 92.2%, with 91.8% concordance in the benign category and 89.5% concordance in cases diagnosed as suspicious for malignancy and malignant. The most frequent benign and malignant lesions were pleomorphic adenoma and squamous cell carcinoma, respectively. There were 46 false-negative and 13 false-positive results, leading to an overall specificity of 97.6% and diagnostic accuracy of 91.3%. FNAC is a reliable diagnostic modality for the diagnosis and management of salivary gland lesions based on its high specificity and diagnostic accuracy. Cancer Cytopathol 2016;124:388-96. © 2016 American Cancer Society. © 2016 American Cancer Society.

  2. A Framework for an Adaptive Seasonal Water Resources Management Tool: Developing Regional Water Shortage Triggers Incorporating Probabilistic Supply and Demand Outlooks

    NASA Astrophysics Data System (ADS)

    Wang, H.; Asefa, T.; Bracciano, D.; Adams, A.

    2016-12-01

    Defining appropriate water shortage triggers is important as water managers use them to initiate specific mitigation strategies, e.g., supply increases or demand reduction, to guide through water shortage situations. Such triggers used by most water utilities often recognize past hydrologic conditions and current systems state, e.g., reservoir storage. Seasonal supply and demand outlooks are, however, rarely used in defining water shortage triggers primary due to the skill of forecasted in both streamflow and expected demand needs at seasonal scale is limited. With recent advancement in seasonal flow forecasting from hydrologic models and/or advanced statistical models, such limitation should not be a barrier in its application in water resources management. This work demonstrates an application of three-month ahead forward looking water shortage trigger development scheme for Tampa Bay Water, a utility that has three different supply sources to meet demand. A stochastic flow generation model that maintains spatial and temporal correlation at multiple streamflow gauges was used to generate three-month ahead streamflow forecasting. A system operation model that optimizes source allocation from surface water harvesting, groundwater production and a desalination plant was used to minimize operation costs, while meeting a probabilistic regional demand that is imposed on the system. Forecasted median reservoir elevation at the end of the 3-month was used to define different water shortage phases. Different adaptation management actions were implemented as the system moves to a severe water shortage phase. Retrospective analysis using observed flow and demand in the past decade 2005-2015 was used to demonstrates the as well as benefits from implementing shortage mitigation actions. Other potential use of such water shortage trigger development is also discussed.

  3. Risk based management of invading plant disease

    USDA-ARS?s Scientific Manuscript database

    Effective control of new and emerging plant disease remains a key challenge. Attempts to eradicate pathogens often involve removal of all plants within a fixed distance of detected infected hosts, targeting asymptomatic infection. Here we develop and test potentially more efficient, epidemiologicall...

  4. Risk Based Security Management at Research Reactors

    SciTech Connect

    Ek, David R.

    2015-09-01

    This presentation provides a background of what led to the international emphasis on nuclear security and describes how nuclear security is effectively implemented so as to preserve the societal benefits of nuclear and radioactive materials.

  5. Enhancing the effectiveness of IST through risk-based techniques

    SciTech Connect

    Floyd, S.D.

    1996-12-01

    Current IST requirements were developed mainly through deterministic-based methods. While this approach has resulted in an adequate level of safety and reliability for pumps and valves, insights from probabilistic safety assessments suggest a better safety focus can be achieved at lower costs. That is, some high safety impact pumps and valves are currently not tested under the IST program and should be added, while low safety impact valves could be tested at significantly greater intervals than allowed by the current IST program. The nuclear utility industry, through the Nuclear Energy Institute (NEI), has developed a draft guideline for applying risk-based techniques to focus testing on those pumps and valves with a high safety impact while reducing test frequencies on low safety impact pumps and valves. The guideline is being validated through an industry pilot application program that is being reviewed by the U.S. Nuclear Regulatory Commission. NEI and the ASME maintain a dialogue on the two groups` activities related to risk-based IST. The presenter will provide an overview of the NEI guideline, discuss the methodological approach for applying risk-based technology to IST and provide the status of the industry pilot plant effort.

  6. Probabilistic Structural Analysis Program

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  7. An integrated GIS-based interval-probabilistic programming model for land-use planning management under uncertainty--a case study at Suzhou, China.

    PubMed

    Lu, Shasha; Zhou, Min; Guan, Xingliang; Tao, Lizao

    2015-03-01

    A large number of mathematical models have been developed for supporting optimization of land-use allocation; however, few of them simultaneously consider land suitability (e.g., physical features and spatial information) and various uncertainties existing in many factors (e.g., land availabilities, land demands, land-use patterns, and ecological requirements). This paper incorporates geographic information system (GIS) technology into interval-probabilistic programming (IPP) for land-use planning management (IPP-LUPM). GIS is utilized to assemble data for the aggregated land-use alternatives, and IPP is developed for tackling uncertainties presented as discrete intervals and probability distribution. Based on GIS, the suitability maps of different land users are provided by the outcomes of land suitability assessment and spatial analysis. The maximum area of every type of land use obtained from the suitability maps, as well as various objectives/constraints (i.e., land supply, land demand of socioeconomic development, future development strategies, and environmental capacity), is used as input data for the optimization of land-use areas with IPP-LUPM model. The proposed model not only considers the outcomes of land suitability evaluation (i.e., topography, ground conditions, hydrology, and spatial location) but also involves economic factors, food security, and eco-environmental constraints, which can effectively reflect various interrelations among different aspects in a land-use planning management system. The case study results at Suzhou, China, demonstrate that the model can help to examine the reliability of satisfying (or risk of violating) system constraints under uncertainty. Moreover, it may identify the quantitative relationship between land suitability and system benefits. Willingness to arrange the land areas based on the condition of highly suitable land will not only reduce the potential conflicts on the environmental system but also lead to a lower

  8. The Evidence for a Risk-Based Approach to Australian Higher Education Regulation and Quality Assurance

    ERIC Educational Resources Information Center

    Edwards, Fleur

    2012-01-01

    This paper explores the nascent field of risk management in higher education, which is of particular relevance in Australia currently, as the Commonwealth Government implements its plans for a risk-based approach to higher education regulation and quality assurance. The literature outlines the concept of risk management and risk-based approaches…

  9. The Evidence for a Risk-Based Approach to Australian Higher Education Regulation and Quality Assurance

    ERIC Educational Resources Information Center

    Edwards, Fleur

    2012-01-01

    This paper explores the nascent field of risk management in higher education, which is of particular relevance in Australia currently, as the Commonwealth Government implements its plans for a risk-based approach to higher education regulation and quality assurance. The literature outlines the concept of risk management and risk-based approaches…

  10. Risk Based Inspection Pilot Study of Ignalina Nuclear Power Plant,Unit 2

    SciTech Connect

    Brickstad, Bjorn; Letzter, Adam; Klimasauskas, Arturas; Alzbutas, Robertas; Nedzinskas, Linas; Kopustinskas, Vytis

    2002-07-01

    A project with the acronym IRBIS (Ignalina Risk Based Inspection pilot Study) has been performed with the objective to perform a quantitative risk analysis of a total of 1240 stainless steel welds in Ignalina Nuclear Power Plant, unit 2 (INPP-2). The damage mechanism is IGSCC and the failure probabilities are quantified by using probabilistic fracture mechanics. The conditional core damage probabilities are taken from the plant PSA. (authors)

  11. Risk-based zoning for urbanizing floodplains.

    PubMed

    Porse, Erik

    2014-01-01

    Urban floodplain development brings economic benefits and enhanced flood risks. Rapidly growing cities must often balance the economic benefits and increased risks of floodplain settlement. Planning can provide multiple flood mitigation and environmental benefits by combining traditional structural measures such as levees, increasingly popular landscape and design features (green infrastructure), and non-structural measures such as zoning. Flexibility in both structural and non-structural options, including zoning procedures, can reduce flood risks. This paper presents a linear programming formulation to assess cost-effective urban floodplain development decisions that consider benefits and costs of development along with expected flood damages. It uses a probabilistic approach to identify combinations of land-use allocations (residential and commercial development, flood channels, distributed runoff management) and zoning regulations (development zones in channel) to maximize benefits. The model is applied to a floodplain planning analysis for an urbanizing region in the Baja Sur peninsula of Mexico. The analysis demonstrates how (1) economic benefits drive floodplain development, (2) flexible zoning can improve economic returns, and (3) cities can use landscapes, enhanced by technology and design, to manage floods. The framework can incorporate additional green infrastructure benefits, and bridges typical disciplinary gaps for planning and engineering.

  12. Risk based ASME Code requirements

    SciTech Connect

    Gore, B.F.; Vo, T.V.; Balkey, K.R.

    1992-09-01

    The objective of this ASME Research Task Force is to develop and to apply a methodology for incorporating quantitative risk analysis techniques into the definition of in-service inspection (ISI) programs for a wide range of industrial applications. An additional objective, directed towards the field of nuclear power generation, is ultimately to develop a recommendation for comprehensive revisions to the ISI requirements of Section XI of the ASME Boiler and Pressure Vessel Code. This will require development of a firm technical basis for such requirements, which does not presently exist. Several years of additional research will be required before this can be accomplished. A general methodology suitable for application to any industry has been defined and published. It has recently been refined and further developed during application to the field of nuclear power generation. In the nuclear application probabilistic risk assessment (PRA) techniques and information have been incorporated. With additional analysis, PRA information is used to determine the consequence of a component rupture (increased reactor core damage probability). A procedure has also been recommended for using the resulting quantified risk estimates to determine target component rupture probability values to be maintained by inspection activities. Structural risk and reliability analysis (SRRA) calculations are then used to determine characteristics which an inspection strategy must posess in order to maintain component rupture probabilities below target values. The methodology, results of example applications, and plans for future work are discussed.

  13. PROBABILISTIC INFORMATION INTEGRATION TECHNOLOGY

    SciTech Connect

    J. BOOKER; M. MEYER; ET AL

    2001-02-01

    The Statistical Sciences Group at Los Alamos has successfully developed a structured, probabilistic, quantitative approach for the evaluation of system performance based on multiple information sources, called Information Integration Technology (IIT). The technology integrates diverse types and sources of data and information (both quantitative and qualitative), and their associated uncertainties, to develop distributions for performance metrics, such as reliability. Applications include predicting complex system performance, where test data are lacking or expensive to obtain, through the integration of expert judgment, historical data, computer/simulation model predictions, and any relevant test/experimental data. The technology is particularly well suited for tracking estimated system performance for systems under change (e.g. development, aging), and can be used at any time during product development, including concept and early design phases, prior to prototyping, testing, or production, and before costly design decisions are made. Techniques from various disciplines (e.g., state-of-the-art expert elicitation, statistical and reliability analysis, design engineering, physics modeling, and knowledge management) are merged and modified to develop formal methods for the data/information integration. The power of this technology, known as PREDICT (Performance and Reliability Evaluation with Diverse Information Combination and Tracking), won a 1999 R and D 100 Award (Meyer, Booker, Bement, Kerscher, 1999). Specifically the PREDICT application is a formal, multidisciplinary process for estimating the performance of a product when test data are sparse or nonexistent. The acronym indicates the purpose of the methodology: to evaluate the performance or reliability of a product/system by combining all available (often diverse) sources of information and then tracking that performance as the product undergoes changes.

  14. Probabilistic numerics and uncertainty in computations

    PubMed Central

    Hennig, Philipp; Osborne, Michael A.; Girolami, Mark

    2015-01-01

    We deliver a call to arms for probabilistic numerical methods: algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations. PMID:26346321

  15. Probabilistic numerics and uncertainty in computations.

    PubMed

    Hennig, Philipp; Osborne, Michael A; Girolami, Mark

    2015-07-08

    We deliver a call to arms for probabilistic numerical methods: algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations.

  16. Probabilistic boundary element method

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Raveendra, S. T.

    1989-01-01

    The purpose of the Probabilistic Structural Analysis Method (PSAM) project is to develop structural analysis capabilities for the design analysis of advanced space propulsion system hardware. The boundary element method (BEM) is used as the basis of the Probabilistic Advanced Analysis Methods (PADAM) which is discussed. The probabilistic BEM code (PBEM) is used to obtain the structural response and sensitivity results to a set of random variables. As such, PBEM performs analogous to other structural analysis codes such as finite elements in the PSAM system. For linear problems, unlike the finite element method (FEM), the BEM governing equations are written at the boundary of the body only, thus, the method eliminates the need to model the volume of the body. However, for general body force problems, a direct condensation of the governing equations to the boundary of the body is not possible and therefore volume modeling is generally required.

  17. Formalizing Probabilistic Safety Claims

    NASA Technical Reports Server (NTRS)

    Herencia-Zapana, Heber; Hagen, George E.; Narkawicz, Anthony J.

    2011-01-01

    A safety claim for a system is a statement that the system, which is subject to hazardous conditions, satisfies a given set of properties. Following work by John Rushby and Bev Littlewood, this paper presents a mathematical framework that can be used to state and formally prove probabilistic safety claims. It also enables hazardous conditions, their uncertainties, and their interactions to be integrated into the safety claim. This framework provides a formal description of the probabilistic composition of an arbitrary number of hazardous conditions and their effects on system behavior. An example is given of a probabilistic safety claim for a conflict detection algorithm for aircraft in a 2D airspace. The motivation for developing this mathematical framework is that it can be used in an automated theorem prover to formally verify safety claims.

  18. Probabilistic Composite Design

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1997-01-01

    Probabilistic composite design is described in terms of a computational simulation. This simulation tracks probabilistically the composite design evolution from constituent materials, fabrication process, through composite mechanics and structural components. Comparisons with experimental data are provided to illustrate selection of probabilistic design allowables, test methods/specimen guidelines, and identification of in situ versus pristine strength, For example, results show that: in situ fiber tensile strength is 90% of its pristine strength; flat-wise long-tapered specimens are most suitable for setting ply tensile strength allowables: a composite radome can be designed with a reliability of 0.999999; and laminate fatigue exhibits wide-spread scatter at 90% cyclic-stress to static-strength ratios.

  19. Probabilistic Causation without Probability.

    ERIC Educational Resources Information Center

    Holland, Paul W.

    The failure of Hume's "constant conjunction" to describe apparently causal relations in science and everyday life has led to various "probabilistic" theories of causation of which the study by P. C. Suppes (1970) is an important example. A formal model that was developed for the analysis of comparative agricultural experiments…

  20. Probabilistic composite micromechanics

    NASA Technical Reports Server (NTRS)

    Stock, T. A.; Bellini, P. X.; Murthy, P. L. N.; Chamis, C. C.

    1988-01-01

    Probabilistic composite micromechanics methods are developed that simulate expected uncertainties in unidirectional fiber composite properties. These methods are in the form of computational procedures using Monte Carlo simulation. A graphite/epoxy unidirectional composite (ply) is studied to demonstrate fiber composite material properties at the micro level. Regression results are presented to show the relative correlation between predicted and response variables in the study.

  1. Probabilistic simple sticker systems

    NASA Astrophysics Data System (ADS)

    Selvarajoo, Mathuri; Heng, Fong Wan; Sarmin, Nor Haniza; Turaev, Sherzod

    2017-04-01

    A model for DNA computing using the recombination behavior of DNA molecules, known as a sticker system, was introduced by by L. Kari, G. Paun, G. Rozenberg, A. Salomaa, and S. Yu in the paper entitled DNA computing, sticker systems and universality from the journal of Acta Informatica vol. 35, pp. 401-420 in the year 1998. A sticker system uses the Watson-Crick complementary feature of DNA molecules: starting from the incomplete double stranded sequences, and iteratively using sticking operations until a complete double stranded sequence is obtained. It is known that sticker systems with finite sets of axioms and sticker rules generate only regular languages. Hence, different types of restrictions have been considered to increase the computational power of sticker systems. Recently, a variant of restricted sticker systems, called probabilistic sticker systems, has been introduced [4]. In this variant, the probabilities are initially associated with the axioms, and the probability of a generated string is computed by multiplying the probabilities of all occurrences of the initial strings in the computation of the string. Strings for the language are selected according to some probabilistic requirements. In this paper, we study fundamental properties of probabilistic simple sticker systems. We prove that the probabilistic enhancement increases the computational power of simple sticker systems.

  2. Probabilistic composite micromechanics

    NASA Technical Reports Server (NTRS)

    Stock, T. A.; Bellini, P. X.; Murthy, P. L. N.; Chamis, C. C.

    1988-01-01

    Probabilistic composite micromechanics methods are developed that simulate expected uncertainties in unidirectional fiber composite properties. These methods are in the form of computational procedures using Monte Carlo simulation. A graphite/epoxy unidirectional composite (ply) is studied to demonstrate fiber composite material properties at the micro level. Regression results are presented to show the relative correlation between predicted and response variables in the study.

  3. Probabilistic Threshold Criterion

    SciTech Connect

    Gresshoff, M; Hrousis, C A

    2010-03-09

    The Probabilistic Shock Threshold Criterion (PSTC) Project at LLNL develops phenomenological criteria for estimating safety or performance margin on high explosive (HE) initiation in the shock initiation regime, creating tools for safety assessment and design of initiation systems and HE trains in general. Until recently, there has been little foundation for probabilistic assessment of HE initiation scenarios. This work attempts to use probabilistic information that is available from both historic and ongoing tests to develop a basis for such assessment. Current PSTC approaches start with the functional form of the James Initiation Criterion as a backbone, and generalize to include varying areas of initiation and provide a probabilistic response based on test data for 1.8 g/cc (Ultrafine) 1,3,5-triamino-2,4,6-trinitrobenzene (TATB) and LX-17 (92.5% TATB, 7.5% Kel-F 800 binder). Application of the PSTC methodology is presented investigating the safety and performance of a flying plate detonator and the margin of an Ultrafine TATB booster initiating LX-17.

  4. A Probabilistic Model for Propagating Ungauged Basin Runoff Prediction Variability and Uncertainty Into Estuarine Water Quality Dynamics and Water Quality-Based Management Decisions

    NASA Astrophysics Data System (ADS)

    Anderson, R.; Gronewold, A.; Alameddine, I.; Reckhow, K.

    2008-12-01

    probabilistic modeling software program Analytica. This approach not only reflects uncertainty in parameter estimates but, by modeling the predicted daily runoff rate as a random variable, propagates that variability into the tidal prism model as well. The tidal prism model has the advantage of having only one hydrodynamic calibration parameter, the tidal exchange ratio (the ratio between the volume of water returning to an estuary on an incoming tide and the volume of water which exited the estuary on the previous outgoing tide). We estimate the tidal exchange ratio by calibrating the tidal prism model to salinity data using a Bayesian Markov chain Monte Carlo (MCMC) procedure and, as with other parameters, encode it as a random variable in the comprehensive model. We compare our results to those of a purely deterministic model, and find that intrinsic sources of variability in ungauged basin runoff predictions, when ignored, lead to pollutant concentration forecasts with unnecessarily large prediction intervals, and to potentially over-conservative management decisions. By demonstrating an innovative approach to capturing and explicitly acknowledging uncertainty in runoff model parameter estimates, our modeling approach serves as an ideal building block for future comprehensive model-based pollutant mitigation planning efforts in ungauged coastal watersheds, including those implemented through the US Environmental Protection Agency total maximum daily load program.

  5. Risk based inspection for atmospheric storage tank

    NASA Astrophysics Data System (ADS)

    Nugroho, Agus; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin

    2016-04-01

    Corrosion is an attack that occurs on a metallic material as a result of environment's reaction.Thus, it causes atmospheric storage tank's leakage, material loss, environmental pollution, equipment failure and affects the age of process equipment then finally financial damage. Corrosion risk measurement becomesa vital part of Asset Management at the plant for operating any aging asset.This paper provides six case studies dealing with high speed diesel atmospheric storage tank parts at a power plant. A summary of the basic principles and procedures of corrosion risk analysis and RBI applicable to the Process Industries were discussed prior to the study. Semi quantitative method based onAPI 58I Base-Resource Document was employed. The risk associated with corrosion on the equipment in terms of its likelihood and its consequences were discussed. The corrosion risk analysis outcome used to formulate Risk Based Inspection (RBI) method that should be a part of the atmospheric storage tank operation at the plant. RBI gives more concern to inspection resources which are mostly on `High Risk' and `Medium Risk' criteria and less on `Low Risk' shell. Risk categories of the evaluated equipment were illustrated through case study analysis outcome.

  6. Evaluating physicians' probabilistic judgments.

    PubMed

    Poses, R M; Cebul, R D; Centor, R M

    1988-01-01

    Physicians increasingly are challenged to make probabilistic judgments quantitatively. Their ability to make such judgments may be directly linked to the quality of care they provide. Many methods are available to evaluate these judgments. Graphic means of assessment include the calibration curve, covariance graph, and receiver operating characteristic (ROC) curve. Statistical tools can measure the significance of departures from ideal calibration, and measure the area under ROC curve. Modeling the calibration curve using linear or logistic regression provides another method to assess probabilistic judgments, although these may be limited by failure of the data to meet the model's assumptions. Scoring rules provide indices of overall judgmental performance, although their reliability is difficult to gauge for small sample sizes. Decompositions of scoring rules separate judgmental performance into functional components. The authors provide preliminary guidelines for choosing methods for specific research in this area.

  7. Probabilistic authenticated quantum dialogue

    NASA Astrophysics Data System (ADS)

    Hwang, Tzonelih; Luo, Yi-Ping

    2015-12-01

    This work proposes a probabilistic authenticated quantum dialogue (PAQD) based on Bell states with the following notable features. (1) In our proposed scheme, the dialogue is encoded in a probabilistic way, i.e., the same messages can be encoded into different quantum states, whereas in the state-of-the-art authenticated quantum dialogue (AQD), the dialogue is encoded in a deterministic way; (2) the pre-shared secret key between two communicants can be reused without any security loophole; (3) each dialogue in the proposed PAQD can be exchanged within only one-step quantum communication and one-step classical communication. However, in the state-of-the-art AQD protocols, both communicants have to run a QKD protocol for each dialogue and each dialogue requires multiple quantum as well as classical communicational steps; (4) nevertheless, the proposed scheme can resist the man-in-the-middle attack, the modification attack, and even other well-known attacks.

  8. Probabilistic Fatigue: Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2002-01-01

    Fatigue is a primary consideration in the design of aerospace structures for long term durability and reliability. There are several types of fatigue that must be considered in the design. These include low cycle, high cycle, combined for different cyclic loading conditions - for example, mechanical, thermal, erosion, etc. The traditional approach to evaluate fatigue has been to conduct many tests in the various service-environment conditions that the component will be subjected to in a specific design. This approach is reasonable and robust for that specific design. However, it is time consuming, costly and needs to be repeated for designs in different operating conditions in general. Recent research has demonstrated that fatigue of structural components/structures can be evaluated by computational simulation based on a novel paradigm. Main features in this novel paradigm are progressive telescoping scale mechanics, progressive scale substructuring and progressive structural fracture, encompassed with probabilistic simulation. These generic features of this approach are to probabilistically telescope scale local material point damage all the way up to the structural component and to probabilistically scale decompose structural loads and boundary conditions all the way down to material point. Additional features include a multifactor interaction model that probabilistically describes material properties evolution, any changes due to various cyclic load and other mutually interacting effects. The objective of the proposed paper is to describe this novel paradigm of computational simulation and present typical fatigue results for structural components. Additionally, advantages, versatility and inclusiveness of computational simulation versus testing are discussed. Guidelines for complementing simulated results with strategic testing are outlined. Typical results are shown for computational simulation of fatigue in metallic composite structures to demonstrate the

  9. Geothermal probabilistic cost study

    NASA Technical Reports Server (NTRS)

    Orren, L. H.; Ziman, G. M.; Jones, S. C.; Lee, T. K.; Noll, R.; Wilde, L.; Sadanand, V.

    1981-01-01

    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model was used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents was analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance were examined.

  10. Probabilistic Model Development

    NASA Technical Reports Server (NTRS)

    Adam, James H., Jr.

    2010-01-01

    Objective: Develop a Probabilistic Model for the Solar Energetic Particle Environment. Develop a tool to provide a reference solar particle radiation environment that: 1) Will not be exceeded at a user-specified confidence level; 2) Will provide reference environments for: a) Peak flux; b) Event-integrated fluence; and c) Mission-integrated fluence. The reference environments will consist of: a) Elemental energy spectra; b) For protons, helium and heavier ions.

  11. Geothermal probabilistic cost study

    NASA Astrophysics Data System (ADS)

    Orren, L. H.; Ziman, G. M.; Jones, S. C.; Lee, T. K.; Noll, R.; Wilde, L.; Sadanand, V.

    1981-08-01

    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model was used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents was analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance were examined.

  12. Probabilistic liver atlas construction.

    PubMed

    Dura, Esther; Domingo, Juan; Ayala, Guillermo; Marti-Bonmati, Luis; Goceri, E

    2017-01-13

    Anatomical atlases are 3D volumes or shapes representing an organ or structure of the human body. They contain either the prototypical shape of the object of interest together with other shapes representing its statistical variations (statistical atlas) or a probability map of belonging to the object (probabilistic atlas). Probabilistic atlases are mostly built with simple estimations only involving the data at each spatial location. A new method for probabilistic atlas construction that uses a generalized linear model is proposed. This method aims to improve the estimation of the probability to be covered by the liver. Furthermore, all methods to build an atlas involve previous coregistration of the sample of shapes available. The influence of the geometrical transformation adopted for registration in the quality of the final atlas has not been sufficiently investigated. The ability of an atlas to adapt to a new case is one of the most important quality criteria that should be taken into account. The presented experiments show that some methods for atlas construction are severely affected by the previous coregistration step. We show the good performance of the new approach. Furthermore, results suggest that extremely flexible registration methods are not always beneficial, since they can reduce the variability of the atlas and hence its ability to give sensible values of probability when used as an aid in segmentation of new cases.

  13. Risk-Based Explosive Safety Analysis

    DTIC Science & Technology

    2016-11-30

    safety siting of energetic liquids and propellants can be greatly aided by the use of risk- based methodologies. The low probability of exposed...of energetic liquids and propellants can be greatly aided by the use of risk- based methodologies. The low probability of exposed personnel and the... based analysis of scenario 2 would likely determine that the hazard of death or injury to any single person is low due to the separation distance

  14. Risk-based inservice testing program modifications at Palo Verde nuclear generating station

    SciTech Connect

    Knauf, S.; Lindenlaub, B.; Linthicum, R.

    1996-12-01

    Arizona Public Service Company (APS) is investigating changes to the Palo Verde Inservice Testing (IST) Program that are intended to result in the reduction of the required test frequency for various valves in the American Society of Mechanical Engineers (ASME) Section XI IST program. The analytical techniques employed to select candidate valves and to demonstrate that these frequency reductions are acceptable are risk based. The results of the Palo Verde probabilistic risk assessment (PRA), updated in June 1994, and the risk significant determination performed as part of the implementation efforts for 10 CFR 50.65 (the maintenance rule) were used to select candidate valves for extended test intervals. Additional component level evaluations were conducted by an `expert panel.` The decision to pursue these changes was facilitated by the ASME Risk-Based Inservice Testing Research Task Force for which Palo Verde is participating as a pilot plant. The NRC`s increasing acceptance of cost beneficial licensing actions and risk-based submittals also provided incentive to seek these changes. Arizona Public Service is pursuing the risk-based IST program modification in order to reduce the unnecessary regulatory burden of the IST program through qualitative and quantitative analysis consistent with maintaining a high level of plant safety. The objectives of this project at Palo Verde are as follows: (1) Apply risk-based technologies to IST components to determine their risk significance (i.e., high or low). (2) Apply a combination of deterministic and risk-based methods to determine appropriate testing requirements for IST components including improvement of testing methods and frequency intervals for high-risk significant components. (3) Apply risk-based technologies to high-risk significant components identified by the {open_quotes}expert panel{close_quotes} and outside of the IST program to determine whether additional testing requirements are appropriate.

  15. Risk based microbiological criteria for Campylobacter in broiler meat in the European Union.

    PubMed

    Nauta, Maarten J; Sanaa, Moez; Havelaar, Arie H

    2012-09-03

    Quantitative microbiological risk assessment (QMRA) allows evaluating the public health impact of food safety targets to support the control of foodborne pathogens. We estimate the risk reduction of setting microbiological criteria (MCs) for Campylobacter on broiler meat in 25 European countries, applying quantitative data from the 2008 EU baseline survey. We demonstrate that risk based MCs can be derived without explicit consideration of Food Safety Objectives or Performance Objectives. Published QMRA models for the consumer phase and dose response provide a relation between Campylobacter concentration on skin samples and the attending probability of illness for the consumer. Probabilistic modelling is used to evaluate a set of potential MCs. We present the percentage of batches not complying with the potential criteria, in relation to the risk reduction attending totally efficient treatment of these batches. We find different risk estimates and different impacts of MCs in different countries, which offers a practical and flexible tool for risk managers to select the most appropriate MC by weighing the costs (i.e. non-compliant batches) and the benefits (i.e. reduction in public health risk). Our analyses show that the estimated percentage of batches not complying with the MC is better correlated with the risk estimate than surrogate risk measures like the flock prevalence or the arithmetic mean concentration of bacteria on carcasses, and would therefore be a good measure for the risk of Campylobacter on broiler meat in a particular country. Two uncertain parameters in the model are the ratio of within- and between-flock variances in concentrations, and the transition factor of skin sample concentrations to concentrations on the meat. Sensitivity analyses show that these parameters have a considerable effect on our results, but the impact of their uncertainty is small compared to that of the parameters defining the Microbiological Criterion and the concentration

  16. Incorporating psychological influences in probabilistic cost analysis

    SciTech Connect

    Kujawski, Edouard; Alvaro, Mariana; Edwards, William

    2004-01-08

    Today's typical probabilistic cost analysis assumes an ''ideal'' project that is devoid of the human and organizational considerations that heavily influence the success and cost of real-world projects. In the real world ''Money Allocated Is Money Spent'' (MAIMS principle); cost underruns are rarely available to protect against cost overruns while task overruns are passed on to the total project cost. Realistic cost estimates therefore require a modified probabilistic cost analysis that simultaneously models the cost management strategy including budget allocation. Psychological influences such as overconfidence in assessing uncertainties and dependencies among cost elements and risks are other important considerations that are generally not addressed. It should then be no surprise that actual project costs often exceed the initial estimates and are delivered late and/or with a reduced scope. This paper presents a practical probabilistic cost analysis model that incorporates recent findings in human behavior and judgment under uncertainty, dependencies among cost elements, the MAIMS principle, and project management practices. Uncertain cost elements are elicited from experts using the direct fractile assessment method and fitted with three-parameter Weibull distributions. The full correlation matrix is specified in terms of two parameters that characterize correlations among cost elements in the same and in different subsystems. The analysis is readily implemented using standard Monte Carlo simulation tools such as {at}Risk and Crystal Ball{reg_sign}. The analysis of a representative design and engineering project substantiates that today's typical probabilistic cost analysis is likely to severely underestimate project cost for probability of success values of importance to contractors and procuring activities. The proposed approach provides a framework for developing a viable cost management strategy for allocating baseline budgets and contingencies. Given the

  17. Risk-based enteric pathogen reduction targets for non-potable and direct potable use of roof runoff, stormwater, and greywater

    EPA Science Inventory

    This paper presents risk-based enteric pathogen log reduction targets for non-potable and potable uses of a variety of alternative source waters (i.e., locally-collected greywater, roof runoff, and stormwater). A probabilistic Quantitative Microbial Risk Assessment (QMRA) was use...

  18. Risk-based decisionmaking in the DOE: Challenges and status

    SciTech Connect

    Henry, C.J.; Alchowiak, J.; Moses, M.

    1995-12-31

    The primary mission of the Environmental Management Program is to protect human health and the environment, the first goal of which must be, to address urgent risks and threats. Another is to provide for a safe workplace. Without credible risk assessments and good risk management practices, the central environmental goals cannot be met. Principles for risk analysis which include principles for risk assessment, management, communication, and priority setting were adopted. As recommended, Environmental Management is using risk-based decision making in its budget process and in the implementation of its program. The challenges presented in using a risk-based Decision making process are to integrate risk assessment methods and cultural and social values so as to produce meaningful priorities. The different laws and regulations governing the Department define risk differently in implementing activities to protect human health and the environment, therefore, assumptions and judgements in risk analysis vary. Currently, the Environmental Management Program is developing and improving a framework to incorporate risk into the budget process and to link the budget, compliance requirements and risk reduction/pollution prevention activities.

  19. PCAT: Probabilistic Cataloger

    NASA Astrophysics Data System (ADS)

    Daylan, Tansu; Portillo, K. N. Stephen; Finkbeiner, Douglas P.

    2017-05-01

    PCAT (Probabilistic Cataloger) samples from the posterior distribution of a metamodel, i.e., union of models with different dimensionality, to compare the models. This is achieved via transdimensional proposals such as births, deaths, splits and merges in addition to the within-model proposals. This method avoids noisy estimates of the Bayesian evidence that may not reliably distinguish models when sampling from the posterior probability distribution of each model. The code has been applied in two different subfields of astronomy: high energy photometry, where transdimensional elements are gamma-ray point sources; and strong lensing, where light-deflecting dark matter subhalos take the role of transdimensional elements.

  20. An Approach to Risk-Based Design Incorporating Damage Tolerance Analyses

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Glaessgen, Edward H.; Sleight, David W.

    2002-01-01

    Incorporating risk-based design as an integral part of spacecraft development is becoming more and more common. Assessment of uncertainties associated with design parameters and environmental aspects such as loading provides increased knowledge of the design and its performance. Results of such studies can contribute to mitigating risk through a system-level assessment. Understanding the risk of an event occurring, the probability of its occurrence, and the consequences of its occurrence can lead to robust, reliable designs. This paper describes an approach to risk-based structural design incorporating damage-tolerance analysis. The application of this approach to a candidate Earth-entry vehicle is described. The emphasis of the paper is on describing an approach for establishing damage-tolerant structural response inputs to a system-level probabilistic risk assessment.

  1. Quantum probabilistic logic programming

    NASA Astrophysics Data System (ADS)

    Balu, Radhakrishnan

    2015-05-01

    We describe a quantum mechanics based logic programming language that supports Horn clauses, random variables, and covariance matrices to express and solve problems in probabilistic logic. The Horn clauses of the language wrap random variables, including infinite valued, to express probability distributions and statistical correlations, a powerful feature to capture relationship between distributions that are not independent. The expressive power of the language is based on a mechanism to implement statistical ensembles and to solve the underlying SAT instances using quantum mechanical machinery. We exploit the fact that classical random variables have quantum decompositions to build the Horn clauses. We establish the semantics of the language in a rigorous fashion by considering an existing probabilistic logic language called PRISM with classical probability measures defined on the Herbrand base and extending it to the quantum context. In the classical case H-interpretations form the sample space and probability measures defined on them lead to consistent definition of probabilities for well formed formulae. In the quantum counterpart, we define probability amplitudes on Hinterpretations facilitating the model generations and verifications via quantum mechanical superpositions and entanglements. We cast the well formed formulae of the language as quantum mechanical observables thus providing an elegant interpretation for their probabilities. We discuss several examples to combine statistical ensembles and predicates of first order logic to reason with situations involving uncertainty.

  2. Probabilistic cellular automata.

    PubMed

    Agapie, Alexandru; Andreica, Anca; Giuclea, Marius

    2014-09-01

    Cellular automata are binary lattices used for modeling complex dynamical systems. The automaton evolves iteratively from one configuration to another, using some local transition rule based on the number of ones in the neighborhood of each cell. With respect to the number of cells allowed to change per iteration, we speak of either synchronous or asynchronous automata. If randomness is involved to some degree in the transition rule, we speak of probabilistic automata, otherwise they are called deterministic. With either type of cellular automaton we are dealing with, the main theoretical challenge stays the same: starting from an arbitrary initial configuration, predict (with highest accuracy) the end configuration. If the automaton is deterministic, the outcome simplifies to one of two configurations, all zeros or all ones. If the automaton is probabilistic, the whole process is modeled by a finite homogeneous Markov chain, and the outcome is the corresponding stationary distribution. Based on our previous results for the asynchronous case-connecting the probability of a configuration in the stationary distribution to its number of zero-one borders-the article offers both numerical and theoretical insight into the long-term behavior of synchronous cellular automata.

  3. Probabilistic Climate Forecasting

    NASA Astrophysics Data System (ADS)

    Allen, M. R.; Aina, T.; Bannerman, S.; Christensen, C.; Collins, M.; Dzbor, M.; Faull, N.; Folgate, V.; Frame, D.; Gault, R.; Kettleborough, J.; Knight, S.; Martin, A.; McPherson, E.; Simpson, A.; Spicer, B.; Stainforth, D.; Piani, C.

    2003-12-01

    As a European record-breaking summer draws to an end, climate `stakeholders' are actively planning for the future, presenting the climate research communtiy with a new challenge. Today's coastal and water-supply engineers do not need `projections' of how the climate might respond to rising levels of greenhouse gases, no matter how detailed and realistic. Rather they need to know what changes can be ruled out at a given level of confidence. This is probabilistic climate forecasting. The correct procedure for probabilistic climate forecasting begins with a perturbation analysis of the model to identify consistent relationships between observable quantities and forecast variables of interest(this is reffered to as: `mapping the response manifold'). The resulting ensemble is weighted to accurately represent both current knowledge and uncertainty in observations and then used to infer future climate change. Mapping the respons manifold in a full-scale, non-linear climate model is a formidable chalenge well beyond the capabilities of conventional supercomputing resources. Today the only adequate resource of this scale is presented by the joint idle processing capacity of home and desktop computers of the general public: this is the climateprediction.net approach.

  4. Probabilistic population aging

    PubMed Central

    2017-01-01

    We merge two methodologies, prospective measures of population aging and probabilistic population forecasts. We compare the speed of change and variability in forecasts of the old age dependency ratio and the prospective old age dependency ratio as well as the same comparison for the median age and the prospective median age. While conventional measures of population aging are computed on the basis of the number of years people have already lived, prospective measures are computed also taking account of the expected number of years they have left to live. Those remaining life expectancies change over time and differ from place to place. We compare the probabilistic distributions of the conventional and prospective measures using examples from China, Germany, Iran, and the United States. The changes over time and the variability of the prospective indicators are smaller than those that are observed in the conventional ones. A wide variety of new results emerge from the combination of methodologies. For example, for Germany, Iran, and the United States the likelihood that the prospective median age of the population in 2098 will be lower than it is today is close to 100 percent. PMID:28636675

  5. Topics in Probabilistic Judgment Aggregation

    ERIC Educational Resources Information Center

    Wang, Guanchun

    2011-01-01

    This dissertation is a compilation of several studies that are united by their relevance to probabilistic judgment aggregation. In the face of complex and uncertain events, panels of judges are frequently consulted to provide probabilistic forecasts, and aggregation of such estimates in groups often yield better results than could have been made…

  6. Passage Retrieval: A Probabilistic Technique.

    ERIC Educational Resources Information Center

    Melucci, Massimo

    1998-01-01

    Presents a probabilistic technique to retrieve passages from texts having a large size or heterogeneous semantic content. Results of experiments comparing the probabilistic technique to one based on a text segmentation algorithm revealed that the passage size affects passage retrieval performance; text organization and query generality may have an…

  7. Topics in Probabilistic Judgment Aggregation

    ERIC Educational Resources Information Center

    Wang, Guanchun

    2011-01-01

    This dissertation is a compilation of several studies that are united by their relevance to probabilistic judgment aggregation. In the face of complex and uncertain events, panels of judges are frequently consulted to provide probabilistic forecasts, and aggregation of such estimates in groups often yield better results than could have been made…

  8. Time Analysis for Probabilistic Workflows

    SciTech Connect

    Czejdo, Bogdan; Ferragut, Erik M

    2012-01-01

    There are many theoretical and practical results in the area of workflow modeling, especially when the more formal workflows are used. In this paper we focus on probabilistic workflows. We show algorithms for time computations in probabilistic workflows. With time of activities more precisely modeled, we can achieve improvement in the work cooperation and analyses of cooperation including simulation and visualization.

  9. Probabilistic Structural Analysis Theory Development

    NASA Technical Reports Server (NTRS)

    Burnside, O. H.

    1985-01-01

    The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.

  10. Probabilistic Structural Analysis Theory Development

    NASA Technical Reports Server (NTRS)

    Burnside, O. H.

    1985-01-01

    The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.

  11. Tools for Risk-Based UXO Remediation

    DTIC Science & Technology

    2014-01-01

    we (i) performed a probabilistic risk assessment using polarizabilities and ground truth information from Camp San Luis Obispo , Camp Butner, and...actual depth distribution of the UXO recovered at San Luis Obispo and results of the synthetic seed study, we conclude that all of the UXO, at least...same detection scheme, for burial depths of up to 0.77m. Thus, the detection process applied to ESTCP’s Classification Study at San Luis Obispo , CA

  12. Risk-based SMA for Cubesats

    NASA Technical Reports Server (NTRS)

    Leitner, Jesse

    2016-01-01

    This presentation conveys an approach for risk-based safety and mission assurance applied to cubesats. This presentation accompanies a NASA Goddard standard in development that provides guidance for building a mission success plan for cubesats based on the risk tolerance and resources available.

  13. Fews-Risk: A step towards risk-based flood forecasting

    NASA Astrophysics Data System (ADS)

    Bachmann, Daniel; Eilander, Dirk; de Leeuw, Annemargreet; Diermanse, Ferdinand; Weerts, Albrecht; de Bruijn, Karin; Beckers, Joost; Boelee, Leonore; Brown, Emma; Hazlewood, Caroline

    2015-04-01

    Operational flood prediction and the assessment of flood risk are important components of flood management. Currently, the model-based prediction of discharge and/or water level in a river is common practice for operational flood forecasting. Based on the prediction of these values decisions about specific emergency measures are made within operational flood management. However, the information provided for decision support is restricted to pure hydrological or hydraulic aspects of a flood. Information about weak sections within the flood defences, flood prone areas and assets at risk in the protected areas are rarely used in a model-based flood forecasting system. This information is often available for strategic planning, but is not in an appropriate format for operational purposes. The idea of FEWS-Risk is the extension of existing flood forecasting systems with elements of strategic flood risk analysis, such as probabilistic failure analysis, two dimensional flood spreading simulation and the analysis of flood impacts and consequences. Thus, additional information is provided to the decision makers, such as: • Location, timing and probability of failure of defined sections of the flood defence line; • Flood spreading, extent and hydraulic values in the hinterland caused by an overflow or a breach flow • Impacts and consequences in case of flooding in the protected areas, such as injuries or casualties and/or damages to critical infrastructure or economy. In contrast with purely hydraulic-based operational information, these additional data focus upon decision support for answering crucial questions within an operational flood forecasting framework, such as: • Where should I reinforce my flood defence system? • What type of action can I take to mend a weak spot in my flood defences? • What are the consequences of a breach? • Which areas should I evacuate first? This presentation outlines the additional required workflows towards risk-based flood

  14. Probabilistic retinal vessel segmentation

    NASA Astrophysics Data System (ADS)

    Wu, Chang-Hua; Agam, Gady

    2007-03-01

    Optic fundus assessment is widely used for diagnosing vascular and non-vascular pathology. Inspection of the retinal vasculature may reveal hypertension, diabetes, arteriosclerosis, cardiovascular disease and stroke. Due to various imaging conditions retinal images may be degraded. Consequently, the enhancement of such images and vessels in them is an important task with direct clinical applications. We propose a novel technique for vessel enhancement in retinal images that is capable of enhancing vessel junctions in addition to linear vessel segments. This is an extension of vessel filters we have previously developed for vessel enhancement in thoracic CT scans. The proposed approach is based on probabilistic models which can discern vessels and junctions. Evaluation shows the proposed filter is better than several known techniques and is comparable to the state of the art when evaluated on a standard dataset. A ridge-based vessel tracking process is applied on the enhanced image to demonstrate the effectiveness of the enhancement filter.

  15. Probabilistic Fiber Composite Micromechanics

    NASA Technical Reports Server (NTRS)

    Stock, Thomas A.

    1996-01-01

    Probabilistic composite micromechanics methods are developed that simulate expected uncertainties in unidirectional fiber composite properties. These methods are in the form of computational procedures using Monte Carlo simulation. The variables in which uncertainties are accounted for include constituent and void volume ratios, constituent elastic properties and strengths, and fiber misalignment. A graphite/epoxy unidirectional composite (ply) is studied to demonstrate fiber composite material property variations induced by random changes expected at the material micro level. Regression results are presented to show the relative correlation between predictor and response variables in the study. These computational procedures make possible a formal description of anticipated random processes at the intra-ply level, and the related effects of these on composite properties.

  16. Probabilistic Mesomechanical Fatigue Model

    NASA Technical Reports Server (NTRS)

    Tryon, Robert G.

    1997-01-01

    A probabilistic mesomechanical fatigue life model is proposed to link the microstructural material heterogeneities to the statistical scatter in the macrostructural response. The macrostructure is modeled as an ensemble of microelements. Cracks nucleation within the microelements and grow from the microelements to final fracture. Variations of the microelement properties are defined using statistical parameters. A micromechanical slip band decohesion model is used to determine the crack nucleation life and size. A crack tip opening displacement model is used to determine the small crack growth life and size. Paris law is used to determine the long crack growth life. The models are combined in a Monte Carlo simulation to determine the statistical distribution of total fatigue life for the macrostructure. The modeled response is compared to trends in experimental observations from the literature.

  17. Novel probabilistic neuroclassifier

    NASA Astrophysics Data System (ADS)

    Hong, Jiang; Serpen, Gursel

    2003-09-01

    A novel probabilistic potential function neural network classifier algorithm to deal with classes which are multi-modally distributed and formed from sets of disjoint pattern clusters is proposed in this paper. The proposed classifier has a number of desirable properties which distinguish it from other neural network classifiers. A complete description of the algorithm in terms of its architecture and the pseudocode is presented. Simulation analysis of the newly proposed neuro-classifier algorithm on a set of benchmark problems is presented. Benchmark problems tested include IRIS, Sonar, Vowel Recognition, Two-Spiral, Wisconsin Breast Cancer, Cleveland Heart Disease and Thyroid Gland Disease. Simulation results indicate that the proposed neuro-classifier performs consistently better for a subset of problems for which other neural classifiers perform relatively poorly.

  18. Probabilistic, meso-scale flood loss modelling

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2016-04-01

    Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.

  19. Probabilistic brains: knowns and unknowns

    PubMed Central

    Pouget, Alexandre; Beck, Jeffrey M; Ma, Wei Ji; Latham, Peter E

    2015-01-01

    There is strong behavioral and physiological evidence that the brain both represents probability distributions and performs probabilistic inference. Computational neuroscientists have started to shed light on how these probabilistic representations and computations might be implemented in neural circuits. One particularly appealing aspect of these theories is their generality: they can be used to model a wide range of tasks, from sensory processing to high-level cognition. To date, however, these theories have only been applied to very simple tasks. Here we discuss the challenges that will emerge as researchers start focusing their efforts on real-life computations, with a focus on probabilistic learning, structural learning and approximate inference. PMID:23955561

  20. Probabilistic Design of Composite Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2006-01-01

    A formal procedure for the probabilistic design evaluation of a composite structure is described. The uncertainties in all aspects of a composite structure (constituent material properties, fabrication variables, structural geometry, and service environments, etc.), which result in the uncertain behavior in the composite structural responses, are included in the evaluation. The probabilistic evaluation consists of: (1) design criteria, (2) modeling of composite structures and uncertainties, (3) simulation methods, and (4) the decision-making process. A sample case is presented to illustrate the formal procedure and to demonstrate that composite structural designs can be probabilistically evaluated with accuracy and efficiency.

  1. Risk-based remediation of polluted sites: A critical perspective.

    PubMed

    Kuppusamy, Saranya; Venkateswarlu, Kadiyala; Megharaj, Mallavarapu; Mayilswami, Srinithi; Lee, Yong Bok

    2017-11-01

    Sites contaminated with chemical pollutants represent a growing challenge, and remediation of such lands is of international concern. Risk-based land management (RBLM) is an emerging approach that integrates risk assessment practices with more traditional site-specific investigations and remediation activities. Developing countries are yet to adopt RBLM strategies for remediation. RBLM is considered to be practical, scientifically defensible and cost-efficient. However, it is inherently limited by: firstly, the accuracy of risk assessment models used; secondly, ramifications of the fact that they are more likely to leave contamination in place; and thirdly, uncertainties involved and having to consider the total concentrations of all contaminants in soils that overestimate the potential risks from exposure to the contaminants. Consideration of contaminant bioavailability as the underlying basis for risk assessment and setting remediation goals of those contaminated lands that pose a risk to environmental and human health may lead to the development of a more sophisticated risk-based approach. However, employing the bioavailability concept in RBLM has not been extensively studied and/or legalized. This review highlights the extent of global land contamination, and the concept of risk-based assessment and management of contaminated sites including its advantages and disadvantages. Furthermore, the concept of bioavailability-based RBLM strategy has been proposed, and the challenges of RBLM and the priority areas for future research are summarized. Thus, the present review may help achieve a better understanding and successful implementation of a sustainable bioavailability-based RBLM strategy. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Common Difficulties with Probabilistic Reasoning.

    ERIC Educational Resources Information Center

    Hope, Jack A.; Kelly, Ivan W.

    1983-01-01

    Several common errors reflecting difficulties in probabilistic reasoning are identified, relating to ambiguity, previous outcomes, sampling, unusual events, and estimating. Knowledge of these mistakes and interpretations may help mathematics teachers understand the thought processes of their students. (MNS)

  3. Probabilistic Open Set Recognition

    NASA Astrophysics Data System (ADS)

    Jain, Lalit Prithviraj

    Real-world tasks in computer vision, pattern recognition and machine learning often touch upon the open set recognition problem: multi-class recognition with incomplete knowledge of the world and many unknown inputs. An obvious way to approach such problems is to develop a recognition system that thresholds probabilities to reject unknown classes. Traditional rejection techniques are not about the unknown; they are about the uncertain boundary and rejection around that boundary. Thus traditional techniques only represent the "known unknowns". However, a proper open set recognition algorithm is needed to reduce the risk from the "unknown unknowns". This dissertation examines this concept and finds existing probabilistic multi-class recognition approaches are ineffective for true open set recognition. We hypothesize the cause is due to weak adhoc assumptions combined with closed-world assumptions made by existing calibration techniques. Intuitively, if we could accurately model just the positive data for any known class without overfitting, we could reject the large set of unknown classes even under this assumption of incomplete class knowledge. For this, we formulate the problem as one of modeling positive training data by invoking statistical extreme value theory (EVT) near the decision boundary of positive data with respect to negative data. We provide a new algorithm called the PI-SVM for estimating the unnormalized posterior probability of class inclusion. This dissertation also introduces a new open set recognition model called Compact Abating Probability (CAP), where the probability of class membership decreases in value (abates) as points move from known data toward open space. We show that CAP models improve open set recognition for multiple algorithms. Leveraging the CAP formulation, we go on to describe the novel Weibull-calibrated SVM (W-SVM) algorithm, which combines the useful properties of statistical EVT for score calibration with one-class and binary

  4. Use of Geologic and Paleoflood Information for INL Probabilistic Flood Hazard Decisions

    NASA Astrophysics Data System (ADS)

    Ostenaa, D.; O'Connell, D.; Creed, B.

    2009-05-01

    The Big Lost River is a western U.S., closed basin stream which flows through and terminates on the Idaho National Laboratory. Historic flows are highly regulated, and peak flows decline downstream through natural and anthropomorphic influences. Glaciated headwater regions were the source of Pleistocene outburst floods which traversed the site. A wide range of DOE facilities (including a nuclear research reactor) require flood stage estimates for flow exceedance probabilities over a range from 1/100/yr to 1/100,000/yr per DOE risk based standards. These risk management objectives required the integration of geologic and geomorphic paleoflood data into Bayesian non parametric flood frequency analyses that incorporated measurement uncertainties in gaged, historical, and paleoflood discharges and non exceedance bounds to produce fully probabilistic flood frequency estimates for annual exceedance probabilities of specific discharges of interest. Two-dimensional hydraulic flow modeling with scenarios for varied hydraulic parameters, infiltration, and culvert blockages on the site was conducted for a range of discharges from 13-700 m3/s. High-resolution topographic grids and two-dimensional flow modeling allowed detailed evaluation of the potential impacts of numerous secondary channels and flow paths resulting from flooding in extreme events. These results were used to construct stage probability curves for 15 key locations on the site consistent with DOE standards. These probability curves resulted from the systematic inclusion of contributions of uncertainty from flood sources, hydraulic modeling, and flood-frequency analyses. These products also provided a basis to develop weights for logic tree branches associated with infiltration and culvert performance scenarios to produce probabilistic inundation maps. The flood evaluation process was structured using Senior Seismic Hazard Analysis Committee processes (NRC-NUREG/CR-6372) concepts, evaluating and integrating the

  5. A probabilistic prediction network for hydrological drought identification and environmental flow assessment

    NASA Astrophysics Data System (ADS)

    Liu, Zhiyong; Törnros, Tobias; Menzel, Lucas

    2016-08-01

    A general probabilistic prediction network is proposed for hydrological drought examination and environmental flow assessment. This network consists of three major components. First, we present the joint streamflow drought indicator (JSDI) to describe the hydrological dryness/wetness conditions. The JSDI is established based on a high-dimensional multivariate probabilistic model. In the second part, a drought-based environmental flow assessment method is introduced, which provides dynamic risk-based information about how much flow (the environmental flow target) is required for drought recovery and its likelihood under different hydrological drought initial situations. The final part involves estimating the conditional probability of achieving the required environmental flow under different precipitation scenarios according to the joint dependence structure between streamflow and precipitation. Three watersheds from different countries (Germany, China, and the United States) with varying sizes from small to large were used to examine the usefulness of this network. The results show that the JSDI can provide an assessment of overall hydrological dryness/wetness conditions and performs well in identifying both drought onset and persistence. This network also allows quantitative prediction of targeted environmental flow required for hydrological drought recovery and estimation of the corresponding likelihood. Moreover, the results confirm that the general network can estimate the conditional probability associated with the required flow under different precipitation scenarios. The presented methodology offers a promising tool for water supply planning and management and for drought-based environmental flow assessment. The network has no restrictions that would prevent it from being applied to other basins worldwide.

  6. Probabilistic Risk Assessment: A Bibliography

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Probabilistic risk analysis is an integration of failure modes and effects analysis (FMEA), fault tree analysis and other techniques to assess the potential for failure and to find ways to reduce risk. This bibliography references 160 documents in the NASA STI Database that contain the major concepts, probabilistic risk assessment, risk and probability theory, in the basic index or major subject terms, An abstract is included with most citations, followed by the applicable subject terms.

  7. Risk based tiered approach (RBTASM) for pollution prevention.

    PubMed

    Elves, R G; Sweeney, L M; Tomljanovic, C

    1997-11-01

    Effective management of human health and ecological hazards in the manufacturing and maintenance environment can be achieved by focusing on the risks associated with these operations. The NDCEE Industrial Health Risk Assessment (IHRA) Program is developing a comprehensive approach to risk analysis applied to existing processes and used to evaluate alternatives. The IHRA Risk-Based Tiered Approach (RBTASM) builds on the American Society for Testing and Materials (ASTM) Risk-Based Corrective Action (RBCA) effort to remediate underground storage tanks. Using readily available information, a semi-quantitative ranking of alternatives based on environmental, safety, and occupational health criteria was produced. A Rapid Screening Assessment of alternative corrosion protection products was performed on behalf of the Joint Group on Acquisition Pollution Prevention (JG-APP). Using the RBTASM in pollution prevention alternative selection required higher tiered analysis and more detailed assessment of human health risks under site-specific conditions. This example illustrates the RBTASM for a organic finishing line using three different products (one conventional spray and two alternative powder coats). The human health risk information developed using the RBTASM is considered along with product performance, regulatory, and cost information by risk managers downselecting alternatives for implementation or further analysis.

  8. Risk based analysis: A rational approach to site cleanup

    SciTech Connect

    Arulanatham, R.; So, E.

    1994-12-31

    Soil and groundwater pollution in urban areas often can pose a threat to either human health or water quality or both. This soil and groundwater cleanup can be a very lengthy process and requires significant economic resources. The cleanup levels or requirements required by one agency sometimes do not match that required by the other agency, especially those for soil pollution. The involvement of several agencies at different times during the reclamation process has often diminished the cost-effectiveness of the reclamation efforts. In an attempt to bring some solutions to minimize this kind of problem (which has been experienced by both the authors) the staff of the Alameda County Department of Environmental Health and the Regional Water Quality Control Board, San Francisco Bay Region, has jointly developed some workable guidelines to self-assist the responsible parties in deriving target cleanup goals that are both human health (or other ecological receptor) and water quality protective. The following is a 6-step summary of the methodology to assist the responsible parties in properly managing their pollution problem. These guidelines include: (1) site characterization; (2) initial risk-based screening of contaminants; (3) derivation of health and/or ecological risk-based cleanup goals; (4) derivation of groundwater quality-based cleanup goals; (5) site cleanup goals and site remediation; and (6) risk management decisions.

  9. Probabilistic theories with purification

    SciTech Connect

    Chiribella, Giulio; D'Ariano, Giacomo Mauro; Perinotti, Paolo

    2010-06-15

    We investigate general probabilistic theories in which every mixed state has a purification, unique up to reversible channels on the purifying system. We show that the purification principle is equivalent to the existence of a reversible realization of every physical process, that is, to the fact that every physical process can be regarded as arising from a reversible interaction of the system with an environment, which is eventually discarded. From the purification principle we also construct an isomorphism between transformations and bipartite states that possesses all structural properties of the Choi-Jamiolkowski isomorphism in quantum theory. Such an isomorphism allows one to prove most of the basic features of quantum theory, like, e.g., existence of pure bipartite states giving perfect correlations in independent experiments, no information without disturbance, no joint discrimination of all pure states, no cloning, teleportation, no programming, no bit commitment, complementarity between correctable channels and deletion channels, characterization of entanglement-breaking channels as measure-and-prepare channels, and others, without resorting to the mathematical framework of Hilbert spaces.

  10. Interval probabilistic neural network.

    PubMed

    Kowalski, Piotr A; Kulczycki, Piotr

    2017-01-01

    Automated classification systems have allowed for the rapid development of exploratory data analysis. Such systems increase the independence of human intervention in obtaining the analysis results, especially when inaccurate information is under consideration. The aim of this paper is to present a novel approach, a neural networking, for use in classifying interval information. As presented, neural methodology is a generalization of probabilistic neural network for interval data processing. The simple structure of this neural classification algorithm makes it applicable for research purposes. The procedure is based on the Bayes approach, ensuring minimal potential losses with regard to that which comes about through classification errors. In this article, the topological structure of the network and the learning process are described in detail. Of note, the correctness of the procedure proposed here has been verified by way of numerical tests. These tests include examples of both synthetic data, as well as benchmark instances. The results of numerical verification, carried out for different shapes of data sets, as well as a comparative analysis with other methods of similar conditioning, have validated both the concept presented here and its positive features.

  11. Probabilistic Evaluation of Ecological and Economic Objectives of River Basin Management Reveals a Potential Flaw in the Goal Setting of the EU Water Framework Directive

    NASA Astrophysics Data System (ADS)

    Hjerppe, Turo; Taskinen, Antti; Kotamäki, Niina; Malve, Olli; Kettunen, Juhani

    2017-04-01

    The biological status of European lakes has not improved as expected despite up-to-date legislation and ecological standards. As a result, the realism of objectives and the attainment of related ecological standards are under doubt. This paper gets to the bottom of a river basin management plan of a eutrophic lake in Finland and presents the ecological and economic impacts of environmental and societal drivers and planned management measures. For these purposes, we performed a Monte Carlo simulation of a diffuse nutrient load, lake water quality and cost-benefit models. Simulations were integrated into a Bayesian influence diagram that revealed the basic uncertainties. It turned out that the attainment of good ecological status as qualified in the Water Framework Directive of the European Union is unlikely within given socio-economic constraints. Therefore, management objectives and ecological and economic standards need to be reassessed and reset to provide a realistic goal setting for management. More effort should be put into the evaluation of the total monetary benefits and on the monitoring of lake phosphorus balances to reduce the uncertainties, and the resulting margin of safety and costs and risks of planned management measures.

  12. Probabilistic Evaluation of Ecological and Economic Objectives of River Basin Management Reveals a Potential Flaw in the Goal Setting of the EU Water Framework Directive.

    PubMed

    Hjerppe, Turo; Taskinen, Antti; Kotamäki, Niina; Malve, Olli; Kettunen, Juhani

    2017-04-01

    The biological status of European lakes has not improved as expected despite up-to-date legislation and ecological standards. As a result, the realism of objectives and the attainment of related ecological standards are under doubt. This paper gets to the bottom of a river basin management plan of a eutrophic lake in Finland and presents the ecological and economic impacts of environmental and societal drivers and planned management measures. For these purposes, we performed a Monte Carlo simulation of a diffuse nutrient load, lake water quality and cost-benefit models. Simulations were integrated into a Bayesian influence diagram that revealed the basic uncertainties. It turned out that the attainment of good ecological status as qualified in the Water Framework Directive of the European Union is unlikely within given socio-economic constraints. Therefore, management objectives and ecological and economic standards need to be reassessed and reset to provide a realistic goal setting for management. More effort should be put into the evaluation of the total monetary benefits and on the monitoring of lake phosphorus balances to reduce the uncertainties, and the resulting margin of safety and costs and risks of planned management measures.

  13. Probabilistic Tsunami Hazard Analysis - Results for the Western United States

    NASA Astrophysics Data System (ADS)

    Thio, H.; Polet, J.; Somerville, P.

    2007-12-01

    We have developed a series of probabilistic tsunami hazard maps for the coasts of western North America based on fault source characterizations of the circum-Pacific subduction zones as well as local offshore faults. The maps show the probabilistic offshore exceedance waveheights at 72, 475, 975 and 2475 year return periods, which are the return periods typically used in Probabilistic Seismic Hazard Analysis (PSHA). Our method follows along similar lines as (PSHA) which has become a standard practice in the evaluation and mitigation of seismic hazard in particular with respect to structures, infrastructure and lifelines. Its ability to condense complexities, variability and uncertainties of seismic activity into a manageable set of ground motion parameters greatly facilitates the planning and design of effective seismic resistant buildings and infrastructure. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can rapidly synthesize tsunami waveforms for any slip distribution on those faults by summing the individual weighted subfault tsunami waveforms. This Green's function summation provides accurate estimates of tsunami height for probabilistic calculations, where one typically integrates over thousands of earthquake scenarios. We have carried out tsunami hazard calculations for western North America and Hawaii based on a comprehensive source model around the Pacific Ocean including both subduction zone sources as well as local offshore faults. We will present the tsunami hazard maps and discuss how these results are used for probabilistic inundation mapping, including a follow-up inundation study of the San Francisco Bay area that is based on disaggregation results of the

  14. Risk-based regulation: A utility's perspective

    SciTech Connect

    Chapman, J.R. )

    1993-01-01

    Yankee Atomic Electric Company (YAEC) has supported the operation of several plants under the premise that regulations and corresponding implementation strategies are intended to be [open quotes]risk based.[close quotes] During the past 15 yr, these efforts have changed from essentially qualitative to a blend of qualitative and quantitative. Our observation is that implementation of regulatory requirements has often not addressed the risk significance of the underlying intent of regulations on a proportionate basis. It has caused our resource allocation to be skewed, to the point that our cost-competitiveness has eroded, but more importantly we have missed opportunities for increases in safety.

  15. A Probabilistic Asteroid Impact Risk Model

    NASA Technical Reports Server (NTRS)

    Mathias, Donovan L.; Wheeler, Lorien F.; Dotson, Jessie L.

    2016-01-01

    Asteroid threat assessment requires the quantification of both the impact likelihood and resulting consequence across the range of possible events. This paper presents a probabilistic asteroid impact risk (PAIR) assessment model developed for this purpose. The model incorporates published impact frequency rates with state-of-the-art consequence assessment tools, applied within a Monte Carlo framework that generates sets of impact scenarios from uncertain parameter distributions. Explicit treatment of atmospheric entry is included to produce energy deposition rates that account for the effects of thermal ablation and object fragmentation. These energy deposition rates are used to model the resulting ground damage, and affected populations are computed for the sampled impact locations. The results for each scenario are aggregated into a distribution of potential outcomes that reflect the range of uncertain impact parameters, population densities, and strike probabilities. As an illustration of the utility of the PAIR model, the results are used to address the question of what minimum size asteroid constitutes a threat to the population. To answer this question, complete distributions of results are combined with a hypothetical risk tolerance posture to provide the minimum size, given sets of initial assumptions. Model outputs demonstrate how such questions can be answered and provide a means for interpreting the effect that input assumptions and uncertainty can have on final risk-based decisions. Model results can be used to prioritize investments to gain knowledge in critical areas or, conversely, to identify areas where additional data has little effect on the metrics of interest.

  16. Probabilistic exposure fusion.

    PubMed

    Song, Mingli; Tao, Dacheng; Chen, Chun; Bu, Jiajun; Luo, Jiebo; Zhang, Chengqi

    2012-01-01

    The luminance of a natural scene is often of high dynamic range (HDR). In this paper, we propose a new scheme to handle HDR scenes by integrating locally adaptive scene detail capture and suppressing gradient reversals introduced by the local adaptation. The proposed scheme is novel for capturing an HDR scene by using a standard dynamic range (SDR) device and synthesizing an image suitable for SDR displays. In particular, we use an SDR capture device to record scene details (i.e., the visible contrasts and the scene gradients) in a series of SDR images with different exposure levels. Each SDR image responds to a fraction of the HDR and partially records scene details. With the captured SDR image series, we first calculate the image luminance levels, which maximize the visible contrasts, and then the scene gradients embedded in these images. Next, we synthesize an SDR image by using a probabilistic model that preserves the calculated image luminance levels and suppresses reversals in the image luminance gradients. The synthesized SDR image contains much more scene details than any of the captured SDR image. Moreover, the proposed scheme also functions as the tone mapping of an HDR image to the SDR image, and it is superior to both global and local tone mapping operators. This is because global operators fail to preserve visual details when the contrast ratio of a scene is large, whereas local operators often produce halos in the synthesized SDR image. The proposed scheme does not require any human interaction or parameter tuning for different scenes. Subjective evaluations have shown that it is preferred over a number of existing approaches.

  17. Probabilistic Survivability Versus Time Modeling

    NASA Technical Reports Server (NTRS)

    Joyner, James J., Sr.

    2015-01-01

    This technical paper documents Kennedy Space Centers Independent Assessment team work completed on three assessments for the Ground Systems Development and Operations (GSDO) Program to assist the Chief Safety and Mission Assurance Officer (CSO) and GSDO management during key programmatic reviews. The assessments provided the GSDO Program with an analysis of how egress time affects the likelihood of astronaut and worker survival during an emergency. For each assessment, the team developed probability distributions for hazard scenarios to address statistical uncertainty, resulting in survivability plots over time. The first assessment developed a mathematical model of probabilistic survivability versus time to reach a safe location using an ideal Emergency Egress System at Launch Complex 39B (LC-39B); the second used the first model to evaluate and compare various egress systems under consideration at LC-39B. The third used a modified LC-39B model to determine if a specific hazard decreased survivability more rapidly than other events during flight hardware processing in Kennedys Vehicle Assembly Building (VAB).Based on the composite survivability versus time graphs from the first two assessments, there was a soft knee in the Figure of Merit graphs at eight minutes (ten minutes after egress ordered). Thus, the graphs illustrated to the decision makers that the final emergency egress design selected should have the capability of transporting the flight crew from the top of LC 39B to a safe location in eight minutes or less. Results for the third assessment were dominated by hazards that were classified as instantaneous in nature (e.g. stacking mishaps) and therefore had no effect on survivability vs time to egress the VAB. VAB emergency scenarios that degraded over time (e.g. fire) produced survivability vs time graphs that were line with aerospace industry norms.

  18. Towards Risk Based Design for NASA's Missions

    NASA Technical Reports Server (NTRS)

    Tumer, Irem Y.; Barrientos, Francesca; Meshkat, Leila

    2004-01-01

    This paper describes the concept of Risk Based Design in the context of NASA s low volume, high cost missions. The concept of accounting for risk in the design lifecycle has been discussed and proposed under several research topics, including reliability, risk analysis, optimization, uncertainty, decision-based design, and robust design. This work aims to identify and develop methods to enable and automate a means to characterize and optimize risk, and use risk as a tradeable resource to make robust and reliable decisions, in the context of the uncertain and ambiguous stage of early conceptual design. This paper first presents a survey of the related topics explored in the design research community as they relate to risk based design. Then, a summary of the topics from the NASA-led Risk Colloquium is presented, followed by current efforts within NASA to account for risk in early design. Finally, a list of "risk elements", identified for early-phase conceptual design at NASA, is presented. The purpose is to lay the foundation and develop a roadmap for future work and collaborations for research to eliminate and mitigate these risk elements in early phase design.

  19. Is the basic conditional probabilistic?

    PubMed

    Goodwin, Geoffrey P

    2014-06-01

    Nine experiments examined whether individuals treat the meaning of basic conditional assertions as deterministic or probabilistic. In Experiments 1-4, participants were presented with either probabilistic or deterministic relations, which they had to describe with a conditional. These experiments consistently showed that people tend only to use the basic if p then q construction to describe deterministic relations between antecedent and consequent, whereas they use a probabilistically qualified construction, if p then probably q, to describe probabilistic relations-suggesting that the default interpretation of the conditional is deterministic. Experiments 5 and 6 showed that when directly asked, individuals typically report that conditional assertions admit no exceptions (i.e., they are seen as deterministic). Experiments 7-9 showed that individuals judge the truth of conditional assertions in accordance with this deterministic interpretation. Together, these results pose a challenge to probabilistic accounts of the meaning of conditionals and support mental models, formal rules, and suppositional accounts. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  20. Probabilistic Design and Analysis Framework

    NASA Technical Reports Server (NTRS)

    Strack, William C.; Nagpal, Vinod K.

    2010-01-01

    PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.

  1. Online dissemination of probabilistic forecasts

    NASA Astrophysics Data System (ADS)

    Roulston, M. S.; Smith, L. A.

    2003-04-01

    Probabilistic weather forecasts intrinsically contain a much larger amount of information than traditional deterministic forecasts. This greatly increases their potential value to end-users, but also creates an obstacle to their dissemination. Traditional media, such as TV, radio and newspapers, are not suitable for presenting probabilistic forecasts to a large number of users who need predictions concerning a range of variables at a range of locations. The web has the potential to allow probabilistic forecasts to be communicated to users without having to make tacit assumptions about how their individual utility functions depend on weather variables. Unfortunately, the majority of weather forecasts currently available on the web are little more than online renditions of the type of forecasts found in more traditional media. We present a demonstration of how probabilistic forecasts might be effectively disseminated using the web. The graphical user interface allows users to view ensembles of the weather variables of interest to them without having to summarise the probabilistic information in the ensemble, and thus make implicit assumptions about the users weather risk exposure. Such a GUI can also be used to view "end-to-end" ensemble forecasts of non-weather, but weather dependent, variables of direct interest to users (e.g. wind power production).

  2. Risk-based targeting: A new approach in environmental protection

    SciTech Connect

    Fox, C.A.

    1995-12-31

    Risk-based targeting has recently emerged as an effective tool to help prioritize efforts to identify and manage geographic areas, chemicals, facilities, and agricultural activities that cause the most environmental degradation. This paper focuses on how the Environmental Protection Agency (EPA) has recently used risk-based targeting to identify and screen Federal, industrial, commercial and municipal facilities which contribute to probable human health (fish consumption advisories and contaminated fish tissue) and aquatic life (contaminated sediments) impacts. Preliminary results identified several hundred potential contributors of problem chemicals to probable impacts within the same river reach in 1991--93. Analysis by industry sector showed that the majority of the facilities identified were publicly owned treatment works (POTWs), in addition to industry organic and inorganic chemical manufacturers, petroleum refineries, and electric services, coatings, engravings, and allied services, among others. Both compliant and non-compliant potentially contributing facilities were identified to some extent in all EPA regions. Additional results identifying possible linkages of other pollutant sources to probable impacts, as well as estimation of potential exposure of these contaminants to minority and/or poverty populations are also presented. Out of these analyses, a number of short and long-term strategies are being developed that EPA may use to reduce loadings of problem contaminants to impacted waterbodies.

  3. 12 CFR 652.70 - Risk-based capital level.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Risk-based capital level. 652.70 Section 652.70... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.70 Risk-based capital level. The risk-based capital level is the sum of the following amounts: (a) Credit and interest rate risk. The...

  4. Cost-Effectiveness and Harm-Benefit Analyses of Risk-Based Screening Strategies for Breast Cancer

    PubMed Central

    Carles, Misericordia; Sala, Maria; Pla, Roger; Castells, Xavier; Domingo, Laia; Rue, Montserrat

    2014-01-01

    The one-size-fits-all paradigm in organized screening of breast cancer is shifting towards a personalized approach. The present study has two objectives: 1) To perform an economic evaluation and to assess the harm-benefit ratios of screening strategies that vary in their intensity and interval ages based on breast cancer risk; and 2) To estimate the gain in terms of cost and harm reductions using risk-based screening with respect to the usual practice. We used a probabilistic model and input data from Spanish population registries and screening programs, as well as from clinical studies, to estimate the benefit, harm, and costs over time of 2,624 screening strategies, uniform or risk-based. We defined four risk groups, low, moderate-low, moderate-high and high, based on breast density, family history of breast cancer and personal history of breast biopsy. The risk-based strategies were obtained combining the exam periodicity (annual, biennial, triennial and quinquennial), the starting ages (40, 45 and 50 years) and the ending ages (69 and 74 years) in the four risk groups. Incremental cost-effectiveness and harm-benefit ratios were used to select the optimal strategies. Compared to risk-based strategies, the uniform ones result in a much lower benefit for a specific cost. Reductions close to 10% in costs and higher than 20% in false-positive results and overdiagnosed cases were obtained for risk-based strategies. Optimal screening is characterized by quinquennial or triennial periodicities for the low or moderate risk-groups and annual periodicity for the high-risk group. Risk-based strategies can reduce harm and costs. It is necessary to develop accurate measures of individual risk and to work on how to implement risk-based screening strategies. PMID:24498285

  5. Cost-effectiveness and harm-benefit analyses of risk-based screening strategies for breast cancer.

    PubMed

    Vilaprinyo, Ester; Forné, Carles; Carles, Misericordia; Sala, Maria; Pla, Roger; Castells, Xavier; Domingo, Laia; Rue, Montserrat

    2014-01-01

    The one-size-fits-all paradigm in organized screening of breast cancer is shifting towards a personalized approach. The present study has two objectives: 1) To perform an economic evaluation and to assess the harm-benefit ratios of screening strategies that vary in their intensity and interval ages based on breast cancer risk; and 2) To estimate the gain in terms of cost and harm reductions using risk-based screening with respect to the usual practice. We used a probabilistic model and input data from Spanish population registries and screening programs, as well as from clinical studies, to estimate the benefit, harm, and costs over time of 2,624 screening strategies, uniform or risk-based. We defined four risk groups, low, moderate-low, moderate-high and high, based on breast density, family history of breast cancer and personal history of breast biopsy. The risk-based strategies were obtained combining the exam periodicity (annual, biennial, triennial and quinquennial), the starting ages (40, 45 and 50 years) and the ending ages (69 and 74 years) in the four risk groups. Incremental cost-effectiveness and harm-benefit ratios were used to select the optimal strategies. Compared to risk-based strategies, the uniform ones result in a much lower benefit for a specific cost. Reductions close to 10% in costs and higher than 20% in false-positive results and overdiagnosed cases were obtained for risk-based strategies. Optimal screening is characterized by quinquennial or triennial periodicities for the low or moderate risk-groups and annual periodicity for the high-risk group. Risk-based strategies can reduce harm and costs. It is necessary to develop accurate measures of individual risk and to work on how to implement risk-based screening strategies.

  6. An expert system for probabilistic description of loads on space propulsion system structural components

    NASA Technical Reports Server (NTRS)

    Spencer, B. F., Jr.; Hopkins, D. A.

    1988-01-01

    LDEXPT, an expert system that generates probabilistic characterizations of the loads spectra borne by spacecraft propulsion systems' structural components, is found by recent experience at NASA-Lewis to be useful in the cases of components representative of the Space Shuttle Main Engine's turbopumps and fluid transfer ducting. LDEXPT is composed of a knowledge base management system and a rule base management system. The ANLOAD load-modeling module of LDEXPT encompasses three independent probabilistic analysis techniques.

  7. Vagueness as Probabilistic Linguistic Knowledge

    NASA Astrophysics Data System (ADS)

    Lassiter, Daniel

    Consideration of the metalinguistic effects of utterances involving vague terms has led Barker [1] to treat vagueness using a modified Stalnakerian model of assertion. I present a sorites-like puzzle for factual beliefs in the standard Stalnakerian model [28] and show that it can be resolved by enriching the model to make use of probabilistic belief spaces. An analogous problem arises for metalinguistic information in Barker's model, and I suggest that a similar enrichment is needed here as well. The result is a probabilistic theory of linguistic representation that retains a classical metalanguage but avoids the undesirable divorce between meaning and use inherent in the epistemic theory [34]. I also show that the probabilistic approach provides a plausible account of the sorites paradox and higher-order vagueness and that it fares well empirically and conceptually in comparison to leading competitors.

  8. Probabilistic progressive buckling of trusses

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Chamis, Christos C.

    1991-01-01

    A three-bay, space, cantilever truss is probabilistically evaluated to describe progressive buckling and truss collapse in view of the numerous uncertainties associated with the structural, material, and load variables (primitive variables) that describe the truss. Initially, the truss is deterministically analyzed for member forces, and member(s) in which the axial force exceeds the Euler buckling load are identified. These member(s) are then discretized with several intermediate nodes and a probabilistic buckling analysis is performed on the truss to obtain its probabilistic buckling loads and respective mode shapes. Furthermore, sensitivities associated with the uncertainties in the primitive variables are investigated, margin of safety values for the truss are determined, and truss end node displacements are noted. These steps are repeated by sequentially removing the buckled member(s) until onset of truss collapse is reached. Results show that this procedure yields an optimum truss configuration for a given loading and for a specified reliability.

  9. Probabilistic progressive buckling of trusses

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Chamis, Christos C.

    1994-01-01

    A three-bay, space, cantilever truss is probabilistically evaluated to describe progressive buckling and truss collapse in view of the numerous uncertainties associated with the structural, material, and load variables that describe the truss. Initially, the truss is deterministically analyzed for member forces, and members in which the axial force exceeds the Euler buckling load are identified. These members are then discretized with several intermediate nodes, and a probabilistic buckling analysis is performed on the truss to obtain its probabilistic buckling loads and the respective mode shapes. Furthermore, sensitivities associated with the uncertainties in the primitive variables are investigated, margin of safety values for the truss are determined, and truss end node displacements are noted. These steps are repeated by sequentially removing buckled members until onset of truss collapse is reached. Results show that this procedure yields an optimum truss configuration for a given loading and for a specified reliability.

  10. Probabilistic Declarative Process Mining

    NASA Astrophysics Data System (ADS)

    Bellodi, Elena; Riguzzi, Fabrizio; Lamma, Evelina

    The management of business processes is receiving much attention, since it can support significant efficiency improvements in organizations. One of the most interesting problems is the representation of process models in a language that allows to perform reasoning on it.

  11. A probabilistic tsunami hazard assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-11-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence-based decision-making regarding risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean tsunami, but this has been largely concentrated on the Sunda Arc with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent probabilistic tsunami hazard assessment (PTHA) for Indonesia. This assessment produces time-independent forecasts of tsunami hazards at the coast using data from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting the larger maximum magnitudes. The annual probability of experiencing a tsunami with a height of > 0.5 m at the coast is greater than 10% for Sumatra, Java, the Sunda islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of > 3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national-scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  12. Probabilistic inversion: a preliminary discussion

    NASA Astrophysics Data System (ADS)

    Battista Rossi, Giovanni; Crenna, Francesco

    2015-02-01

    We continue the discussion on the possibility of interpreting probability as a logic, that we have started in the previous IMEKO TC1-TC7-TC13 Symposium. We show here how a probabilistic logic can be extended up to including direct and inverse functions. We also discuss the relationship between this framework and the Bayes-Laplace rule, showing how the latter can be formally interpreted as a probabilistic inversion device. We suggest that these findings open a new perspective in the evaluation of measurement uncertainty.

  13. Radiation risk management at DOE accelerator facilities

    SciTech Connect

    Dyck, O.B. van

    1997-01-01

    The DOE accelerator contractors have been discussing among themselves and with the Department how to improve radiation safety risk management. This activity-how to assure prevention of unplanned high exposures-is separate from normal exposure management, which historically has been quite successful. The ad-hoc Committee on the Accelerator Safety Order and Guidance [CASOG], formed by the Accelerator Section of the HPS, has proposed a risk- based approach, which will be discussed. Concepts involved are risk quantification and comparison (including with non-radiation risk), passive and active (reacting) protection systems, and probabilistic analysis. Different models of risk management will be presented, and the changing regulatory environment will also be discussed..

  14. Probabilistic risk assessment of the Space Shuttle. Phase 3: A study of the potential of losing the vehicle during nominal operation, volume 1

    NASA Technical Reports Server (NTRS)

    Fragola, Joseph R.; Maggio, Gaspare; Frank, Michael V.; Gerez, Luis; Mcfadden, Richard H.; Collins, Erin P.; Ballesio, Jorge; Appignani, Peter L.; Karns, James J.

    1995-01-01

    This document is the Executive Summary of a technical report on a probabilistic risk assessment (PRA) of the Space Shuttle vehicle performed under the sponsorship of the Office of Space Flight of the US National Aeronautics and Space Administration. It briefly summarizes the methodology and results of the Shuttle PRA. The primary objective of this project was to support management and engineering decision-making with respect to the Shuttle program by producing (1) a quantitative probabilistic risk model of the Space Shuttle during flight, (2) a quantitative assessment of in-flight safety risk, (3) an identification and prioritization of the design and operations that principally contribute to in-flight safety risk, and (4) a mechanism for risk-based evaluation proposed modifications to the Shuttle System. Secondary objectives were to provide a vehicle for introducing and transferring PRA technology to the NASA community, and to demonstrate the value of PRA by applying it beneficially to a real program of great international importance.

  15. Regional Variability of Stream Responses to Urbanization: Implications for Risk-Based Assessments

    NASA Astrophysics Data System (ADS)

    Bledsoe, B. P.; Dust, D. W.; Hawley, R. J.

    2007-12-01

    Predictive scientific assessments of the geomorphic consequences of urbanization must be calibrated to the regional hydroclimatological, geologic, and historical context in which streams occur. We present examples of context-specific stream responses to hydromodification, and a general framework for risk-based modeling and scientific assessment of hydrologic-geomorphic-ecologic linkages in urbanizing watersheds. The framework involves: 1) a priori stratification of a region's streams based on flow regime, geomorphic context and susceptibility to changes in water, sediment, and wood regimes, 2) field surveys across a gradient of urban influence, 3) coupling long term hydrologic simulation with geomorphic analysis to quantify key hydrogeomorphic metrics, and 4) using probabilistic modeling to identify regional linkages between hydrogeomorphic descriptors and decision endpoints of primary interest to stakeholders and decision-makers.

  16. Probabilistic forecasts based on radar rainfall uncertainty

    NASA Astrophysics Data System (ADS)

    Liguori, S.; Rico-Ramirez, M. A.

    2012-04-01

    gauges location, and then interpolated back onto the radar domain, in order to obtain probabilistic radar rainfall fields in real time. The deterministic nowcasting model integrated in the STEPS system [7-8] has been used for the purpose of propagating the uncertainty and assessing the benefit of implementing the radar ensemble generator for probabilistic rainfall forecasts and ultimately sewer flow predictions. For this purpose, events representative of different types of precipitation (i.e. stratiform/convective) and significant at the urban catchment scale (i.e. in terms of sewer overflow within the urban drainage system) have been selected. As high spatial/temporal resolution is required to the forecasts for their use in urban areas [9-11], the probabilistic nowcasts have been set up to be produced at 1 km resolution and 5 min intervals. The forecasting chain is completed by a hydrodynamic model of the urban drainage network. The aim of this work is to discuss the implementation of this probabilistic system, which takes into account the radar error to characterize the forecast uncertainty, with consequent potential benefits in the management of urban systems. It will also allow a comparison with previous findings related to the analysis of different approaches to uncertainty estimation and quantification in terms of rainfall [12] and flows at the urban scale [13]. Acknowledgements The authors would like to acknowledge the BADC, the UK Met Office and Dr. Alan Seed from the Australian Bureau of Meteorology for providing the radar data and the nowcasting model. The authors acknowledge the support from the Engineering and Physical Sciences Research Council (EPSRC) via grant EP/I012222/1.

  17. Probabilistic Route Selection Algorithm for IP Traceback

    NASA Astrophysics Data System (ADS)

    Yim, Hong-Bin; Jung, Jae-Il

    DoS(Denial of Service) or DDoS(Distributed DoS) attack is a major threaten and the most difficult problem to solve among many attacks. Moreover, it is very difficult to find a real origin of attackers because DoS/DDoS attacker uses spoofed IP addresses. To solve this problem, we propose a probabilistic route selection traceback algorithm, namely PRST, to trace the attacker's real origin. This algorithm uses two types of packets such as an agent packet and a reply agent packet. The agent packet is in use to find the attacker's real origin and the reply agent packet is in use to notify to a victim that the agent packet is reached the edge router of the attacker. After attacks occur, the victim generates the agent packet and sends it to a victim's edge router. The attacker's edge router received the agent packet generates the reply agent packet and send it to the victim. The agent packet and the reply agent packet is forwarded refer to probabilistic packet forwarding table (PPFT) by routers. The PRST algorithm runs on the distributed routers and PPFT is stored and managed by routers. We validate PRST algorithm by using mathematical approach based on Poisson distribution.

  18. Making Probabilistic Relational Categories Learnable

    ERIC Educational Resources Information Center

    Jung, Wookyoung; Hummel, John E.

    2015-01-01

    Theories of relational concept acquisition (e.g., schema induction) based on structured intersection discovery predict that relational concepts with a probabilistic (i.e., family resemblance) structure ought to be extremely difficult to learn. We report four experiments testing this prediction by investigating conditions hypothesized to facilitate…

  19. Probabilistic Techniques for Phrase Extraction.

    ERIC Educational Resources Information Center

    Feng, Fangfang; Croft, W. Bruce

    2001-01-01

    This study proposes a probabilistic model for automatically extracting English noun phrases for indexing or information retrieval. The technique is based on a Markov model, whose initial parameters are estimated by a phrase lookup program with a phrase dictionary, then optimized by a set of maximum entropy parameters. (Author/LRW)

  20. Research on probabilistic information processing

    NASA Technical Reports Server (NTRS)

    Edwards, W.

    1973-01-01

    The work accomplished on probabilistic information processing (PIP) is reported. The research proposals and decision analysis are discussed along with the results of research on MSC setting, multiattribute utilities, and Bayesian research. Abstracts of reports concerning the PIP research are included.

  1. On the applicability of probabilistics

    SciTech Connect

    Roth, P.G.

    1996-12-31

    GEAE`s traditional lifing approach, based on Low Cycle Fatigue (LCF) curves, is evolving for fracture critical powder metal components by incorporating probabilistic fracture mechanics analysis. Supporting this move is a growing validation database which convincingly demonstrates that probabilistics work given the right inputs. Significant efforts are being made to ensure the right inputs. For example, Heavy Liquid Separation (HLS) analysis has been developed to quantify and control inclusion content (1). Also, an intensive seeded fatigue program providing a model for crack initiation at inclusions is ongoing (2). Despite the optimism and energy, probabilistics are only tools and have limitations. Designing to low failure probabilities helps provide protection, but other strategies are needed to protect against surprises. A low risk design limit derived from a predicted failure distribution can lead to a high risk deployment if there are unaccounted-for deviations from analysis assumptions. Recognized deviations which are statistically quantifiable can be integrated into the probabilistic analysis (an advantage of the approach). When deviations are known to be possible but are not properly describable statistically, it may be more appropriate to maintain the traditional position of conservatively bounding relevant input parameters. Finally, safety factors on analysis results may be called for in cases where there is little experience supporting new design concepts or material applications (where unrecognized deviations might be expected).

  2. Probabilistic assessment of composite structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Shiao, Michael C.

    1993-01-01

    A methodology and attendant computer code were developed and are used to computationally simulate the uncertain behavior of composite structures. The uncertain behavior includes buckling loads, stress concentration factors, displacements, stress/strain, etc., which are the consequences of the inherent uncertainties (scatter) in the primitive (independent random) variables (constituent, ply, laminate, and structural) that describe the composite structures. The computer code is IPACS (Integrated Probabilistic Assessment of Composite Structures). IPACS can simulate both composite mechanics and composite structural behavior. Application to probabilistic composite mechanics is illustrated by its use to evaluate the uncertainties in the major Poisson's ratio and in laminate stiffness and strength. IPACS' application to probabilistic structural analysis is illustrated by its used to evaluate the uncertainties in the buckling of a composite plate, the stress concentration factor in a composite panel, and the vertical displacement and ply stress in a composite aircraft wing segment. IPACS' application to probabilistic design is illustrated by its use to assess the thin composite shell (pipe).

  3. Probabilistic assessment of composite structures

    NASA Technical Reports Server (NTRS)

    Shiao, Michael E.; Abumeri, Galib H.; Chamis, Christos C.

    1993-01-01

    A general computational simulation methodology for an integrated probabilistic assessment of composite structures is discussed and demonstrated using aircraft fuselage (stiffened composite cylindrical shell) structures with rectangular cutouts. The computational simulation was performed for the probabilistic assessment of the structural behavior including buckling loads, vibration frequencies, global displacements, and local stresses. The scatter in the structural response is simulated based on the inherent uncertainties in the primitive (independent random) variables at the fiber matrix constituent, ply, laminate, and structural scales that describe the composite structures. The effect of uncertainties due to fabrication process variables such as fiber volume ratio, void volume ratio, ply orientation, and ply thickness is also included. The methodology has been embedded in the computer code IPACS (Integrated Probabilistic Assessment of Composite Structures). In addition to the simulated scatter, the IPACS code also calculates the sensitivity of the composite structural behavior to all the primitive variables that influence the structural behavior. This information is useful for assessing reliability and providing guidance for improvement. The results from the probabilistic assessment for the composite structure with rectangular cutouts indicate that the uncertainty in the longitudinal ply stress is mainly caused by the uncertainty in the laminate thickness, and the large overlap of the scatter in the first four buckling loads implies that the buckling mode shape for a specific buckling load can be either of the four modes.

  4. A Probabilistic Cell Tracking Algorithm

    NASA Astrophysics Data System (ADS)

    Steinacker, Reinhold; Mayer, Dieter; Leiding, Tina; Lexer, Annemarie; Umdasch, Sarah

    2013-04-01

    The research described below was carried out during the EU-Project Lolight - development of a low cost, novel and accurate lightning mapping and thunderstorm (supercell) tracking system. The Project aims to develop a small-scale tracking method to determine and nowcast characteristic trajectories and velocities of convective cells and cell complexes. The results of the algorithm will provide a higher accuracy than current locating systems distributed on a coarse scale. Input data for the developed algorithm are two temporally separated lightning density fields. Additionally a Monte Carlo method minimizing a cost function is utilizied which leads to a probabilistic forecast for the movement of thunderstorm cells. In the first step the correlation coefficients between the first and the second density field are computed. Hence, the first field is shifted by all shifting vectors which are physically allowed. The maximum length of each vector is determined by the maximum possible speed of thunderstorm cells and the difference in time for both density fields. To eliminate ambiguities in determination of directions and velocities, the so called Random Walker of the Monte Carlo process is used. Using this method a grid point is selected at random. Moreover, one vector out of all predefined shifting vectors is suggested - also at random but with a probability that is related to the correlation coefficient. If this exchange of shifting vectors reduces the cost function, the new direction and velocity are accepted. Otherwise it is discarded. This process is repeated until the change of cost functions falls below a defined threshold. The Monte Carlo run gives information about the percentage of accepted shifting vectors for all grid points. In the course of the forecast, amplifications of cell density are permitted. For this purpose, intensity changes between the investigated areas of both density fields are taken into account. Knowing the direction and speed of thunderstorm

  5. A probabilistic Hu-Washizu variational principle

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Belytschko, T.; Besterfield, G. H.

    1987-01-01

    A Probabilistic Hu-Washizu Variational Principle (PHWVP) for the Probabilistic Finite Element Method (PFEM) is presented. This formulation is developed for both linear and nonlinear elasticity. The PHWVP allows incorporation of the probabilistic distributions for the constitutive law, compatibility condition, equilibrium, domain and boundary conditions into the PFEM. Thus, a complete probabilistic analysis can be performed where all aspects of the problem are treated as random variables and/or fields. The Hu-Washizu variational formulation is available in many conventional finite element codes thereby enabling the straightforward inclusion of the probabilistic features into present codes.

  6. Risk-based modeling of early warning systems for pollution accidents.

    PubMed

    Grayman, W M; Males, R M

    2002-01-01

    An early warning system is a mechanism for detecting, characterizing and providing notification of a source water contamination event (spill event) in order to mitigate the impact of contamination. Spill events are highly probabilistic occurrences with major spills, which can have very significant impacts on raw water sources of drinking water, being relatively rare. A systematic method for designing and operating early warning systems that considers the highly variable, probabilistic nature of many aspects of the system is described. The methodology accounts for the probability of spills, behavior of monitoring equipment, variable hydrology, and the probability of obtaining information about spills independent of a monitoring system. Spill Risk, a risk-based model using Monte Carlo simulation techniques has been developed and its utility has been demonstrated as part of an AWWA Research Foundation sponsored project. The model has been applied to several hypothetical river situations and to an actual section of the Ohio River. Additionally, the model has been systematically applied to a wide range of conditions in order to develop general guidance on design of early warning systems.

  7. Arsenic speciation driving risk based corrective action.

    PubMed

    Marlborough, Sidney J; Wilson, Vincent L

    2015-07-01

    The toxicity of arsenic depends on a number of factors including its valence state. The more potent trivalent arsenic [arsenite (As3+)] inhibits a large number of cellular enzymatic pathways involved in energy production, while the less toxic pentavalent arsenic [arsenate (As5+)] interferes with phosphate metabolism, phosphoproteins and ATP formation (uncoupling of oxidative phosphorylation). Environmental risk based corrective action for arsenic contamination utilizes data derived from arsenite studies of toxicity to be conservative. However, depending upon environmental conditions, the arsenate species may predominate substantially, especially in well aerated surface soils. Analyses of soil concentrations of arsenic species at two sites in northeastern Texas historically contaminated with arsenical pesticides yielded mean arsenate concentrations above 90% of total arsenic with the majority of the remainder being the trivalent arsenite species. Ecological risk assessments based on the concentration of the trivalent arsenite species will lead to restrictive remediation requirements that do not adequately reflect the level of risk associated with the predominate species of arsenic found in the soil. The greater concentration of the pentavalent arsenate species in soils would be the more appropriate species to monitor remediation at sites that contain high arsenate to arsenite ratios. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Risk-Based Decision Support of Water Resource Management Alternatives

    DTIC Science & Technology

    2006-12-01

    22.5 kilometers) in Pennsylvania and Maryland, was created in 1928 with the completion of the Conowingo dam . [1] The Conowingo system gradually...The Conowingo Dam is one of four hydroelectric projects on the lower Susquehanna River. All are regulated by the Federal Energy Regulatory Commission...components. In the Conowingo Dam system, the dam itself is a structural component; turbines, flood gates are related equipment are operating components

  9. Toward a probabilistic definition of seizures.

    PubMed

    Osorio, Ivan; Lyubushin, Alexey; Sornette, Didier

    2011-12-01

    This writing (1) draws attention to the intricacies inherent to the pursuit of a universal seizure definition even when powerful, well-understood signal analysis methods are used to this end; (2) identifies this aim as a multi-objective optimization problem and discusses the advantages and disadvantages of adopting or rejecting a unitary seizure definition; and (3) introduces a probabilistic measure of seizure activity to manage this thorny issue. The challenges posed by the attempt to define seizures unitarily may be partly related to their fractal properties and understood through a simplistic analogy to the so-called "Richardson effect." A revision of the time-honored conceptualization of seizures may be warranted to further advance epileptology. This article is part of a Supplemental Special Issue entitled The Future of Automated Seizure Detection and Prediction. Copyright © 2011 Elsevier Inc. All rights reserved.

  10. Risk-based analysis methods applied to nuclear power plant technical specifications

    SciTech Connect

    Wagner, D.P.; Minton, L.A.; Gaertner, J.P.

    1989-03-01

    A computer-aided methodology and practical applications of risk-based evaluation of technical specifications are described. The methodology, developed for use by the utility industry, is a part of the overall process of improving nuclear power plant technical specifications. The SOCRATES computer program uses the results of a probabilistic risk assessment or a system-level risk analysis to calculate changes in risk due to changes in the surveillance test interval and/or the allowed outage time stated in the technical specification. The computer program can accommodate various testing strategies (such as staggered or simultaneous testing) to allow modeling of component testing as it is carried out at the plant. The methods and computer program are an integral part of a larger decision process aimed at determining benefits from technical specification changes. These benefits can include cost savings to the utilities by reducing forced shutdowns and decreasing labor requirements for test and maintenance activities, with no adverse impacts on risk. The methodology and the SOCRATES computer program have been used extensively toe valuate several actual technical specifications in case studies demonstrating the methods. Summaries of these applications demonstrate the types of results achieved and the usefulness of the risk-based evaluation in improving the technical specifications.

  11. Risk-Based Data Management System design specifications and implementation plan for the Alaska Oil and Gas Conservation Commission; the Mississippi State Oil and Gas Board; the Montana Board of Oil and Gas Conservation; and the Nebraska Oil and Gas Conservation Commission

    SciTech Connect

    Not Available

    1993-09-01

    The purpose of this document is to present design specifications and an implementation schedule for the development and implementation of Risk Based Data Management Systems (RBDMS`s) in the states of Alaska, Mississippi, Montana, and Nebraska. The document presents detailed design information including a description of the system database structure, data dictionary, data entry and inquiry screen layouts, specifications for standard reports that will be produced by the system, functions and capabilities (including environmental risk analyses), And table relationships for each database table within the system. This design information provides a comprehensive blueprint of the system to be developed and presents the necessary detailed information for system development and implementation. A proposed schedule for development and implementation also is presented. The schedule presents timeframes for the development of system modules, training, implementation, and providing assistance to the states with data conversion from existing systems. However, the schedule will vary depending upon the timing of funding allocations from the United States Department of Energy (DOE) for the development and implementation phase of the project. For planning purposes, the schedule assumes that initiation of the development and implementation phase will commence November 1, 1993, somewhat later than originally anticipated.

  12. Synaptic Computation Underlying Probabilistic Inference

    PubMed Central

    Soltani, Alireza; Wang, Xiao-Jing

    2010-01-01

    In this paper we propose that synapses may be the workhorse of neuronal computations that underlie probabilistic reasoning. We built a neural circuit model for probabilistic inference when information provided by different sensory cues needs to be integrated, and the predictive powers of individual cues about an outcome are deduced through experience. We found that bounded synapses naturally compute, through reward-dependent plasticity, the posterior probability that a choice alternative is correct given that a cue is presented. Furthermore, a decision circuit endowed with such synapses makes choices based on the summated log posterior odds and performs near-optimal cue combination. The model is validated by reproducing salient observations of, and provide insights into, a monkey experiment using a categorization task. Our model thus suggests a biophysical instantiation of the Bayesian decision rule, while predicting important deviations from it similar to ‘base-rate neglect’ observed in human studies when alternatives have unequal priors. PMID:20010823

  13. The probabilistic no miracles argument.

    PubMed

    Sprenger, Jan

    This paper develops a probabilistic reconstruction of the No Miracles Argument (NMA) in the debate between scientific realists and anti-realists. The goal of the paper is to clarify and to sharpen the NMA by means of a probabilistic formalization. In particular, I demonstrate that the persuasive force of the NMA depends on the particular disciplinary context where it is applied, and the stability of theories in that discipline. Assessments and critiques of "the" NMA, without reference to a particular context, are misleading and should be relinquished. This result has repercussions for recent anti-realist arguments, such as the claim that the NMA commits the base rate fallacy (Howson (2000), Magnus and Callender (Philosophy of Science, 71:320-338, 2004)). It also helps to explain the persistent disagreement between realists and anti-realists.

  14. Environmental probabilistic quantitative assessment methodologies

    USGS Publications Warehouse

    Crovelli, R.A.

    1995-01-01

    In this paper, four petroleum resource assessment methodologies are presented as possible pollution assessment methodologies, even though petroleum as a resource is desirable, whereas pollution is undesirable. A methodology is defined in this paper to consist of a probability model and a probabilistic method, where the method is used to solve the model. The following four basic types of probability models are considered: 1) direct assessment, 2) accumulation size, 3) volumetric yield, and 4) reservoir engineering. Three of the four petroleum resource assessment methodologies were written as microcomputer systems, viz. TRIAGG for direct assessment, APRAS for accumulation size, and FASPU for reservoir engineering. A fourth microcomputer system termed PROBDIST supports the three assessment systems. The three assessment systems have different probability models but the same type of probabilistic method. The type of advantages of the analytic method are in computational speed and flexibility, making it ideal for a microcomputer. -from Author

  15. Risk based treatment selection and optimization of contaminated site remediation

    SciTech Connect

    Heitzer, A.; Scholz, R.W.

    1995-12-31

    During the past few years numerous remediation technologies for the cleanup of contaminated sites have been developed. Because of the associated uncertainties concerning treatment reliability it is important to develop strategies to characterize their risks to achieve the cleanup requirements. For this purpose it is necessary to integrate existing knowledge on treatment efficacy and efficiency into the planning process for the management of contaminated sites. Based on field-scale experience data for the remediation of soils contaminated with petroleum hydrocarbons, two treatment technologies, biological land treatment and phyisco-chemical soil washing, were analyzed with respect to their general performance risks to achieve given cleanup standards. For a specific contamination scenario, efficient application ranges were identified using the method of linear optimization in combination with sensitivity analysis. Various constraints including cleanup standards, available financial budget, amount of contamination and others were taken into account. While land treatment was found to be most efficient at higher cleanup standards and less contaminated soils, soil washing exhibited better efficiency at lower cleanup standards and higher contaminated soils. These results compare favorably with practical experiences and indicate the utility of this approach to support decision making and planning processes for the general management of contaminated sites. In addition, the method allows for the simultaneous integration of various aspects such as risk based characteristics of treatment technologies, cleanup standards and more general ecological and economical remedial action objectives.

  16. Probabilistic risk assessment of HTGRs

    SciTech Connect

    Fleming, K.N.; Houghton, W.J.; Hannaman, G.W.; Joksimovic, V.

    1980-08-01

    Probabilistic Risk Assessment methods have been applied to gas-cooled reactors for more than a decade and to HTGRs for more than six years in the programs sponsored by the US Department of Energy. Significant advancements to the development of PRA methodology in these programs are summarized as are the specific applications of the methods to HTGRs. Emphasis here is on PRA as a tool for evaluating HTGR design options. Current work and future directions are also discussed.

  17. Probabilistic Simulation for Nanocomposite Characterization

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Coroneos, Rula M.

    2007-01-01

    A unique probabilistic theory is described to predict the properties of nanocomposites. The simulation is based on composite micromechanics with progressive substructuring down to a nanoscale slice of a nanofiber where all the governing equations are formulated. These equations have been programmed in a computer code. That computer code is used to simulate uniaxial strengths properties of a mononanofiber laminate. The results are presented graphically and discussed with respect to their practical significance. These results show smooth distributions.

  18. A Simple Probabilistic Combat Model

    DTIC Science & Technology

    2016-06-13

    model described here, all attrition is modeled probabilistically and it is possible (although unlikely) for the weaker side to be successful . The model...integrals are plotted as a function of the number of waves in the lower right plot . Since we start at a point in the space, there is a clear winner...Red14 5. MODEL FOR RED AND BLUE WIN PROBABILITY The previous plots have shown how the probability distribution of red and blue survivors evolves

  19. Probabilistic methods for rotordynamics analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.

    1991-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.

  20. Applications of Probabilistic Risk Assessment

    SciTech Connect

    Burns, K.J.; Chapman, J.R.; Follen, S.M.; O'Regan, P.J. )

    1991-05-01

    This report provides a summary of potential and actual applications of Probabilistic Risk Assessment (PRA) technology and insights. Individual applications are derived from the experiences of a number of US nuclear utilities. This report identifies numerous applications of PRA techniques beyond those typically associated with PRAs. In addition, believing that the future use of PRA techniques should not be limited to those of the past, areas of plant operations, maintenance, and financial resource allocation are discussed. 9 refs., 3 tabs.

  1. Probabilistic tractography using Lasso bootstrap.

    PubMed

    Ye, Chuyang; Prince, Jerry L

    2017-01-01

    Diffusion magnetic resonance imaging (dMRI) can be used for noninvasive imaging of white matter tracts. Using fiber tracking, which propagates fiber streamlines according to fiber orientations (FOs) computed from dMRI, white matter tracts can be reconstructed for investigation of brain diseases and the brain connectome. Because of image noise, probabilistic tractography has been proposed to characterize uncertainties in FO estimation. Bootstrap provides a nonparametric approach to the estimation of FO uncertainties and residual bootstrap has been used for developing probabilistic tractography. However, recently developed models have incorporated sparsity regularization to reduce the required number of gradient directions to resolve crossing FOs, and the residual bootstrap used in previous methods is not applicable to these models. In this work, we propose a probabilistic tractography algorithm named Lasso bootstrap tractography (LBT) for the models that incorporate sparsity. Using a fixed tensor basis and a sparsity assumption, diffusion signals are modeled using a Lasso formulation. With the residuals from the Lasso model, a distribution of diffusion signals is obtained according to a modified Lasso bootstrap strategy. FOs are then estimated from the synthesized diffusion signals by an algorithm that improves FO estimation by enforcing spatial consistency of FOs. Finally, streamlining fiber tracking is performed with the computed FOs. The LBT algorithm was evaluated on simulated and real dMRI data both qualitatively and quantitatively. Results demonstrate that LBT outperforms state-of-the-art algorithms. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Aggregation of Trust for Iterated Belief Revision in Probabilistic Logics

    NASA Astrophysics Data System (ADS)

    Pardo, Pere

    In this paper it is shown how communication about trust in a multi-agent system may be used to endow agents with belief change capabilities, in a probabilistic logical framework. Belief change operators are obtained in an intuitive, principled way using aggregation operators for trust-values. Under additional conditions, such change operators may be proved to be maxichoice. The present approach constitutes a sound method for autonomous uncertainty management in multi-agent systems.

  3. Environmental probabilistic quantitative assessment methodologies

    NASA Astrophysics Data System (ADS)

    Crovelli, Robert A.

    1995-10-01

    Probabilistic methodologies developed originally for one area of application may be applicable in another area. Therefore, it is extremely important to communicate across disciplines. Of course, a physical reinterpretation is necessary and perhaps some modification of the methodology. This seems to be the situation in applying resource assessment methodologies as environmental assessment methodologies. In this paper, four petroleum resource assessment methodologies are presented as possible pollution assessment methodologies, even though petroleum as a resource is desirable, whereas pollution is undesirable. It is ironic that oil as a precious resource in the ground can become a serious pollutant as a spill in the ocean. There are similarities in both situations where the quantity of undiscovered crude oil and natural gas resources, and the quantity of a pollutant or contaminant are to be estimated. Obviously, we are interested in making a quantitative assessment in order to answer the question, "How much material is there?" For situations in which there are a lack of statistical data, risk analysis is used rather than classical statistical analysis. That is, a relatively subjective evaluation is made rather than an evaluation based on random sampling which may be impossible. Hence, probabilistic quantitative assessment methodologies are needed for the risk analysis. A methodology is defined in this paper to consist of a probability model and a probabilistic method, where the method is used to solve the model. The following four basic types of probability models are considered: (1) direct assessment, (2) accumulation size, (3) volumetric yield, and (4) reservoir engineering. Three of the four petroleum resource assessment methodologies were written as microcomputer systems, viz., TRIAGG for direct assessment, APRAS for accumulation size, and FASPU for reservoir engineering. A fourth microcomputer system termed PROBDIST supports the three assessment systems. TRIAGG

  4. Toward Probabilistic Risk Analyses - Development of a Probabilistic Tsunami Hazard Assessment of Crescent City, CA

    NASA Astrophysics Data System (ADS)

    González, F. I.; Leveque, R. J.; Hatheway, D.; Metzger, N.

    2011-12-01

    Risk is defined in many ways, but most are consistent with Crichton's [1999] definition based on the ''risk triangle'' concept and the explicit identification of three risk elements: ''Risk is the probability of a loss, and this depends on three elements: hazard, vulnerability, and exposure. If any of these three elements in risk increases or decreases, then the risk increases or decreases respectively." The World Meteorological Organization, for example, cites Crichton [1999] and then defines risk as [WMO, 2008] Risk = function (Hazard x Vulnerability x Exposure) while the Asian Disaster Reduction Center adopts the more general expression [ADRC, 2005] Risk = function (Hazard, Vulnerability, Exposure) In practice, probabilistic concepts are invariably invoked, and at least one of the three factors are specified as probabilistic in nature. The Vulnerability and Exposure factors are defined in multiple ways in the relevant literature; but the Hazard factor, which is the focus of our presentation, is generally understood to deal only with the physical aspects of the phenomena and, in particular, the ability of the phenomena to inflict harm [Thywissen, 2006]. A Hazard factor can be estimated by a methodology known as Probabilistic Tsunami Hazard Assessment (PTHA) [González, et al., 2009]. We will describe the PTHA methodology and provide an example -- the results of a previous application to Seaside, OR. We will also present preliminary results for a PTHA of Crescent City, CA -- a pilot project and coastal modeling/mapping effort funded by the Federal Emergency Management Agency (FEMA) Region IX office as part of the new California Coastal Analysis and Mapping Project (CCAMP). CCAMP and the PTHA in Crescent City are being conducted under the nationwide FEMA Risk Mapping, Assessment, and Planning (Risk MAP) Program which focuses on providing communities with flood information and tools they can use to enhance their mitigation plans and better protect their citizens.

  5. Is probabilistic bias analysis approximately Bayesian?

    PubMed Central

    MacLehose, Richard F.; Gustafson, Paul

    2011-01-01

    Case-control studies are particularly susceptible to differential exposure misclassification when exposure status is determined following incident case status. Probabilistic bias analysis methods have been developed as ways to adjust standard effect estimates based on the sensitivity and specificity of exposure misclassification. The iterative sampling method advocated in probabilistic bias analysis bears a distinct resemblance to a Bayesian adjustment; however, it is not identical. Furthermore, without a formal theoretical framework (Bayesian or frequentist), the results of a probabilistic bias analysis remain somewhat difficult to interpret. We describe, both theoretically and empirically, the extent to which probabilistic bias analysis can be viewed as approximately Bayesian. While the differences between probabilistic bias analysis and Bayesian approaches to misclassification can be substantial, these situations often involve unrealistic prior specifications and are relatively easy to detect. Outside of these special cases, probabilistic bias analysis and Bayesian approaches to exposure misclassification in case-control studies appear to perform equally well. PMID:22157311

  6. Probabilistic Aeroelastic Analysis of Turbomachinery Components

    NASA Technical Reports Server (NTRS)

    Reddy, T. S. R.; Mital, S. K.; Stefko, G. L.

    2004-01-01

    A probabilistic approach is described for aeroelastic analysis of turbomachinery blade rows. Blade rows with subsonic flow and blade rows with supersonic flow with subsonic leading edge are considered. To demonstrate the probabilistic approach, the flutter frequency, damping and forced response of a blade row representing a compressor geometry is considered. The analysis accounts for uncertainties in structural and aerodynamic design variables. The results are presented in the form of probabilistic density function (PDF) and sensitivity factors. For subsonic flow cascade, comparisons are also made with different probabilistic distributions, probabilistic methods, and Monte-Carlo simulation. The approach shows that the probabilistic approach provides a more realistic and systematic way to assess the effect of uncertainties in design variables on the aeroelastic instabilities and response.

  7. Quantitative Integration of Ndt with Probabilistic Fracture Mechanics for the Assessment of Fracture Risk in Pipelines

    NASA Astrophysics Data System (ADS)

    Kurz, J. H.; Cioclov, D.; Dobmann, G.; Boller, C.

    2010-02-01

    In the context of probabilistic paradigm of fracture risk assessment in structural components a computer simulation rationale is presented which has at the base the integration of Quantitative Non-destructive Inspection and Probabilistic Fracture Mechanics. In this study the static failure under static loading is assessed in the format known as Failure Assessment Diagram (FAD). The fracture risk is evaluated in probabilistic terms. The superposed probabilistic pattern over the deterministic one is implemented via Monte-Carlo sampling. The probabilistic fracture simulation yields a more informative analysis in terms of probability of failure. The ability to simulate the influence of the quality and reliability of non-destructive inspection (NDI) is an important feature of this approach. It is achieved by integrating, algorithmically, probabilistic FAD analysis and the Probability of Detection (POD). The POD information can only be applied in a probabilistic analysis and leads to a refinement of the assessment. By this means, it can be ascertained the decrease of probability of failure when POD-characterized NDI is applied. Therefore, this procedure can be used as a tool for inspection based life time conceptions. In this paper results of sensitivity analyses are presented with the aim to outline, in terms of non-failure probabilities, the benefits of applying NDI, in various qualities, in comparison with the situation when NDI is lacking. A better substantiation is enabled of both the component reliability management and the costs-effectiveness of NDI timing.

  8. 6 CFR 27.230 - Risk-based performance standards.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 6 Domestic Security 1 2010-01-01 2010-01-01 false Risk-based performance standards. 27.230 Section 27.230 Domestic Security DEPARTMENT OF HOMELAND SECURITY, OFFICE OF THE SECRETARY CHEMICAL FACILITY ANTI-TERRORISM STANDARDS Chemical Facility Security Program § 27.230 Risk-based performance...

  9. 6 CFR 27.230 - Risk-based performance standards.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 6 Domestic Security 1 2014-01-01 2014-01-01 false Risk-based performance standards. 27.230 Section 27.230 Domestic Security DEPARTMENT OF HOMELAND SECURITY, OFFICE OF THE SECRETARY CHEMICAL FACILITY ANTI-TERRORISM STANDARDS Chemical Facility Security Program § 27.230 Risk-based performance...

  10. Risk-based versus deterministic explosives safety criteria

    SciTech Connect

    Wright, R.E.

    1996-12-01

    The Department of Defense Explosives Safety Board (DDESB) is actively considering ways to apply risk-based approaches in its decision- making processes. As such, an understanding of the impact of converting to risk-based criteria is required. The objectives of this project are to examine the benefits and drawbacks of risk-based criteria and to define the impact of converting from deterministic to risk-based criteria. Conclusions will be couched in terms that allow meaningful comparisons of deterministic and risk-based approaches. To this end, direct comparisons of the consequences and impacts of both deterministic and risk-based criteria at selected military installations are made. Deterministic criteria used in this report are those in DoD 6055.9-STD, `DoD Ammunition and Explosives Safety Standard.` Risk-based criteria selected for comparison are those used by the government of Switzerland, `Technical Requirements for the Storage of Ammunition (TLM 75).` The risk-based criteria used in Switzerland were selected because they have been successfully applied for over twenty-five years.

  11. Probabilistic Evaluation of Blade Impact Damage

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Abumeri, G. H.

    2003-01-01

    The response to high velocity impact of a composite blade is probabilistically evaluated. The evaluation is focused on quantifying probabilistically the effects of uncertainties (scatter) in the variables that describe the impact, the blade make-up (geometry and material), the blade response (displacements, strains, stresses, frequencies), the blade residual strength after impact, and the blade damage tolerance. The results of probabilistic evaluations results are in terms of probability cumulative distribution functions and probabilistic sensitivities. Results show that the blade has relatively low damage tolerance at 0.999 probability of structural failure and substantial at 0.01 probability.

  12. Probabilistic structural analysis methods development for SSME

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Hopkins, D. A.

    1988-01-01

    The development of probabilistic structural analysis methods is a major part of the SSME Structural Durability Program and consists of three program elements: composite load spectra, probabilistic finite element structural analysis, and probabilistic structural analysis applications. Recent progress includes: (1) the effects of the uncertainties of several factors on the HPFP blade temperature pressure and torque, (2) the evaluation of the cumulative distribution function of structural response variables based on assumed uncertainties on primitive structural variables, and (3) evaluation of the failure probability. Collectively, the results obtained demonstrate that the structural durability of critical SSME components can be probabilistically evaluated.

  13. Probabilistic Analysis of Ground-Holding Strategies

    NASA Technical Reports Server (NTRS)

    Sheel, Minakshi

    1997-01-01

    The Ground-Holding Policy Problem (GHPP) has become a matter of great interest in recent years because of the high cost incurred by aircraft suffering from delays. Ground-holding keeps a flight on the ground at the departure airport if it is known it will be unable to land at the arrival airport. The GBPP is determining how many flights should be held on the ground before take-off and for how long, in order to minimize the cost of delays. When the uncertainty associated with airport landing capacity is considered, the GHPP becomes complicated. A decision support system that incorporates this uncertainty, solves the GHPP quickly, and gives good results would be of great help to air traffic management. The purpose of this thesis is to modify and analyze a probabilistic ground-holding algorithm by applying it to two common cases of capacity reduction. A graphical user interface was developed and sensitivity analysis was done on the algorithm, in order to see how it may be implemented in practice. The sensitivity analysis showed the algorithm was very sensitive to the number of probabilistic capacity scenarios used and to the cost ratio of air delay to ground delay. The algorithm was not particularly sensitive to the number of periods that the time horizon was divided into. In terms of cost savings, a ground-holding policy was the most beneficial when demand greatly exceeded airport capacity. When compared to other air traffic flow strategies, the ground-holding algorithm performed the best and was the most consistent under various situations. The algorithm can solve large problems quickly and efficiently on a personal computer.

  14. Air Quality Monitoring: Risk-Based Choices

    NASA Technical Reports Server (NTRS)

    James, John T.

    2009-01-01

    Air monitoring is secondary to rigid control of risks to air quality. Air quality monitoring requires us to target the credible residual risks. Constraints on monitoring devices are severe. Must transition from archival to real-time, on-board monitoring. Must provide data to crew in a way that they can interpret findings. Dust management and monitoring may be a major concern for exploration class missions.

  15. Chronic kidney disease: towards a risk-based approach.

    PubMed

    Taal, Maarten W

    2016-12-01

    Chronic kidney disease (CKD) affects 8-16% of adults worldwide and is associated with multiple adverse outcomes. It includes a heterogeneous group of conditions with widely varied associated risks; risk stratification is therefore vital for clinical management. Use of the CKD Epidemiology Collaboration (CKD-EPI) equation to estimate glomerular filtration rate (GFR) instead of the Modification of Diet in Renal Disease (MDRD) equation will reduce, though not eliminate, over-diagnosis of CKD. Cystatin C is recommended as an alternative measure of GFR but is not yet widely used. A new classification system for CKD, which includes GFR and albuminuria, has been endorsed by the National Institute for Health and Care Excellence to aid risk stratification and a recently validated formula, requiring only age, gender, eGFR and albuminuria, is useful to predict risk of end-stage kidney disease (ESKD). A risk-based approach will facilitate appropriate treatment for people at high risk of developing ESKD while sparing the majority, who are at low risk, from unnecessary intervention. © Royal College of Physicians 2016. All rights reserved.

  16. Probabilistic approach to EMP assessment

    SciTech Connect

    Bevensee, R.M.; Cabayan, H.S.; Deadrick, F.J.; Martin, L.C.; Mensing, R.W.

    1980-09-01

    The development of nuclear EMP hardness requirements must account for uncertainties in the environment, in interaction and coupling, and in the susceptibility of subsystems and components. Typical uncertainties of the last two kinds are briefly summarized, and an assessment methodology is outlined, based on a probabilistic approach that encompasses the basic concepts of reliability. It is suggested that statements of survivability be made compatible with system reliability. Validation of the approach taken for simple antenna/circuit systems is performed with experiments and calculations that involve a Transient Electromagnetic Range, numerical antenna modeling, separate device failure data, and a failure analysis computer program.

  17. Mixed deterministic and probabilistic networks

    PubMed Central

    Dechter, Rina

    2010-01-01

    The paper introduces mixed networks, a new graphical model framework for expressing and reasoning with probabilistic and deterministic information. The motivation to develop mixed networks stems from the desire to fully exploit the deterministic information (constraints) that is often present in graphical models. Several concepts and algorithms specific to belief networks and constraint networks are combined, achieving computational efficiency, semantic coherence and user-interface convenience. We define the semantics and graphical representation of mixed networks, and discuss the two main types of algorithms for processing them: inference-based and search-based. A preliminary experimental evaluation shows the benefits of the new model. PMID:20981243

  18. Probabilistic risk assessment: Number 219

    SciTech Connect

    Bari, R.A.

    1985-11-13

    This report describes a methodology for analyzing the safety of nuclear power plants. A historical overview of plants in the US is provided, and past, present, and future nuclear safety and risk assessment are discussed. A primer on nuclear power plants is provided with a discussion of pressurized water reactors (PWR) and boiling water reactors (BWR) and their operation and containment. Probabilistic Risk Assessment (PRA), utilizing both event-tree and fault-tree analysis, is discussed as a tool in reactor safety, decision making, and communications. (FI)

  19. Probabilistic Simulation for Nanocomposite Fracture

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A unique probabilistic theory is described to predict the uniaxial strengths and fracture properties of nanocomposites. The simulation is based on composite micromechanics with progressive substructuring down to a nanoscale slice of a nanofiber where all the governing equations are formulated. These equations have been programmed in a computer code. That computer code is used to simulate uniaxial strengths and fracture of a nanofiber laminate. The results are presented graphically and discussed with respect to their practical significance. These results show smooth distributions from low probability to high.

  20. Use of Probabilistic Risk Assessment in Shuttle Decision Making Process

    NASA Technical Reports Server (NTRS)

    Boyer, Roger L.; Hamlin, Teri, L.

    2011-01-01

    This slide presentation reviews the use of Probabilistic Risk Assessment (PRA) to assist in the decision making for the shuttle design and operation. Probabilistic Risk Assessment (PRA) is a comprehensive, structured, and disciplined approach to identifying and analyzing risk in complex systems and/or processes that seeks answers to three basic questions: (i.e., what can go wrong? what is the likelihood of these occurring? and what are the consequences that could result if these occur?) The purpose of the Shuttle PRA (SPRA) is to provide a useful risk management tool for the Space Shuttle Program (SSP) to identify strengths and possible weaknesses in the Shuttle design and operation. SPRA was initially developed to support upgrade decisions, but has evolved into a tool that supports Flight Readiness Reviews (FRR) and near real-time flight decisions. Examples of the use of PRA for the shuttle are reviewed.

  1. 2009 Space Shuttle Probabilistic Risk Assessment Overview

    NASA Technical Reports Server (NTRS)

    Hamlin, Teri L.; Canga, Michael A.; Boyer, Roger L.; Thigpen, Eric B.

    2010-01-01

    Loss of a Space Shuttle during flight has severe consequences, including loss of a significant national asset; loss of national confidence and pride; and, most importantly, loss of human life. The Shuttle Probabilistic Risk Assessment (SPRA) is used to identify risk contributors and their significance; thus, assisting management in determining how to reduce risk. In 2006, an overview of the SPRA Iteration 2.1 was presented at PSAM 8 [1]. Like all successful PRAs, the SPRA is a living PRA and has undergone revisions since PSAM 8. The latest revision to the SPRA is Iteration 3. 1, and it will not be the last as the Shuttle program progresses and more is learned. This paper discusses the SPRA scope, overall methodology, and results, as well as provides risk insights. The scope, assumptions, uncertainties, and limitations of this assessment provide risk-informed perspective to aid management s decision-making process. In addition, this paper compares the Iteration 3.1 analysis and results to the Iteration 2.1 analysis and results presented at PSAM 8.

  2. Regional crop yield forecasting: a probabilistic approach

    NASA Astrophysics Data System (ADS)

    de Wit, A.; van Diepen, K.; Boogaard, H.

    2009-04-01

    Information on the outlook on yield and production of crops over large regions is essential for government services dealing with import and export of food crops, for agencies with a role in food relief, for international organizations with a mandate in monitoring the world food production and trade, and for commodity traders. Process-based mechanistic crop models are an important tool for providing such information, because they can integrate the effect of crop management, weather and soil on crop growth. When properly integrated in a yield forecasting system, the aggregated model output can be used to predict crop yield and production at regional, national and continental scales. Nevertheless, given the scales at which these models operate, the results are subject to large uncertainties due to poorly known weather conditions and crop management. Current yield forecasting systems are generally deterministic in nature and provide no information about the uncertainty bounds on their output. To improve on this situation we present an ensemble-based approach where uncertainty bounds can be derived from the dispersion of results in the ensemble. The probabilistic information provided by this ensemble-based system can be used to quantify uncertainties (risk) on regional crop yield forecasts and can therefore be an important support to quantitative risk analysis in a decision making process.

  3. Probabilistic analysis of tsunami hazards

    USGS Publications Warehouse

    Geist, E.L.; Parsons, T.

    2006-01-01

    Determining the likelihood of a disaster is a key component of any comprehensive hazard assessment. This is particularly true for tsunamis, even though most tsunami hazard assessments have in the past relied on scenario or deterministic type models. We discuss probabilistic tsunami hazard analysis (PTHA) from the standpoint of integrating computational methods with empirical analysis of past tsunami runup. PTHA is derived from probabilistic seismic hazard analysis (PSHA), with the main difference being that PTHA must account for far-field sources. The computational methods rely on numerical tsunami propagation models rather than empirical attenuation relationships as in PSHA in determining ground motions. Because a number of source parameters affect local tsunami runup height, PTHA can become complex and computationally intensive. Empirical analysis can function in one of two ways, depending on the length and completeness of the tsunami catalog. For site-specific studies where there is sufficient tsunami runup data available, hazard curves can primarily be derived from empirical analysis, with computational methods used to highlight deficiencies in the tsunami catalog. For region-wide analyses and sites where there are little to no tsunami data, a computationally based method such as Monte Carlo simulation is the primary method to establish tsunami hazards. Two case studies that describe how computational and empirical methods can be integrated are presented for Acapulco, Mexico (site-specific) and the U.S. Pacific Northwest coastline (region-wide analysis).

  4. Models for Retrieval with Probabilistic Indexing.

    ERIC Educational Resources Information Center

    Fuhr, Norbert

    1989-01-01

    Describes three models for probabilistic indexing, all based on the Darmstadt automatic indexing approach, and presents experimental evaluation results for each. The discussion covers the improved retrieval effectiveness of probabilistic indexing over binary indexing, and suggestions for using this automatic indexing method with free text terms.…

  5. Probabilistic Cue Combination: Less Is More

    ERIC Educational Resources Information Center

    Yurovsky, Daniel; Boyer, Ty W.; Smith, Linda B.; Yu, Chen

    2013-01-01

    Learning about the structure of the world requires learning probabilistic relationships: rules in which cues do not predict outcomes with certainty. However, in some cases, the ability to track probabilistic relationships is a handicap, leading adults to perform non-normatively in prediction tasks. For example, in the "dilution effect,"…

  6. Error Discounting in Probabilistic Category Learning

    ERIC Educational Resources Information Center

    Craig, Stewart; Lewandowsky, Stephan; Little, Daniel R.

    2011-01-01

    The assumption in some current theories of probabilistic categorization is that people gradually attenuate their learning in response to unavoidable error. However, existing evidence for this error discounting is sparse and open to alternative interpretations. We report 2 probabilistic-categorization experiments in which we investigated error…

  7. Probabilistic Cue Combination: Less Is More

    ERIC Educational Resources Information Center

    Yurovsky, Daniel; Boyer, Ty W.; Smith, Linda B.; Yu, Chen

    2013-01-01

    Learning about the structure of the world requires learning probabilistic relationships: rules in which cues do not predict outcomes with certainty. However, in some cases, the ability to track probabilistic relationships is a handicap, leading adults to perform non-normatively in prediction tasks. For example, in the "dilution effect,"…

  8. Exploiting Synoptic-Scale Climate Processes to Develop Nonstationary, Probabilistic Flood Hazard Projections

    NASA Astrophysics Data System (ADS)

    Spence, C. M.; Brown, C.; Doss-Gollin, J.

    2016-12-01

    Climate model projections are commonly used for water resources management and planning under nonstationarity, but they do not reliably reproduce intense short-term precipitation and are instead more skilled at broader spatial scales. To provide a credible estimate of flood trend that reflects climate uncertainty, we present a framework that exploits the connections between synoptic-scale oceanic and atmospheric patterns and local-scale flood-producing meteorological events to develop long-term flood hazard projections. We demonstrate the method for the Iowa River, where high flow episodes have been found to correlate with tropical moisture exports that are associated with a pressure dipole across the eastern continental United States We characterize the relationship between flooding on the Iowa River and this pressure dipole through a nonstationary Pareto-Poisson peaks-over-threshold probability distribution estimated based on the historic record. We then combine the results of a trend analysis of dipole index in the historic record with the results of a trend analysis of the dipole index as simulated by General Circulation Models (GCMs) under climate change conditions through a Bayesian framework. The resulting nonstationary posterior distribution of dipole index, combined with the dipole-conditioned peaks-over-threshold flood frequency model, connects local flood hazard to changes in large-scale atmospheric pressure and circulation patterns that are related to flooding in a process-driven framework. The Iowa River example demonstrates that the resulting nonstationary, probabilistic flood hazard projection may be used to inform risk-based flood adaptation decisions.

  9. Is probabilistic evidence a source of knowledge?

    PubMed

    Friedman, Ori; Turri, John

    2015-07-01

    We report a series of experiments examining whether people ascribe knowledge for true beliefs based on probabilistic evidence. Participants were less likely to ascribe knowledge for beliefs based on probabilistic evidence than for beliefs based on perceptual evidence (Experiments 1 and 2A) or testimony providing causal information (Experiment 2B). Denial of knowledge for beliefs based on probabilistic evidence did not arise because participants viewed such beliefs as unjustified, nor because such beliefs leave open the possibility of error. These findings rule out traditional philosophical accounts for why probabilistic evidence does not produce knowledge. The experiments instead suggest that people deny knowledge because they distrust drawing conclusions about an individual based on reasoning about the population to which it belongs, a tendency previously identified by "judgment and decision making" researchers. Consistent with this, participants were more willing to ascribe knowledge for beliefs based on probabilistic evidence that is specific to a particular case (Experiments 3A and 3B).

  10. Belief Propagation for Probabilistic Slow Feature Analysis

    NASA Astrophysics Data System (ADS)

    Omori, Toshiaki; Sekiguchi, Tomoki; Okada, Masato

    2017-08-01

    Slow feature analysis (SFA) is a time-series analysis method for extracting slowly-varying latent features from multi-dimensional data. A recent study proposed a probabilistic framework of SFA using the Bayesian statistical framework. However, the conventional probabilistic framework of SFA can not accurately extract the slow feature in noisy environments since its marginal likelihood function was approximately derived under the assumption that there exists no observation noise. In this paper, we propose a probabilistic framework of SFA with rigorously derived marginal likelihood function. Here, we rigorously derive the marginal likelihood function of the probabilistic framework of SFA by using belief propagation. We show using numerical data that the proposed probabilistic framework of SFA can accurately extract the slow feature and underlying parameters for the latent dynamics simultaneously even under noisy environments.

  11. Weighing costs and losses: A decision making game using probabilistic forecasts

    NASA Astrophysics Data System (ADS)

    Werner, Micha; Ramos, Maria-Helena; Wetterhall, Frederik; Cranston, Michael; van Andel, Schalk-Jan; Pappenberger, Florian; Verkade, Jan

    2017-04-01

    Probabilistic forecasts are increasingly recognised as an effective and reliable tool to communicate uncertainties. The economic value of probabilistic forecasts has been demonstrated by several authors, showing the benefit to using probabilistic forecasts over deterministic forecasts in several sectors, including flood and drought warning, hydropower, and agriculture. Probabilistic forecasting is also central to the emerging concept of risk-based decision making, and underlies emerging paradigms such as impact-based forecasting. Although the economic value of probabilistic forecasts is easily demonstrated in academic works, its evaluation in practice is more complex. The practical use of probabilistic forecasts requires decision makers to weigh the cost of an appropriate response to a probabilistic warning against the projected loss that would occur if the event forecast becomes reality. In this paper, we present the results of a simple game that aims to explore how decision makers are influenced by the costs required for taking a response and the potential losses they face in case the forecast flood event occurs. Participants play the role of one of three possible different shop owners. Each type of shop has losses of quite different magnitude, should a flood event occur. The shop owners are presented with several forecasts, each with a probability of a flood event occurring, which would inundate their shop and lead to those losses. In response, they have to decide if they want to do nothing, raise temporary defences, or relocate their inventory. Each action comes at a cost; and the different shop owners therefore have quite different cost/loss ratios. The game was played on four occasions. Players were attendees of the ensemble hydro-meteorological forecasting session of the 2016 EGU Assembly, professionals participating at two other conferences related to hydrometeorology, and a group of students. All audiences were familiar with the principles of forecasting

  12. Probabilistic Reasoning for Plan Robustness

    NASA Technical Reports Server (NTRS)

    Schaffer, Steve R.; Clement, Bradley J.; Chien, Steve A.

    2005-01-01

    A planning system must reason about the uncertainty of continuous variables in order to accurately project the possible system state over time. A method is devised for directly reasoning about the uncertainty in continuous activity duration and resource usage for planning problems. By representing random variables as parametric distributions, computing projected system state can be simplified in some cases. Common approximation and novel methods are compared for over-constrained and lightly constrained domains. The system compares a few common approximation methods for an iterative repair planner. Results show improvements in robustness over the conventional non-probabilistic representation by reducing the number of constraint violations witnessed by execution. The improvement is more significant for larger problems and problems with higher resource subscription levels but diminishes as the system is allowed to accept higher risk levels.

  13. Probabilistic Reasoning for Plan Robustness

    NASA Technical Reports Server (NTRS)

    Schaffer, Steve R.; Clement, Bradley J.; Chien, Steve A.

    2005-01-01

    A planning system must reason about the uncertainty of continuous variables in order to accurately project the possible system state over time. A method is devised for directly reasoning about the uncertainty in continuous activity duration and resource usage for planning problems. By representing random variables as parametric distributions, computing projected system state can be simplified in some cases. Common approximation and novel methods are compared for over-constrained and lightly constrained domains. The system compares a few common approximation methods for an iterative repair planner. Results show improvements in robustness over the conventional non-probabilistic representation by reducing the number of constraint violations witnessed by execution. The improvement is more significant for larger problems and problems with higher resource subscription levels but diminishes as the system is allowed to accept higher risk levels.

  14. Probabilistic Survivability Versus Time Modeling

    NASA Technical Reports Server (NTRS)

    Joyner, James J., Sr.

    2016-01-01

    This presentation documents Kennedy Space Center's Independent Assessment work completed on three assessments for the Ground Systems Development and Operations (GSDO) Program to assist the Chief Safety and Mission Assurance Officer during key programmatic reviews and provided the GSDO Program with analyses of how egress time affects the likelihood of astronaut and ground worker survival during an emergency. For each assessment, a team developed probability distributions for hazard scenarios to address statistical uncertainty, resulting in survivability plots over time. The first assessment developed a mathematical model of probabilistic survivability versus time to reach a safe location using an ideal Emergency Egress System at Launch Complex 39B (LC-39B); the second used the first model to evaluate and compare various egress systems under consideration at LC-39B. The third used a modified LC-39B model to determine if a specific hazard decreased survivability more rapidly than other events during flight hardware processing in Kennedy's Vehicle Assembly Building.

  15. Probabilistic direct counterfactual quantum communication

    NASA Astrophysics Data System (ADS)

    Zhang, Sheng

    2017-02-01

    It is striking that the quantum Zeno effect can be used to launch a direct counterfactual communication between two spatially separated parties, Alice and Bob. So far, existing protocols of this type only provide a deterministic counterfactual communication service. However, this counterfactuality should be payed at a price. Firstly, the transmission time is much longer than a classical transmission costs. Secondly, the chained-cycle structure makes them more sensitive to channel noises. Here, we extend the idea of counterfactual communication, and present a probabilistic-counterfactual quantum communication protocol, which is proved to have advantages over the deterministic ones. Moreover, the presented protocol could evolve to a deterministic one solely by adjusting the parameters of the beam splitters. Project supported by the National Natural Science Foundation of China (Grant No. 61300203).

  16. Probabilistic cloning of equidistant states

    SciTech Connect

    Jimenez, O.; Roa, Luis; Delgado, A.

    2010-08-15

    We study the probabilistic cloning of equidistant states. These states are such that the inner product between them is a complex constant or its conjugate. Thereby, it is possible to study their cloning in a simple way. In particular, we are interested in the behavior of the cloning probability as a function of the phase of the overlap among the involved states. We show that for certain families of equidistant states Duan and Guo's cloning machine leads to cloning probabilities lower than the optimal unambiguous discrimination probability of equidistant states. We propose an alternative cloning machine whose cloning probability is higher than or equal to the optimal unambiguous discrimination probability for any family of equidistant states. Both machines achieve the same probability for equidistant states whose inner product is a positive real number.

  17. Probabilistic Fatigue Damage Program (FATIG)

    NASA Technical Reports Server (NTRS)

    Michalopoulos, Constantine

    2012-01-01

    FATIG computes fatigue damage/fatigue life using the stress rms (root mean square) value, the total number of cycles, and S-N curve parameters. The damage is computed by the following methods: (a) traditional method using Miner s rule with stress cycles determined from a Rayleigh distribution up to 3*sigma; and (b) classical fatigue damage formula involving the Gamma function, which is derived from the integral version of Miner's rule. The integration is carried out over all stress amplitudes. This software solves the problem of probabilistic fatigue damage using the integral form of the Palmgren-Miner rule. The software computes fatigue life using an approach involving all stress amplitudes, up to N*sigma, as specified by the user. It can be used in the design of structural components subjected to random dynamic loading, or by any stress analyst with minimal training for fatigue life estimates of structural components.

  18. Accounting for failure: risk-based regulation and the problems of ensuring healthcare quality in the NHS

    PubMed Central

    Beaussier, Anne-Laure; Demeritt, David; Griffiths, Alex; Rothstein, Henry

    2016-01-01

    In this paper, we examine why risk-based policy instruments have failed to improve the proportionality, effectiveness, and legitimacy of healthcare quality regulation in the National Health Service (NHS) in England. Rather than trying to prevent all possible harms, risk-based approaches promise to rationalise and manage the inevitable limits of what regulation can hope to achieve by focusing regulatory standard-setting and enforcement activity on the highest priority risks, as determined through formal assessments of their probability and consequences. As such, risk-based approaches have been enthusiastically adopted by healthcare quality regulators over the last decade. However, by drawing on historical policy analysis and in-depth interviews with 15 high-level UK informants in 2013–2015, we identify a series of practical problems in using risk-based policy instruments for defining, assessing, and ensuring compliance with healthcare quality standards. Based on our analysis, we go on to consider why, despite a succession of failures, healthcare regulators remain committed to developing and using risk-based approaches. We conclude by identifying several preconditions for successful risk-based regulation: goals must be clear and trade-offs between them amenable to agreement; regulators must be able to reliably assess the probability and consequences of adverse outcomes; regulators must have a range of enforcement tools that can be deployed in proportion to risk; and there must be political tolerance for adverse outcomes. PMID:27499677

  19. Accounting for failure: risk-based regulation and the problems of ensuring healthcare quality in the NHS.

    PubMed

    Beaussier, Anne-Laure; Demeritt, David; Griffiths, Alex; Rothstein, Henry

    2016-05-18

    In this paper, we examine why risk-based policy instruments have failed to improve the proportionality, effectiveness, and legitimacy of healthcare quality regulation in the National Health Service (NHS) in England. Rather than trying to prevent all possible harms, risk-based approaches promise to rationalise and manage the inevitable limits of what regulation can hope to achieve by focusing regulatory standard-setting and enforcement activity on the highest priority risks, as determined through formal assessments of their probability and consequences. As such, risk-based approaches have been enthusiastically adopted by healthcare quality regulators over the last decade. However, by drawing on historical policy analysis and in-depth interviews with 15 high-level UK informants in 2013-2015, we identify a series of practical problems in using risk-based policy instruments for defining, assessing, and ensuring compliance with healthcare quality standards. Based on our analysis, we go on to consider why, despite a succession of failures, healthcare regulators remain committed to developing and using risk-based approaches. We conclude by identifying several preconditions for successful risk-based regulation: goals must be clear and trade-offs between them amenable to agreement; regulators must be able to reliably assess the probability and consequences of adverse outcomes; regulators must have a range of enforcement tools that can be deployed in proportion to risk; and there must be political tolerance for adverse outcomes.

  20. Development of probabilistic multimedia multipathway computer codes.

    SciTech Connect

    Yu, C.; LePoire, D.; Gnanapragasam, E.; Arnish, J.; Kamboj, S.; Biwer, B. M.; Cheng, J.-J.; Zielen, A. J.; Chen, S. Y.; Mo, T.; Abu-Eid, R.; Thaggard, M.; Sallo, A., III.; Peterson, H., Jr.; Williams, W. A.; Environmental Assessment; NRC; EM

    2002-01-01

    The deterministic multimedia dose/risk assessment codes RESRAD and RESRAD-BUILD have been widely used for many years for evaluation of sites contaminated with residual radioactive materials. The RESRAD code applies to the cleanup of sites (soils) and the RESRAD-BUILD code applies to the cleanup of buildings and structures. This work describes the procedure used to enhance the deterministic RESRAD and RESRAD-BUILD codes for probabilistic dose analysis. A six-step procedure was used in developing default parameter distributions and the probabilistic analysis modules. These six steps include (1) listing and categorizing parameters; (2) ranking parameters; (3) developing parameter distributions; (4) testing parameter distributions for probabilistic analysis; (5) developing probabilistic software modules; and (6) testing probabilistic modules and integrated codes. The procedures used can be applied to the development of other multimedia probabilistic codes. The probabilistic versions of RESRAD and RESRAD-BUILD codes provide tools for studying the uncertainty in dose assessment caused by uncertain input parameters. The parameter distribution data collected in this work can also be applied to other multimedia assessment tasks and multimedia computer codes.

  1. A Unified Probabilistic Framework for Dose–Response Assessment of Human Health Effects

    PubMed Central

    Slob, Wout

    2015-01-01

    management decisions. Citation Chiu WA, Slob W. 2015. A unified probabilistic framework for dose–response assessment of human health effects. Environ Health Perspect 123:1241–1254; http://dx.doi.org/10.1289/ehp.1409385 PMID:26006063

  2. Risk-Based Comparison of Carbon Capture Technologies

    SciTech Connect

    Engel, David W.; Dalton, Angela C.; Dale, Crystal; Jones, Edward

    2013-05-01

    In this paper, we describe an integrated probabilistic risk assessment methodological framework and a decision-support tool suite for implementing systematic comparisons of competing carbon capture technologies. Culminating from a collaborative effort among national laboratories under the Carbon Capture Simulation Initiative (CCSI), the risk assessment framework and the decision-support tool suite encapsulate three interconnected probabilistic modeling and simulation components. The technology readiness level (TRL) assessment component identifies specific scientific and engineering targets required by each readiness level and applies probabilistic estimation techniques to calculate the likelihood of graded as well as nonlinear advancement in technology maturity. The technical risk assessment component focuses on identifying and quantifying risk contributors, especially stochastic distributions for significant risk contributors, performing scenario-based risk analysis, and integrating with carbon capture process model simulations and optimization. The financial risk component estimates the long-term return on investment based on energy retail pricing, production cost, operating and power replacement cost, plan construction and retrofit expenses, and potential tax relief, expressed probabilistically as the net present value distributions over various forecast horizons.

  3. Application impact analysis: a risk-based approach to business continuity and disaster recovery.

    PubMed

    Epstein, Beth; Khan, Dawn Christine

    2014-01-01

    There are many possible disruptions that can occur in business. Overlooking or under planning for Business Continuity requires time, understanding and careful planning. Business Continuity Management is far more than producing a document and declaring business continuity success. What is the recipe for businesses to achieve continuity management success? Application Impact Analysis is a method for understanding the unique Business Attributes. This AIA Cycle involves a risk based approach to understanding the business priority and considering business aspects such as Financial, Operational, Service Structure, Contractual Legal, and Brand. The output of this analysis provides a construct for viewing data, evaluating impact, and delivering results, for an approved valuation of Recovery Time Objectives (RTO).

  4. Probabilistic machine learning and artificial intelligence.

    PubMed

    Ghahramani, Zoubin

    2015-05-28

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  5. Probabilistic machine learning and artificial intelligence

    NASA Astrophysics Data System (ADS)

    Ghahramani, Zoubin

    2015-05-01

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  6. Probabilistic cognition in two indigenous Mayan groups

    PubMed Central

    Fontanari, Laura; Gonzalez, Michel; Vallortigara, Giorgio; Girotto, Vittorio

    2014-01-01

    Is there a sense of chance shared by all individuals, regardless of their schooling or culture? To test whether the ability to make correct probabilistic evaluations depends on educational and cultural guidance, we investigated probabilistic cognition in preliterate and prenumerate Kaqchikel and K’iche’, two indigenous Mayan groups, living in remote areas of Guatemala. Although the tested individuals had no formal education, they performed correctly in tasks in which they had to consider prior and posterior information, proportions and combinations of possibilities. Their performance was indistinguishable from that of Mayan school children and Western controls. Our results provide evidence for the universal nature of probabilistic cognition. PMID:25368160

  7. Probabilistic population projections with migration uncertainty

    PubMed Central

    Azose, Jonathan J.; Ševčíková, Hana; Raftery, Adrian E.

    2016-01-01

    We produce probabilistic projections of population for all countries based on probabilistic projections of fertility, mortality, and migration. We compare our projections to those from the United Nations’ Probabilistic Population Projections, which uses similar methods for fertility and mortality but deterministic migration projections. We find that uncertainty in migration projection is a substantial contributor to uncertainty in population projections for many countries. Prediction intervals for the populations of Northern America and Europe are over 70% wider, whereas prediction intervals for the populations of Africa, Asia, and the world as a whole are nearly unchanged. Out-of-sample validation shows that the model is reasonably well calibrated. PMID:27217571

  8. Probabilistic population projections with migration uncertainty.

    PubMed

    Azose, Jonathan J; Ševčíková, Hana; Raftery, Adrian E

    2016-06-07

    We produce probabilistic projections of population for all countries based on probabilistic projections of fertility, mortality, and migration. We compare our projections to those from the United Nations' Probabilistic Population Projections, which uses similar methods for fertility and mortality but deterministic migration projections. We find that uncertainty in migration projection is a substantial contributor to uncertainty in population projections for many countries. Prediction intervals for the populations of Northern America and Europe are over 70% wider, whereas prediction intervals for the populations of Africa, Asia, and the world as a whole are nearly unchanged. Out-of-sample validation shows that the model is reasonably well calibrated.

  9. Exploration Health Risks: Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Rhatigan, Jennifer; Charles, John; Hayes, Judith; Wren, Kiley

    2006-01-01

    Maintenance of human health on long-duration exploration missions is a primary challenge to mission designers. Indeed, human health risks are currently the largest risk contributors to the risks of evacuation or loss of the crew on long-duration International Space Station missions. We describe a quantitative assessment of the relative probabilities of occurrence of the individual risks to human safety and efficiency during space flight to augment qualitative assessments used in this field to date. Quantitative probabilistic risk assessments will allow program managers to focus resources on those human health risks most likely to occur with undesirable consequences. Truly quantitative assessments are common, even expected, in the engineering and actuarial spheres, but that capability is just emerging in some arenas of life sciences research, such as identifying and minimize the hazards to astronauts during future space exploration missions. Our expectation is that these results can be used to inform NASA mission design trade studies in the near future with the objective of preventing the higher among the human health risks. We identify and discuss statistical techniques to provide this risk quantification based on relevant sets of astronaut biomedical data from short and long duration space flights as well as relevant analog populations. We outline critical assumptions made in the calculations and discuss the rationale for these. Our efforts to date have focussed on quantifying the probabilities of medical risks that are qualitatively perceived as relatively high risks of radiation sickness, cardiac dysrhythmias, medically significant renal stone formation due to increased calcium mobilization, decompression sickness as a result of EVA (extravehicular activity), and bone fracture due to loss of bone mineral density. We present these quantitative probabilities in order-of-magnitude comparison format so that relative risk can be gauged. We address the effects of

  10. Exploration Health Risks: Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Rhatigan, Jennifer; Charles, John; Hayes, Judith; Wren, Kiley

    2006-01-01

    Maintenance of human health on long-duration exploration missions is a primary challenge to mission designers. Indeed, human health risks are currently the largest risk contributors to the risks of evacuation or loss of the crew on long-duration International Space Station missions. We describe a quantitative assessment of the relative probabilities of occurrence of the individual risks to human safety and efficiency during space flight to augment qualitative assessments used in this field to date. Quantitative probabilistic risk assessments will allow program managers to focus resources on those human health risks most likely to occur with undesirable consequences. Truly quantitative assessments are common, even expected, in the engineering and actuarial spheres, but that capability is just emerging in some arenas of life sciences research, such as identifying and minimize the hazards to astronauts during future space exploration missions. Our expectation is that these results can be used to inform NASA mission design trade studies in the near future with the objective of preventing the higher among the human health risks. We identify and discuss statistical techniques to provide this risk quantification based on relevant sets of astronaut biomedical data from short and long duration space flights as well as relevant analog populations. We outline critical assumptions made in the calculations and discuss the rationale for these. Our efforts to date have focussed on quantifying the probabilities of medical risks that are qualitatively perceived as relatively high risks of radiation sickness, cardiac dysrhythmias, medically significant renal stone formation due to increased calcium mobilization, decompression sickness as a result of EVA (extravehicular activity), and bone fracture due to loss of bone mineral density. We present these quantitative probabilities in order-of-magnitude comparison format so that relative risk can be gauged. We address the effects of

  11. Risk-based maintenance--techniques and applications.

    PubMed

    Arunraj, N S; Maiti, J

    2007-04-11

    Plant and equipment, however well designed, will not remain safe or reliable if it is not maintained. The general objective of the maintenance process is to make use of the knowledge of failures and accidents to achieve the possible safety with the lowest possible cost. The concept of risk-based maintenance was developed to inspect the high-risk components usually with greater frequency and thoroughness and to maintain in a greater manner, to achieve tolerable risk criteria. Risk-based maintenance methodology provides a tool for maintenance planning and decision making to reduce the probability of failure of equipment and the consequences of failure. In this paper, the risk analysis and risk-based maintenance methodologies were identified and classified into suitable classes. The factors affecting the quality of risk analysis were identified and analyzed. The applications, input data and output data were studied to understand their functioning and efficiency. The review showed that there is no unique way to perform risk analysis and risk-based maintenance. The use of suitable techniques and methodologies, careful investigation during the risk analysis phase, and its detailed and structured results are necessary to make proper risk-based maintenance decisions.

  12. A risk-based approach to scheduling audits.

    PubMed

    Rönninger, Stephan; Holmes, Malcolm

    2009-01-01

    The manufacture and supply of pharmaceutical products can be a very complex operation. Companies may purchase a wide variety of materials, from active pharmaceutical ingredients to packaging materials, from "in company" suppliers or from third parties. They may also purchase or contract a number of services such as analysis, data management, audit, among others. It is very important that these materials and services are of the requisite quality in order that patient safety and company reputation are adequately protected. Such quality requirements are ongoing throughout the product life cycle. In recent years, assurance of quality has been derived via audit of the supplier or service provider and by using periodic audits, for example, annually or at least once every 5 years. In the past, companies may have used an audit only for what they considered to be "key" materials or services and used testing on receipt, for example, as their quality assurance measure for "less important" supplies. Such approaches changed as a result of pressure from both internal sources and regulators to the time-driven audit for all suppliers and service providers. Companies recognised that eventually they would be responsible for the quality of the supplied product or service and audit, although providing only a "snapshot in time" seemed a convenient way of demonstrating that they were meeting their obligations. Problems, however, still occur with the supplied product or service and will usually be more frequent from certain suppliers. Additionally, some third-party suppliers will no longer accept routine audits from individual companies, as the overall audit load can exceed one external audit per working day. Consequently a different model is needed for assessing supplier quality. This paper presents a risk-based approach to creating an audit plan and for scheduling the frequency and depth of such audits. The approach is based on the principles and process of the Quality Risk Management

  13. Auxiliary feedwater system risk-based inspection guide for the J. M. Farley Nuclear Power Plant

    SciTech Connect

    Vo, T.V.; Pugh, R.; Gore, B.F.; Harrison, D.G. )

    1990-10-01

    In a study sponsored by the US Nuclear Regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment(PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. J. M. Farley was selected as the second plant for study. The product of this effort is a prioritized listing of AFW failures which have occurred at the plant and at other PWRs. This listing is intended for use by NRC inspectors in the preparation of inspection plans addressing AFW risk-important at the J. M. Farley plant. 23 refs., 1 fig., 1 tab.

  14. Risk-Based Treatment Targets for Onsite Non-Potable Water ...

    EPA Pesticide Factsheets

    This presentation presents risk-based enteric pathogen log reduction targets for non-potable and potable uses of a variety of alternative source waters (i.e., municipal wastewater, locally-collected greywater, rainwater, and stormwater). A probabilistic, forward Quantitative Microbial Risk Assessment (QMRA) was used to derive the pathogen log10 reduction targets (LRTs) that corresponded with an infection risk of either 10-4 per person per year (ppy) or 10-2 ppy. The QMRA accounted for variation in pathogen concentration and sporadic pathogen occurrence (when data were available) in source waters for reference pathogens Rotavirus, Adenovirus, Norovirus, Campylobacter spp., Salmonella spp., Giardia spp., and Cryptosporidium spp.. Non-potable uses included indoor use (for toilet flushing and clothes washing) with accidental ingestion of treated non-potable water (or cross connection with potable water), and unrestricted irrigation for outdoor use. Various exposure scenarios captured the uncertainty from key inputs, i.e., the pathogen concentration in source water; the volume of water ingested; and for the indoor use, the frequency of and the fraction of the population exposed to accidental ingestion. Both potable and non-potable uses required pathogen treatment for the selected waters and the LRT was generally greater for potable use than nonpotable indoor use and unrestricted irrigation. The difference in treatment requirements among source waters was driven by th

  15. Auxiliary feedwater system risk-based inspection guide for the Ginna Nuclear Power Plant

    SciTech Connect

    Pugh, R.; Gore, B.F.; Vo, T.V.; Moffitt, N.E. )

    1991-09-01

    In a study sponsored by the US Nuclear Regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment (PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance information recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. Ginna was selected as the eighth plant for study. The product of this effort is a prioritized listing of AFW failures which have occurred at the plant and at other PWRs. This listing is intended for use by NRC inspectors in the preparation of inspection plans addressing AFW risk-important components at the Ginna plant. 23 refs., 1 fig., 1 tab.

  16. Auxiliary feedwater system risk-based inspection guide for the Byron and Braidwood nuclear power plants

    SciTech Connect

    Moffitt, N.E.; Gore, B.F.: Vo, T.V. )

    1991-07-01

    In a study sponsored by the US Nuclear Regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment (PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance information recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. Byron and Braidwood were selected for the fourth study in this program. The produce of this effort is a prioritized listing of AFW failures which have occurred at the plants and at other PWRs. This listing is intended for use by NRC inspectors in the preparation of inspection plans addressing AFW risk-important components at the Byron/Braidwood plants. 23 refs., 1 fig., 1 tab.

  17. Health economics and outcomes methods in risk-based decision-making for blood safety.

    PubMed

    Custer, Brian; Janssen, Mart P

    2015-08-01

    Analytical methods appropriate for health economic assessments of transfusion safety interventions have not previously been described in ways that facilitate their use. Within the context of risk-based decision-making (RBDM), health economics can be important for optimizing decisions among competing interventions. The objective of this review is to address key considerations and limitations of current methods as they apply to blood safety. Because a voluntary blood supply is an example of a public good, analyses should be conducted from the societal perspective when possible. Two primary study designs are recommended for most blood safety intervention assessments: budget impact analysis (BIA), which measures the cost to implement an intervention both to the blood operator but also in a broader context, and cost-utility analysis (CUA), which measures the ratio between costs and health gain achieved, in terms of reduced morbidity and mortality, by use of an intervention. These analyses often have important limitations because data that reflect specific aspects, for example, blood recipient population characteristics or complication rates, are not available. Sensitivity analyses play an important role. The impact of various uncertain factors can be studied conjointly in probabilistic sensitivity analyses. The use of BIA and CUA together provides a comprehensive assessment of the costs and benefits from implementing (or not) specific interventions. RBDM is multifaceted and impacts a broad spectrum of stakeholders. Gathering and analyzing health economic evidence as part of the RBDM process enhances the quality, completeness, and transparency of decision-making.

  18. Auxiliary feedwater system risk-based inspection guide for the South Texas Project nuclear power plant

    SciTech Connect

    Bumgardner, J.D.; Nickolaus, J.R.; Moffitt, N.E.; Gore, B.F.; Vo, T.V.

    1993-12-01

    In a study sponsored by the US Nuclear Regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment (PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance information recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. South Texas Project was selected as a plant for study. The product of this effort is a prioritized listing of AFW failures which have occurred at the plant and at other PWRs. This listing is intended for use by the NRC inspectors in preparation of inspection plans addressing AFW risk important components at the South Texas Project plant.

  19. Auxiliary feedwater system risk-based inspection guide for the McGuire nuclear power plant

    SciTech Connect

    Bumgardner, J.D.; Lloyd, R.C.; Moffitt, N.E.; Gore, B.F.; Vo, T.V.

    1994-05-01

    In a study sponsored by the US Nuclear Regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment (PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance information recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. McGuire was selected as one of a series of plants for study. The product of this effort is a prioritized listing of AFW failures which have occurred at the plant and at other PWRs. This listing is intended for use by NRC inspectors in the preparation of inspection plans addressing AFW risk-important components at the McGuire plant.

  20. Auxiliary feedwater system risk-based inspection guide for the H. B. Robinson nuclear power plant

    SciTech Connect

    Moffitt, N.E.; Lloyd, R.C.; Gore, B.F.; Vo, T.V.; Garner, L.W.

    1993-08-01

    In a study sponsored by the US Nuclear Regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment (PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance information recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. H. B. Robinson was selected as one of a series of plants for study. The product of this effort is a prioritized listing of AFW failures which have occurred at the plant and at other PWRs. This listing is intended for use by NRC inspectors in the preparation of inspection plans addressing AFW risk-important components at the H. B. Robinson plant.

  1. Risk-based enteric pathogen reduction targets for non-potable ...

    EPA Pesticide Factsheets

    This paper presents risk-based enteric pathogen log reduction targets for non-potable and potable uses of a variety of alternative source waters (i.e., locally-collected greywater, roof runoff, and stormwater). A probabilistic Quantitative Microbial Risk Assessment (QMRA) was used to derive the pathogen log10 reduction targets (LRTs) that corresponded with an infection risk of either 10−4 per person per year (ppy) or 10−2 ppy. The QMRA accounted for variation in pathogen concentration and sporadic pathogen occurrence (when data were available) in source waters for reference pathogens in the genera Rotavirus, Mastadenovirus (human adenoviruses), Norovirus, Campylobacter, Salmonella, Giardia and Cryptosporidium. Non-potable uses included indoor use (for toilet flushing and clothes washing) with occasional accidental ingestion of treated non-potable water (or cross-connection with potable water), and unrestricted irrigation for outdoor use. Various exposure scenarios captured the uncertainty from key inputs, i.e., the pathogen concentration in source water; the volume of water ingested; and for the indoor use, the frequency of and the fraction of the population exposed to accidental ingestion. Both potable and non-potable uses required pathogen treatment for the selected waters and the LRT was generally greater for potable use than non-potable indoor use and unrestricted irrigation. The difference in treatment requirements among source waters was driven by the

  2. Auxiliary feedwater system risk-based inspection guide for the Point Beach nuclear power plant

    SciTech Connect

    Lloyd, R C; Moffitt, N E; Gore, B F; Vo, T V; Vehec, T A

    1993-02-01

    In a study sponsored by the US Nuclear Regulatory Commission (NRC), Pacific Northwest Laboratory has developed and applied a methodology for deriving plant-specific risk-based inspection guidance for the auxiliary feedwater (AFW) system at pressurized water reactors that have not undergone probabilistic risk assessment (PRA). This methodology uses existing PRA results and plant operating experience information. Existing PRA-based inspection guidance information recently developed for the NRC for various plants was used to identify generic component failure modes. This information was then combined with plant-specific and industry-wide component information and failure data to identify failure modes and failure mechanisms for the AFW system at the selected plants. Point Beach was selected as one of a series of plants for study. The product of this effort is a prioritized listing of AFW failures which have occurred at the plant and at other PWRS. This listing is intended for use by NRC inspectors in the preparation of inspection plans addressing AFW risk-important components at the Point Beach plant.

  3. Probabilistic micromechanics for high-temperature composites

    NASA Technical Reports Server (NTRS)

    Reddy, J. N.

    1993-01-01

    The three-year program of research had the following technical objectives: the development of probabilistic methods for micromechanics-based constitutive and failure models, application of the probabilistic methodology in the evaluation of various composite materials and simulation of expected uncertainties in unidirectional fiber composite properties, and influence of the uncertainties in composite properties on the structural response. The first year of research was devoted to the development of probabilistic methodology for micromechanics models. The second year of research focused on the evaluation of the Chamis-Hopkins constitutive model and Aboudi constitutive model using the methodology developed in the first year of research. The third year of research was devoted to the development of probabilistic finite element analysis procedures for laminated composite plate and shell structures.

  4. Non-unitary probabilistic quantum computing

    NASA Technical Reports Server (NTRS)

    Gingrich, Robert M.; Williams, Colin P.

    2004-01-01

    We present a method for designing quantum circuits that perform non-unitary quantum computations on n-qubit states probabilistically, and give analytic expressions for the success probability and fidelity.

  5. Non-unitary probabilistic quantum computing

    NASA Technical Reports Server (NTRS)

    Gingrich, Robert M.; Williams, Colin P.

    2004-01-01

    We present a method for designing quantum circuits that perform non-unitary quantum computations on n-qubit states probabilistically, and give analytic expressions for the success probability and fidelity.

  6. Probabilistic cloning of three symmetric states

    SciTech Connect

    Jimenez, O.; Bergou, J.; Delgado, A.

    2010-12-15

    We study the probabilistic cloning of three symmetric states. These states are defined by a single complex quantity, the inner product among them. We show that three different probabilistic cloning machines are necessary to optimally clone all possible families of three symmetric states. We also show that the optimal cloning probability of generating M copies out of one original can be cast as the quotient between the success probability of unambiguously discriminating one and M copies of symmetric states.

  7. Probabilistic regularization in inverse optical imaging.

    PubMed

    De Micheli, E; Viano, G A

    2000-11-01

    The problem of object restoration in the case of spatially incoherent illumination is considered. A regularized solution to the inverse problem is obtained through a probabilistic approach, and a numerical algorithm based on the statistical analysis of the noisy data is presented. Particular emphasis is placed on the question of the positivity constraint, which is incorporated into the probabilistically regularized solution by means of a quadratic programming technique. Numerical examples illustrating the main steps of the algorithm are also given.

  8. Probabilistic Approaches for Evaluating Space Shuttle Risks

    NASA Technical Reports Server (NTRS)

    Vesely, William

    2001-01-01

    The objectives of the Space Shuttle PRA (Probabilistic Risk Assessment) are to: (1) evaluate mission risks; (2) evaluate uncertainties and sensitivities; (3) prioritize contributors; (4) evaluate upgrades; (5) track risks; and (6) provide decision tools. This report discusses the significance of a Space Shuttle PRA and its participants. The elements and type of losses to be included are discussed. The program and probabilistic approaches are then discussed.

  9. Probabilistic/Fracture-Mechanics Model For Service Life

    NASA Technical Reports Server (NTRS)

    Watkins, T., Jr.; Annis, C. G., Jr.

    1991-01-01

    Computer program makes probabilistic estimates of lifetime of engine and components thereof. Developed to fill need for more accurate life-assessment technique that avoids errors in estimated lives and provides for statistical assessment of levels of risk created by engineering decisions in designing system. Implements mathematical model combining techniques of statistics, fatigue, fracture mechanics, nondestructive analysis, life-cycle cost analysis, and management of engine parts. Used to investigate effects of such engine-component life-controlling parameters as return-to-service intervals, stresses, capabilities for nondestructive evaluation, and qualities of materials.

  10. Probabilistic/Fracture-Mechanics Model For Service Life

    NASA Technical Reports Server (NTRS)

    Watkins, T., Jr.; Annis, C. G., Jr.

    1991-01-01

    Computer program makes probabilistic estimates of lifetime of engine and components thereof. Developed to fill need for more accurate life-assessment technique that avoids errors in estimated lives and provides for statistical assessment of levels of risk created by engineering decisions in designing system. Implements mathematical model combining techniques of statistics, fatigue, fracture mechanics, nondestructive analysis, life-cycle cost analysis, and management of engine parts. Used to investigate effects of such engine-component life-controlling parameters as return-to-service intervals, stresses, capabilities for nondestructive evaluation, and qualities of materials.

  11. Probabilistic simulation of uncertainties in thermal structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Shiao, Michael

    1990-01-01

    Development of probabilistic structural analysis methods for hot structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) blade temperature, pressure, and torque of the Space Shuttle Main Engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; (3) evaluation of the failure probability; (4) reliability and risk-cost assessment, and (5) an outline of an emerging approach for eventual hot structures certification. Collectively, the results demonstrate that the structural durability/reliability of hot structural components can be effectively evaluated in a formal probabilistic framework. In addition, the approach can be readily extended to computationally simulate certification of hot structures for aerospace environments.

  12. Probabilistic Prediction of Lifetimes of Ceramic Parts

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Gyekenyesi, John P.; Jadaan, Osama M.; Palfi, Tamas; Powers, Lynn; Reh, Stefan; Baker, Eric H.

    2006-01-01

    ANSYS/CARES/PDS is a software system that combines the ANSYS Probabilistic Design System (PDS) software with a modified version of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) Version 6.0 software. [A prior version of CARES/Life was reported in Program for Evaluation of Reliability of Ceramic Parts (LEW-16018), NASA Tech Briefs, Vol. 20, No. 3 (March 1996), page 28.] CARES/Life models effects of stochastic strength, slow crack growth, and stress distribution on the overall reliability of a ceramic component. The essence of the enhancement in CARES/Life 6.0 is the capability to predict the probability of failure using results from transient finite-element analysis. ANSYS PDS models the effects of uncertainty in material properties, dimensions, and loading on the stress distribution and deformation. ANSYS/CARES/PDS accounts for the effects of probabilistic strength, probabilistic loads, probabilistic material properties, and probabilistic tolerances on the lifetime and reliability of the component. Even failure probability becomes a stochastic quantity that can be tracked as a response variable. ANSYS/CARES/PDS enables tracking of all stochastic quantities in the design space, thereby enabling more precise probabilistic prediction of lifetimes of ceramic components.

  13. Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.

    2016-01-01

    Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.

  14. Dynamical systems probabilistic risk assessment

    SciTech Connect

    Denman, Matthew R.; Ames, Arlo Leroy

    2014-03-01

    Probabilistic Risk Assessment (PRA) is the primary tool used to risk-inform nuclear power regulatory and licensing activities. Risk-informed regulations are intended to reduce inherent conservatism in regulatory metrics (e.g., allowable operating conditions and technical specifications) which are built into the regulatory framework by quantifying both the total risk profile as well as the change in the risk profile caused by an event or action (e.g., in-service inspection procedures or power uprates). Dynamical Systems (DS) analysis has been used to understand unintended time-dependent feedbacks in both industrial and organizational settings. In dynamical systems analysis, feedback loops can be characterized and studied as a function of time to describe the changes to the reliability of plant Structures, Systems and Components (SSCs). While DS has been used in many subject areas, some even within the PRA community, it has not been applied toward creating long-time horizon, dynamic PRAs (with time scales ranging between days and decades depending upon the analysis). Understanding slowly developing dynamic effects, such as wear-out, on SSC reliabilities may be instrumental in ensuring a safely and reliably operating nuclear fleet. Improving the estimation of a plant's continuously changing risk profile will allow for more meaningful risk insights, greater stakeholder confidence in risk insights, and increased operational flexibility.

  15. Optimal probabilistic dense coding schemes

    NASA Astrophysics Data System (ADS)

    Kögler, Roger A.; Neves, Leonardo

    2017-04-01

    Dense coding with non-maximally entangled states has been investigated in many different scenarios. We revisit this problem for protocols adopting the standard encoding scheme. In this case, the set of possible classical messages cannot be perfectly distinguished due to the non-orthogonality of the quantum states carrying them. So far, the decoding process has been approached in two ways: (i) The message is always inferred, but with an associated (minimum) error; (ii) the message is inferred without error, but only sometimes; in case of failure, nothing else is done. Here, we generalize on these approaches and propose novel optimal probabilistic decoding schemes. The first uses quantum-state separation to increase the distinguishability of the messages with an optimal success probability. This scheme is shown to include (i) and (ii) as special cases and continuously interpolate between them, which enables the decoder to trade-off between the level of confidence desired to identify the received messages and the success probability for doing so. The second scheme, called multistage decoding, applies only for qudits ( d-level quantum systems with d>2) and consists of further attempts in the state identification process in case of failure in the first one. We show that this scheme is advantageous over (ii) as it increases the mutual information between the sender and receiver.

  16. Probabilistic Elastography: Estimating Lung Elasticity

    PubMed Central

    Risholm, Petter; Ross, James; Washko, George R.; Wells, William M.

    2011-01-01

    We formulate registration-based elastography in a probabilistic framework and apply it to study lung elasticity in the presence of emphysematous and fibrotic tissue. The elasticity calculations are based on a Finite Element discretization of a linear elastic biomechanical model. We marginalize over the boundary conditions (deformation) of the biomechanical model to determine the posterior distribution over elasticity parameters. Image similarity is included in the likelihood, an elastic prior is included to constrain the boundary conditions, while a Markov model is used to spatially smooth the inhomogeneous elasticity. We use a Markov Chain Monte Carlo (MCMC) technique to characterize the posterior distribution over elasticity from which we extract the most probable elasticity as well as the uncertainty of this estimate. Even though registration-based lung elastography with inhomogeneous elasticity is challenging due the problem's highly underdetermined nature and the sparse image information available in lung CT, we show promising preliminary results on estimating lung elasticity contrast in the presence of emphysematous and fibrotic tissue. PMID:21761697

  17. Probabilistic elastography: estimating lung elasticity.

    PubMed

    Risholm, Petter; Ross, James; Washko, George R; Wells, William M

    2011-01-01

    We formulate registration-based elastography in a probabilistic framework and apply it to study lung elasticity in the presence of emphysematous and fibrotic tissue. The elasticity calculations are based on a Finite Element discretization of a linear elastic biomechanical model. We marginalize over the boundary conditions (deformation) of the biomechanical model to determine the posterior distribution over elasticity parameters. Image similarity is included in the likelihood, an elastic prior is included to constrain the boundary conditions, while a Markov model is used to spatially smooth the inhomogeneous elasticity. We use a Markov Chain Monte Carlo (MCMC) technique to characterize the posterior distribution over elasticity from which we extract the most probable elasticity as well as the uncertainty of this estimate. Even though registration-based lung elastography with inhomogeneous elasticity is challenging due the problem's highly underdetermined nature and the sparse image information available in lung CT, we show promising preliminary results on estimating lung elasticity contrast in the presence of emphysematous and fibrotic tissue.

  18. Symbolic representation of probabilistic worlds.

    PubMed

    Feldman, Jacob

    2012-04-01

    Symbolic representation of environmental variables is a ubiquitous and often debated component of cognitive science. Yet notwithstanding centuries of philosophical discussion, the efficacy, scope, and validity of such representation has rarely been given direct consideration from a mathematical point of view. This paper introduces a quantitative measure of the effectiveness of symbolic representation, and develops formal constraints under which such representation is in fact warranted. The effectiveness of symbolic representation hinges on the probabilistic structure of the environment that is to be represented. For arbitrary probability distributions (i.e., environments), symbolic representation is generally not warranted. But in modal environments, defined here as those that consist of mixtures of component distributions that are narrow ("spiky") relative to their spreads, symbolic representation can be shown to represent the environment with a relatively negligible loss of information. Modal environments support propositional forms, logical relations, and other familiar features of symbolic representation. Hence the assumption that our environment is, in fact, modal is a key tacit assumption underlying the use of symbols in cognitive science.

  19. Probabilistic description of traffic flow

    NASA Astrophysics Data System (ADS)

    Mahnke, R.; Kaupužs, J.; Lubashevsky, I.

    2005-03-01

    A stochastic description of traffic flow, called probabilistic traffic flow theory, is developed. The general master equation is applied to relatively simple models to describe the formation and dissolution of traffic congestions. Our approach is mainly based on spatially homogeneous systems like periodically closed circular rings without on- and off-ramps. We consider a stochastic one-step process of growth or shrinkage of a car cluster (jam). As generalization we discuss the coexistence of several car clusters of different sizes. The basic problem is to find a physically motivated ansatz for the transition rates of the attachment and detachment of individual cars to a car cluster consistent with the empirical observations in real traffic. The emphasis is put on the analogy with first-order phase transitions and nucleation phenomena in physical systems like supersaturated vapour. The results are summarized in the flux-density relation, the so-called fundamental diagram of traffic flow, and compared with empirical data. Different regimes of traffic flow are discussed: free flow, congested mode as stop-and-go regime, and heavy viscous traffic. The traffic breakdown is studied based on the master equation as well as the Fokker-Planck approximation to calculate mean first passage times or escape rates. Generalizations are developed to allow for on-ramp effects. The calculated flux-density relation and characteristic breakdown times coincide with empirical data measured on highways. Finally, a brief summary of the stochastic cellular automata approach is given.

  20. Probabilistic modeling of children's handwriting

    NASA Astrophysics Data System (ADS)

    Puri, Mukta; Srihari, Sargur N.; Hanson, Lisa

    2013-12-01

    There is little work done in the analysis of children's handwriting, which can be useful in developing automatic evaluation systems and in quantifying handwriting individuality. We consider the statistical analysis of children's handwriting in early grades. Samples of handwriting of children in Grades 2-4 who were taught the Zaner-Bloser style were considered. The commonly occurring word "and" written in cursive style as well as hand-print were extracted from extended writing. The samples were assigned feature values by human examiners using a truthing tool. The human examiners looked at how the children constructed letter formations in their writing, looking for similarities and differences from the instructions taught in the handwriting copy book. These similarities and differences were measured using a feature space distance measure. Results indicate that the handwriting develops towards more conformity with the class characteristics of the Zaner-Bloser copybook which, with practice, is the expected result. Bayesian networks were learnt from the data to enable answering various probabilistic queries, such as determining students who may continue to produce letter formations as taught during lessons in school and determining the students who will develop a different and/or variation of the those letter formations and the number of different types of letter formations.

  1. Probabilistic risk assessment familiarization training

    SciTech Connect

    Phillabaum, J.L.

    1989-01-01

    Philadelphia Electric Company (PECo) created a Nuclear Group Risk and Reliability Assessment Program Plan in order to focus the utilization of probabilistic risk assessment (PRA) in support of Limerick Generating Station and Peach Bottom Atomic Power Station. The continuation of a PRA program was committed by PECo to the U.S. Nuclear Regulatory Commission (NRC) prior to be the issuance of an operating license for Limerick Unit 1. It is believed that increased use of PRA techniques to support activities at Limerick and Peach Bottom will enhance PECo's overall nuclear excellence. Training for familiarization with PRA is designed for attendance once by all nuclear group personnel to understand PRA and its potential effect on their jobs. The training content describes the history of PRA and how it applies to PECo's nuclear activities. Key PRA concepts serve as the foundation for the familiarization training. These key concepts are covered in all classes to facilitate an appreciation of the remaining material, which is tailored to the audience. Some of the concepts covered are comparison of regulatory philosophy to PRA techniques, fundamentals of risk/success, risk equation/risk summation, and fault trees and event trees. Building on the concepts, PRA insights and applications are then described that are tailored to the audience.

  2. Application of the risk-based strategy to the Hanford tank waste organic-nitrate safety issue

    SciTech Connect

    Hunter, V.L.; Colson, S.D.; Ferryman, T.; Gephart, R.E.; Heasler, P.; Scheele, R.D.

    1997-12-01

    This report describes the results from application of the Risk-Based Decision Management Approach for Justifying Characterization of Hanford Tank Waste to the organic-nitrate safety issue in Hanford single-shell tanks (SSTs). Existing chemical and physical models were used, taking advantage of the most current (mid-1997) sampling and analysis data. The purpose of this study is to make specific recommendations for planning characterization to help ensure the safety of each SST as it relates to the organic-nitrate safety issue. An additional objective is to demonstrate the viability of the Risk-Based Strategy for addressing Hanford tank waste safety issues.

  3. A Risk-Based Approach to Test and Evaluation

    DTIC Science & Technology

    2012-05-01

    is it to occur (probability, frequency), and what will be the outcome (consequences)? The SAPHIRE software tool also is introduced as a way to...develop those risk concepts dealing with event trees, fault trees, and desired end states. SAPHIRE is a probabilistic risk, and reliability assessment...software tool. SAPHIRE stands for Systems Analysis Programs for Hands-on Integrated Reliability Evaluations and was developed for the U.S. Nuclear

  4. Development of a risk-based approach to Hanford Site cleanup

    SciTech Connect

    Hesser, W.A.; Daling, P.M.; Baynes, P.A.

    1995-06-01

    In response to a request from Mr. Thomas Grumbly, Assistant Secretary of Energy for Environmental Management, the Hanford Site contractors developed a conceptual set of risk-based cleanup strategies that (1) protect the public, workers, and environment from unacceptable risks; (2) are executable technically; and (3) fit within an expected annual funding profile of 1.05 billion dollars. These strategies were developed because (1) the US Department of Energy and Hanford Site budgets are being reduced, (2) stakeholders are dissatisfied with the perceived rate of cleanup, (3) the US Congress and the US Department of Energy are increasingly focusing on risk and riskreduction activities, (4) the present strategy is not integrated across the Site and is inconsistent in its treatment of similar hazards, (5) the present cleanup strategy is not cost-effective from a risk-reduction or future land use perspective, and (6) the milestones and activities in the Tri-Party Agreement cannot be achieved with an anticipated funding of 1.05 billion dollars annually. The risk-based strategies described herein were developed through a systems analysis approach that (1) analyzed the cleanup mission; (2) identified cleanup objectives, including risk reduction, land use, and mortgage reduction; (3) analyzed the existing baseline cleanup strategy from a cost and risk perspective; (4) developed alternatives for accomplishing the cleanup mission; (5) compared those alternatives against cleanup objectives; and (6) produced conclusions and recommendations regarding the current strategy and potential risk-based strategies.

  5. Handbook of methods for risk-based analyses of technical specifications

    SciTech Connect

    Samanta, P.K.; Kim, I.S.; Mankamo, T.; Vesely, W.E.

    1994-12-01

    Technical Specifications (TS) requirements for nuclear power plants define the Limiting Conditions for Operation (LCOs) and Surveillance Requirements (SRs) to assure safety during operation. In general, these requirements are based on deterministic analysis and engineering judgments. Experiences with plant operation indicate that some elements of the requirements are unnecessarily restrictive, while a few may not be conducive to safety. The US Nuclear Regulatory Commission (USNRC) Office of Research has sponsored research to develop systematic risk-based methods to improve various aspects of TS requirements. This handbook summarizes these risk-based methods. The scope of the handbook includes reliability and risk-based methods for evaluating allowed outage times (AOTs), scheduled or preventive maintenance, action statements requiring shutdown where shutdown risk may be substantial, surveillance test intervals (STIs), and management of plant configurations resulting from outages of systems, or components. For each topic, the handbook summarizes analytic methods with data needs, outlines the insights to be gained, lists additional references, and gives examples of evaluations.

  6. Risk based guideline values and the development of preliminary remediation goals

    SciTech Connect

    Brothers, R.A.; Cox, D.M.; Guty, J.L.; Miller, D.B.; Motheramgari, K.; Stinnette, S.E.

    1995-02-01

    Risk managers at federal facilities often need a risk-based tool to rapidly assess the possible human health risks of large numbers of sites before completing a baseline risk assessment. Risk-based concentrations, based on Preliminary Remediation Goal (PRG) development methodology, can be used as screening guideline values. We have developed a set of guideline values (GVs) for the Mound Facility at Miamisburg, Ohio, that are risk based, decision-making tools. The GVs are used (with regulatory approval) to rapidly assess the possibility that sites may be considered for {open_quotes}no action{close_quotes} decisions. The GVs are neither PRGs nor final remedial action levels. Development of the GVs on a facilitywide basis incorporated known contaminants of potential concern, physical and chemical characteristics of contaminated media, current and potential future land uses, and exposure pathway assumptions. Because no one site was used in the development process, the GVs can be applied (after consideration of the land use and exposure potential) to any site on the facility. The facilitywide approach will streamline the PRG development process by minimizing the efforts to develop site-specific PRGs for each operable unit at a considerable saving of time and effort.

  7. Maximizing Statistical Power When Verifying Probabilistic Forecasts of Hydrometeorological Events

    NASA Astrophysics Data System (ADS)

    DeChant, C. M.; Moradkhani, H.

    2014-12-01

    Hydrometeorological events (i.e. floods, droughts, precipitation) are increasingly being forecasted probabilistically, owing to the uncertainties in the underlying causes of the phenomenon. In these forecasts, the probability of the event, over some lead time, is estimated based on some model simulations or predictive indicators. By issuing probabilistic forecasts, agencies may communicate the uncertainty in the event occurring. Assuming that the assigned probability of the event is correct, which is referred to as a reliable forecast, the end user may perform some risk management based on the potential damages resulting from the event. Alternatively, an unreliable forecast may give false impressions of the actual risk, leading to improper decision making when protecting resources from extreme events. Due to this requisite for reliable forecasts to perform effective risk management, this study takes a renewed look at reliability assessment in event forecasts. Illustrative experiments will be presented, showing deficiencies in the commonly available approaches (Brier Score, Reliability Diagram). Overall, it is shown that the conventional reliability assessment techniques do not maximize the ability to distinguish between a reliable and unreliable forecast. In this regard, a theoretical formulation of the probabilistic event forecast verification framework will be presented. From this analysis, hypothesis testing with the Poisson-Binomial distribution is the most exact model available for the verification framework, and therefore maximizes one's ability to distinguish between a reliable and unreliable forecast. Application of this verification system was also examined within a real forecasting case study, highlighting the additional statistical power provided with the use of the Poisson-Binomial distribution.

  8. Efficient probabilistic inference in generic neural networks trained with non-probabilistic feedback.

    PubMed

    Orhan, A Emin; Ma, Wei Ji

    2017-07-26

    Animals perform near-optimal probabilistic inference in a wide range of psychophysical tasks. Probabilistic inference requires trial-to-trial representation of the uncertainties associated with task variables and subsequent use of this representation. Previous work has implemented such computations using neural networks with hand-crafted and task-dependent operations. We show that generic neural networks trained with a simple error-based learning rule perform near-optimal probabilistic inference in nine common psychophysical tasks. In a probabilistic categorization task, error-based learning in a generic network simultaneously explains a monkey's learning curve and the evolution of qualitative aspects of its choice behavior. In all tasks, the number of neurons required for a given level of performance grows sublinearly with the input population size, a substantial improvement on previous implementations of probabilistic inference. The trained networks develop a novel sparsity-based probabilistic population code. Our results suggest that probabilistic inference emerges naturally in generic neural networks trained with error-based learning rules.Behavioural tasks often require probability distributions to be inferred about task specific variables. Here, the authors demonstrate that generic neural networks can be trained using a simple error-based learning rule to perform such probabilistic computations efficiently without any need for task specific operations.

  9. Exploring the uncertainties in cancer risk assessment using the integrated probabilistic risk assessment (IPRA) approach.

    PubMed

    Slob, Wout; Bakker, Martine I; Biesebeek, Jan Dirk Te; Bokkers, Bas G H

    2014-08-01

    Current methods for cancer risk assessment result in single values, without any quantitative information on the uncertainties in these values. Therefore, single risk values could easily be overinterpreted. In this study, we discuss a full probabilistic cancer risk assessment approach in which all the generally recognized uncertainties in both exposure and hazard assessment are quantitatively characterized and probabilistically evaluated, resulting in a confidence interval for the final risk estimate. The methodology is applied to three example chemicals (aflatoxin, N-nitrosodimethylamine, and methyleugenol). These examples illustrate that the uncertainty in a cancer risk estimate may be huge, making single value estimates of cancer risk meaningless. Further, a risk based on linear extrapolation tends to be lower than the upper 95% confidence limit of a probabilistic risk estimate, and in that sense it is not conservative. Our conceptual analysis showed that there are two possible basic approaches for cancer risk assessment, depending on the interpretation of the dose-incidence data measured in animals. However, it remains unclear which of the two interpretations is the more adequate one, adding an additional uncertainty to the already huge confidence intervals for cancer risk estimates. © 2014 Society for Risk Analysis.

  10. 77 FR 53059 - Risk-Based Capital Guidelines: Market Risk

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-30

    ... complexity. The agencies have incorporated a number of changes into the final rule based on feedback received... would generate a risk-based capital requirement for a specific covered position or portfolio of covered positions that is not commensurate with the risks of the covered position or portfolio. In...

  11. 78 FR 43829 - Risk-Based Capital Guidelines; Market Risk

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-22

    ... CFR Parts 208 and 225 RIN 7100 AD-98 Risk-Based Capital Guidelines; Market Risk AGENCY: Board of... Governors of the Federal Reserve System (Board) proposes to revise its market risk capital rule (market risk... Organization for Economic Cooperation and Development (OECD), which are referenced in the Board's market risk...

  12. 78 FR 76521 - Risk-Based Capital Guidelines; Market Risk

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-18

    ... RIN 7100 AD-98 Risk-Based Capital Guidelines; Market Risk AGENCY: Board of Governors of the Federal...) is adopting a final rule that revises its market risk capital rule (market risk rule) to address... Cooperation and Development (OECD), which are referenced in the Board's market risk rule; to clarify the...

  13. Risk-Based Educational Accountability in Dutch Primary Education

    ERIC Educational Resources Information Center

    Timmermans, A. C.; de Wolf, I. F.; Bosker, R. J.; Doolaard, S.

    2015-01-01

    A recent development in educational accountability is a risk-based approach, in which intensity and frequency of school inspections vary across schools to make educational accountability more efficient and effective by enabling inspectorates to focus on organizations at risk. Characteristics relevant in predicting which schools are "at risk…

  14. Study of a risk-based piping inspection guideline system.

    PubMed

    Tien, Shiaw-Wen; Hwang, Wen-Tsung; Tsai, Chih-Hung

    2007-02-01

    A risk-based inspection system and a piping inspection guideline model were developed in this study. The research procedure consists of two parts--the building of a risk-based inspection model for piping and the construction of a risk-based piping inspection guideline model. Field visits at the plant were conducted to develop the risk-based inspection and strategic analysis system. A knowledge-based model had been built in accordance with international standards and local government regulations, and the rational unified process was applied for reducing the discrepancy in the development of the models. The models had been designed to analyze damage factors, damage models, and potential damage positions of piping in the petrochemical plants. The purpose of this study was to provide inspection-related personnel with the optimal planning tools for piping inspections, hence, to enable effective predictions of potential piping risks and to enhance the better degree of safety in plant operations that the petrochemical industries can be expected to achieve. A risk analysis was conducted on the piping system of a petrochemical plant. The outcome indicated that most of the risks resulted from a small number of pipelines.

  15. 13 CFR 120.1000 - Risk-Based Lender Oversight.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Risk-Based Lender Oversight. 120.1000 Section 120.1000 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Risk... Oversight. SBA supervises, examines, and regulates, and enforces laws against, SBA Supervised Lenders...

  16. Risk-Based Educational Accountability in Dutch Primary Education

    ERIC Educational Resources Information Center

    Timmermans, A. C.; de Wolf, I. F.; Bosker, R. J.; Doolaard, S.

    2015-01-01

    A recent development in educational accountability is a risk-based approach, in which intensity and frequency of school inspections vary across schools to make educational accountability more efficient and effective by enabling inspectorates to focus on organizations at risk. Characteristics relevant in predicting which schools are "at risk…

  17. Risks and Risk-Based Regulation in Higher Education Institutions

    ERIC Educational Resources Information Center

    Huber, Christian

    2009-01-01

    Risk-based regulation is a relatively new mode of governance. Not only does it offer a way of controlling institutions from the outside but it also provides the possibility of making an organisation's achievements visible/visualisable. This paper comments on a list of possible risks that higher education institutions have to face. In a second…

  18. Probabilistic Modeling of Rosette Formation

    PubMed Central

    Long, Mian; Chen, Juan; Jiang, Ning; Selvaraj, Periasamy; McEver, Rodger P.; Zhu, Cheng

    2006-01-01

    Rosetting, or forming a cell aggregate between a single target nucleated cell and a number of red blood cells (RBCs), is a simple assay for cell adhesion mediated by specific receptor-ligand interaction. For example, rosette formation between sheep RBC and human lymphocytes has been used to differentiate T cells from B cells. Rosetting assay is commonly used to determine the interaction of Fc γ-receptors (FcγR) expressed on inflammatory cells and IgG coated on RBCs. Despite its wide use in measuring cell adhesion, the biophysical parameters of rosette formation have not been well characterized. Here we developed a probabilistic model to describe the distribution of rosette sizes, which is Poissonian. The average rosette size is predicted to be proportional to the apparent two-dimensional binding affinity of the interacting receptor-ligand pair and their site densities. The model has been supported by experiments of rosettes mediated by four molecular interactions: FcγRIII interacting with IgG, T cell receptor and coreceptor CD8 interacting with antigen peptide presented by major histocompatibility molecule, P-selectin interacting with P-selectin glycoprotein ligand 1 (PSGL-1), and L-selectin interacting with PSGL-1. The latter two are structurally similar and are different from the former two. Fitting the model to data enabled us to evaluate the apparent effective two-dimensional binding affinity of the interacting molecular pairs: 7.19 × 10−5 μm4 for FcγRIII-IgG interaction, 4.66 × 10−3 μm4 for P-selectin-PSGL-1 interaction, and 0.94 × 10−3 μm4 for L-selectin-PSGL-1 interaction. These results elucidate the biophysical mechanism of rosette formation and enable it to become a semiquantitative assay that relates the rosette size to the effective affinity for receptor-ligand binding. PMID:16603493

  19. Risk-based prioritization methodology for the classification of groundwater pollution sources.

    PubMed

    Pizzol, Lisa; Zabeo, Alex; Critto, Andrea; Giubilato, Elisa; Marcomini, Antonio

    2015-02-15

    Water management is one of the EU environmental priorities and it is one of the most serious challenges that today's major cities are facing. The main European regulation for the protection of water resources is represented by the Water Framework Directive (WFD) and the Groundwater Directive (2006/118/EC) which require the identification, risk-based ranking and management of sources of pollution and the identification of those contamination sources that threaten the achievement of groundwater's good quality status. The aim of this paper is to present a new risk-based prioritization methodology to support the determination of a management strategy for the achievement of the good quality status of groundwater. The proposed methodology encompasses the following steps: 1) hazard analysis, 2) pathway analysis, 3) receptor vulnerability analysis and 4) relative risk estimation. Moreover, by integrating GIS functionalities and Multi Criteria Decision Analysis (MCDA) techniques, it allows to: i) deal with several sources and multiple impacted receptors within the area of concern; ii) identify different receptors' vulnerability levels according to specific groundwater uses; iii) assess the risks posed by all contamination sources in the area; and iv) provide a risk-based ranking of the contamination sources that can threaten the achievement of the groundwater good quality status. The application of the proposed framework to a well-known industrialized area located in the surroundings of Milan (Italy) is illustrated in order to demonstrate the effectiveness of the proposed framework in supporting the identification of intervention priorities. Among the 32 sources analyzed in the case study, three sources received the highest relevance score, due to the medium-high relative risks estimated for Chromium (VI) and Perchloroethylene. The case study application showed that the developed methodology is flexible and easy to adapt to different contexts, thanks to the possibility to

  20. Probabilistic Flood Mapping using Volunteered Geographical Information

    NASA Astrophysics Data System (ADS)

    Rivera, S. J.; Girons Lopez, M.; Seibert, J.; Minsker, B. S.

    2016-12-01

    Flood extent maps are widely used by decision makers and first responders to provide critical information that prevents economic impacts and the loss of human lives. These maps are usually obtained from sensory data and/or hydrologic models, which often have limited coverage in space and time. Recent developments in social media and communication technology have created a wealth of near-real-time, user-generated content during flood events in many urban areas, such as flooded locations, pictures of flooding extent and height, etc. These data could improve decision-making and response operations as events unfold. However, the integration of these data sources has been limited due to the need for methods that can extract and translate the data into useful information for decision-making. This study presents an approach that uses volunteer geographic information (VGI) and non-traditional data sources (i.e., Twitter, Flicker, YouTube, and 911 and 311 calls) to generate/update the flood extent maps in areas where no models and/or gauge data are operational. The approach combines Web-crawling and computer vision techniques to gather information about the location, extent, and water height of the flood from unstructured textual data, images, and videos. These estimates are then used to provide an updated flood extent map for areas surrounding the geo-coordinate of the VGI through the application of a Hydro Growing Region Algorithm (HGRA). HGRA combines hydrologic and image segmentation concepts to estimate a probabilistic flooding extent along the corresponding creeks. Results obtained for a case study in Austin, TX (i.e., 2015 Memorial Day flood) were comparable to those obtained by a calibrated hydrologic model and had good spatial correlation with flooding extents estimated by the Federal Emergency Management Agency (FEMA).

  1. Development and application of a probabilistic method for wildfire suppression cost modeling

    Treesearch

    Matthew P. Thompson; Jessica R. Haas; Mark A. Finney; David E. Calkin; Michael S. Hand; Mark J. Browne; Martin Halek; Karen C. Short; Isaac C. Grenfell

    2015-01-01

    Wildfire activity and escalating suppression costs continue to threaten the financial health of federal land management agencies. In order to minimize and effectively manage the cost of financial risk, agencies need the ability to quantify that risk. A fundamental aim of this research effort, therefore, is to develop a process for generating risk-based metrics for...

  2. Fuzzy logic and a risk-based graded approach for developing S/RIDs: An introduction

    SciTech Connect

    Wayland, J.R.

    1996-01-01

    A Standards/Requirements Identification Document (S/RID) is the set of expressed performance expectations, or standards, for a facility. Critical to the development of an integrated standards-based management is the identification of a set of necessary and sufficient standards from a selected set of standards/requirements. There is a need for a formal, rigorous selection process for the S/RIDs. This is the first of three reports that develop a fuzzy logic selection process. In this report the fundamentals of fuzzy logic are discussed as they apply to a risk-based graded approach.

  3. A pilot application of risk-based methods to establish in-service inspection priorities for nuclear components at Surry Unit 1 Nuclear Power Station

    SciTech Connect

    Vo, T.; Gore, B.; Simonen, F.; Doctor, S.

    1994-08-01

    As part of the Nondestructive Evaluation Reliability Program sponsored by the US Nuclear Regulatory Commission, the Pacific Northwest Laboratory is developing a method that uses risk-based approaches to establish in-service inspection plans for nuclear power plant components. This method uses probabilistic risk assessment (PRA) results and Failure Modes and Effects Analysis (FEMA) techniques to identify and prioritize the most risk-important systems and components for inspection. The Surry Nuclear Power Station Unit 1 was selected for pilot applications of this method. The specific systems addressed in this report are the reactor pressure vessel, the reactor coolant, the low-pressure injection, and the auxiliary feedwater. The results provide a risk-based ranking of components within these systems and relate the target risk to target failure probability values for individual components. These results will be used to guide the development of improved inspection plans for nuclear power plants. To develop inspection plans, the acceptable level of risk from structural failure for important systems and components will be apportioned as a small fraction (i.e., 5%) of the total PRA-estimated risk for core damage. This process will determine target (acceptable) risk and target failure probability values for individual components. Inspection requirements will be set at levels to assure that acceptable failure probabilistics are maintained.

  4. 12 CFR 652.65 - Risk-based capital stress test.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 6 2011-01-01 2011-01-01 false Risk-based capital stress test. 652.65 Section... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.65 Risk-based capital stress test. You will perform the risk-based capital stress test as described in summary form below and...

  5. 12 CFR 652.65 - Risk-based capital stress test.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 7 2012-01-01 2012-01-01 false Risk-based capital stress test. 652.65 Section... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.65 Risk-based capital stress test. You will perform the risk-based capital stress test as described in summary form below and...

  6. 12 CFR 652.65 - Risk-based capital stress test.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 7 2013-01-01 2013-01-01 false Risk-based capital stress test. 652.65 Section... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.65 Risk-based capital stress test. You will perform the risk-based capital stress test as described in summary form below and...

  7. 12 CFR 652.65 - Risk-based capital stress test.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 7 2014-01-01 2014-01-01 false Risk-based capital stress test. 652.65 Section... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.65 Risk-based capital stress test. You will perform the risk-based capital stress test as described in summary form below and...

  8. 12 CFR 652.65 - Risk-based capital stress test.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Risk-based capital stress test. 652.65 Section... CORPORATION FUNDING AND FISCAL AFFAIRS Risk-Based Capital Requirements § 652.65 Risk-based capital stress test. You will perform the risk-based capital stress test as described in summary form below and...

  9. Probabilistic Tsunami Hazard Assessment: the Seaside, Oregon Pilot Study

    NASA Astrophysics Data System (ADS)

    Gonzalez, F. I.; Geist, E. L.; Synolakis, C.; Titov, V. V.

    2004-12-01

    A pilot study of Seaside, Oregon is underway, to develop methodologies for probabilistic tsunami hazard assessments that can be incorporated into Flood Insurance Rate Maps (FIRMs) developed by FEMA's National Flood Insurance Program (NFIP). Current NFIP guidelines for tsunami hazard assessment rely on the science, technology and methodologies developed in the 1970s; although generally regarded as groundbreaking and state-of-the-art for its time, this approach is now superseded by modern methods that reflect substantial advances in tsunami research achieved in the last two decades. In particular, post-1990 technical advances include: improvements in tsunami source specification; improved tsunami inundation models; better computational grids by virtue of improved bathymetric and topographic databases; a larger database of long-term paleoseismic and paleotsunami records and short-term, historical earthquake and tsunami records that can be exploited to develop improved probabilistic methodologies; better understanding of earthquake recurrence and probability models. The NOAA-led U.S. National Tsunami Hazard Mitigation Program (NTHMP), in partnership with FEMA, USGS, NSF and Emergency Management and Geotechnical agencies of the five Pacific States, incorporates these advances into site-specific tsunami hazard assessments for coastal communities in Alaska, California, Hawaii, Oregon and Washington. NTHMP hazard assessment efforts currently focus on developing deterministic, "credible worst-case" scenarios that provide valuable guidance for hazard mitigation and emergency management. The NFIP focus, on the other hand, is on actuarial needs that require probabilistic hazard assessments such as those that characterize 100- and 500-year flooding events. There are clearly overlaps in NFIP and NTHMP objectives. NTHMP worst-case scenario assessments that include an estimated probability of occurrence could benefit the NFIP; NFIP probabilistic assessments of 100- and 500-yr

  10. bayesPop: Probabilistic Population Projections

    PubMed Central

    Ševčíková, Hana; Raftery, Adrian E.

    2016-01-01

    We describe bayesPop, an R package for producing probabilistic population projections for all countries. This uses probabilistic projections of total fertility and life expectancy generated by Bayesian hierarchical models. It produces a sample from the joint posterior predictive distribution of future age- and sex-specific population counts, fertility rates and mortality rates, as well as future numbers of births and deaths. It provides graphical ways of summarizing this information, including trajectory plots and various kinds of probabilistic population pyramids. An expression language is introduced which allows the user to produce the predictive distribution of a wide variety of derived population quantities, such as the median age or the old age dependency ratio. The package produces aggregated projections for sets of countries, such as UN regions or trading blocs. The methodology has been used by the United Nations to produce their most recent official population projections for all countries, published in the World Population Prospects. PMID:28077933

  11. Error Discounting in Probabilistic Category Learning

    PubMed Central

    Craig, Stewart; Lewandowsky, Stephan; Little, Daniel R.

    2011-01-01

    Some current theories of probabilistic categorization assume that people gradually attenuate their learning in response to unavoidable error. However, existing evidence for this error discounting is sparse and open to alternative interpretations. We report two probabilistic-categorization experiments that investigated error discounting by shifting feedback probabilities to new values after different amounts of training. In both experiments, responding gradually became less responsive to errors, and learning was slowed for some time after the feedback shift. Both results are indicative of error discounting. Quantitative modeling of the data revealed that adding a mechanism for error discounting significantly improved the fits of an exemplar-based and a rule-based associative learning model, as well as of a recency-based model of categorization. We conclude that error discounting is an important component of probabilistic learning. PMID:21355666

  12. Error discounting in probabilistic category learning.

    PubMed

    Craig, Stewart; Lewandowsky, Stephan; Little, Daniel R

    2011-05-01

    The assumption in some current theories of probabilistic categorization is that people gradually attenuate their learning in response to unavoidable error. However, existing evidence for this error discounting is sparse and open to alternative interpretations. We report 2 probabilistic-categorization experiments in which we investigated error discounting by shifting feedback probabilities to new values after different amounts of training. In both experiments, responding gradually became less responsive to errors, and learning was slowed for some time after the feedback shift. Both results were indicative of error discounting. Quantitative modeling of the data revealed that adding a mechanism for error discounting significantly improved the fits of an exemplar-based and a rule-based associative learning model, as well as of a recency-based model of categorization. We conclude that error discounting is an important component of probabilistic learning.

  13. A probabilistic approach to spectral graph matching.

    PubMed

    Egozi, Amir; Keller, Yosi; Guterman, Hugo

    2013-01-01

    Spectral Matching (SM) is a computationally efficient approach to approximate the solution of pairwise matching problems that are np-hard. In this paper, we present a probabilistic interpretation of spectral matching schemes and derive a novel Probabilistic Matching (PM) scheme that is shown to outperform previous approaches. We show that spectral matching can be interpreted as a Maximum Likelihood (ML) estimate of the assignment probabilities and that the Graduated Assignment (GA) algorithm can be cast as a Maximum a Posteriori (MAP) estimator. Based on this analysis, we derive a ranking scheme for spectral matchings based on their reliability, and propose a novel iterative probabilistic matching algorithm that relaxes some of the implicit assumptions used in prior works. We experimentally show our approaches to outperform previous schemes when applied to exhaustive synthetic tests as well as the analysis of real image sequences.

  14. bayesPop: Probabilistic Population Projections.

    PubMed

    Ševčíková, Hana; Raftery, Adrian E

    2016-12-01

    We describe bayesPop, an R package for producing probabilistic population projections for all countries. This uses probabilistic projections of total fertility and life expectancy generated by Bayesian hierarchical models. It produces a sample from the joint posterior predictive distribution of future age- and sex-specific population counts, fertility rates and mortality rates, as well as future numbers of births and deaths. It provides graphical ways of summarizing this information, including trajectory plots and various kinds of probabilistic population pyramids. An expression language is introduced which allows the user to produce the predictive distribution of a wide variety of derived population quantities, such as the median age or the old age dependency ratio. The package produces aggregated projections for sets of countries, such as UN regions or trading blocs. The methodology has been used by the United Nations to produce their most recent official population projections for all countries, published in the World Population Prospects.

  15. Probabilistic graphs using coupled random variables

    NASA Astrophysics Data System (ADS)

    Nelson, Kenric P.; Barbu, Madalina; Scannell, Brian J.

    2014-05-01

    Neural network design has utilized flexible nonlinear processes which can mimic biological systems, but has suffered from a lack of traceability in the resulting network. Graphical probabilistic models ground network design in probabilistic reasoning, but the restrictions reduce the expressive capability of each node making network designs complex. The ability to model coupled random variables using the calculus of nonextensive statistical mechanics provides a neural node design incorporating nonlinear coupling between input states while maintaining the rigor of probabilistic reasoning. A generalization of Bayes rule using the coupled product enables a single node to model correlation between hundreds of random variables. A coupled Markov random field is designed for the inferencing and classification of UCI's MLR `Multiple Features Data Set' such that thousands of linear correlation parameters can be replaced with a single coupling parameter with just a (3%, 4%) reduction in (classification, inference) performance.

  16. Probabilistic Cue Combination: Less is More

    PubMed Central

    Yurovsky, Daniel; Boyer, Ty W.; Smith, Linda B.; Yu, Chen

    2012-01-01

    Learning about the structure of the world requires learning probabilistic relationships: rules in which cues do not predict outcomes with certainty. However, in some cases, the ability to track probabilistic relationships is a handicap, leading adults to perform non-normatively in prediction tasks. For example, in the dilution effect, predictions made from the combination of two cues of different strengths are less accurate than those made from the stronger cue alone. Here we show that dilution is an adult problem; 11-month-old infants combine strong and weak predictors normatively. These results extend and add support for the less is more hypothesis: limited cognitive resources can lead children to represent probabilistic information differently from adults, and this difference in representation can have important downstream consequences for prediction. PMID:23432826

  17. Parallel computing for probabilistic fatigue analysis

    NASA Technical Reports Server (NTRS)

    Sues, Robert H.; Lua, Yuan J.; Smith, Mark D.

    1993-01-01

    This paper presents the results of Phase I research to investigate the most effective parallel processing software strategies and hardware configurations for probabilistic structural analysis. We investigate the efficiency of both shared and distributed-memory architectures via a probabilistic fatigue life analysis problem. We also present a parallel programming approach, the virtual shared-memory paradigm, that is applicable across both types of hardware. Using this approach, problems can be solved on a variety of parallel configurations, including networks of single or multiprocessor workstations. We conclude that it is possible to effectively parallelize probabilistic fatigue analysis codes; however, special strategies will be needed to achieve large-scale parallelism to keep large number of processors busy and to treat problems with the large memory requirements encountered in practice. We also conclude that distributed-memory architecture is preferable to shared-memory for achieving large scale parallelism; however, in the future, the currently emerging hybrid-memory architectures will likely be optimal.

  18. Degradation monitoring using probabilistic inference

    NASA Astrophysics Data System (ADS)

    Alpay, Bulent

    In order to increase safety and improve economy and performance in a nuclear power plant (NPP), the source and extent of component degradations should be identified before failures and breakdowns occur. It is also crucial for the next generation of NPPs, which are designed to have a long core life and high fuel burnup to have a degradation monitoring system in order to keep the reactor in a safe state, to meet the designed reactor core lifetime and to optimize the scheduled maintenance. Model-based methods are based on determining the inconsistencies between the actual and expected behavior of the plant, and use these inconsistencies for detection and diagnostics of degradations. By defining degradation as a random abrupt change from the nominal to a constant degraded state of a component, we employed nonlinear filtering techniques based on state/parameter estimation. We utilized a Bayesian recursive estimation formulation in the sequential probabilistic inference framework and constructed a hidden Markov model to represent a general physical system. By addressing the problem of a filter's inability to estimate an abrupt change, which is called the oblivious filter problem in nonlinear extensions of Kalman filtering, and the sample impoverishment problem in particle filtering, we developed techniques to modify filtering algorithms by utilizing additional data sources to improve the filter's response to this problem. We utilized a reliability degradation database that can be constructed from plant specific operational experience and test and maintenance reports to generate proposal densities for probable degradation modes. These are used in a multiple hypothesis testing algorithm. We then test samples drawn from these proposal densities with the particle filtering estimates based on the Bayesian recursive estimation formulation with the Metropolis Hastings algorithm, which is a well-known Markov chain Monte Carlo method (MCMC). This multiple hypothesis testing

  19. Probabilistic quantum teleportation via thermal entanglement

    NASA Astrophysics Data System (ADS)

    Fortes, Raphael; Rigolin, Gustavo

    2017-08-01

    We study the probabilistic (conditional) teleportation protocol when the entanglement needed for its implementation is given by thermal entanglement, i.e., when the entangled resource connecting Alice and Bob is an entangled mixed state described by the canonical ensemble density matrix. Specifically, the entangled resource we employ here is given by two interacting spin-1/2 systems (two qubits) in equilibrium with a thermal reservoir at temperature T . The interaction between the qubits is described by a Heisenberg-like Hamiltonian, encompassing the Ising, the X X , the X Y , the X X X , and X X Z models, with or without external fields. For all those models, we show analytically that the probabilistic protocol is exactly equal to the deterministic one whenever we have no external field. However, when we turn on the field, the probabilistic protocol outperforms the deterministic one in several interesting ways. Under certain scenarios, for example, the efficiency (average fidelity) of the probabilistic protocol is greater than the deterministic one and increases with increasing temperature, a counterintuitive behavior. We also show regimes in which the probabilistic protocol operates with relatively high success rates and, at the same time, with efficiency greater than the classical limit 2 /3 , a threshold that cannot be surpassed by any protocol using only classical resources (no entanglement shared between Alice and Bob). The deterministic protocol's efficiency under the same conditions is below 2 /3 , highlighting that the probabilistic protocol is the only one yielding a genuine quantum teleportation. We also show that near the quantum critical points for almost all those models the qualitative and quantitative behaviors of the efficiency change considerably, even at finite T .

  20. Probabilistic Learning by Rodent Grid Cells

    PubMed Central

    Cheung, Allen

    2016-01-01

    Mounting evidence shows mammalian brains are probabilistic computers, but the specific cells involved remain elusive. Parallel research suggests that grid cells of the mammalian hippocampal formation are fundamental to spatial cognition but their diverse response properties still defy explanation. No plausible model exists which explains stable grids in darkness for twenty minutes or longer, despite being one of the first results ever published on grid cells. Similarly, no current explanation can tie together grid fragmentation and grid rescaling, which show very different forms of flexibility in grid responses when the environment is varied. Other properties such as attractor dynamics and grid anisotropy seem to be at odds with one another unless additional properties are assumed such as a varying velocity gain. Modelling efforts have largely ignored the breadth of response patterns, while also failing to account for the disastrous effects of sensory noise during spatial learning and recall, especially in darkness. Here, published electrophysiological evidence from a range of experiments are reinterpreted using a novel probabilistic learning model, which shows that grid cell responses are accurately predicted by a probabilistic learning process. Diverse response properties of probabilistic grid cells are statistically indistinguishable from rat grid cells across key manipulations. A simple coherent set of probabilistic computations explains stable grid fields in darkness, partial grid rescaling in resized arenas, low-dimensional attractor grid cell dynamics, and grid fragmentation in hairpin mazes. The same computations also reconcile oscillatory dynamics at the single cell level with attractor dynamics at the cell ensemble level. Additionally, a clear functional role for boundary cells is proposed for spatial learning. These findings provide a parsimonious and unified explanation of grid cell function, and implicate grid cells as an accessible neuronal population

  1. Probabilistic Learning by Rodent Grid Cells.

    PubMed

    Cheung, Allen

    2016-10-01

    Mounting evidence shows mammalian brains are probabilistic computers, but the specific cells involved remain elusive. Parallel research suggests that grid cells of the mammalian hippocampal formation are fundamental to spatial cognition but their diverse response properties still defy explanation. No plausible model exists which explains stable grids in darkness for twenty minutes or longer, despite being one of the first results ever published on grid cells. Similarly, no current explanation can tie together grid fragmentation and grid rescaling, which show very different forms of flexibility in grid responses when the environment is varied. Other properties such as attractor dynamics and grid anisotropy seem to be at odds with one another unless additional properties are assumed such as a varying velocity gain. Modelling efforts have largely ignored the breadth of response patterns, while also failing to account for the disastrous effects of sensory noise during spatial learning and recall, especially in darkness. Here, published electrophysiological evidence from a range of experiments are reinterpreted using a novel probabilistic learning model, which shows that grid cell responses are accurately predicted by a probabilistic learning process. Diverse response properties of probabilistic grid cells are statistically indistinguishable from rat grid cells across key manipulations. A simple coherent set of probabilistic computations explains stable grid fields in darkness, partial grid rescaling in resized arenas, low-dimensional attractor grid cell dynamics, and grid fragmentation in hairpin mazes. The same computations also reconcile oscillatory dynamics at the single cell level with attractor dynamics at the cell ensemble level. Additionally, a clear functional role for boundary cells is proposed for spatial learning. These findings provide a parsimonious and unified explanation of grid cell function, and implicate grid cells as an accessible neuronal population

  2. Risk-based decision making for terrorism applications.

    PubMed

    Dillon, Robin L; Liebe, Robert M; Bestafka, Thomas

    2009-03-01

    This article describes the anti-terrorism risk-based decision aid (ARDA), a risk-based decision-making approach for prioritizing anti-terrorism measures. The ARDA model was developed as part of a larger effort to assess investments for protecting U.S. Navy assets at risk and determine whether the most effective anti-terrorism alternatives are being used to reduce the risk to the facilities and war-fighting assets. With ARDA and some support from subject matter experts, we examine thousands of scenarios composed of 15 attack modes against 160 facility types on two installations and hundreds of portfolios of 22 mitigation alternatives. ARDA uses multiattribute utility theory to solve some of the commonly identified challenges in security risk analysis. This article describes the process and documents lessons learned from applying the ARDA model for this application.

  3. Risk-based approach to petroleum hydrocarbon remediation. Research study

    SciTech Connect

    Miller, R.N.; Haas, P.; Faile, M.; Taffinder, S.

    1994-12-31

    The risk-based approach utilizes tools developed under the BTEX, Intrinsic Remediation (natural attenuation), Bioslurper, and Bioventing Initiatives of the Air Force Center for Environmental Excellence Technology Transfer Division (AFCEE/ERT) to construct a risk-based cost-effective approach to the cleanup of petroleum contaminated sites. The AFCEE Remediation Matrix (Enclosure 1) identifies natural attenuation as the first remediation alternative for soil and ground water contaminated with petroleum hydrocarbons. The intrinsic remediation (natural attenuation) alternative requires a scientifically defensible risk assessment based on contaminant sources, pathways, and receptors. For fuel contaminated sites, the first step is to determine contaminants of interest. For the ground water pathway (usually considered most important by regulators), this will normally be the most soluble, mobile, and toxic compounds, namely benzene, toluene, ethyl benzene, and o, m, p, xylene (BTEX).

  4. Coupling risk-based remediation with innovative technology

    SciTech Connect

    Goodheart, G.F.; Teaf, C.M. |; Manning, M.J.

    1998-05-01

    Tiered risk-based cleanup approaches have been effectively used at petroleum sites, pesticide sites and other commercial/industrial facilities. For example, the Illinois Environmental Protection Agency (IEPA) has promulgated guidance for a Tiered Approach to Corrective action Objectives (TACO) to establish site-specific remediation goals for contaminated soil and groundwater. As in the case of many other state programs, TACO is designed to provide for adequate protection of human health and the environment based on potential risks posed by site conditions. It also incorporates site-related information that may allow more cost-effective remediation. IEPA developed TACO to provide flexibility to site owners/operators when formulating site-specific remediation activities, as well as to hasten property redevelopment to return sites to more productive use. Where appropriate, risk-based cleanup objectives as set by TACO-type programs may be coupled with innovative remediation technologies such as air sparging, bioremediation and soil washing.

  5. Probabilistic flood damage modelling at the meso-scale

    NASA Astrophysics Data System (ADS)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2014-05-01

    Decisions on flood risk management and adaptation are usually based on risk analyses. Such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments. Most damage models have in common that complex damaging processes are described by simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood damage models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we show how the model BT-FLEMO (Bagging decision Tree based Flood Loss Estimation MOdel) can be applied on the meso-scale, namely on the basis of ATKIS land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany. The application of BT-FLEMO provides a probability distribution of estimated damage to residential buildings per municipality. Validation is undertaken on the one hand via a comparison with eight other damage models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official damage data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of damage estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation model BT-FLEMO is that it inherently provides quantitative information about the uncertainty of the prediction. Reference: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64.

  6. Probabilistic fatigue life prediction of metallic and composite materials

    NASA Astrophysics Data System (ADS)

    Xiang, Yibing

    Fatigue is one of the most common failure modes for engineering structures, such as aircrafts, rotorcrafts and aviation transports. Both metallic materials and composite materials are widely used and affected by fatigue damage. Huge uncertainties arise from material properties, measurement noise, imperfect models, future anticipated loads and environmental conditions. These uncertainties are critical issues for accurate remaining useful life (RUL) prediction for engineering structures in service. Probabilistic fatigue prognosis considering various uncertainties is of great importance for structural safety. The objective of this study is to develop probabilistic fatigue life prediction models for metallic materials and composite materials. A fatigue model based on crack growth analysis and equivalent initial flaw size concept is proposed for metallic materials. Following this, the developed model is extended to include structural geometry effects (notch effect), environmental effects (corroded specimens) and manufacturing effects (shot peening effects). Due to the inhomogeneity and anisotropy, the fatigue model suitable for metallic materials cannot be directly applied to composite materials. A composite fatigue model life prediction is proposed based on a mixed-mode delamination growth model and a stiffness degradation law. After the development of deterministic fatigue models of metallic and composite materials, a general probabilistic life prediction methodology is developed. The proposed methodology combines an efficient Inverse First-Order Reliability Method (IFORM) for the uncertainty propogation in fatigue life prediction. An equivalent stresstransformation has been developed to enhance the computational efficiency under realistic random amplitude loading. A systematical reliability-based maintenance optimization framework is proposed for fatigue risk management and mitigation of engineering structures.

  7. A~probabilistic tsunami hazard assessment for Indonesia

    NASA Astrophysics Data System (ADS)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-05-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence based decision making on risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean Tsunami, but this has been largely concentrated on the Sunda Arc, with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent Probabilistic Tsunami Hazard Assessment (PTHA) for Indonesia. This assessment produces time independent forecasts of tsunami hazard at the coast from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte-carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and through sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting larger maximum magnitudes along the Sunda Arc. The annual probability of experiencing a tsunami with a height at the coast of > 0.5 m is greater than 10% for Sumatra, Java, the Sunda Islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of >3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  8. Ensemble Flow Forecasts for Risk Based Reservoir Operations of Lake Mendocino in Mendocino County, California

    NASA Astrophysics Data System (ADS)

    Delaney, C.; Hartman, R. K.; Mendoza, J.; Evans, K. M.; Evett, S.

    2016-12-01

    Forecast informed reservoir operations (FIRO) is a methodology that incorporates short to mid-range precipitation or flow forecasts to inform the flood operations of reservoirs. Previous research and modeling for flood control reservoirs has shown that FIRO can reduce flood risk and increase water supply for many reservoirs. The risk-based method of FIRO presents a unique approach that incorporates flow forecasts made by NOAA's California-Nevada River Forecast Center (CNRFC) to model and assess risk of meeting or exceeding identified management targets or thresholds. Forecasted risk is evaluated against set risk tolerances to set reservoir flood releases. A water management model was developed for Lake Mendocino, a 116,500 acre-foot reservoir located near Ukiah, California. Lake Mendocino is a dual use reservoir, which is owned and operated for flood control by the United State Army Corps of Engineers and is operated by the Sonoma County Water Agency for water supply. Due to recent changes in the operations of an upstream hydroelectric facility, this reservoir has been plagued with water supply reliability issues since 2007. FIRO is applied to Lake Mendocino by simulating daily hydrologic conditions from 1985 to 2010 in the Upper Russian River from Lake Mendocino to the City of Healdsburg approximately 50 miles downstream. The risk-based method is simulated using a 15-day, 61 member streamflow hindcast by the CNRFC. Model simulation results of risk-based flood operations demonstrate a 23% increase in average end of water year (September 30) storage levels over current operations. Model results show no increase in occurrence of flood damages for points downstream of Lake Mendocino. This investigation demonstrates that FIRO may be a viable flood control operations approach for Lake Mendocino and warrants further investigation through additional modeling and analysis.

  9. A Probabilistic Approach to Aeropropulsion System Assessment

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.

    2000-01-01

    A probabilistic approach is described for aeropropulsion system assessment. To demonstrate this approach, the technical performance of a wave rotor-enhanced gas turbine engine (i.e. engine net thrust, specific fuel consumption, and engine weight) is assessed. The assessment accounts for the uncertainties in component efficiencies/flows and mechanical design variables, using probability distributions. The results are presented in the form of cumulative distribution functions (CDFs) and sensitivity analyses, and are compared with those from the traditional deterministic approach. The comparison shows that the probabilistic approach provides a more realistic and systematic way to assess an aeropropulsion system.

  10. The probabilistic approach to human reasoning.

    PubMed

    Oaksford, M; Chater, N

    2001-08-01

    A recent development in the cognitive science of reasoning has been the emergence of a probabilistic approach to the behaviour observed on ostensibly logical tasks. According to this approach the errors and biases documented on these tasks occur because people import their everyday uncertain reasoning strategies into the laboratory. Consequently participants' apparently irrational behaviour is the result of comparing it with an inappropriate logical standard. In this article, we contrast the probabilistic approach with other approaches to explaining rationality, and then show how it has been applied to three main areas of logical reasoning: conditional inference, Wason's selection task and syllogistic reasoning.

  11. A Probabilistic Approach to Aeropropulsion System Assessment

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.

    2000-01-01

    A probabilistic approach is described for aeropropulsion system assessment. To demonstrate this approach, the technical performance of a wave rotor-enhanced gas turbine engine (i.e. engine net thrust, specific fuel consumption, and engine weight) is assessed. The assessment accounts for the uncertainties in component efficiencies/flows and mechanical design variables, using probability distributions. The results are presented in the form of cumulative distribution functions (CDFS) and sensitivity analyses, and are compared with those from the traditional deterministic approach. The comparison shows that the probabilistic approach provides a more realistic and systematic way to assess an aeropropulsion system.

  12. A Probabilistic Approach to Aeropropulsion System Assessment

    NASA Technical Reports Server (NTRS)

    Tong, Michael T.

    1999-01-01

    A probabilistic approach is described for aeropropulsion system assessment. To demonstrate this approach, the technical performance of a wave rotor-enhanced gas turbine engine (i.e. engine net thrust, specific fuel consumption, and engine weight) is assessed. The assessment accounts for the uncertainties in component efficiencies/flows and mechanical design variables, using probability distributions. The results are presented in the form of cumulative distribution functions (CDFs) and sensitivity analyses, and are compared with those from the traditional deterministic approach. The comparison shows that the probabilistic approach provides a more realistic and systematic way to assess an aeropropulsion system.

  13. Probabilistic assessment of uncertain adaptive hybrid composites

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Singhal, Surendra N.; Chamis, Christos C.

    1994-01-01

    Adaptive composite structures using actuation materials, such as piezoelectric fibers, were assessed probabilistically utilizing intraply hybrid composite mechanics in conjunction with probabilistic composite structural analysis. Uncertainties associated with the actuation material as well as the uncertainties in the regular (traditional) composite material properties were quantified and considered in the assessment. Static and buckling analyses were performed for rectangular panels with various boundary conditions and different control arrangements. The probability density functions of the structural behavior, such as maximum displacement and critical buckling load, were computationally simulated. The results of the assessment indicate that improved design and reliability can be achieved with actuation material.

  14. Towards Probabilistic Modelling in Event-B

    NASA Astrophysics Data System (ADS)

    Tarasyuk, Anton; Troubitsyna, Elena; Laibinis, Linas

    Event-B provides us with a powerful framework for correct-by-construction system development. However, while developing dependable systems we should not only guarantee their functional correctness but also quantitatively assess their dependability attributes. In this paper we investigate how to conduct probabilistic assessment of reliability of control systems modeled in Event-B. We show how to transform an Event-B model into a Markov model amendable for probabilistic reliability analysis. Our approach enables integration of reasoning about correctness with quantitative analysis of reliability.

  15. Probabilistic Signal Recovery and Random Matrices

    DTIC Science & Technology

    2016-12-08

    AFRL-AFOSR-VA-TR-2016-0369 Probabilistic Signal Recovery and Random Matrices Roman Vershynin UNIVERSITY OF MICHIGAN Final Report 12/08/2016...SUBTITLE Probabilistic Signal Recovery and Random Matrices 5a.  CONTRACT NUMBER 5b.  GRANT NUMBER FA9550-14-1-0009 5c.  PROGRAM ELEMENT NUMBER 61102F 6...computing the permanents of matrices with non-negative entries. In computational graph theory, we studied a randomized algorithm for estimating the number of

  16. Why are probabilistic laws governing quantum mechanics and neurobiology?

    NASA Astrophysics Data System (ADS)

    Kröger, Helmut

    2005-08-01

    We address the question: Why are dynamical laws governing in quantum mechanics and in neuroscience of probabilistic nature instead of being deterministic? We discuss some ideas showing that the probabilistic option offers advantages over the deterministic one.

  17. Decision making in flood risk based storm sewer network design.

    PubMed

    Sun, S A; Djordjević, S; Khu, S T

    2011-01-01

    It is widely recognised that flood risk needs to be taken into account when designing a storm sewer network. Flood risk is generally a combination of flood consequences and flood probabilities. This paper aims to explore the decision making in flood risk based storm sewer network design. A multiobjective optimization is proposed to find the Pareto front of optimal designs in terms of low construction cost and low flood risk. The decision making process then follows this multi-objective optimization to select a best design from the Pareto front. The traditional way of designing a storm sewer system based on a predefined design storm is used as one of the decision making criteria. Additionally, three commonly used risk based criteria, i.e., the expected flood risk based criterion, the Hurwicz criterion and the stochastic dominance based criterion, are investigated and applied in this paper. Different decisions are made according to different criteria as a result of different concerns represented by the criteria. The proposed procedure is applied to a simple storm sewer network design to demonstrate its effectiveness and the different criteria are compared.

  18. Game Theory and Risk-Based Levee System Design

    NASA Astrophysics Data System (ADS)

    Hui, R.; Lund, J. R.; Madani, K.

    2014-12-01

    Risk-based analysis has been developed for optimal levee design for economic efficiency. Along many rivers, two levees on opposite riverbanks act as a simple levee system. Being rational and self-interested, land owners on each river bank would tend to independently optimize their levees with risk-based analysis, resulting in a Pareto-inefficient levee system design from the social planner's perspective. Game theory is applied in this study to analyze decision making process in a simple levee system in which the land owners on each river bank develop their design strategies using risk-based economic optimization. For each land owner, the annual expected total cost includes expected annual damage cost and annualized construction cost. The non-cooperative Nash equilibrium is identified and compared to the social planner's optimal distribution of flood risk and damage cost throughout the system which results in the minimum total flood cost for the system. The social planner's optimal solution is not feasible without appropriate level of compensation for the transferred flood risk to guarantee and improve conditions for all parties. Therefore, cooperative game theory is then employed to develop an economically optimal design that can be implemented in practice. By examining the game in the reversible and irreversible decision making modes, the cost of decision making myopia is calculated to underline the significance of considering the externalities and evolution path of dynamic water resource problems for optimal decision making.

  19. Risk-based planning analysis for a single levee

    NASA Astrophysics Data System (ADS)

    Hui, Rui; Jachens, Elizabeth; Lund, Jay

    2016-04-01

    Traditional risk-based analysis for levee planning focuses primarily on overtopping failure. Although many levees fail before overtopping, few planning studies explicitly include intermediate geotechnical failures in flood risk analysis. This study develops a risk-based model for two simplified levee failure modes: overtopping failure and overall intermediate geotechnical failure from through-seepage, determined by the levee cross section represented by levee height and crown width. Overtopping failure is based only on water level and levee height, while through-seepage failure depends on many geotechnical factors as well, mathematically represented here as a function of levee crown width using levee fragility curves developed from professional judgment or analysis. These levee planning decisions are optimized to minimize the annual expected total cost, which sums expected (residual) annual flood damage and annualized construction costs. Applicability of this optimization approach to planning new levees or upgrading existing levees is demonstrated preliminarily for a levee on a small river protecting agricultural land, and a major levee on a large river protecting a more valuable urban area. Optimized results show higher likelihood of intermediate geotechnical failure than overtopping failure. The effects of uncertainty in levee fragility curves, economic damage potential, construction costs, and hydrology (changing climate) are explored. Optimal levee crown width is more sensitive to these uncertainties than height, while the derived general principles and guidelines for risk-based optimal levee planning remain the same.

  20. Impact of Probabilistic Weather on Flight Routing Decisions

    NASA Technical Reports Server (NTRS)

    Sheth, Kapil; Sridhar, Banavar; Mulfinger, Daniel

    2006-01-01

    Flight delays in the United States have been found to increase year after year, along with the increase in air traffic. During the four-month period from May through August of 2005, weather related delays accounted for roughly 70% of all reported delays, The current weather prediction in tactical (within 2 hours) timeframe is at manageable levels, however, the state of forecasting weather for strategic (2-6 hours) timeframe is still not dependable for long-term planning. In the absence of reliable severe weather forecasts, the decision-making for flights longer than two hours is challenging. This paper deals with an approach of using probabilistic weather prediction for Traffic Flow Management use, and a general method using this prediction for estimating expected values of flight length and delays in the National Airspace System (NAS). The current state-of-the-art convective weather forecasting is employed to aid the decision makers in arriving at decisions for traffic flow and flight planing. The six-agency effort working on the Next Generation Air Transportation System (NGATS) have considered weather-assimilated decision-making as one of the principal foci out of a list of eight. The weather Integrated Product Team has considered integrated weather information and improved aviation weather forecasts as two of the main efforts (Ref. 1, 2). Recently, research has focused on the concept of operations for strategic traffic flow management (Ref. 3) and how weather data can be integrated for improved decision-making for efficient traffic management initiatives (Ref. 4, 5). An overview of the weather data needs and benefits of various participants in the air traffic system along with available products can be found in Ref. 6. Previous work related to use of weather data in identifying and categorizing pilot intrusions into severe weather regions (Ref. 7, 8) has demonstrated a need for better forecasting in the strategic planning timeframes and moving towards a

  1. Impact of Probabilistic Weather on Flight Routing Decisions

    NASA Technical Reports Server (NTRS)

    Sheth, Kapil; Sridhar, Banavar; Mulfinger, Daniel

    2006-01-01

    Flight delays in the United States have been found to increase year after year, along with the increase in air traffic. During the four-month period from May through August of 2005, weather related delays accounted for roughly 70% of all reported delays, The current weather prediction in tactical (within 2 hours) timeframe is at manageable levels, however, the state of forecasting weather for strategic (2-6 hours) timeframe is still not dependable for long-term planning. In the absence of reliable severe weather forecasts, the decision-making for flights longer than two hours is challenging. This paper deals with an approach of using probabilistic weather prediction for Traffic Flow Management use, and a general method using this prediction for estimating expected values of flight length and delays in the National Airspace System (NAS). The current state-of-the-art convective weather forecasting is employed to aid the decision makers in arriving at decisions for traffic flow and flight planing. The six-agency effort working on the Next Generation Air Transportation System (NGATS) have considered weather-assimilated decision-making as one of the principal foci out of a list of eight. The weather Integrated Product Team has considered integrated weather information and improved aviation weather forecasts as two of the main efforts (Ref. 1, 2). Recently, research has focused on the concept of operations for strategic traffic flow management (Ref. 3) and how weather data can be integrated for improved decision-making for efficient traffic management initiatives (Ref. 4, 5). An overview of the weather data needs and benefits of various participants in the air traffic system along with available products can be found in Ref. 6. Previous work related to use of weather data in identifying and categorizing pilot intrusions into severe weather regions (Ref. 7, 8) has demonstrated a need for better forecasting in the strategic planning timeframes and moving towards a

  2. Risk-Based Prioritization Method for the Classification of Groundwater Pollution from Hazardous Waste Landfills.

    PubMed

    Yang, Yu; Jiang, Yong-Hai; Lian, Xin-Ying; Xi, Bei-Dou; Ma, Zhi-Fei; Xu, Xiang-Jian; An, Da

    2016-12-01

    Hazardous waste landfill sites are a significant source of groundwater pollution. To ensure that these landfills with a significantly high risk of groundwater contamination are properly managed, a risk-based ranking method related to groundwater contamination is needed. In this research, a risk-based prioritization method for the classification of groundwater pollution from hazardous waste landfills was established. The method encompasses five phases, including risk pre-screening, indicator selection, characterization, classification and, lastly, validation. In the risk ranking index system employed here, 14 indicators involving hazardous waste landfills and migration in the vadose zone as well as aquifer were selected. The boundary of each indicator was determined by K-means cluster analysis and the weight of each indicator was calculated by principal component analysis. These methods were applied to 37 hazardous waste landfills in China. The result showed that the risk for groundwater contamination from hazardous waste landfills could be ranked into three classes from low to high risk. In all, 62.2 % of the hazardous waste landfill sites were classified in the low and medium risk classes. The process simulation method and standardized anomalies were used to validate the result of risk ranking; the results were consistent with the simulated results related to the characteristics of contamination. The risk ranking method was feasible, valid and can provide reference data related to risk management for groundwater contamination at hazardous waste landfill sites.

  3. Risk-Based Prioritization Method for the Classification of Groundwater Pollution from Hazardous Waste Landfills

    NASA Astrophysics Data System (ADS)

    Yang, Yu; Jiang, Yong-Hai; lian, Xin-Ying; Xi, Bei-Dou; Ma, Zhi-fei; Xu, Xiang-Jian; An, Da

    2016-12-01

    Hazardous waste landfill sites are a significant source of groundwater pollution. To ensure that these landfills with a significantly high risk of groundwater contamination are properly managed, a risk-based ranking method related to groundwater contamination is needed. In this research, a risk-based prioritization method for the classification of groundwater pollution from hazardous waste landfills was established. The method encompasses five phases, including risk pre-screening, indicator selection, characterization, classification and, lastly, validation. In the risk ranking index system employed here, 14 indicators involving hazardous waste landfills and migration in the vadose zone as well as aquifer were selected. The boundary of each indicator was determined by K-means cluster analysis and the weight of each indicator was calculated by principal component analysis. These methods were applied to 37 hazardous waste landfills in China. The result showed that the risk for groundwater contamination from hazardous waste landfills could be ranked into three classes from low to high risk. In all, 62.2 % of the hazardous waste landfill sites were classified in the low and medium risk classes. The process simulation method and standardized anomalies were used to validate the result of risk ranking; the results were consistent with the simulated results related to the characteristics of contamination. The risk ranking method was feasible, valid and can provide reference data related to risk management for groundwater contamination at hazardous waste landfill sites.

  4. Risk-based analyses in support of California hazardous site remediation

    SciTech Connect

    Ringland, J.T.

    1995-08-01

    The California Environmental Enterprise (CEE) is a joint program of the Department of Energy (DOE), Lawrence Livermore National Laboratory, Lawrence Berkeley Laboratory, and Sandia National Laboratories. Its goal is to make DOE laboratory expertise accessible to hazardous site cleanups in the state This support might involve working directly with parties responsible for individual cleanups or it might involve working with the California Environmental Protection Agency to develop tools that would be applicable across a broad range of sites. As part of its initial year`s activities, the CEE supported a review to examine where laboratory risk and risk-based systems analysis capabilities might be most effectively applied. To this end, this study draws the following observations. The labs have a clear role in analyses supporting the demonstration and transfer of laboratory characterization or remediation technologies. The labs may have opportunities in developing broadly applicable analysis tools and computer codes for problems such as site characterization or efficient management of resources. Analysis at individual sites, separate from supporting lab technologies or prototyping general tools, may be appropriate only in limited circumstances. In any of these roles, the labs` capabilities extend beyond health risk assessment to the broader areas of risk management and risk-based systems analysis.

  5. Development of a Risk-Based Comparison Methodology of Carbon Capture Technologies

    SciTech Connect

    Engel, David W.; Dalton, Angela C.; Dale, Crystal; Thompson, Julie; Leclaire, Rene; Edward, Bryan; Jones, Edward

    2014-06-01

    Given the varying degrees of maturity among existing carbon capture (CC) technology alternatives, an understanding of the inherent technical and financial risk and uncertainty associated with these competing technologies is requisite to the success of carbon capture as a viable solution to the greenhouse gas emission challenge. The availability of tools and capabilities to conduct rigorous, risk–based technology comparisons is thus highly desirable for directing valuable resources toward the technology option(s) with a high return on investment, superior carbon capture performance, and minimum risk. To address this research need, we introduce a novel risk-based technology comparison method supported by an integrated multi-domain risk model set to estimate risks related to technological maturity, technical performance, and profitability. Through a comparison between solid sorbent and liquid solvent systems, we illustrate the feasibility of estimating risk and quantifying uncertainty in a single domain (modular analytical capability) as well as across multiple risk dimensions (coupled analytical capability) for comparison. This method brings technological maturity and performance to bear on profitability projections, and carries risk and uncertainty modeling across domains via inter-model sharing of parameters, distributions, and input/output. The integration of the models facilitates multidimensional technology comparisons within a common probabilistic risk analysis framework. This approach and model set can equip potential technology adopters with the necessary computational capabilities to make risk-informed decisions about CC technology investment. The method and modeling effort can also be extended to other industries where robust tools and analytical capabilities are currently lacking for evaluating nascent technologies.

  6. Application of risk-based methods to inservice testing of check valves

    SciTech Connect

    Closky, N.B.; Balkey, K.R.; McAllister, W.J.

    1996-12-01

    Research efforts have been underway in the American Society of Mechanical Engineers (ASME) and industry to define appropriate methods for the application of risk-based technology in the development of inservice testing (IST) programs for pumps and valves in nuclear steam supply systems. This paper discusses a pilot application of these methods to the inservice testing of check valves in the emergency core cooling system of Georgia Power`s Vogtle nuclear power station. The results of the probabilistic safety assessment (PSA) are used to divide the check valves into risk-significant and less-risk-significant groups. This information is reviewed by a plant expert panel along with the consideration of appropriate deterministic insights to finally categorize the check valves into more safety-significant and less safety-significant component groups. All of the more safety-significant check valves are further evaluated in detail using a failure modes and causes analysis (FMCA) to assist in defining effective IST strategies. A template has been designed to evaluate how effective current and emerging tests for check valves are in detecting failures or in finding significant conditions that are precursors to failure for the likely failure causes. This information is then used to design and evaluate appropriate IST strategies that consider both the test method and frequency. A few of the less safety-significant check valves are also evaluated using this process since differences exist in check valve design, function, and operating conditions. Appropriate test strategies are selected for each check valve that has been evaluated based on safety and cost considerations. Test strategies are inferred from this information for the other check valves based on similar check valve conditions. Sensitivity studies are performed using the PSA model to arrive at an overall IST program that maintains or enhances safety at the lowest achievable cost.

  7. Receptor-Specific Modulation of Risk-Based Decision Making by Nucleus Accumbens Dopamine

    PubMed Central

    Stopper, Colin M; Khayambashi, Shahin; Floresco, Stan B

    2013-01-01

    The nucleus accumbens (NAc) serves as an integral node within cortico-limbic circuitry that regulates various forms of cost–benefit decision making. The dopamine (DA) system has also been implicated in enabling organisms to overcome a variety of costs to obtain more valuable rewards. However, it remains unclear how DA activity within the NAc may regulate decision making involving reward uncertainty. This study investigated the contribution of different DA receptor subtypes in the NAc to risk-based decision making, assessed with a probabilistic discounting task. In well-trained rats, D1 receptor blockade with SCH 23 390 decreased preference for larger, uncertain rewards, which was associated with enhanced negative-feedback sensitivity (ie, an increased tendency to select a smaller/certain option after an unrewarded risky choice). Treatment with a D1 agonist (SKF 81 297) optimized decision making, increasing choice of the risky option when reward probability was high, and decreasing preference under low probability conditions. In stark contrast, neither blockade of NAc D2 receptors with eticlopride, nor stimulation of these receptors with quinpirole or bromocriptine influenced risky choice. In comparison, infusion of the D3-preferring agonist PD 128 907 decreased reward sensitivity and risky choice. Collectively, these results show that mesoaccumbens DA refines risk–reward decision biases via dissociable mechanisms recruiting D1 and D3, but not D2 receptors. D1 receptor activity mitigates the effect of reward omissions on subsequent choices to promote selection of reward options that may have greater long-term utility, whereas excessive D3 receptor activity blunts the impact that larger/uncertain rewards have in promoting riskier choices. PMID:23303055

  8. A Novel Biobjective Risk-Based Model for Stochastic Air Traffic Network Flow Optimization Problem

    PubMed Central

    Cai, Kaiquan; Jia, Yaoguang; Zhu, Yanbo; Xiao, Mingming

    2015-01-01

    Network-wide air traffic flow management (ATFM) is an effective way to alleviate demand-capacity imbalances globally and thereafter reduce airspace congestion and flight delays. The conventional ATFM models assume the capacities of airports or airspace sectors are all predetermined. However, the capacity uncertainties due to the dynamics of convective weather may make the deterministic ATFM measures impractical. This paper investigates the stochastic air traffic network flow optimization (SATNFO) problem, which is formulated as a weighted biobjective 0-1 integer programming model. In order to evaluate the effect of capacity uncertainties on ATFM, the operational risk is modeled via probabilistic risk assessment and introduced as an extra objective in SATNFO problem. Computation experiments using real-world air traffic network data associated with simulated weather data show that presented model has far less constraints compared to stochastic model with nonanticipative constraints, which means our proposed model reduces the computation complexity. PMID:26180842

  9. A Novel Biobjective Risk-Based Model for Stochastic Air Traffic Network Flow Optimization Problem.

    PubMed

    Cai, Kaiquan; Jia, Yaoguang; Zhu, Yanbo; Xiao, Mingming

    2015-01-01

    Network-wide air traffic flow management (ATFM) is an effective way to alleviate demand-capacity imbalances globally and thereafter reduce airspace congestion and flight delays. The conventional ATFM models assume the capacities of airports or airspace sectors are all predetermined. However, the capacity uncertainties due to the dynamics of convective weather may make the deterministic ATFM measures impractical. This paper investigates the stochastic air traffic network flow optimization (SATNFO) problem, which is formulated as a weighted biobjective 0-1 integer programming model. In order to evaluate the effect of capacity uncertainties on ATFM, the operational risk is modeled via probabilistic risk assessment and introduced as an extra objective in SATNFO problem. Computation experiments using real-world air traffic network data associated with simulated weather data show that presented model has far less constraints compared to stochastic model with nonanticipative constraints, which means our proposed model reduces the computation complexity.

  10. What is the Value Added to Adaptation Planning by Probabilistic Projections of Climate Change?

    NASA Astrophysics Data System (ADS)

    Wilby, R. L.

    2008-12-01

    Probabilistic projections of climate change offer new sources of risk information to support regional impacts assessment and adaptation options appraisal. However, questions continue to surround how best to apply these scenarios in a practical context, and whether the added complexity and computational burden leads to more robust decision-making. This paper provides an overview of recent efforts in the UK to 'bench-test' frameworks for employing probabilistic projections ahead of the release of the next generation, UKCIP08 projections (in November 2008). This is involving close collaboration between government agencies, research and stakeholder communities. Three examples will be cited to illustrate how probabilistic projections are already informing decisions about future flood risk management in London, water resource planning in trial river basins, and assessments of risks from rising water temperatures to Atlantic salmon stocks in southern England. When compared with conventional deterministic scenarios, ensemble projections allow exploration of a wider range of management options and highlight timescales for implementing adaptation measures. Users of probabilistic scenarios must keep in mind that other uncertainties (e.g., due to impacts model structure and parameterisation) should be handled in an equally rigorous way to those arising from climate models and emission scenarios. Finally, it is noted that a commitment to long-term monitoring is also critical for tracking environmental change, testing model projections, and for evaluating the success (or not) of any scenario-led interventions.

  11. Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William

    2009-01-01

    This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).

  12. A Novel TRM Calculation Method by Probabilistic Concept

    NASA Astrophysics Data System (ADS)

    Audomvongseree, Kulyos; Yokoyama, Akihiko; Verma, Suresh Chand; Nakachi, Yoshiki

    In a new competitive environment, it becomes possible for the third party to access a transmission facility. From this structure, to efficiently manage the utilization of the transmission network, a new definition about Available Transfer Capability (ATC) has been proposed. According to the North American ElectricReliability Council (NERC)’s definition, ATC depends on several parameters, i. e. Total Transfer Capability (TTC), Transmission Reliability Margin (TRM), and Capacity Benefit Margin (CBM). This paper is focused on the calculation of TRM which is one of the security margin reserved for any uncertainty of system conditions. The TRM calculation by probabilistic method is proposed in this paper. Based on the modeling of load forecast error and error in transmission line limitation, various cases of transmission transfer capability and its related probabilistic nature can be calculated. By consideration of the proposed concept of risk analysis, the appropriate required amount of TRM can be obtained. The objective of this research is to provide realistic information on the actual ability of the network which may be an alternative choice for system operators to make an appropriate decision in the competitive market. The advantages of the proposed method are illustrated by application to the IEEJ-WEST10 model system.

  13. Translating Ensemble Weather Forecasts into Probabilistic User-Relevant Information

    NASA Astrophysics Data System (ADS)

    Steiner, Matthias; Sharman, Robert; Hopson, Thomas; Liu, Yubao; Chapman, Michael

    2010-05-01

    Weather-related decisions increasingly rely on probabilistic information as a means of assessing the risk of one potential outcome over another. Ensemble forecasting presents one of the key approaches trying to grasp the uncertainty of weather forecasting. Moreover, in the future decision makers will rely on tools that fully integrate weather information into the decision making process. Through these decision support tools, weather information will be translated into impact information. This presentation will highlight the translation of gridded ensemble weather forecasts into probabilistic user-relevant information. Examples will be discussed that relate to the management of air traffic, noise and pollution dispersion, missile trajectory prediction, water resources and flooding, wind energy production, and road maintenance. The primary take-home message from these examples will be that weather forecasts have to be tailored with a specific user perspective in mind rather than a "one fits all" approach, where a standard forecast product gets thrown over the fence and the user has to figure out what to do with it.

  14. A platform for probabilistic Multimodel and Multiproduct Streamflow Forecasting

    NASA Astrophysics Data System (ADS)

    Roy, Tirthankar; Serrat-Capdevila, Aleix; Gupta, Hoshin; Valdes, Juan

    2017-01-01

    We develop and test a probabilistic real-time streamflow-forecasting platform, Multimodel and Multiproduct Streamflow Forecasting (MMSF), that uses information provided by a suite of hydrologic models and satellite precipitation products (SPPs). The SPPs are bias-corrected before being used as inputs to the hydrologic models, and model calibration is carried out independently for each of the model-product combinations (MPCs). Forecasts generated from the calibrated models are further bias-corrected to compensate for the deficiencies within the models, and then probabilistically merged using a variety of model averaging techniques. Use of bias-corrected SPPs in streamflow forecasting applications can overcome several issues associated with sparsely gauged basins and enable robust forecasting capabilities. Bias correction of streamflow significantly improves the forecasts in terms of accuracy and precision for all different cases considered. Results show that the merging of individual forecasts from different MPCs provides additional improvements. All the merging techniques applied in this study produce similar results, however, the Inverse Weighted Averaging (IVA) proves to be slightly superior in most cases. We demonstrate the implementation of the MMSF platform for real-time streamflow monitoring and forecasting in the Mara River basin of Africa (Kenya & Tanzania) in order to provide improved monitoring and forecasting tools to inform water management decisions.

  15. Probabilistic model for bridge structural evaluation using nondestructive inspection data

    NASA Astrophysics Data System (ADS)

    Carrion, Francisco; Lopez, Jose Alfredo; Balankin, Alexander

    2005-05-01

    A bridge management system developed for the Mexican toll highway network applies a probabilistic-reliability model to estimate load capacity and structural residual life. Basic inputs for the system are the global inspection data (visual inspections and vibration testing), and the information from the environment conditions (weather, traffic, loads, earthquakes); although, the model takes account for additional non-destructive testing or permanent monitoring data. Main outputs are the periodic maintenance, rehabilitation and replacement program, and the updated inspection program. Both programs are custom-made to available funds and scheduled according to a priority assignation criterion. The probabilistic model, tailored to typical bridges, accounts for the size, age, material and structure type. Special bridges in size or type may be included, while in these cases finite element deterministic models are also possible. Key feature is that structural qualification is given in terms of the probability of failure, calculated considering fundamental degradation mechanisms and from actual direct observations and measurements, such as crack distribution and size, materials properties, bridge dimensions, load deflections, and parameters for corrosion evaluation. Vibration measurements are basically used to infer structural resistance and to monitor long term degradation.

  16. Probabilistic Assessment of Radiation Risk for Astronauts in Space Missions

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee; DeAngelis, Giovanni; Cucinotta, Francis A.

    2009-01-01

    Accurate predictions of the health risks to astronauts from space radiation exposure are necessary for enabling future lunar and Mars missions. Space radiation consists of solar particle events (SPEs), comprised largely of medium energy protons, (less than 100 MeV); and galactic cosmic rays (GCR), which include protons and heavy ions of higher energies. While the expected frequency of SPEs is strongly influenced by the solar activity cycle, SPE occurrences themselves are random in nature. A solar modulation model has been developed for the temporal characterization of the GCR environment, which is represented by the deceleration potential, phi. The risk of radiation exposure from SPEs during extra-vehicular activities (EVAs) or in lightly shielded vehicles is a major concern for radiation protection, including determining the shielding and operational requirements for astronauts and hardware. To support the probabilistic risk assessment for EVAs, which would be up to 15% of crew time on lunar missions, we estimated the probability of SPE occurrence as a function of time within a solar cycle using a nonhomogeneous Poisson model to fit the historical database of measurements of protons with energy > 30 MeV, (phi)30. The resultant organ doses and dose equivalents, as well as effective whole body doses for acute and cancer risk estimations are analyzed for a conceptual habitat module and a lunar rover during defined space mission periods. This probabilistic approach to radiation risk assessment from SPE and GCR is in support of mission design and operational planning to manage radiation risks for space exploration.

  17. Probabilistic Grammars for Natural Languages. Psychology Series.

    ERIC Educational Resources Information Center

    Suppes, Patrick

    The purpose of this paper is to define the framework within which empirical investigations of probabilistic grammars can take place and to sketch how this attack can be made. The full presentation of empirical results will be left to other papers. In the detailed empirical work, the author has depended on the collaboration of E. Gammon and A.…

  18. Dynamic Probabilistic Instability of Composite Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2009-01-01

    A computationally effective method is described to evaluate the non-deterministic dynamic instability (probabilistic dynamic buckling) of thin composite shells. The method is a judicious combination of available computer codes for finite element, composite mechanics and probabilistic structural analysis. The solution method is incrementally updated Lagrangian. It is illustrated by applying it to thin composite cylindrical shell subjected to dynamic loads. Both deterministic and probabilistic buckling loads are evaluated to demonstrate the effectiveness of the method. A universal plot is obtained for the specific shell that can be used to approximate buckling loads for different load rates and different probability levels. Results from this plot show that the faster the rate, the higher the buckling load and the shorter the time. The lower the probability, the lower is the buckling load for a specific time. Probabilistic sensitivity results show that the ply thickness, the fiber volume ratio and the fiber longitudinal modulus, dynamic load and loading rate are the dominant uncertainties in that order.

  19. Pigeons' Discounting of Probabilistic and Delayed Reinforcers

    ERIC Educational Resources Information Center

    Green, Leonard; Myerson, Joel; Calvert, Amanda L.

    2010-01-01

    Pigeons' discounting of probabilistic and delayed food reinforcers was studied using adjusting-amount procedures. In the probability discounting conditions, pigeons chose between an adjusting number of food pellets contingent on a single key peck and a larger, fixed number of pellets contingent on completion of a variable-ratio schedule. In the…

  20. Championship Tennis as a Probabilistic Modelling Context.

    ERIC Educational Resources Information Center

    Galbraith, Peter

    1996-01-01

    Suggests ways for using data from championship tennis as a means for exploring probabilistic models, especially binomial probability. Examples include the probability of winning a service point and the probability of winning a service game using data from tables and graphs. (AIM)

  1. The Probabilistic Nature of Preferential Choice

    ERIC Educational Resources Information Center

    Rieskamp, Jorg

    2008-01-01

    Previous research has developed a variety of theories explaining when and why people's decisions under risk deviate from the standard economic view of expected utility maximization. These theories are limited in their predictive accuracy in that they do not explain the probabilistic nature of preferential choice, that is, why an individual makes…

  2. A probabilistic approach to composite micromechanics

    NASA Technical Reports Server (NTRS)

    Stock, T. A.; Bellini, P. X.; Murthy, P. L. N.; Chamis, C. C.

    1988-01-01

    Probabilistic composite micromechanics methods are developed that simulate expected uncertainties in unidirectional fiber composite properties. These methods are in the form of computational procedures using Monte Carlo simulation. A graphite/epoxy unidirectional composite (ply) is studied to demonstrate fiber composite material properties at the micro level. Regression results are presented to show the relative correlation between predicted and response variables in the study.

  3. Probabilistic analysis of a materially nonlinear structure

    NASA Technical Reports Server (NTRS)

    Millwater, H. R.; Wu, Y.-T.; Fossum, A. F.

    1990-01-01

    A probabilistic finite element program is used to perform probabilistic analysis of a materially nonlinear structure. The program used in this study is NESSUS (Numerical Evaluation of Stochastic Structure Under Stress), under development at Southwest Research Institute. The cumulative distribution function (CDF) of the radial stress of a thick-walled cylinder under internal pressure is computed and compared with the analytical solution. In addition, sensitivity factors showing the relative importance of the input random variables are calculated. Significant plasticity is present in this problem and has a pronounced effect on the probabilistic results. The random input variables are the material yield stress and internal pressure with Weibull and normal distributions, respectively. The results verify the ability of NESSUS to compute the CDF and sensitivity factors of a materially nonlinear structure. In addition, the ability of the Advanced Mean Value (AMV) procedure to assess the probabilistic behavior of structures which exhibit a highly nonlinear response is shown. Thus, the AMV procedure can be applied with confidence to other structures which exhibit nonlinear behavior.

  4. The Probabilistic Nature of Preferential Choice

    ERIC Educational Resources Information Center

    Rieskamp, Jorg

    2008-01-01

    Previous research has developed a variety of theories explaining when and why people's decisions under risk deviate from the standard economic view of expected utility maximization. These theories are limited in their predictive accuracy in that they do not explain the probabilistic nature of preferential choice, that is, why an individual makes…

  5. Bayesian probabilistic population projections for all countries

    PubMed Central

    Raftery, Adrian E.; Li, Nan; Ševčíková, Hana; Gerland, Patrick; Heilig, Gerhard K.

    2012-01-01

    Projections of countries’ future populations, broken down by age and sex, are widely used for planning and research. They are mostly done deterministically, but there is a widespread need for probabilistic projections. We propose a Bayesian method for probabilistic population projections for all countries. The total fertility rate and female and male life expectancies at birth are projected probabilistically using Bayesian hierarchical models estimated via Markov chain Monte Carlo using United Nations population data for all countries. These are then converted to age-specific rates and combined with a cohort component projection model. This yields probabilistic projections of any population quantity of interest. The method is illustrated for five countries of different demographic stages, continents and sizes. The method is validated by an out of sample experiment in which data from 1950–1990 are used for estimation, and applied to predict 1990–2010. The method appears reasonably accurate and well calibrated for this period. The results suggest that the current United Nations high and low variants greatly underestimate uncertainty about the number of oldest old from about 2050 and that they underestimate uncertainty for high fertility countries and overstate uncertainty for countries that have completed the demographic transition and whose fertility has started to recover towards replacement level, mostly in Europe. The results also indicate that the potential support ratio (persons aged 20–64 per person aged 65+) will almost certainly decline dramatically in most countries over the coming decades. PMID:22908249

  6. Balkanization and Unification of Probabilistic Inferences

    ERIC Educational Resources Information Center

    Yu, Chong-Ho

    2005-01-01

    Many research-related classes in social sciences present probability as a unified approach based upon mathematical axioms, but neglect the diversity of various probability theories and their associated philosophical assumptions. Although currently the dominant statistical and probabilistic approach is the Fisherian tradition, the use of Fisherian…

  7. Probabilistic Scale-Space Filtering Program

    NASA Technical Reports Server (NTRS)

    Kulkarni, Deepak; Kutulakos, Kiriakos

    1993-01-01

    Probabilistic Scale-Space Filtering (PSF) computer program implements scale-space technique to describe input signals as collections of nested hills and valleys organized in treelike structure. Helps to construct sparse representations of complicated signals. Calculates probabilities, with extracted features corresponding to physical processes. Written in C language (49 percent) and Common Lisp (51 percent).

  8. Expectancy Learning from Probabilistic Input by Infants

    PubMed Central

    Romberg, Alexa R.; Saffran, Jenny R.

    2013-01-01

    Across the first few years of life, infants readily extract many kinds of regularities from their environment, and this ability is thought to be central to development in a number of domains. Numerous studies have documented infants’ ability to recognize deterministic sequential patterns. However, little is known about the processes infants use to build and update representations of structure in time, and how infants represent patterns that are not completely predictable. The present study investigated how infants’ expectations fora simple structure develope over time, and how infants update their representations with new information. We measured 12-month-old infants’ anticipatory eye movements to targets that appeared in one of two possible locations. During the initial phase of the experiment, infants either saw targets that appeared consistently in the same location (Deterministic condition) or probabilistically in either location, with one side more frequent than the other (Probabilistic condition). After this initial divergent experience, both groups saw the same sequence of trials for the rest of the experiment. The results show that infants readily learn from both deterministic and probabilistic input, with infants in both conditions reliably predicting the most likely target location by the end of the experiment. Local context had a large influence on behavior: infants adjusted their predictions to reflect changes in the target location on the previous trial. This flexibility was particularly evident in infants with more variable prior experience (the Probabilistic condition). The results provide some of the first data showing how infants learn in real time. PMID:23439947

  9. Probabilistic Relational Structures and Their Applications

    ERIC Educational Resources Information Center

    Domotor, Zoltan

    The principal objects of the investigation reported were, first, to study qualitative probability relations on Boolean algebras, and secondly, to describe applications in the theories of probability logic, information, automata, and probabilistic measurement. The main contribution of this work is stated in 10 definitions and 20 theorems. The basic…

  10. Probabilistic Relational Structures and Their Applications

    ERIC Educational Resources Information Center

    Domotor, Zoltan

    The principal objects of the investigation reported were, first, to study qualitative probability relations on Boolean algebras, and secondly, to describe applications in the theories of probability logic, information, automata, and probabilistic measurement. The main contribution of this work is stated in 10 definitions and 20 theorems. The basic…

  11. Probabilistic Aeroelastic Analysis Developed for Turbomachinery Components

    NASA Technical Reports Server (NTRS)

    Reddy, T. S. R.; Mital, Subodh K.; Stefko, George L.; Pai, Shantaram S.

    2003-01-01

    Aeroelastic analyses for advanced turbomachines are being developed for use at the NASA Glenn Research Center and industry. However, these analyses at present are used for turbomachinery design with uncertainties accounted for by using safety factors. This approach may lead to overly conservative designs, thereby reducing the potential of designing higher efficiency engines. An integration of the deterministic aeroelastic analysis methods with probabilistic analysis methods offers the potential to design efficient engines with fewer aeroelastic problems and to make a quantum leap toward designing safe reliable engines. In this research, probabilistic analysis is integrated with aeroelastic analysis: (1) to determine the parameters that most affect the aeroelastic characteristics (forced response and stability) of a turbomachine component such as a fan, compressor, or turbine and (2) to give the acceptable standard deviation on the design parameters for an aeroelastically stable system. The approach taken is to combine the aeroelastic analysis of the MISER (MIStuned Engine Response) code with the FPI (fast probability integration) code. The role of MISER is to provide the functional relationships that tie the structural and aerodynamic parameters (the primitive variables) to the forced response amplitudes and stability eigenvalues (the response properties). The role of FPI is to perform probabilistic analyses by utilizing the response properties generated by MISER. The results are a probability density function for the response properties. The probabilistic sensitivities of the response variables to uncertainty in primitive variables are obtained as a byproduct of the FPI technique. The combined analysis of aeroelastic and probabilistic analysis is applied to a 12-bladed cascade vibrating in bending and torsion. Out of the total 11 design parameters, 6 are considered as having probabilistic variation. The six parameters are space-to-chord ratio (SBYC), stagger angle

  12. DEVELOPMENT OF RISK-BASED AND TECHNOLOGY-INDEPENDENT SAFETY CRITERIA FOR GENERATION IV SYSTEMS

    SciTech Connect

    William E. Kastenberg; Edward Blandford; Lance Kim

    2009-03-31

    This project has developed quantitative safety goals for Generation IV (Gen IV) nuclear energy systems. These safety goals are risk based and technology independent. The foundations for a new approach to risk analysis has been developed, along with a new operational definition of risk. This project has furthered the current state-of-the-art by developing quantitative safety goals for both Gen IV reactors and for the overall Gen IV nuclear fuel cycle. The risk analysis approach developed will quantify performance measures, characterize uncertainty, and address a more comprehensive view of safety as it relates to the overall system. Appropriate safety criteria are necessary to manage risk in a prudent and cost-effective manner. This study is also important for government agencies responsible for managing, reviewing, and for approving advanced reactor systems because they are charged with assuring the health and safety of the public.

  13. A probabilistic strategy for parametric catastrophe insurance

    NASA Astrophysics Data System (ADS)

    Figueiredo, Rui; Martina, Mario; Stephenson, David; Youngman, Benjamin

    2017-04-01

    Economic losses due to natural hazards have shown an upward trend since 1980, which is expected to continue. Recent years have seen a growing worldwide commitment towards the reduction of disaster losses. This requires effective management of disaster risk at all levels, a part of which involves reducing financial vulnerability to disasters ex-ante, ensuring that necessary resources will be available following such events. One way to achieve this is through risk transfer instruments. These can be based on different types of triggers, which determine the conditions under which payouts are made after an event. This study focuses on parametric triggers, where payouts are determined by the occurrence of an event exceeding specified physical parameters at a given location, or at multiple locations, or over a region. This type of product offers a number of important advantages, and its adoption is increasing. The main drawback of parametric triggers is their susceptibility to basis risk, which arises when there is a mismatch between triggered payouts and the occurrence of loss events. This is unavoidable in said programmes, as their calibration is based on models containing a number of different sources of uncertainty. Thus, a deterministic definition of the loss event triggering parameters appears flawed. However, often for simplicity, this is the way in which most parametric models tend to be developed. This study therefore presents an innovative probabilistic strategy for parametric catastrophe insurance. It is advantageous as it recognizes uncertainties and minimizes basis risk while maintaining a simple and transparent procedure. A logistic regression model is constructed here to represent the occurrence of loss events based on certain loss index variables, obtained through the transformation of input environmental variables. Flood-related losses due to rainfall are studied. The resulting model is able, for any given day, to issue probabilities of occurrence of loss

  14. Probabilistic, Dynamic Analysis of Plans

    DTIC Science & Technology

    2004-03-01

    technologies under the Active Templates program . The first was Capture the Flag, a wargaming environment that includes a simulator of battalion-and-higher...level land warfare. Capture the Flag was initiated by the Active Templates program manager LTC Doug Dyer, but funded primarily by other programs , so...the Active Templates program could be developed “bottom up” by the people who use them, rather than “top down” by trained ontologists. 15. NUMBER OF

  15. Risk-based audit selection of dairy farms.

    PubMed

    van Asseldonk, M A P M; Velthuis, A G J

    2014-02-01

    Dairy farms are audited in the Netherlands on numerous process standards. Each farm is audited once every 2 years. Increasing demands for cost-effectiveness in farm audits can be met by introducing risk-based principles. This implies targeting subpopulations with a higher risk of poor process standards. To select farms for an audit that present higher risks, a statistical analysis was conducted to test the relationship between the outcome of farm audits and bulk milk laboratory results before the audit. The analysis comprised 28,358 farm audits and all conducted laboratory tests of bulk milk samples 12 mo before the audit. The overall outcome of each farm audit was classified as approved or rejected. Laboratory results included somatic cell count (SCC), total bacterial count (TBC), antimicrobial drug residues (ADR), level of butyric acid spores (BAB), freezing point depression (FPD), level of free fatty acids (FFA), and cleanliness of the milk (CLN). The bulk milk laboratory results were significantly related to audit outcomes. Rejected audits are likely to occur on dairy farms with higher mean levels of SCC, TBC, ADR, and BAB. Moreover, in a multivariable model, maxima for TBC, SCC, and FPD as well as standard deviations for TBC and FPD are risk factors for negative audit outcomes. The efficiency curve of a risk-based selection approach, on the basis of the derived regression results, dominated the current random selection approach. To capture 25, 50, or 75% of the population with poor process standards (i.e., audit outcome of rejected), respectively, only 8, 20, or 47% of the population had to be sampled based on a risk-based selection approach. Milk quality information can thus be used to preselect high-risk farms to be audited more frequently.

  16. Risk-based approach to analyzing operating events

    SciTech Connect

    Marchese, A.R.; Neogy, P.

    1996-02-01

    Existing programs for the analysis of operating events at the Department of Energy (DOE) facilities do not determine risk measures for the events. An approach for the risk based analysis of operating events has been developed, and applied to two events. The approach utilizes the data now being collected in existing data programs and determines risk measures for the events which are not currently determined. Such risk measures allow risk appropriate responses to be made to events, and provide a means for comparing the safety significance of dissimilar events at different facilities.

  17. Risk-based selection of SSCs at Peach Bottom

    SciTech Connect

    Krueger, G.A.; Marie, A.J. )

    1993-01-01

    The purpose of identifying risk significant systems, structures, and components (SSCS) that are within the scope of the maintenance rule is to bring a higher level of attention to a subset of those SSCS. These risk-significant SSCs will have specific performance criteria established for them, and failure to meet this performance criteria will result in establishing goals to ensure the necessary improvement in performance. The Peach Bottom individual plant examination (IPE) results were used to provide insights for the verification of proposed probabilistic risk assessment (PRA) methods set forth in the Industry Maintenance Guidelines for Implementation of the Maintenance Rule. The objective of reviewing the methods for selection of SSCs that are considered risk significant was to ensure the methods used are logical, reproducible, and can be consistently applied.

  18. Toward a Safety Risk-Based Classification of Unmanned Aircraft

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2016-01-01

    There is a trend of growing interest and demand for greater access of unmanned aircraft (UA) to the National Airspace System (NAS) as the ongoing development of UA technology has created the potential for significant economic benefits. However, the lack of a comprehensive and efficient UA regulatory framework has constrained the number and kinds of UA operations that can be performed. This report presents initial results of a study aimed at defining a safety-risk-based UA classification as a plausible basis for a regulatory framework for UA operating in the NAS. Much of the study up to this point has been at a conceptual high level. The report includes a survey of contextual topics, analysis of safety risk considerations, and initial recommendations for a risk-based approach to safe UA operations in the NAS. The next phase of the study will develop and leverage deeper clarity and insight into practical engineering and regulatory considerations for ensuring that UA operations have an acceptable level of safety.

  19. Risk based culling for highly infectious diseases of livestock

    PubMed Central

    2011-01-01

    The control of highly infectious diseases of livestock such as classical swine fever, foot-and-mouth disease, and avian influenza is fraught with ethical, economic, and public health dilemmas. Attempts to control outbreaks of these pathogens rely on massive culling of infected farms, and farms deemed to be at risk of infection. Conventional approaches usually involve the preventive culling of all farms within a certain radius of an infected farm. Here we propose a novel culling strategy that is based on the idea that farms that have the highest expected number of secondary infections should be culled first. We show that, in comparison with conventional approaches (ring culling), our new method of risk based culling can reduce the total number of farms that need to be culled, the number of culled infected farms (and thus the expected number of human infections in case of a zoonosis), and the duration of the epidemic. Our novel risk based culling strategy requires three pieces of information, viz. the location of all farms in the area at risk, the moments when infected farms are detected, and an estimate of the distance-dependent probability of transmission. PMID:21714865

  20. Risk-based consequences of extreme natural hazard processes in mountain regions - Multi-hazard analysis in Tyrol (Austria)

    NASA Astrophysics Data System (ADS)

    Huttenlau, Matthias; Stötter, Johann

    2010-05-01

    weighting within the risk concept, this has sufficient implications on the results of risk analyses. Thus, an equal and scale appropriated balance of those risk components is a fundamental key factor for effective natural hazard risk analyses. The results of such analyses inform especially decision makers in the insurance industry, the administration, and politicians on potential consequences and are the basis for appropriate risk management strategies. Thereby, results (i) on an annual or probabilistic risk comprehension have to be distinguished from (ii) scenario-based analyses. The first analyses are based on statistics of periodically or episodically occurring events whereas the latter approach is especially applied for extreme, non-linear, stochastic events. Focusing on the needs especially of insurance companies, the first approaches are appropriate for premium pricing and reinsurance strategies with an annual perspective, whereas the latter is focusing on events with extreme loss burdens under worst-case criteria to guarantee accordant reinsurance coverage. Moreover, the demand of adequate loss model approaches and methods is strengthened by the risk-based requirements of the upcoming capital requirement directive Solvency II. The present study estimates the potential elements at risk, their corresponding damage potentials and the Probable Maximum Losses (PMLs) of extreme natural hazards events in Tyrol (Austria) and considers adequatly the scale dependency and balanced application of the introduced risk components. Beside the introduced analysis an additionally portfolio analysis of a regional insurance company was executed. The geocoded insurance contracts of this portfolio analysis were the basis to estimate spatial, socio-economical and functional differentiated mean insurance values for the different risk categories of (i) buildings, (ii) contents or inventory, (iii) vehicles, and (iv) persons in the study area. The estimated mean insurance values were

  1. Design of a probabilistic wildfire alert system for Chile

    NASA Astrophysics Data System (ADS)

    Crawford, Ben; Dacre, Helen; Lopez Saldana, Gerardo; Charlton-Perez, Andrew

    2017-04-01

    During the past 50 years over 200,000 wildfires have burned nearly 2.3 million hectares in Chile, leading to significant economic consequences. To improve wildfire warning capabilities, statistical models have been developed by the University of Chile for 15 different geographic regions of the country to quantify wildfire risk based on a set of specific meteorological variables (air temperature, relative humidity, wind speed, accumulated precipitation, and time of year). Currently, the warning system uses data input from ground-based weather stations and alerts are issued one day ahead. This project improves upon the current system by using variables from ensemble weather prediction datasets (TIGGE archive from ECMWF) as input to the wildfire risk model. This allows development of a probabilistic alert system that takes into account uncertainties in the specific meteorological forecast variables used in the wildfire risk model. This also allows the wildfire risk index to be calculated up to seven days ahead. The integration of the statistical wildfire risk model with the ensemble weather prediction system provides additional information about uncertainty to improve resource allocation decisions. The new system is evaluated using MODIS satellite wildfire detection datasets from 2008-2015 for each of the 15 geographic wildfire risk regions. The prototype alert system is then compared to alerts made using forecast variables from the operational ensemble weather prediction system used by the Chilean Meteorological Service. Finally, a novel method to update the wildfire risk statistical model parameters in real time based on observed spatial and temporal wildfire patterns will be presented.

  2. How probabilistic risk assessment can mislead terrorism risk analysts.

    PubMed

    Brown, Gerald G; Cox, Louis Anthony Tony

    2011-02-01

    Traditional probabilistic risk assessment (PRA), of the type originally developed for engineered systems, is still proposed for terrorism risk analysis. We show that such PRA applications are unjustified in general. The capacity of terrorists to seek and use information and to actively research different attack options before deciding what to do raises unique features of terrorism risk assessment that are not adequately addressed by conventional PRA for natural and engineered systems-in part because decisions based on such PRA estimates do not adequately hedge against the different probabilities that attackers may eventually act upon. These probabilities may differ from the defender's (even if the defender's experts are thoroughly trained, well calibrated, unbiased probability assessors) because they may be conditioned on different information. We illustrate the fundamental differences between PRA and terrorism risk analysis, and suggest use of robust decision analysis for risk management when attackers may know more about some attack options than we do.

  3. Probabilistic Structural Health Monitoring of the Orbiter Wing Leading Edge

    NASA Technical Reports Server (NTRS)

    Yap, Keng C.; Macias, Jesus; Kaouk, Mohamed; Gafka, Tammy L.; Kerr, Justin H.

    2011-01-01

    A structural health monitoring (SHM) system can contribute to the risk management of a structure operating under hazardous conditions. An example is the Wing Leading Edge Impact Detection System (WLEIDS) that monitors the debris hazards to the Space Shuttle Orbiter s Reinforced Carbon-Carbon (RCC) panels. Since Return-to-Flight (RTF) after the Columbia accident, WLEIDS was developed and subsequently deployed on board the Orbiter to detect ascent and on-orbit debris impacts, so as to support the assessment of wing leading edge structural integrity prior to Orbiter re-entry. As SHM is inherently an inverse problem, the analyses involved, including those performed for WLEIDS, tend to be associated with significant uncertainty. The use of probabilistic approaches to handle the uncertainty has resulted in the successful implementation of many development and application milestones.

  4. Probabilistic Modeling of Aircraft Trajectories for Dynamic Separation Volumes

    NASA Technical Reports Server (NTRS)

    Lewis, Timothy A.

    2016-01-01

    With a proliferation of new and unconventional vehicles and operations expected in the future, the ab initio airspace design will require new approaches to trajectory prediction for separation assurance and other air traffic management functions. This paper presents an approach to probabilistic modeling of the trajectory of an aircraft when its intent is unknown. The approach uses a set of feature functions to constrain a maximum entropy probability distribution based on a set of observed aircraft trajectories. This model can be used to sample new aircraft trajectories to form an ensemble reflecting the variability in an aircraft's intent. The model learning process ensures that the variability in this ensemble reflects the behavior observed in the original data set. Computational examples are presented.

  5. Risk-based prioritization method for the classification of groundwater pesticide pollution from agricultural regions.

    PubMed

    Yang, Yu; Lian, Xin-Ying; Jiang, Yong-Hai; Xi, Bei-Dou; He, Xiao-Song

    2017-06-03

    Agricultural regions are a significant source of groundwater pesticide pollution. To ensure that agricultural regions with a significantly high risk of groundwater pesticide contamination are properly managed, a risk-based ranking method related to groundwater pesticide contamination is needed. In the present paper, a risk-based prioritization method for the classification of groundwater pesticide pollution from agricultural regions was established. The method encompasses 3 phases, including indicator selection, characterization, and classification. In the risk ranking index system employed here, 17 indicators involving the physicochemical properties, environmental behavior characteristics, pesticide application methods, and inherent vulnerability of groundwater in the agricultural region were selected. The boundary of each indicator was determined using K-means cluster analysis based on a survey of a typical agricultural region and the physical and chemical properties of 300 typical pesticides. The total risk characterization was calculated by multiplying the risk value of each indicator, which could effectively avoid the subjectivity of index weight calculation and identify the main factors associated with the risk. The results indicated that the risk for groundwater pesticide contamination from agriculture in a region could be ranked into 4 classes from low to high risk. This method was applied to an agricultural region in Jiangsu Province, China, and it showed that this region had a relatively high risk for groundwater contamination from pesticides, and that the pesticide application method was the primary factor contributing to the relatively high risk. The risk ranking method was determined to be feasible, valid, and able to provide reference data related to the risk management of groundwater pesticide pollution from agricultural regions. Integr Environ Assess Manag 2017;00:000-000. © 2017 SETAC. © 2017 SETAC.

  6. Probabilistic Thermomechanical Fatigue of Polymer Matrix Composites

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    1997-01-01

    Traditional computational approaches for predicting the life and long-term behavior of materials rely on empirical data and are neither generic nor unique in nature. Also, those approaches are not easy to implement in a design procedure in an effective, integrated manner. The focus of ongoing research at the NASA Lewis Research Center has been to develop advanced integrated computational methods and related computer codes for a complete reliability-based assessment of composite structures. These methods - which account for uncertainties in all the constituent properties, fabrication process variables, and loads to predict probabilistic micromechanics, ply, laminate, and structural responses - have already been implemented in the Integrated Probabilistic Assessment of Composite Structures (IPACS) computer code. The main objective of this evaluation is to illustrate the effectiveness of the methodology to predict the long-term behavior of composites under combined mechanical and thermal cyclic loading conditions.

  7. Probabilistic Assessment of National Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Shah, A. R.; Shiao, M.; Chamis, C. C.

    1996-01-01

    A preliminary probabilistic structural assessment of the critical section of National Wind Tunnel (NWT) is performed using NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) computer code. Thereby, the capabilities of NESSUS code have been demonstrated to address reliability issues of the NWT. Uncertainties in the geometry, material properties, loads and stiffener location on the NWT are considered to perform the reliability assessment. Probabilistic stress, frequency, buckling, fatigue and proof load analyses are performed. These analyses cover the major global and some local design requirements. Based on the assumed uncertainties, the results reveal the assurance of minimum 0.999 reliability for the NWT. Preliminary life prediction analysis results show that the life of the NWT is governed by the fatigue of welds. Also, reliability based proof test assessment is performed.

  8. Modelling default and likelihood reasoning as probabilistic

    NASA Technical Reports Server (NTRS)

    Buntine, Wray

    1990-01-01

    A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. 'Likely' and 'by default' are in fact treated as duals in the same sense as 'possibility' and 'necessity'. To model these four forms probabilistically, a logic QDP and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequence results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.

  9. Exact and Approximate Probabilistic Symbolic Execution

    NASA Technical Reports Server (NTRS)

    Luckow, Kasper; Pasareanu, Corina S.; Dwyer, Matthew B.; Filieri, Antonio; Visser, Willem

    2014-01-01

    Probabilistic software analysis seeks to quantify the likelihood of reaching a target event under uncertain environments. Recent approaches compute probabilities of execution paths using symbolic execution, but do not support nondeterminism. Nondeterminism arises naturally when no suitable probabilistic model can capture a program behavior, e.g., for multithreading or distributed systems. In this work, we propose a technique, based on symbolic execution, to synthesize schedulers that resolve nondeterminism to maximize the probability of reaching a target event. To scale to large systems, we also introduce approximate algorithms to search for good schedulers, speeding up established random sampling and reinforcement learning results through the quantification of path probabilities based on symbolic execution. We implemented the techniques in Symbolic PathFinder and evaluated them on nondeterministic Java programs. We show that our algorithms significantly improve upon a state-of- the-art statistical model checking algorithm, originally developed for Markov Decision Processes.

  10. Significance testing as perverse probabilistic reasoning

    PubMed Central

    2011-01-01

    Truth claims in the medical literature rely heavily on statistical significance testing. Unfortunately, most physicians misunderstand the underlying probabilistic logic of significance tests and consequently often misinterpret their results. This near-universal misunderstanding is highlighted by means of a simple quiz which we administered to 246 physicians at two major academic hospitals, on which the proportion of incorrect responses exceeded 90%. A solid understanding of the fundamental concepts of probability theory is becoming essential to the rational interpretation of medical information. This essay provides a technically sound review of these concepts that is accessible to a medical audience. We also briefly review the debate in the cognitive sciences regarding physicians' aptitude for probabilistic inference. PMID:21356064

  11. Probabilistic Seismic Risk Model for Western Balkans

    NASA Astrophysics Data System (ADS)

    Stejskal, Vladimir; Lorenzo, Francisco; Pousse, Guillaume; Radovanovic, Slavica; Pekevski, Lazo; Dojcinovski, Dragi; Lokin, Petar; Petronijevic, Mira; Sipka, Vesna

    2010-05-01

    A probabilistic seismic risk model for insurance and reinsurance purposes is presented for an area of Western Balkans, covering former Yugoslavia and Albania. This territory experienced many severe earthquakes during past centuries producing significant damage to many population centres in the region. The highest hazard is related to external Dinarides, namely to the collision zone of the Adriatic plate. The model is based on a unified catalogue for the region and a seismic source model consisting of more than 30 zones covering all the three main structural units - Southern Alps, Dinarides and the south-western margin of the Pannonian Basin. A probabilistic methodology using Monte Carlo simulation was applied to generate the hazard component of the model. Unique set of damage functions based on both loss experience and engineering assessments is used to convert the modelled ground motion severity into the monetary loss.

  12. Probabilistic Safety Assessment of Tehran Research Reactor

    SciTech Connect

    Hosseini, Seyed Mohammad Hadi; Nematollahi, Mohammad Reza; Sepanloo, Kamran

    2004-07-01

    Probabilistic Safety Assessment (PSA) application is found to be a practical tool for research reactor safety due to intense involvement of human interactions in an experimental facility. In this paper the application of the Probabilistic Safety Assessment to the Tehran Research Reactor (TRR) is presented. The level 1 PSA application involved: Familiarization with the plant, selection of accident initiators, mitigating functions and system definitions, event tree constructions and quantification, fault tree constructions and quantification, human reliability, component failure data base development and dependent failure analysis. Each of the steps of the analysis given above is discussed with highlights from the selected results. Quantification of the constructed models is done using SAPHIRE software. This Study shows that the obtained core damage frequency for Tehran Research Reactor (8.368 E-6 per year) well meets the IAEA criterion for existing nuclear power plants (1E-4). But safety improvement suggestions are offered to decrease the most probable accidents. (authors)

  13. Probabilistic Methods for Structural Design and Reliability

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Whitlow, Woodrow, Jr. (Technical Monitor)

    2002-01-01

    This report describes a formal method to quantify structural damage tolerance and reliability in the presence of a multitude of uncertainties in turbine engine components. The method is based at the material behavior level where primitive variables with their respective scatter ranges are used to describe behavior. Computational simulation is then used to propagate the uncertainties to the structural scale where damage tolerance and reliability are usually specified. Several sample cases are described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from this method demonstrate, that it is mature and that it can be used to probabilistically evaluate turbine engine structural components. It may be inferred from the results that the method is suitable for probabilistically predicting the remaining life in aging or in deteriorating structures, for making strategic projections and plans, and for achieving better, cheaper, faster products that give competitive advantages in world markets.

  14. Probabilistic Analysis of Gas Turbine Field Performance

    NASA Technical Reports Server (NTRS)

    Gorla, Rama S. R.; Pai, Shantaram S.; Rusick, Jeffrey J.

    2002-01-01

    A gas turbine thermodynamic cycle was computationally simulated and probabilistically evaluated in view of the several uncertainties in the performance parameters, which are indices of gas turbine health. Cumulative distribution functions and sensitivity factors were computed for the overall thermal efficiency and net specific power output due to the thermodynamic random variables. These results can be used to quickly identify the most critical design variables in order to optimize the design, enhance performance, increase system availability and make it cost effective. The analysis leads to the selection of the appropriate measurements to be used in the gas turbine health determination and to the identification of both the most critical measurements and parameters. Probabilistic analysis aims at unifying and improving the control and health monitoring of gas turbine aero-engines by increasing the quality and quantity of information available about the engine's health and performance.

  15. Probabilistic assessment of smart composite structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Shiao, Michael C.

    1994-01-01

    A composite wing with spars and bulkheads is used to demonstrate the effectiveness of probabilistic assessment of smart composite structures to control uncertainties in distortions and stresses. Results show that a smart composite wing can be controlled to minimize distortions and to have specified stress levels in the presence of defects. Structural responses such as changes in angle of attack, vertical displacements, and stress in the control and controlled plies are probabilistically assessed to quantify their respective uncertainties. Sensitivity factors are evaluated to identify those parameters that have the greatest influence on a specific structural response. Results show that smart composite structures can be configured to control both distortions and ply stresses to satisfy specified design requirements.

  16. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2008-01-01

    A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multi-factor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.

  17. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multifactor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.

  18. Development of probabilistic thinking-oriented learning tools for probability materials at junior high school students

    NASA Astrophysics Data System (ADS)

    Sari, Dwi Ivayana; Hermanto, Didik

    2017-08-01

    This research is a developmental research of probabilistic thinking-oriented learning tools for probability materials at ninth grade students. This study is aimed to produce a good probabilistic thinking-oriented learning tools. The subjects were IX-A students of MTs Model Bangkalan. The stages of this development research used 4-D development model which has been modified into define, design and develop. Teaching learning tools consist of lesson plan, students' worksheet, learning teaching media and students' achievement test. The research instrument used was a sheet of learning tools validation, a sheet of teachers' activities, a sheet of students' activities, students' response questionnaire and students' achievement test. The result of those instruments were analyzed descriptively to answer research objectives. The result was teaching learning tools in which oriented to probabilistic thinking of probability at ninth grade students which has been valid. Since teaching and learning tools have been revised based on validation, and after experiment in class produced that teachers' ability in managing class was effective, students' activities were good, students' responses to the learning tools were positive and the validity, sensitivity and reliability category toward achievement test. In summary, this teaching learning tools can be used by teacher to teach probability for develop students' probabilistic thinking.

  19. 12 CFR 652.75 - Your responsibility for determining the risk-based capital level.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... the risk-based capital stress test and must be able to determine your risk-based capital level at any...-based capital level. 652.75 Section 652.75 Banks and Banking FARM CREDIT ADMINISTRATION FARM CREDIT... Requirements § 652.75 Your responsibility for determining the risk-based capital level. (a) You must determine...

  20. 12 CFR 327.12 - Prepayment of quarterly risk-based assessments.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 4 2011-01-01 2011-01-01 false Prepayment of quarterly risk-based assessments... STATEMENTS OF GENERAL POLICY ASSESSMENTS In General § 327.12 Prepayment of quarterly risk-based assessments... pay to the FDIC a prepaid assessment, which shall equal its estimated quarterly risk-based...

  1. 12 CFR 327.12 - Prepayment of quarterly risk-based assessments.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 5 2013-01-01 2013-01-01 false Prepayment of quarterly risk-based assessments... STATEMENTS OF GENERAL POLICY ASSESSMENTS In General § 327.12 Prepayment of quarterly risk-based assessments... pay to the FDIC a prepaid assessment, which shall equal its estimated quarterly risk-based...

  2. 12 CFR 327.12 - Prepayment of quarterly risk-based assessments.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 5 2014-01-01 2014-01-01 false Prepayment of quarterly risk-based assessments... STATEMENTS OF GENERAL POLICY ASSESSMENTS In General § 327.12 Prepayment of quarterly risk-based assessments... pay to the FDIC a prepaid assessment, which shall equal its estimated quarterly risk-based...

  3. 12 CFR 327.12 - Prepayment of quarterly risk-based assessments.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 5 2012-01-01 2012-01-01 false Prepayment of quarterly risk-based assessments... STATEMENTS OF GENERAL POLICY ASSESSMENTS In General § 327.12 Prepayment of quarterly risk-based assessments... pay to the FDIC a prepaid assessment, which shall equal its estimated quarterly risk-based...

  4. 12 CFR 652.100 - Audit of the risk-based capital stress test.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 12 Banks and Banking 7 2014-01-01 2014-01-01 false Audit of the risk-based capital stress test... the risk-based capital stress test. You must have a qualified, independent external auditor review your implementation of the risk-based capital stress test every 3 years and submit a copy of...

  5. 12 CFR 652.100 - Audit of the risk-based capital stress test.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 6 2011-01-01 2011-01-01 false Audit of the risk-based capital stress test... the risk-based capital stress test. You must have a qualified, independent external auditor review your implementation of the risk-based capital stress test every 3 years and submit a copy of...

  6. 12 CFR 652.100 - Audit of the risk-based capital stress test.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 12 Banks and Banking 7 2013-01-01 2013-01-01 false Audit of the risk-based capital stress test... the risk-based capital stress test. You must have a qualified, independent external auditor review your implementation of the risk-based capital stress test every 3 years and submit a copy of...

  7. 12 CFR 652.100 - Audit of the risk-based capital stress test.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 12 Banks and Banking 7 2012-01-01 2012-01-01 false Audit of the risk-based capital stress test... the risk-based capital stress test. You must have a qualified, independent external auditor review your implementation of the risk-based capital stress test every 3 years and submit a copy of...

  8. 12 CFR 652.100 - Audit of the risk-based capital stress test.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Audit of the risk-based capital stress test... the risk-based capital stress test. You must have a qualified, independent external auditor review your implementation of the risk-based capital stress test every 3 years and submit a copy of...

  9. 12 CFR 1750.13 - Risk-based capital level computation.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... paragraph (a)(2)(iii) of this section, whichever would require more capital in the stress test for the... level computation. (a) Risk-Based Capital Test—OFHEO shall compute a risk-based capital level for each Enterprise at least quarterly by applying the risk-based capital test described in appendix A to this...

  10. 76 FR 41602 - Fair Credit Reporting Risk-Based Pricing Regulations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-15

    ... on the model forms. This statement mirrors a sentence on the current risk-based pricing notice... Reporting Risk-Based Pricing Regulations AGENCIES: Board of Governors of the Federal Reserve System (Board... Board and the Commission published final rules to implement the risk-based pricing provisions in section...

  11. The Diagnostic Challenge Competition: Probabilistic Techniques for Fault Diagnosis in Electrical Power Systems

    NASA Technical Reports Server (NTRS)

    Ricks, Brian W.; Mengshoel, Ole J.

    2009-01-01

    Reliable systems health management is an important research area of NASA. A health management system that can accurately and quickly diagnose faults in various on-board systems of a vehicle will play a key role in the success of current and future NASA missions. We introduce in this paper the ProDiagnose algorithm, a diagnostic algorithm that uses a probabilistic approach, accomplished with Bayesian Network models compiled to Arithmetic Circuits, to diagnose these systems. We describe the ProDiagnose algorithm, how it works, and the probabilistic models involved. We show by experimentation on two Electrical Power Systems based on the ADAPT testbed, used in the Diagnostic Challenge Competition (DX 09), that ProDiagnose can produce results with over 96% accuracy and less than 1 second mean diagnostic time.

  12. Probabilistic Latent Variable Models as Nonnegative Factorizations

    PubMed Central

    Shashanka, Madhusudana; Raj, Bhiksha; Smaragdis, Paris

    2008-01-01

    This paper presents a family of probabilistic latent variable models that can be used for analysis of nonnegative data. We show that there are strong ties between nonnegative matrix factorization and this family, and provide some straightforward extensions which can help in dealing with shift invariances, higher-order decompositions and sparsity constraints. We argue through these extensions that the use of this approach allows for rapid development of complex statistical models for analyzing nonnegative data. PMID:18509481

  13. Probabilistically teleporting arbitrary two-qubit states

    NASA Astrophysics Data System (ADS)

    Choudhury, Binayak S.; Dhara, Arpan

    2016-12-01

    In this paper we make use of two non-maximally entangled three-qubit channels for probabilistically teleporting arbitrary two particle states from a sender to a receiver. We also calculate the success probability of the teleportation. In the protocol we use two measurements of which one is a POVM and the other is a projective measurement. The POVM provides the protocol with operational advantage.

  14. Probabilistic Anisotropic Failure Criteria for Composite Materials.

    DTIC Science & Technology

    1987-12-01

    worksheets were based on Microsoft Excel software. 55 55 ’. 2.’ 𔃼..’. -.. ’-,’€’.’’.’ :2.,2..’..’.2.’.’.,’.." .𔃼.. .2...analytically described the failure cri - terion and probabilistic failure states of a anisotropic composite in a combined stress state. Strength...APPENDIX F RELIABILITY/FAILURE FUNCTION WORKSHEET ........... 76 APPENDIX G PERCENTILE STRENGTH WORKSHEET ....................... 80 LIST OF

  15. Multiscale/Multifunctional Probabilistic Composite Fatigue

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A multilevel (multiscale/multifunctional) evaluation is demonstrated by applying it to three different sample problems. These problems include the probabilistic evaluation of a space shuttle main engine blade, an engine rotor and an aircraft wing. The results demonstrate that the blade will fail at the highest probability path, the engine two-stage rotor will fail by fracture at the rim and the aircraft wing will fail at 109 fatigue cycles with a probability of 0.9967.

  16. Fast probabilistic file fingerprinting for big data.

    PubMed

    Tretyakov, Konstantin; Laur, Sven; Smant, Geert; Vilo, Jaak; Prins, Pjotr

    2013-01-01

    Biological data acquisition is raising new challenges, both in data analysis and handling. Not only is it proving hard to analyze the data at the rate it is generated today, but simply reading and transferring data files can be prohibitively slow due to their size. This primarily concerns logistics within and between data centers, but is also important for workstation users in the analysis phase. Common usage patterns, such as comparing and transferring files, are proving computationally expensive and are tying down shared resources. We present an efficient method for calculating file uniqueness for large scientific data files, that takes less computational effort than existing techniques. This method, called Probabilistic Fast File Fingerprinting (PFFF), exploits the variation present in biological data and computes file fingerprints by sampling randomly from the file instead of reading it in full. Consequently, it has a flat performance characteristic, correlated with data variation rather than file size. We demonstrate that probabilistic fingerprinting can be as reliable as existing hashing techniques, with provably negligible risk of collisions. We measure the performance of the algorithm on a number of data storage and access technologies, identifying its strengths as well as limitations. Probabilistic fingerprinting may significantly reduce the use of computational resources when comparing very large files. Utilisation of probabilistic fingerprinting techniques can increase the speed of common file-related workflows, both in the data center and for workbench analysis. The implementation of the algorithm is available as an open-source tool named pfff, as a command-line tool as well as a C library. The tool can be downloaded from http://biit.cs.ut.ee/pfff.

  17. The probabilistic structure of planetary contamination models

    NASA Technical Reports Server (NTRS)

    Harrison, J. M.; North, W. D.

    1973-01-01

    The analytical basis for planetary quarantine standards and procedures is presented. The heirarchy of planetary quarantine decisions is explained and emphasis is placed on the determination of mission specifications to include sterilization. The influence of the Sagan-Coleman probabilistic model of planetary contamination on current standards and procedures is analyzed. A classical problem in probability theory which provides a close conceptual parallel to the type of dependence present in the contamination problem is presented.

  18. Bayesian Probabilistic Projection of International Migration.

    PubMed

    Azose, Jonathan J; Raftery, Adrian E

    2015-10-01

    We propose a method for obtaining joint probabilistic projections of migration for all countries, broken down by age and sex. Joint trajectories for all countries are constrained to satisfy the requirement of zero global net migration. We evaluate our model using out-of-sample validation and compare point projections to the projected migration rates from a persistence model similar to the method used in the United Nations' World Population Prospects, and also to a state-of-the-art gravity model.

  19. Probabilistic Anomaly Detection in Dynamic Systems

    NASA Technical Reports Server (NTRS)

    Smyth, Padhraic

    1993-01-01

    This paper describes probabilistic methods for novelty detection when using pattern recognition methods for fault monitoring of dynamic systems. The problem of novelty detection is particularly acute when prior knowledge and data only allow one to construct an incomplete prior model of the system. Hence, some allowance must be made in model design so that a classifier will be robust to data generated by classes not included in the training phase.

  20. Maritime Threat Detection Using Probabilistic Graphical Models

    DTIC Science & Technology

    2012-01-01

    CRF, unlike an HMM, can represent local features, and does not require feature concatenation. MLNs For MLNs, we used Alchemy ( Alchemy 2011), an...open source statistical relational learning and probabilistic inferencing package. Alchemy supports generative and discriminative weight learning, and...that Alchemy creates a new formula for every possible combination of the values for a1 and a2 that fit the type specified in their predicate

  1. Probabilistic Network Approach to Decision-Making

    NASA Astrophysics Data System (ADS)

    Nicolis, Grégoire; Nicolis, Stamatios C.

    2015-06-01

    A probabilistic approach to decision-making is developed in which the states of the underlying stochastic process, assumed to be of the Markov type, represent the competing options. The principal parameters determining the dominance of a particular option versus the others are identified and the transduction of information associated to the transitions between states is quantified using a set of entropy-like quantities.

  2. Dynamic competitive probabilistic principal components analysis.

    PubMed

    López-Rubio, Ezequiel; Ortiz-DE-Lazcano-Lobato, Juan Miguel

    2009-04-01

    We present a new neural model which extends the classical competitive learning (CL) by performing a Probabilistic Principal Components Analysis (PPCA) at each neuron. The model also has the ability to learn the number of basis vectors required to represent the principal directions of each cluster, so it overcomes a drawback of most local PCA models, where the dimensionality of a cluster must be fixed a priori. Experimental results are presented to show the performance of the network with multispectral image data.

  3. Probabilistic structural analysis methods and applications

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Wu, Y.-T.; Dias, B.; Rajagopal, K. R.

    1988-01-01

    An advanced algorithm for simulating the probabilistic distribution of structural responses due to statistical uncertainties in loads, geometry, material properties, and boundary conditions is reported. The method effectively combines an advanced algorithm for calculating probability levels for multivariate problems (fast probability integration) together with a general-purpose finite-element code for stress, vibration, and buckling analysis. Application is made to a space propulsion system turbine blade for which the geometry and material properties are treated as random variables.

  4. Amplification uncertainty relation for probabilistic amplifiers

    NASA Astrophysics Data System (ADS)

    Namiki, Ryo

    2015-09-01

    Traditionally, quantum amplification limit refers to the property of inevitable noise addition on canonical variables when the field amplitude of an unknown state is linearly transformed through a quantum channel. Recent theoretical studies have determined amplification limits for cases of probabilistic quantum channels or general quantum operations by specifying a set of input states or a state ensemble. However, it remains open how much excess noise on canonical variables is unavoidable and whether there exists a fundamental trade-off relation between the canonical pair in a general amplification process. In this paper we present an uncertainty-product form of amplification limits for general quantum operations by assuming an input ensemble of Gaussian-distributed coherent states. It can be derived as a straightforward consequence of canonical uncertainty relations and retrieves basic properties of the traditional amplification limit. In addition, our amplification limit turns out to give a physical limitation on probabilistic reduction of an Einstein-Podolsky-Rosen uncertainty. In this regard, we find a condition that probabilistic amplifiers can be regarded as local filtering operations to distill entanglement. This condition establishes a clear benchmark to verify an advantage of non-Gaussian operations beyond Gaussian operations with a feasible input set of coherent states and standard homodyne measurements.

  5. Probabilistic Modeling of Space Shuttle Debris Impact

    NASA Technical Reports Server (NTRS)

    Huyse, Luc J.; Asce, M.; Waldhart, Chris J.; Riha, David S.; Larsen, Curtis E.; Gomez, Reynaldo J.; Stuart, Phillip C.

    2007-01-01

    On Feb 1, 2003, the Shuttle Columbia was lost during its return to Earth. As a result of the conclusion that debris impact caused the damage to the left wing of the Columbia Space Shuttle Vehicle (SSV) during ascent, the Columbia Accident Investigation Board recommended that an assessment be performed of the debris environment experienced by the SSV during ascent. A flight rationale based on probabilistic assessment is used for the SSV return-to-flight. The assessment entails identifying all potential debris sources, their probable geometric and aerodynamic characteristics, and their potential for impacting and damaging critical Shuttle components. A probabilistic analysis tool, based on the SwRI-developed NESSUS probabilistic analysis software, predicts the probability of impact and damage to the space shuttle wing leading edge and thermal protection system components. Among other parameters, the likelihood of unacceptable damage depends on the time of release (Mach number of the orbiter) and the divot mass as well as the impact velocity and impact angle. A typical result is visualized in the figures below. Probability of impact and damage, as well as the sensitivities thereof with respect to the distribution assumptions, can be computed and visualized at each point on the orbiter or summarized per wing panel or tile zone.

  6. Multiclient Identification System Using Adaptive Probabilistic Model

    NASA Astrophysics Data System (ADS)

    Lin, Chin-Teng; Siana, Linda; Shou, Yu-Wen; Yang, Chien-Ting

    2010-12-01

    This paper aims at integrating detection and identification of human faces in a more practical and real-time face recognition system. The proposed face detection system is based on the cascade Adaboost method to improve the precision and robustness toward unstable surrounding lightings. Our Adaboost method innovates to adjust the environmental lighting conditions by histogram lighting normalization and to accurately locate the face regions by a region-based-clustering process as well. We also address on the problem of multi-scale faces in this paper by using 12 different scales of searching windows and 5 different orientations for each client in pursuit of the multi-view independent face identification. There are majorly two methodological parts in our face identification system, including PCA (principal component analysis) facial feature extraction and adaptive probabilistic model (APM). The structure of our implemented APM with a weighted combination of simple probabilistic functions constructs the likelihood functions by the probabilistic constraint in the similarity measures. In addition, our proposed method can online add a new client and update the information of registered clients due to the constructed APM. The experimental results eventually show the superior performance of our proposed system for both offline and real-time online testing.

  7. Probabilistic Graph Layout for Uncertain Network Visualization.

    PubMed

    Schulz, Christoph; Nocaj, Arlind; Goertler, Jochen; Deussen, Oliver; Brandes, Ulrik; Weiskopf, Daniel

    2017-01-01

    We present a novel uncertain network visualization technique based on node-link diagrams. Nodes expand spatially in our probabilistic graph layout, depending on the underlying probability distributions of edges. The visualization is created by computing a two-dimensional graph embedding that combines samples from the probabilistic graph. A Monte Carlo process is used to decompose a probabilistic graph into its possible instances and to continue with our graph layout technique. Splatting and edge bundling are used to visualize point clouds and network topology. The results provide insights into probability distributions for the entire network-not only for individual nodes and edges. We validate our approach using three data sets that represent a wide range of network types: synthetic data, protein-protein interactions from the STRING database, and travel times extracted from Google Maps. Our approach reveals general limitations of the force-directed layout and allows the user to recognize that some nodes of the graph are at a specific position just by chance.

  8. Asteroid Risk Assessment: A Probabilistic Approach.

    PubMed

    Reinhardt, Jason C; Chen, Xi; Liu, Wenhao; Manchev, Petar; Paté-Cornell, M Elisabeth

    2016-02-01

    Following the 2013 Chelyabinsk event, the risks posed by asteroids attracted renewed interest, from both the scientific and policy-making communities. It reminded the world that impacts from near-Earth objects (NEOs), while rare, have the potential to cause great damage to cities and populations. Point estimates of the risk (such as mean numbers of casualties) have been proposed, but because of the low-probability, high-consequence nature of asteroid impacts, these averages provide limited actionable information. While more work is needed to further refine its input distributions (e.g., NEO diameters), the probabilistic model presented in this article allows a more complete evaluation of the risk of NEO impacts because the results are distributions that cover the range of potential casualties. This model is based on a modularized simulation that uses probabilistic inputs to estimate probabilistic risk metrics, including those of rare asteroid impacts. Illustrative results of this analysis are presented for a period of 100 years. As part of this demonstration, we assess the effectiveness of civil defense measures in mitigating the risk of human casualties. We find that they are likely to be beneficial but not a panacea. We also compute the probability-but not the consequences-of an impact with global effects ("cataclysm"). We conclude that there is a continued need for NEO observation, and for analyses of the feasibility and risk-reduction effectiveness of space missions designed to deflect or destroy asteroids that threaten the Earth.

  9. Integrating Sequence Evolution into Probabilistic Orthology Analysis.

    PubMed

    Ullah, Ikram; Sjöstrand, Joel; Andersson, Peter; Sennblad, Bengt; Lagergren, Jens

    2015-11-01

    Orthology analysis, that is, finding out whether a pair of homologous genes are orthologs - stemming from a speciation - or paralogs - stemming from a gene duplication - is of central importance in computational biology, genome annotation, and phylogenetic inference. In particular, an orthologous relationship makes functional equivalence of the two genes highly likely. A major approach to orthology analysis is to reconcile a gene tree to the corresponding species tree, (most commonly performed using the most parsimonious reconciliation, MPR). However, most such phylogenetic orthology methods infer the gene tree without considering the constraints implied by the species tree and, perhaps even more importantly, only allow the gene sequences to influence the orthology analysis through the a priori reconstructed gene tree. We propose a sound, comprehensive Bayesian Markov chain Monte Carlo-based method, DLRSOrthology, to compute orthology probabilities. It efficiently sums over the possible gene trees and jointly takes into account the current gene tree, all possible reconciliations to the species tree, and the, typically strong, signal conveyed by the sequences. We compare our method with PrIME-GEM, a probabilistic orthology approach built on a probabilistic duplication-loss model, and MrBayesMPR, a probabilistic orthology approach that is based on conventional Bayesian inference coupled with MPR. We find that DLRSOrthology outperforms these competing approaches on synthetic data as well as on biological data sets and is robust to incomplete taxon sampling artifacts.

  10. A method for probabilistic flash flood forecasting

    NASA Astrophysics Data System (ADS)

    Hardy, Jill; Gourley, Jonathan J.; Kirstetter, Pierre-Emmanuel; Hong, Yang; Kong, Fanyou; Flamig, Zachary L.

    2016-10-01

    Flash flooding is one of the most costly and deadly natural hazards in the United States and across the globe. This study advances the use of high-resolution quantitative precipitation forecasts (QPFs) for flash flood forecasting. The QPFs are derived from a stormscale ensemble prediction system, and used within a distributed hydrological model framework to yield basin-specific, probabilistic flash flood forecasts (PFFFs). Before creating the PFFFs, it is important to characterize QPF uncertainty, particularly in terms of location which is the most problematic for hydrological use of QPFs. The SAL methodology (Wernli et al., 2008), which stands for structure, amplitude, and location, is used for this error quantification, with a focus on location. Finally, the PFFF methodology is proposed that produces probabilistic hydrological forecasts. The main advantages of this method are: (1) identifying specific basin scales that are forecast to be impacted by flash flooding; (2) yielding probabilistic information about the forecast hydrologic response that accounts for the locational uncertainties of the QPFs; (3) improving lead time by using stormscale NWP ensemble forecasts; and (4) not requiring multiple simulations, which are computationally demanding.

  11. Earthquake insurance pricing: a risk-based approach.

    PubMed

    Lin, Jeng-Hsiang

    2017-05-23

    Flat earthquake premiums are 'uniformly' set for a variety of buildings in many countries, neglecting the fact that the risk of damage to buildings by earthquakes is based on a wide range of factors. How these factors influence the insurance premiums is worth being studied further. Proposed herein is a risk-based approach to estimate the earthquake insurance rates of buildings. Examples of application of the approach to buildings located in Taipei city of Taiwan were examined. Then, the earthquake insurance rates for the buildings investigated were calculated and tabulated. To fulfil insurance rating, the buildings were classified into 15 model building types according to their construction materials and building height. Seismic design levels were also considered in insurance rating in response to the effect of seismic zone and construction years of buildings. This paper may be of interest to insurers, actuaries, and private and public sectors of insurance. © 2017 The Author(s). Disasters © Overseas Development Institute, 2017.

  12. Nuclear insurance risk assessment using risk-based methodology

    SciTech Connect

    Wendland, W.G. )

    1992-01-01

    This paper presents American Nuclear Insurers' (ANI's) and Mutual Atomic Energy Liability Underwriters' (MAELU's) process and experience for conducting nuclear insurance risk assessments using a risk-based methodology. The process is primarily qualitative and uses traditional insurance risk assessment methods and an approach developed under the auspices of the American Society of Mechanical Engineers (ASME) in which ANI/MAELU is an active sponsor. This process assists ANI's technical resources in identifying where to look for insurance risk in an industry in which insurance exposure tends to be dynamic and nonactuarial. The process is an evolving one that also seeks to minimize the impact on insureds while maintaining a mutually agreeable risk tolerance.

  13. On the optimal risk based design of highway drainage structures

    NASA Astrophysics Data System (ADS)

    Tung, Y.-K.; Bao, Y.

    1990-12-01

    For a proposed highway bridge or culvert, the total cost to the public during its expected service life includes capital investment on the structures, regular operation and maintenance costs, and various flood related costs. The flood related damage costs include items such as replacement and repair costs of the highway bridge or culvert, flood plain property damage costs, users costs from traffic interruptions and detours, and others. As the design discharge increases, the required capital investment increases but the corresponding flood related damage costs decrease. Hydraulic design of a bridge or culvert using a riskbased approach is to choose among the alternatives the one associated with the least total expected cost. In this paper, the risk-based design procedure is applied to pipe culvert design. The effect of the hydrologic uncertainties such as sample size and type of flood distribution model on the optimal culvert design parameters including design return period and total expected cost are examined in this paper.

  14. Risk-based decision-making framework for the selection of sediment dredging option.

    PubMed

    Manap, Norpadzlihatun; Voulvoulis, Nikolaos

    2014-10-15

    The aim of this study was to develop a risk-based decision-making framework for the selection of sediment dredging option. Descriptions using case studies of the newly integrated, holistic and staged framework were followed. The first stage utilized the historical dredging monitoring data and the contamination level in media data into Ecological Risk Assessment phases, which have been altered for benefits in cost, time and simplicity. How Multi-Criteria Decision Analysis (MCDA) can be used to analyze and prioritize dredging areas based on environmental, socio-economic and managerial criteria was described for the next stage. The results from MCDA will be integrated into Ecological Risk Assessment to characterize the degree of contamination in the prioritized areas. The last stage was later described using these findings and analyzed using MCDA, in order to identify the best sediment dredging option, accounting for the economic, environmental and technical aspects of dredging, which is beneficial for dredging and sediment management industries.

  15. Homeland security R&D roadmapping : risk-based methodological options.

    SciTech Connect

    Brandt, Larry D.

    2008-12-01

    The Department of Energy (DOE) National Laboratories support the Department of Homeland Security (DHS) in the development and execution of a research and development (R&D) strategy to improve the nation's preparedness against terrorist threats. Current approaches to planning and prioritization of DHS research decisions are informed by risk assessment tools and processes intended to allocate resources to programs that are likely to have the highest payoff. Early applications of such processes have faced challenges in several areas, including characterization of the intelligent adversary and linkage to strategic risk management decisions. The risk-based analysis initiatives at Sandia Laboratories could augment the methodologies currently being applied by the DHS and could support more credible R&D roadmapping for national homeland security programs. Implementation and execution issues facing homeland security R&D initiatives within the national laboratories emerged as a particular concern in this research.

  16. Options for improving hazardous waste cleanups using risk-based criteria

    SciTech Connect

    Elcock, D.

    1995-06-01

    This paper explores how risk- and technology-based criteria are currently used in the RCRA and CERCLA cleanup programs. It identifies ways in which risk could be further incorporated into RCRA and CERCLA cleanup requirements and the implications of risk-based approaches. The more universal use of risk assessment as embodied in the risk communication and risk improvement bills before Congress is not addressed. Incorporating risk into the laws and regulations governing hazardous waste cleanup, will allow the use of the best scientific information available to further the goal of environmental protection in the United States while containing costs. and may help set an example for other countries that may be developing cleanup programs, thereby contributing to enhanced global environmental management.

  17. Application of Risk-Based Inspection method for gas compressor station

    NASA Astrophysics Data System (ADS)

    Zhang, Meng; Liang, Wei; Qiu, Zeyang; Lin, Yang

    2017-05-01

    According to the complex process and lots of equipment, there are risks in gas compressor station. At present, research on integrity management of gas compressor station is insufficient. In this paper, the basic principle of Risk Based Inspection (RBI) and the RBI methodology are studied; the process of RBI in the gas compressor station is developed. The corrosion loop and logistics loop of the gas compressor station are determined through the study of corrosion mechanism and process of the gas compressor station. The probability of failure is calculated by using the modified coefficient, and the consequence of failure is calculated by the quantitative method. In particular, we addressed the application of a RBI methodology in a gas compressor station. The risk ranking is helpful to find the best preventive plan for inspection in the case study.

  18. Multicriteria Decision Framework for Cybersecurity Risk Assessment and Management.

    PubMed

    Ganin, Alexander A; Quach, Phuoc; Panwar, Mahesh; Collier, Zachary A; Keisler, Jeffrey M; Marchese, Dayton; Linkov, Igor

    2017-09-05

    Risk assessors and managers face many difficult challenges related to novel cyber systems. Among these challenges are the constantly changing nature of cyber systems caused by technical advances, their distribution across the physical, information, and sociocognitive domains, and the complex network structures often including thousands of nodes. Here, we review probabilistic and risk-based decision-making techniques applied to cyber systems and conclude that existing approaches typically do not address all components of the risk assessment triplet (threat, vulnerability, consequence) and lack the ability to integrate across multiple domains of cyber systems to provide guidance for enhancing cybersecurity. We present a decision-analysis-based approach that quantifies threat, vulnerability, and consequences through a set of criteria designed to assess the overall utility of cybersecurity management alternatives. The proposed framework bridges the gap between risk assessment and risk management, allowing an analyst to ensure a structured and transparent process of selecting risk management alternatives. The use of this technique is illustrated for a hypothetical, but realistic, case study exemplifying the process of evaluating and ranking five cybersecurity enhancement strategies. The approach presented does not necessarily eliminate biases and subjectivity necessary for selecting countermeasures, but provides justifiable methods for selecting risk management actions consistent with stakeholder and decisionmaker values and technical data. Published 2017. This article is a U.S. Government work and is in the public domain in the U.S.A.

  19. Heart failure disease management programs: a cost-effectiveness analysis.

    PubMed

    Chan, David C; Heidenreich, Paul A; Weinstein, Milton C; Fonarow, Gregg C

    2008-02-01

    Heart failure (HF) disease management programs have shown impressive reductions in hospitalizations and mortality, but in studies limited to short time frames and high-risk patient populations. Current guidelines thus only recommend disease management targeted to high-risk patients with HF. This study applied a new technique to infer the degree to which clinical trials have targeted patients by risk based on observed rates of hospitalization and death. A Markov model was used to assess the incremental life expectancy and cost of providing disease management for high-risk to low-risk patients. Sensitivity analyses of various long-term scenarios and of reduced effectiveness in low-risk patients were also considered. The incremental cost-effectiveness ratio of extending coverage to all patients was $9700 per life-year gained in the base case. In aggregate, universal coverage almost quadrupled life-years saved as compared to coverage of only the highest quintile of risk. A worst case analysis with simultaneous conservative assumptions yielded an incremental cost-effectiveness ratio of $110,000 per life-year gained. In a probabilistic sensitivity analysis, 99.74% of possible incremental cost-effectiveness ratios were <$50,000 per life-year gained. Heart failure disease management programs are likely cost-effective in the long-term along the whole spectrum of patient risk. Health gains could be extended by enrolling a broader group of patients with HF in disease management.

  20. Probabilistic structural analysis of aerospace components using NESSUS

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Nagpal, Vinod K.; Chamis, Christos C.

    1988-01-01

    Probabilistic structural analysis of a Space Shuttle main engine turbopump blade is conducted using the computer code NESSUS (numerical evaluation of stochastic structures under stress). The goal of the analysis is to derive probabilistic characteristics of blade response given probabilistic descriptions of uncertainties in blade geometry, material properties, and temperature and pressure distributions. Probability densities are derived for critical blade responses. Risk assessment and failure life analysis is conducted assuming different failure models.