Science.gov

Sample records for risk analysis method

  1. Bayes Method Plant Aging Risk Analysis

    Energy Science and Technology Software Center (ESTSC)

    1992-03-13

    DORIAN is an integrated package for performing Bayesian aging analysis of reliability data; e.g. for identifying trends in component failure rates and/or outage durations as a function of time. The user must specify several alternatives hypothesized aging models (i.e. possible trends) along with prior probabilities indicating the subjective probability that each trend is actually the correct one. DORIAN then uses component failure and/or repair data over time to update these prior probabilities and develop amore » posterior probability for each aging model, representing the probability that each model is the correct one in light of the observed data rather than a priori. Mean, median, and 5th and 95th percentile trends are also compiled from the posterior probabilities.« less

  2. Comprehensive safeguards evaluation methods and societal risk analysis

    SciTech Connect

    Richardson, J.M.

    1982-03-01

    Essential capabilities of an integrated evaluation methodology for analyzing safeguards systems are discussed. Such a methodology must be conceptually meaningful, technically defensible, discriminating and consistent. A decompostion of safeguards systems by function is mentioned as a possible starting point for methodology development. The application of a societal risk equation to safeguards systems analysis is addressed. Conceptual problems with this approach are discussed. Technical difficulties in applying this equation to safeguards systems are illustrated through the use of confidence intervals, information content, hypothesis testing and ranking and selection procedures.

  3. Synthesis of Enterprise and Value-Based Methods for Multiattribute Risk Analysis

    SciTech Connect

    C. Robert Kenley; John W. Collins; John M. Beck; Harold J. Heydt; Chad B. Garcia

    2001-10-01

    This paper describes a method for performing multiattribute decision analysis to prioritize ap-proaches to handling risks during the development and operation of complex socio-technical systems. The method combines risk categorization based on enterprise views, risk prioritization of the categories based on the Analytic Hierarchy Process (AHP), and more standard probability-consequence ratings schemes. We also apply value-based testing me-thods used in software development to prioritize risk-handling approaches. We describe a tool that synthesizes the methods and performs a multiattribute analysis of the technical and pro-grammatic risks on the Next Generation Nuclear Plant (NGNP) enterprise.

  4. Development of partial failure analysis method in probability risk assessments

    SciTech Connect

    Ni, T.; Modarres, M.

    1996-12-01

    This paper presents a new approach to evaluate the partial failure effect on current Probability Risk Assessments (PRAs). An integrated methodology of the thermal-hydraulic analysis and fuzzy logic simulation using the Dynamic Master Logic Diagram (DMLD) was developed. The thermal-hydraulic analysis used in this approach is to identify partial operation effect of any PRA system function in a plant model. The DMLD is used to simulate the system performance of the partial failure effect and inspect all minimal cut sets of system functions. This methodology can be applied in the context of a full scope PRA to reduce core damage frequency. An example of this application of the approach is presented. The partial failure data used in the example is from a survey study of partial failure effects from the Nuclear Plant Reliability Data System (NPRDS).

  5. Simplified plant analysis risk (SPAR) human reliability analysis (HRA) methodology: Comparisons with other HRA methods

    SciTech Connect

    J. C. Byers; D. I. Gertman; S. G. Hill; H. S. Blackman; C. D. Gentillon; B. P. Hallbert; L. N. Haney

    2000-07-31

    The 1994 Accident Sequence Precursor (ASP) human reliability analysis (HRA) methodology was developed for the U.S. Nuclear Regulatory Commission (USNRC) in 1994 by the Idaho National Engineering and Environmental Laboratory (INEEL). It was decided to revise that methodology for use by the Simplified Plant Analysis Risk (SPAR) program. The 1994 ASP HRA methodology was compared, by a team of analysts, on a point-by-point basis to a variety of other HRA methods and sources. This paper briefly discusses how the comparisons were made and how the 1994 ASP HRA methodology was revised to incorporate desirable aspects of other methods. The revised methodology was renamed the SPAR HRA methodology.

  6. Simplified Plant Analysis Risk (SPAR) Human Reliability Analysis (HRA) Methodology: Comparisons with other HRA Methods

    SciTech Connect

    Byers, James Clifford; Gertman, David Ira; Hill, Susan Gardiner; Blackman, Harold Stabler; Gentillon, Cynthia Ann; Hallbert, Bruce Perry; Haney, Lon Nolan

    2000-08-01

    The 1994 Accident Sequence Precursor (ASP) human reliability analysis (HRA) methodology was developed for the U.S. Nuclear Regulatory Commission (USNRC) in 1994 by the Idaho National Engineering and Environmental Laboratory (INEEL). It was decided to revise that methodology for use by the Simplified Plant Analysis Risk (SPAR) program. The 1994 ASP HRA methodology was compared, by a team of analysts, on a point-by-point basis to a variety of other HRA methods and sources. This paper briefly discusses how the comparisons were made and how the 1994 ASP HRA methodology was revised to incorporate desirable aspects of other methods. The revised methodology was renamed the SPAR HRA methodology.

  7. Application of the Maximum Entropy Method to Risk Analysis of Mergers and Acquisitions

    NASA Astrophysics Data System (ADS)

    Xie, Jigang; Song, Wenyun

    The maximum entropy (ME) method can be used to analyze the risk of mergers and acquisitions when only pre-acquisition information is available. A practical example of the risk analysis of China listed firms’ mergers and acquisitions is provided to testify the feasibility and practicality of the method.

  8. Finding and fixing systems weaknesses: probabilistic methods and applications of engineering risk analysis.

    PubMed

    Paté-Cornell, Elisabeth

    2002-04-01

    Methods of engineering risk analysis are based on a functional analysis of systems and on the probabilities (generally Bayesian) of the events and random variables that affect their performances. These methods allow identification of a system's failure modes, computation of its probability of failure or performance deterioration per time unit or operation, and of the contribution of each component to the probabilities and consequences of failures. The model has been extended to include the human decisions and actions that affect components' performances, and the management factors that affect behaviors and can thus be root causes of system failures. By computing the risk with and without proposed measures, one can then set priorities among different risk management options under resource constraints. In this article, I present briefly the engineering risk analysis method, then several illustrations of risk computations that can be used to identify a system's weaknesses and the most cost-effective way to fix them. The first example concerns the heat shield of the space shuttle orbiter and shows the relative risk contribution of the tiles in different areas of the orbiter's surface. The second application is to patient risk in anesthesia and demonstrates how the engineering risk analysis method can be used in the medical domain to rank the benefits of risk mitigation measures, in that case, mostly organizational. The third application is a model of seismic risk analysis and mitigation, with application to the San Francisco Bay area for the assessment of the costs and benefits of different seismic provisions of building codes. In all three cases, some aspects of the results were not intuitively obvious. The probabilistic risk analysis (PRA) method allowed identifying system weaknesses and the most cost-effective way to fix them. PMID:12022679

  9. Review of Research Trends and Methods in Nano Environmental, Health, and Safety Risk Analysis.

    PubMed

    Erbis, Serkan; Ok, Zeynep; Isaacs, Jacqueline A; Benneyan, James C; Kamarthi, Sagar

    2016-08-01

    Despite the many touted benefits of nanomaterials, concerns remain about their possible environmental, health, and safety (EHS) risks in terms of their toxicity, long-term accumulation effects, or dose-response relationships. The published studies on EHS risks of nanomaterials have increased significantly over the past decade and half, with most focused on nanotoxicology. Researchers are still learning about health consequences of nanomaterials and how to make environmentally responsible decisions regarding their production. This article characterizes the scientific literature on nano-EHS risk analysis to map the state-of-the-art developments in this field and chart guidance for the future directions. First, an analysis of keyword co-occurrence networks is investigated for nano-EHS literature published in the past decade to identify the intellectual turning points and research trends in nanorisk analysis studies. The exposure groups targeted in emerging nano-EHS studies are also assessed. System engineering methods for risk, safety, uncertainty, and system reliability analysis are reviewed, followed by detailed descriptions where applications of these methods are utilized to analyze nanomaterial EHS risks. Finally, the trends, methods, future directions, and opportunities of system engineering methods in nano-EHS research are discussed. The analysis of nano-EHS literature presented in this article provides important insights on risk assessment and risk management tools associated with nanotechnology, nanomanufacturing, and nano-enabled products. PMID:26882074

  10. RAMPART (TM): Risk Assessment Method-Property Analysis and Ranking Tool v.4.0

    SciTech Connect

    2007-09-30

    RAMPART{trademark}, Risk Assessment Method-property Analysis and Ranking Tool, is a new type of computer software package for the assessment of risk to buildings. RAMPART{trademark} has been developed by Sandia National Laboratories (SNL) for the U.S. General Services Administration (GSA). RAMPART {trademark} has been designed and developed to be a risk-based decision support tool that requires no risk analysis expertise on the part of the user. The RAMPART{trademark} user interface elicits information from the user about the building. The RAMPART{trademark} expert system is a set of rules that embodies GSA corporate knowledge and SNL's risk assessment experience. The RAMPART{trademark} database contains both data entered by the user during a building analysis session and large sets of natural hazard and crime data. RAMPART{trademark} algorithms use these data to assess the risk associated with a given building in the face of certain hazards. Risks arising from five natural hazards (earthquake, hurricane, winter storm, tornado and flood); crime (inside and outside the building); fire and terrorism are calculated. These hazards may cause losses of various kinds. RAMPART{trademark} considers death, injury, loss of mission, loss of property, loss of contents, loss of building use, and first-responder loss. The results of each analysis are presented graphically on the screen and in a written report.

  11. RAMPART (TM): Risk Assessment Method-Property Analysis and Ranking Tool v.4.0

    Energy Science and Technology Software Center (ESTSC)

    2007-09-30

    RAMPART{trademark}, Risk Assessment Method-property Analysis and Ranking Tool, is a new type of computer software package for the assessment of risk to buildings. RAMPART{trademark} has been developed by Sandia National Laboratories (SNL) for the U.S. General Services Administration (GSA). RAMPART {trademark} has been designed and developed to be a risk-based decision support tool that requires no risk analysis expertise on the part of the user. The RAMPART{trademark} user interface elicits information from the user aboutmore » the building. The RAMPART{trademark} expert system is a set of rules that embodies GSA corporate knowledge and SNL's risk assessment experience. The RAMPART{trademark} database contains both data entered by the user during a building analysis session and large sets of natural hazard and crime data. RAMPART{trademark} algorithms use these data to assess the risk associated with a given building in the face of certain hazards. Risks arising from five natural hazards (earthquake, hurricane, winter storm, tornado and flood); crime (inside and outside the building); fire and terrorism are calculated. These hazards may cause losses of various kinds. RAMPART{trademark} considers death, injury, loss of mission, loss of property, loss of contents, loss of building use, and first-responder loss. The results of each analysis are presented graphically on the screen and in a written report.« less

  12. Multi-factor Constrained Information Analysis Method for Landslide Hazard Risk

    NASA Astrophysics Data System (ADS)

    Tao, Kunwang; Wang, Liang; Qian, Xinlin

    2015-04-01

    Landslide hazard cause enormous damage to human life, property, and the environment. The most effective way to mitigate the effects of landslides is to evaluate the risk of the landslides, and take measures to avoid the loss in advance. Various factors should be considered for the landslides' risk assessment, so the assessment has great complexity and uncertainty. According to landslides, the multiple factors constrained method for landslides risk assessment was proposed at the same time, which with three step to carried it out, first using the GIS technology to divide the analysis grid as the base analysis unit, second, according to the available information, slope, lithology, faults, landuse, etc. as the multiple evaluation factors, finally, based on years of landslides in the observed data, the risk assessment analysis of landslides were realized with of multiple factors constrained assessment model, which the weight value of every factor was determined by the information model. The region of Gongliu was selected as the experimental area which located in Xinjiang Ili River basin and the altitude of 600 to 4000 meters, with the special terrain characteristics of long at the east to west, narrow at north to south. The unique topography characters is the abundant rainfall which causing frequent landslides. By selecting 500m * 500m as an analysis grid which covering the whole study area, based on the multiple factors constrained method for the landslides risk assessment, a comprehensive assessment of the landslides risk in this region were computed out, and the landslide hazard classification map was realized at the end. From the experimental results of the statistical perspective, the proportion of landslide hazard point is 94.04% at a little high risk and high risk areas. And the proportion of 4.64% at the low risk zone, 1.32% at the most low risk zone. The results showed a high probability of landslides at high level of the assess region, which showed that

  13. An improved method for risk evaluation in failure modes and effects analysis of CNC lathe

    NASA Astrophysics Data System (ADS)

    Rachieru, N.; Belu, N.; Anghel, D. C.

    2015-11-01

    Failure mode and effects analysis (FMEA) is one of the most popular reliability analysis tools for identifying, assessing and eliminating potential failure modes in a wide range of industries. In general, failure modes in FMEA are evaluated and ranked through the risk priority number (RPN), which is obtained by the multiplication of crisp values of the risk factors, such as the occurrence (O), severity (S), and detection (D) of each failure mode. However, the crisp RPN method has been criticized to have several deficiencies. In this paper, linguistic variables, expressed in Gaussian, trapezoidal or triangular fuzzy numbers, are used to assess the ratings and weights for the risk factors S, O and D. A new risk assessment system based on the fuzzy set theory and fuzzy rule base theory is to be applied to assess and rank risks associated to failure modes that could appear in the functioning of Turn 55 Lathe CNC. Two case studies have been shown to demonstrate the methodology thus developed. It is illustrated a parallel between the results obtained by the traditional method and fuzzy logic for determining the RPNs. The results show that the proposed approach can reduce duplicated RPN numbers and get a more accurate, reasonable risk assessment. As a result, the stability of product and process can be assured.

  14. Genotype relative risks: methods for design and analysis of candidate-gene association studies.

    PubMed Central

    Schaid, D J; Sommer, S S

    1993-01-01

    Design and analysis methods are presented for studying the association of a candidate gene with a disease by using parental data in place of nonrelated controls. This alternative design eliminates spurious differences in allele frequencies between cases and nonrelated controls resulting from different ethnic origins and population stratification for these two groups. We present analysis methods which are based on two genetic relative risks: (1) the relative risk of disease for homozygotes with two copies of the candidate gene versus homozygotes without the candidate gene and (2) the relative risk for heterozygotes with one copy of the candidate gene versus homozygotes without the candidate gene. In addition to estimating the magnitude of these relative risks, likelihood methods allow specific hypotheses to be tested, namely, a test for overall association of the candidate gene with disease, as well as specific genetic hypotheses, such as dominant or recessive inheritance. Two likelihood methods are presented: (1) a likelihood method appropriate when Hardy-Weinberg equilibrium holds and (2) a likelihood method in which we condition on parental genotype data when Hardy-Weinberg equilibrium does not hold. The results for the relative efficiency of these two methods suggest that the conditional approach may at times be preferable, even when equilibrium holds. Sample-size and power calculations are presented for a multitiered design. The purpose of tier 1 is to detect the presence of an abnormal sequence for a postulated candidate gene among a small group of cases. The purpose of tier 2 is to test for association of the abnormal variant with disease, such as by the likelihood methods presented. The purpose of tier 3 is to confirm positive results from tier 2. Results indicate that required sample sizes are smaller when expression of disease is recessive, rather than dominant, and that, for recessive disease and large relative risks, necessary sample sizes may be

  15. Genotype relative risks: Methods for design and analysis of candidate-gene association studies

    SciTech Connect

    Shaid, D.J.; Sommer, S.S. )

    1993-11-01

    Design and analysis methods are presented for studying the association of a candidate gene with a disease by using parental data in place of nonrelated controls. This alternating design eliminates spurious differences in allele frequencies between cases and nonrelated controls resulting from different ethnic origins and population stratification for these two groups. The authors present analysis methods which are based on two genetic relative risks: (1) the relative risk of disease for homozygotes with two copies of the candidate gene versus homozygotes without the candidate gene and (2) the relative risk for heterozygotes with one copy of the candidate gene versus homozygotes without the candidate gene. In addition to estimating the magnitude of these relative risks, likelihood methods allow specific hypotheses to be tested, namely, a test for overall association of the candidate gene with disease, as well as specific genetic hypotheses, such as dominant or recessive inheritance. Two likelihood methods are presented: (1) a likelihood method appropriate when Hardy-Weinberg equilibrium holds and (2) a likelihood method in which the authors condition on parental genotype data when Hardy-Weinberg equilibrium does not hold. The results for the relative efficiency of these two methods suggest that the conditional approach may at times be preferable, even when equilibrium holds. Sample-size and power calculations are presented for a multitiered design. Tier 1 detects the presence of an abnormal sequence for a postulated candidate gene among a small group of cases. Tier 2 tests for association of the abnormal variant with disease, such as by the likelihood methods presented. Tier 3 confirms positive results from tier 2. Results indicate that required sample sizes are smaller when expression of disease is recessive, rather than dominant, and that, for recessive disease and large relative risks, necessary sample sizes may be feasible. 19 refs., 2 figs., 2 tabs.

  16. Method of assessing a lipid-related health risk based on ion mobility analysis of lipoproteins

    DOEpatents

    Benner, W. Henry; Krauss, Ronald M.; Blanche, Patricia J.

    2010-12-14

    A medical diagnostic method and instrumentation system for analyzing noncovalently bonded agglomerated biological particles is described. The method and system comprises: a method of preparation for the biological particles; an electrospray generator; an alpha particle radiation source; a differential mobility analyzer; a particle counter; and data acquisition and analysis means. The medical device is useful for the assessment of human diseases, such as cardiac disease risk and hyperlipidemia, by rapid quantitative analysis of lipoprotein fraction densities. Initially, purification procedures are described to reduce an initial blood sample to an analytical input to the instrument. The measured sizes from the analytical sample are correlated with densities, resulting in a spectrum of lipoprotein densities. The lipoprotein density distribution can then be used to characterize cardiac and other lipid-related health risks.

  17. Risk-based analysis methods applied to nuclear power plant technical specifications

    SciTech Connect

    Wagner, D.P.; Minton, L.A.; Gaertner, J.P.

    1989-03-01

    A computer-aided methodology and practical applications of risk-based evaluation of technical specifications are described. The methodology, developed for use by the utility industry, is a part of the overall process of improving nuclear power plant technical specifications. The SOCRATES computer program uses the results of a probabilistic risk assessment or a system-level risk analysis to calculate changes in risk due to changes in the surveillance test interval and/or the allowed outage time stated in the technical specification. The computer program can accommodate various testing strategies (such as staggered or simultaneous testing) to allow modeling of component testing as it is carried out at the plant. The methods and computer program are an integral part of a larger decision process aimed at determining benefits from technical specification changes. These benefits can include cost savings to the utilities by reducing forced shutdowns and decreasing labor requirements for test and maintenance activities, with no adverse impacts on risk. The methodology and the SOCRATES computer program have been used extensively toe valuate several actual technical specifications in case studies demonstrating the methods. Summaries of these applications demonstrate the types of results achieved and the usefulness of the risk-based evaluation in improving the technical specifications.

  18. An integrated optimization method for river water quality management and risk analysis in a rural system.

    PubMed

    Liu, J; Li, Y P; Huang, G H; Zeng, X T; Nie, S

    2016-01-01

    In this study, an interval-stochastic-based risk analysis (RSRA) method is developed for supporting river water quality management in a rural system under uncertainty (i.e., uncertainties exist in a number of system components as well as their interrelationships). The RSRA method is effective in risk management and policy analysis, particularly when the inputs (such as allowable pollutant discharge and pollutant discharge rate) are expressed as probability distributions and interval values. Moreover, decision-makers' attitudes towards system risk can be reflected using a restricted resource measure by controlling the variability of the recourse cost. The RSRA method is then applied to a real case of water quality management in the Heshui River Basin (a rural area of China), where chemical oxygen demand (COD), total nitrogen (TN), total phosphorus (TP), and soil loss are selected as major indicators to identify the water pollution control strategies. Results reveal that uncertainties and risk attitudes have significant effects on both pollutant discharge and system benefit. A high risk measure level can lead to a reduced system benefit; however, this reduction also corresponds to raised system reliability. Results also disclose that (a) agriculture is the dominant contributor to soil loss, TN, and TP loads, and abatement actions should be mainly carried out for paddy and dry farms; (b) livestock husbandry is the main COD discharger, and abatement measures should be mainly conducted for poultry farm; (c) fishery accounts for a high percentage of TN, TP, and COD discharges but a has low percentage of overall net benefit, and it may be beneficial to cease fishery activities in the basin. The findings can facilitate the local authority in identifying desired pollution control strategies with the tradeoff between socioeconomic development and environmental sustainability. PMID:26310705

  19. Quantitative environmental risk analysis

    SciTech Connect

    Klovning, J.; Nilsen, E.F.

    1995-12-31

    According to regulations relating to implementation and rise of risk analysis in the petroleum activities issued by the Norwegian Petroleum Directorate, it is mandatory for an operator on the Norwegian Continental Shelf to establish acceptance criteria for environmental risk in the activities and carry out environmental risk analysis. This paper presents a {open_quotes}new{close_quotes} method for environmental risk analysis developed by the company. The objective has been to assist the company to meet rules and regulations and to assess and describe the environmental risk in a systematic manner. In the environmental risk analysis the most sensitive biological resource in the affected area is used to assess the environmental damage. The analytical method is based on the methodology for quantitative risk analysis related to loss of life. In addition it incorporates the effect of seasonal fluctuations in the environmental risk evaluations. The paper is describing the function of the main analytical sequences exemplified through an analysis of environmental risk related to exploration drilling in an environmental sensitive area on the Norwegian Continental Shelf.

  20. Uncertainty analysis in regulatory programs: Application factors versus probabilistic methods in ecological risk assessments of chemicals

    SciTech Connect

    Moore, D.R.J.; Elliot, B.

    1995-12-31

    In assessments of toxic chemicals, sources of uncertainty may be dealt with by two basic approaches: application factors and probabilistic methods. In regulatory programs, the most common approach is to calculate a quotient by dividing the predicted environmental concentration (PEC) by the predicted no effects concentration (PNEC). PNECs are usually derived from laboratory bioassays, thus requiring the use of application factors to account for uncertainty introduced by the extrapolation from the laboratory to the field, and from measurement to assessment endpoints. Using this approach, often with worst-case assumptions about exposure and species sensitivities, the hope is that chemicals with a quotient of less than one will have a very low probability of causing adverse ecological effects. This approach has received widespread criticism recently, particularly because it tends to be overly conservative and does not adequately estimate the magnitude and probability of causing adverse effects. On the plus side, application factors are simple to use, accepted worldwide, and may be used with limited effects data in a quotient calculation. The alternative approach is to use probabilistic methods such as Monte Carlo simulation, Baye`s theorem or other techniques to estimate risk. Such methods often have rigorous statistical assumptions and may have large data requirements. Stating an effect in probabilistic terms, however, forces the identification of sources of uncertainty and quantification of their impact on risk estimation. In this presentation the authors discuss the advantages and disadvantages of using application factors and probabilistic methods in dealing with uncertainty in ecological risk assessments of chemicals. Based on this analysis, recommendations are presented to assist in choosing the appropriate approach for different types of regulatory programs dealing with toxic chemicals.

  1. Animal Disease Import Risk Analysis--a Review of Current Methods and Practice.

    PubMed

    Peeler, E J; Reese, R A; Thrush, M A

    2015-10-01

    The application of risk analysis to the spread of disease with international trade in animals and their products, that is, import risk analysis (IRA), has been largely driven by the Sanitary and Phytosanitary (SPS) agreement of the World Trade Organization (WTO). The degree to which the IRA standard established by the World Organization for Animal Health (OIE), and associated guidance, meets the needs of the SPS agreement is discussed. The use of scenario trees is the core modelling approach used to represent the steps necessary for the hazard to occur. There is scope to elaborate scenario trees for commodity IRA so that the quantity of hazard at each step is assessed, which is crucial to the likelihood of establishment. The dependence between exposure and establishment suggests that they should fall within the same subcomponent. IRA undertaken for trade reasons must include an assessment of consequences to meet SPS criteria, but guidance is sparse. The integration of epidemiological and economic modelling may open a path for better methods. Matrices have been used in qualitative IRA to combine estimates of entry and exposure, and consequences with likelihood, but this approach has flaws and better methods are needed. OIE IRA standards and guidance indicate that the volume of trade should be taken into account, but offer no detail. Some published qualitative IRAs have assumed current levels and patterns of trade without specifying the volume of trade, which constrains the use of IRA to determine mitigation measures (to reduce risk to an acceptable level) and whether the principle of equivalence, fundamental to the SPS agreement, has been observed. It is questionable whether qualitative IRA can meet all the criteria set out in the SPS agreement. Nevertheless, scope exists to elaborate the current standards and guidance, so they better serve the principle of science-based decision-making. PMID:24237667

  2. Stochastic Drought Risk Analysis and Projection Methods For Thermoelectric Power Systems

    NASA Astrophysics Data System (ADS)

    Bekera, Behailu Belamo

    Combined effects of socio-economic, environmental, technological and political factors impact fresh cooling water availability, which is among the most important elements of thermoelectric power plant site selection and evaluation criteria. With increased variability and changes in hydrologic statistical stationarity, one concern is the increased occurrence of extreme drought events that may be attributable to climatic changes. As hydrological systems are altered, operators of thermoelectric power plants need to ensure a reliable supply of water for cooling and generation requirements. The effects of climate change are expected to influence hydrological systems at multiple scales, possibly leading to reduced efficiency of thermoelectric power plants. This study models and analyzes drought characteristics from a thermoelectric systems operational and regulation perspective. A systematic approach to characterize a stream environment in relation to extreme drought occurrence, duration and deficit-volume is proposed and demonstrated. More specifically, the objective of this research is to propose a stochastic water supply risk analysis and projection methods from thermoelectric power systems operation and management perspectives. The study defines thermoelectric drought as a shortage of cooling water due to stressed supply or beyond operable water temperature limits for an extended period of time requiring power plants to reduce production or completely shut down. It presents a thermoelectric drought risk characterization framework that considers heat content and water quantity facets of adequate water availability for uninterrupted operation of such plants and safety of its surroundings. In addition, it outlines mechanisms to identify rate of occurrences of the said droughts and stochastically quantify subsequent potential losses to the sector. This mechanism is enabled through a model based on compound Nonhomogeneous Poisson Process. This study also demonstrates how

  3. Risk analysis methods and techniques for veterinary biologicals used in Australia.

    PubMed

    Owusu, J

    1995-12-01

    Advances in modern science and technology and the globalisation of the veterinary manufacturing industry, coupled with the relaxation of trade restrictions by the General Agreement on Tariffs and Trade treaty on sanitary and phytosanitary measures, call for an international approach to standards of acceptable risk and risk analysis methodology. In Australia, different elements of risk analysis are undertaken by different agencies. The agencies employ screening risk assessment, which uses simple worst-case scenarios and conservative data to set priorities and identify issues of limited risk. The approach is multi-factorial, assessing risk to public health, animals and the environment. The major components of the analysis process are risk assessment, risk management, and procedures for communicating and monitoring risk. The author advocates the possible use of quantitative risk assessment, based on acceptable international standards, in making international trade decisions. In the absence of acceptable international standards, it is proposed that countries adopt mutual recognition of comparable standards and specifications employed by national agencies. PMID:8639948

  4. Handbook of methods for risk-based analysis of technical specifications

    SciTech Connect

    Samanta, P.K.; Kim, I.S.; Mankamo, T.; Vesely, W.E.

    1996-09-01

    Technical Specifications (TS) requirements for nuclear power plants define the Limiting Conditions for Operations (LCOs) and Surveillance Requirements (SRs) to assure safety during operation. In general, these requirements are based on deterministic analyses and engineering judgments. Improvements in these requirements are facilitated by the availability of plant-specific Probabilistic Risk Assessments (PRAs). The US Nuclear Regulatory Commission (USNRC) Office of Research sponsored research to develop systematic, risk-based methods to improve various aspects of TS requirements. A handbook of methods summarizing such risk-based approaches has been completed in 1994. It is expected that this handbook will provide valuable input to NRC`s present work in developing guidance for using PRA in risk-informed regulation. The handbook addresses reliability and risk-based methods for evaluating allowed outage times (AOTs), action statements requiring shutdown where shutdown risk may be substantial, surveillance test intervals (STIs), managing plant configurations, and scheduling maintenance.

  5. [Working tasks with upper limbs repetitive movements: analysis of different methods for risk assessment].

    PubMed

    Occhipinti, E

    2008-01-01

    A review of different methods for the risk assessment of upper limbs repetitive movements is carried out mainly referring to a recent ISO standard (ISO 11228-3). This standard establishes ergonomic recommendations for tasks involving manual handling of low loads at high frequency (repetitive work). It is a "voluntary" standard and provides information for all professionals involved in occupational prevention as well as in job and product design. It refers to a four-step approach, involving both risk assessment and risk reduction (hazard identification, risk estimation, risk evaluation and risk reduction). General reference is made to a general model reported in a Consensus Document published by the IEA Technical Committee "Musculoskeletal Disorders", with the endorsement of ICOH. Apart from risk identification, the standard addresses and suggests several methods for a simple risk estimation (i.e. Plibel, Osha Checklist, Upper Limb Expert Tool, Qec, Checklist Ocra). If the risk estimation results in the 'yellow' or 'red' zone, or if the job is composed by two or more repetitive tasks, a more detailed risk assessment is recommended. For a detailed risk assessment, the OCRA method is suggested as "preferred"; however, other methods (STRAIN INDEX; HAL-TLV-ACGIH) can also be used. The applicative limits of the methods mentioned above, considering the purposes of the standard, are shortly discussed together with their recent scientific updates and applicative perspectives. The standard, with the suggested risk assessment procedures and methods, represents a useful tool for all OSH operators involved in the application of European and National legislation regarding the prevention of UL WMSDs. PMID:19288787

  6. Risk analysis of a biomass combustion process using MOSAR and FMEA methods.

    PubMed

    Thivel, P-X; Bultel, Y; Delpech, F

    2008-02-28

    Thermal and chemical conversion processes that convert in energy the sewage sludge, pasty waste and other pre-processed waste are increasingly common, for economic and ecological reasons. Fluidized bed combustion is currently one of the most promising methods of energy conversion, since it burns biomass very efficiently, and produces only very small quantities of sulphur and nitrogen oxides. The hazards associated with biomass combustion processes are fire, explosion and poisoning from the combustion gases (CO, etc.). The risk analysis presented in this paper uses the MADS-MOSAR methodology, applied to a semi-industrial pilot scheme comprising a fluidization column, a conventional cyclone, two natural gas burners and a continuous supply of biomass. The methodology uses a generic approach, with an initial macroscopic stage where hazard sources are identified, scenarios for undesired events are recognized and ranked using a grid of SeverityxProbability and safety barriers suggested. A microscopic stage then analyzes in detail the major risks identified during the first stage. This analysis may use various different tools, such as HAZOP, FMEA, etc.: our analysis is based on FMEA. Using MOSAR, we were able to identify five subsystems: the reactor (fluidized bed and centrifuge), the fuel and biomass supply lines, the operator and the environment. When we drew up scenarios based on these subsystems, we found that malfunction of the gas supply burners was a common trigger in many scenarios. Our subsequent microscopic analysis, therefore, focused on the burners, looking at the ways they failed, and at the effects and criticality of those failures (FMEA). We were, thus, able to identify a number of critical factors such as the incoming gas lines and the ignition electrode. PMID:17624666

  7. Risk Analysis

    NASA Technical Reports Server (NTRS)

    Morring, Frank, Jr.

    2004-01-01

    A National Academies panel says the Hubble Space Telescope is too valuable ;or gamblingon a long-shot robotic mission to extend its service life, and urges Directly contradicting Administrator Sean O'Keefe, who killed a planned fifth shuttle servicing mission to the telescope on grounds it was too dangerous for a human crew in the post-Challenger environment, the expert committee found that upgrades to shuttle safety actually should make it less hazardous to fly to the telescope than it was before Columbia was lost. Risks of a telescope-servicing mission are only marginally greater than the planned missions to the International Space Station (ISS) O'Keefe has authorized, the panel found. After comparing those risks to the dangers inherent in trying to develop a complex space robot in the 39 months remaining in the Hubble s estimated service life, the panel opted for the human mission to save one of the major achievements of the American space program, in the words of Louis J. Lanzerotti, its chairman.

  8. Is risk analysis scientific?

    PubMed

    Hansson, Sven Ove; Aven, Terje

    2014-07-01

    This article discusses to what extent risk analysis is scientific in view of a set of commonly used definitions and criteria. We consider scientific knowledge to be characterized by its subject matter, its success in developing the best available knowledge in its fields of study, and the epistemic norms and values that guide scientific investigations. We proceed to assess the field of risk analysis according to these criteria. For this purpose, we use a model for risk analysis in which science is used as a base for decision making on risks, which covers the five elements evidence, knowledge base, broad risk evaluation, managerial review and judgment, and the decision; and that relates these elements to the domains experts and decisionmakers, and to the domains fact-based or value-based. We conclude that risk analysis is a scientific field of study, when understood as consisting primarily of (i) knowledge about risk-related phenomena, processes, events, etc., and (ii) concepts, theories, frameworks, approaches, principles, methods and models to understand, assess, characterize, communicate, and manage risk, in general and for specific applications (the instrumental part). PMID:24919396

  9. Does simplifying transport and exposure yield reliable results? An analysis of four risk assessment methods.

    PubMed

    Zhang, Q; Crittenden, J C; Mihelcic, J R

    2001-03-15

    Four approaches for predicting the risk of chemicals to humans and fish under different scenarios were compared to investigate whether it is appropriate to simplify risk evaluations in situations where an individual is making environmentally conscious manufacturing decisions or interpreting toxics release inventory (TRI) data: (1) the relative risk method, that compares only a chemical's relative toxicity; (2) the toxicity persistence method, that considers a chemical's relative toxicity and persistence; (3) the partitioning, persistence toxicity method, that considers a chemical's equilibrium partitioning to air, land, water, and sediment, persistence in each medium, and its relative toxicity; and (4) the detailed chemical fate and toxicity method, that considers the chemical's relative toxicity, and realistic attenuation mechanisms such as advection, mass transfer and reaction in air, land, water, and sediment. In all four methods, the magnitude of the risk was estimated by comparing the risk of the chemical's release to that of a reference chemical. Three comparative scenarios were selected to evaluate the four approaches for making pollution prevention decisions: (1) evaluation of nine dry cleaning solvents, (2) evaluation of four reaction pathways to produce glycerine, and (3) comparison of risks for the chemical manufacturing and petroleum industry. In all three situations, it was concluded that ignoring or simplifying exposure calculations is not appropriate, except in cases where either the toxicity was very great or when comparing chemicals with similar fate. When the toxicity is low to moderate and comparable for chemicals, the chemicals' fate influences the results; therefore, we recommend using a detailed chemical fate and toxicity method because the fate of chemicals in the environment is assessed with consideration of more realistic attenuation mechanisms than the other three methods. In addition, our study shows that evaluating the risk associated

  10. FOOD RISK ANALYSIS

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Food risk analysis is a holistic approach to food safety because it considers all aspects of the problem. Risk assessment modeling is the foundation of food risk analysis. Proper design and simulation of the risk assessment model is important to properly predict and control risk. Because of knowl...

  11. Indoor Soiling Method and Outdoor Statistical Risk Analysis of Photovoltaic Power Plants

    NASA Astrophysics Data System (ADS)

    Rajasekar, Vidyashree

    This is a two-part thesis. Part 1 presents an approach for working towards the development of a standardized artificial soiling method for laminated photovoltaic (PV) cells or mini-modules. Construction of an artificial chamber to maintain controlled environmental conditions and components/chemicals used in artificial soil formulation is briefly explained. Both poly-Si mini-modules and a single cell mono-Si coupons were soiled and characterization tests such as I-V, reflectance and quantum efficiency (QE) were carried out on both soiled, and cleaned coupons. From the results obtained, poly-Si mini-modules proved to be a good measure of soil uniformity, as any non-uniformity present would not result in a smooth curve during I-V measurements. The challenges faced while executing reflectance and QE characterization tests on poly-Si due to smaller size cells was eliminated on the mono-Si coupons with large cells to obtain highly repeatable measurements. This study indicates that the reflectance measurements between 600-700 nm wavelengths can be used as a direct measure of soil density on the modules. Part 2 determines the most dominant failure modes of field aged PV modules using experimental data obtained in the field and statistical analysis, FMECA (Failure Mode, Effect, and Criticality Analysis). The failure and degradation modes of about 744 poly-Si glass/polymer frameless modules fielded for 18 years under the cold-dry climate of New York was evaluated. Defect chart, degradation rates (both string and module levels) and safety map were generated using the field measured data. A statistical reliability tool, FMECA that uses Risk Priority Number (RPN) is used to determine the dominant failure or degradation modes in the strings and modules by means of ranking and prioritizing the modes. This study on PV power plants considers all the failure and degradation modes from both safety and performance perspectives. The indoor and outdoor soiling studies were jointly

  12. A method for risk analysis across governance systems: a Great Barrier Reef case study

    NASA Astrophysics Data System (ADS)

    Dale, Allan; Vella, Karen; Pressey, Robert L.; Brodie, Jon; Yorkston, Hugh; Potts, Ruth

    2013-03-01

    Healthy governance systems are key to delivering sound environmental management outcomes from global to local scales. There are, however, surprisingly few risk assessment methods that can pinpoint those domains and sub-domains within governance systems that are most likely to influence good environmental outcomes at any particular scale, or those if absent or dysfunctional, most likely to prevent effective environmental management. This paper proposes a new risk assessment method for analysing governance systems. This method is then tested through its preliminary application to a significant real-world context: governance as it relates to the health of Australia’s Great Barrier Reef (GBR). The GBR exists at a supra-regional scale along most of the north eastern coast of Australia. Brodie et al (2012 Mar. Pollut. Bull. 65 81-100) have recently reviewed the state and trend of the health of the GBR, finding that overall trends remain of significant concern. At the same time, official international concern over the governance of the reef has recently been signalled globally by the International Union for the Conservation of Nature (IUCN). These environmental and political contexts make the GBR an ideal candidate for use in testing and reviewing the application of improved tools for governance risk assessment.

  13. Method for improved prediction of bone fracture risk using bone mineral density in structural analysis

    NASA Technical Reports Server (NTRS)

    Cann, Christopher E. (Inventor); Faulkner, Kenneth G. (Inventor)

    1992-01-01

    A non-invasive in-vivo method of analyzing a bone for fracture risk includes obtaining data from the bone such as by computed tomography or projection imaging which data represents a measure of bone material characteristics such as bone mineral density. The distribution of the bone material characteristics is used to generate a finite element method (FEM) mesh from which load capability of the bone can be determined. In determining load capability, the bone is mathematically compressed, and stress, strain force, force/area versus bone material characteristics are determined.

  14. Integrated seismic risk analysis using simple weighting method: the case of residential Eskişehir, Turkey

    NASA Astrophysics Data System (ADS)

    Pekkan, E.; Tun, M.; Guney, Y.; Mutlu, S.

    2015-06-01

    A large part of the residential areas in Turkey are at risk from earthquakes. The main factors that threaten residential areas during an earthquake are poor quality building stock and soil problems. Liquefaction, loss of bearing capacity, amplification, slope failure, and landslide hazards must be taken into account for residential areas that are close to fault zones and covered with younger sediments. Analyzing these hazards separately and then combining the analyses would ensure a more realistic risk evaluation according to population density than analyzing several risks based on a single parameter. In this study, an integrated seismic risk analysis of central Eskişehir was performed based on two earthquake related parameters, liquefaction and amplification. The analysis used a simple weighting method. Other earthquake-related problems such as loss of bearing capacity, landslides, and slope failures are not significant for Eskişehir because of the geological and the topographical conditions of the region. According to the integrated seismic risk analysis of the Eskişehir residential area, the populated area is found to be generally at medium to high risk during a potential earthquake.

  15. Recruitment Methods and Show Rates to a Prostate Cancer Early Detection Program for High-Risk Men: A Comprehensive Analysis

    PubMed Central

    Giri, Veda N.; Coups, Elliot J.; Ruth, Karen; Goplerud, Julia; Raysor, Susan; Kim, Taylor Y.; Bagden, Loretta; Mastalski, Kathleen; Zakrzewski, Debra; Leimkuhler, Suzanne; Watkins-Bruner, Deborah

    2009-01-01

    Purpose Men with a family history (FH) of prostate cancer (PCA) and African American (AA) men are at higher risk for PCA. Recruitment and retention of these high-risk men into early detection programs has been challenging. We report a comprehensive analysis on recruitment methods, show rates, and participant factors from the Prostate Cancer Risk Assessment Program (PRAP), which is a prospective, longitudinal PCA screening study. Materials and Methods Men 35–69 years are eligible if they have a FH of PCA, are AA, or have a BRCA1/2 mutation. Recruitment methods were analyzed with respect to participant demographics and show to the first PRAP appointment using standard statistical methods Results Out of 707 men recruited, 64.9% showed to the initial PRAP appointment. More individuals were recruited via radio than from referral or other methods (χ2 = 298.13, p < .0001). Men recruited via radio were more likely to be AA (p<0.001), less educated (p=0.003), not married or partnered (p=0.007), and have no FH of PCA (p<0.001). Men recruited via referrals had higher incomes (p=0.007). Men recruited via referral were more likely to attend their initial PRAP visit than those recruited by radio or other methods (χ2 = 27.08, p < .0001). Conclusions This comprehensive analysis finds that radio leads to higher recruitment of AA men with lower socioeconomic status. However, these are the high-risk men that have lower show rates for PCA screening. Targeted motivational measures need to be studied to improve show rates for PCA risk assessment for these high-risk men. PMID:19758657

  16. Qualitative methods for assessing risk

    SciTech Connect

    Mahn, J.A.; Hannaman, G.W.; Kryska, P.

    1995-04-01

    The Department of Energy`s (DOE) non-nuclear facilities generally require only a qualitative accident analysis to assess facility risks in accordance with DOE Order 5481.1B, Safety Analysis and Review System. Achieving a meaningful qualitative assessment of risk necessarily requires the use of suitable non-numerical assessment criteria. Typically, the methods and criteria for assigning facility-specific accident scenarios to the qualitative severity and likelihood classification system in the DOE order requires significant judgment in many applications. Systematic methods for more consistently assigning the total accident scenario frequency and associated consequences are required to substantiate and enhance future risk ranking between various activities at Sandia National Laboratories (SNL). SNL`s Risk Management and National Environmental Policy Act (NEPA) Department has developed an improved methodology for performing qualitative risk assessments in accordance wi the DOE order requirements. Products of this effort are an improved set of qualitative description that permit (1) definition of the severity for both technical and programmatic consequences that may result from a variety of accident scenarios, and (2) qualitative representation of the likelihood of occurrence. These sets of descriptions are intended to facilitate proper application of DOE criteria for assessing facility risks.

  17. An Improved Breast Epithelial Sampling Method for Molecular Profiling and Biomarker Analysis in Women at Risk for Breast Cancer

    PubMed Central

    Danforth, David N; Warner, Andrew C; Wangsa, Darawalee; Ried, Thomas; Duelli, Dominik; Filie, Armando C; Prindiville, Sheila A

    2015-01-01

    BACKGROUND There is a strong need to define the molecular changes in normal at-risk breast epithelium to identify biomarkers and new targets for breast cancer prevention and to develop a molecular signature for risk assessment. Improved methods of breast epithelial sampling are needed to promote whole-genome molecular profiling, increase ductal epithelial cell yield, and reduce sample cell heterogeneity. METHODS We developed an improved method of breast ductal sampling with ductal lavage through a 22-gauge catheter and collection of ductal samples with a microaspirator. Women at normal risk or increased risk for breast cancer were studied. Ductal epithelial samples were analyzed for cytopathologic changes, cellular yield, epithelial cell purity, quality and quantity of DNA and RNA, and use in multiple downstream molecular applications. RESULTS We studied 50 subjects, including 40 subjects at normal risk for breast cancer and 37 subjects with non-nipple aspirate fluid-yielding ducts. This method provided multiple 1.0 mL samples of high ductal epithelial cell content (median ≥8 samples per subject of ≥5,000 cells per sample) with 80%–100% epithelial cell purity. Extraction of a single intact ductal sample (fluid and cells) or the separate frozen cellular component provided DNA and RNA for multiple downstream studies, including quantitative reverse transcription- polymerase chain reaction (PCR) for microRNA, quantitative PCR for the human telomerase reverse transcriptase gene, whole-genome DNA amplification, and array comparative genomic hybridization analysis. CONCLUSION An improved breast epithelial sampling method has been developed, which should significantly expand the acquisition and biomarker analysis of breast ductal epithelium in women at risk for breast cancer. PMID:26078587

  18. Methods Development for a Spatially Explicit Population-Level Risk Assessment, Uncertainty Analysis, and Comparison with Risk Quotient Approaches

    EPA Science Inventory

    The standard framework of Ecological Risk Assessment (ERA) uses organism-level assessment endpoints to qualitatively determine the risk to populations. While organism-level toxicity data provide the pathway by which a species may be affected by a chemical stressor, they neither i...

  19. A survey of probabilistic methods used in reliability, risk and uncertainty analysis: Analytical techniques 1

    SciTech Connect

    Robinson, D.G.

    1998-06-01

    This report provides an introduction to the various probabilistic methods developed roughly between 1956--1985 for performing reliability or probabilistic uncertainty analysis on complex systems. This exposition does not include the traditional reliability methods (e.g. parallel-series systems, etc.) that might be found in the many reliability texts and reference materials (e.g. and 1977). Rather, the report centers on the relatively new, and certainly less well known across the engineering community, analytical techniques. Discussion of the analytical methods has been broken into two reports. This particular report is limited to those methods developed between 1956--1985. While a bit dated, methods described in the later portions of this report still dominate the literature and provide a necessary technical foundation for more current research. A second report (Analytical Techniques 2) addresses methods developed since 1985. The flow of this report roughly follows the historical development of the various methods so each new technique builds on the discussion of strengths and weaknesses of previous techniques. To facilitate the understanding of the various methods discussed, a simple 2-dimensional problem is used throughout the report. The problem is used for discussion purposes only; conclusions regarding the applicability and efficiency of particular methods are based on secondary analyses and a number of years of experience by the author. This document should be considered a living document in the sense that as new methods or variations of existing methods are developed, the document and references will be updated to reflect the current state of the literature as much as possible. For those scientists and engineers already familiar with these methods, the discussion will at times become rather obvious. However, the goal of this effort is to provide a common basis for future discussions and, as such, will hopefully be useful to those more intimate with

  20. Assessing the Risk of Secondary Transfer Via Fingerprint Brush Contamination Using Enhanced Sensitivity DNA Analysis Methods.

    PubMed

    Bolivar, Paula-Andrea; Tracey, Martin; McCord, Bruce

    2016-01-01

    Experiments were performed to determine the extent of cross-contamination of DNA resulting from secondary transfer due to fingerprint brushes used on multiple items of evidence. Analysis of both standard and low copy number (LCN) STR was performed. Two different procedures were used to enhance sensitivity, post-PCR cleanup and increased cycle number. Under standard STR typing procedures, some additional alleles were produced that were not present in the controls or blanks; however, there was insufficient data to include the contaminant donor as a contributor. Inclusion of the contaminant donor did occur for one sample using post-PCR cleanup. Detection of the contaminant donor occurred for every replicate of the 31 cycle amplifications; however, using LCN interpretation recommendations for consensus profiles, only one sample would include the contaminant donor. Our results indicate that detection of secondary transfer of DNA can occur through fingerprint brush contamination and is enhanced using LCN-DNA methods. PMID:26300550

  1. Risk analysis methodology survey

    NASA Technical Reports Server (NTRS)

    Batson, Robert G.

    1987-01-01

    NASA regulations require that formal risk analysis be performed on a program at each of several milestones as it moves toward full-scale development. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from simple to complex network-based simulation were surveyed. A Program Risk Analysis Handbook was prepared in order to provide both analyst and manager with a guide for selection of the most appropriate technique.

  2. North energy system risk analysis features

    NASA Astrophysics Data System (ADS)

    Prokhorov, V. A.; Prokhorov, D. V.

    2015-12-01

    Risk indicator analysis for a decentralized energy system of the North was carried out. Based on analysis of damages caused by accidents at energy systems, their structure is selected, and a North energy system risk determination method was proposed.

  3. Methods for Multitemporal Analysis of Satellite Data Aimed at Environmental Risk Monitoring

    NASA Astrophysics Data System (ADS)

    Caprioli, M.; Scognamiglio, A.

    2012-08-01

    In the last years the topic of Environmental monitoring has raised a particular importance, also according to minor short-term stability and predictability of climatic events. Facing this situation, often in terms of emergency, involves high and unpredictable costs for public Agencies. Prevention of damages caused by natural disasters does not regard only weather forecasts, but requires constant attention and practice of monitoring and control of human activity on territory. Practically, the problem is not knowing if and when an event will affect a determined area, but recognizing the possible damages if this event happened, by adopting the adequate measures to reduce them to a minimum, and requiring the necessary tools for a timely intervention. On the other hand, the surveying technologies should be the most possible accurate and updatable in order to guarantee high standards, involving the analysis of a great amount of data. The management of such data requires the integration and calculation systems with specialized software and fast and reliable connection and communication networks. To solve such requirements, current satellite technology, with recurrent data acquisition for the timely generation of cartographic products updated and coherent to the territorial investigation, offers the possibility to fill the temporal gap between the need of urgent information and official reference information. Among evolved image processing techniques, Change detection analysis is useful to facilitate individuation of environmental temporal variations, contributing to reduce the users intervention by means of the processes automation and improving in a progressive way the qualitative and quantitative accuracy of results. The research investigate automatic methods on land cover transformations by means of "Change detection" techniques executable on satellite data that are heterogeneous for spatial and spectral resolution with homogenization and registration in an unique

  4. Risk/Stress Analysis.

    ERIC Educational Resources Information Center

    Schwerdtfeger, Don; Howell, Richard E.

    1986-01-01

    Identifies stress as a definite health hazard and risk factor involved in a variety of health situations. Proposes that stress identification efforts be considered in environmental analysis so that a more complete approach to risk assessment and management and health hazard prevention can occur. (ML)

  5. Multidimensional Risk Analysis: MRISK

    NASA Technical Reports Server (NTRS)

    McCollum, Raymond; Brown, Douglas; O'Shea, Sarah Beth; Reith, William; Rabulan, Jennifer; Melrose, Graeme

    2015-01-01

    Multidimensional Risk (MRISK) calculates the combined multidimensional score using Mahalanobis distance. MRISK accounts for covariance between consequence dimensions, which de-conflicts the interdependencies of consequence dimensions, providing a clearer depiction of risks. Additionally, in the event the dimensions are not correlated, Mahalanobis distance reduces to Euclidean distance normalized by the variance and, therefore, represents the most flexible and optimal method to combine dimensions. MRISK is currently being used in NASA's Environmentally Responsible Aviation (ERA) project o assess risk and prioritize scarce resources.

  6. Applying Data Envelopment Analysis to Preventive Medicine: A Novel Method for Constructing a Personalized Risk Model of Obesity

    PubMed Central

    Narimatsu, Hiroto; Nakata, Yoshinori; Nakamura, Sho; Sato, Hidenori; Sho, Ri; Otani, Katsumi; Kawasaki, Ryo; Kubota, Isao; Ueno, Yoshiyuki; Kato, Takeo; Yamashita, Hidetoshi; Fukao, Akira; Kayama, Takamasa

    2015-01-01

    Data envelopment analysis (DEA) is a method of operations research that has not yet been applied in the field of obesity research. However, DEA might be used to evaluate individuals’ susceptibility to obesity, which could help establish effective risk models for the onset of obesity. Therefore, we conducted this study to evaluate the feasibility of applying DEA to predict obesity, by calculating efficiency scores and evaluating the usefulness of risk models. In this study, we evaluated data from the Takahata study, which was a population-based cohort study (with a follow-up study) of Japanese people who are >40 years old. For our analysis, we used the input-oriented Charnes-Cooper-Rhodes model of DEA, and defined the decision-making units (DMUs) as individual subjects. The inputs were defined as (1) exercise (measured as calories expended) and (2) the inverse of food intake (measured as calories ingested). The output was defined as the inverse of body mass index (BMI). Using the β coefficients for the participants’ single nucleotide polymorphisms, we then calculated their genetic predisposition score (GPS). Both efficiency scores and GPS were available for 1,620 participants from the baseline survey, and for 708 participants from the follow-up survey. To compare the strengths of the associations, we used models of multiple linear regressions. To evaluate the effects of genetic factors and efficiency score on body mass index (BMI), we used multiple linear regression analysis, with BMI as the dependent variable, GPS and efficiency scores as the explanatory variables, and several demographic controls, including age and sex. Our results indicated that all factors were statistically significant (p < 0.05), with an adjusted R2 value of 0.66. Therefore, it is possible to use DEA to predict environmentally driven obesity, and thus to establish a well-fitted model for risk of obesity. PMID:25973987

  7. Application of a risk analysis method to different technologies for producing a monoclonal antibody employed in hepatitis B vaccine manufacturing.

    PubMed

    Milá, Lorely; Valdés, Rodolfo; Tamayo, Andrés; Padilla, Sigifredo; Ferro, Williams

    2012-03-01

    CB.Hep-1 monoclonal antibody (mAb) is used for a recombinant Hepatitis B vaccine manufacturing, which is included in a worldwide vaccination program against Hepatitis B disease. The use of this mAb as immunoligand has been addressed into one of the most efficient steps of active pharmaceutical ingredient purification process. Regarding this, Quality Risk Management (QRM) provides an excellent framework for the risk management use in pharmaceutical manufacturing and quality decision-making applications. Consequently, this study sought applying a prospective risk analysis methodology Failure Mode Effects Analysis (FMEA) as QRM tool for analyzing different CB.Hep-1 mAb manufacturing technologies. As main conclusions FMEA was successfully used to assess risks associated with potential problems in CB.Hep-1 mAb manufacturing processes. The severity and occurrence of risks analysis evidenced that the percentage of very high severe risks ranged 31.0-38.7% of all risks and the huge majority of risks have a very low occurrence level (61.9-83.3%) in all assessed technologies. Finally, additive Risk Priority Number, was descending ordered as follow: transgenic plants (2636), ascites (2577), transgenic animals (2046) and hollow fiber bioreactors (1654), which also corroborated that in vitro technology, should be the technology of choice for CB.Hep-1 mAb manufacturing in terms of risks and mAb molecule quality. PMID:22285820

  8. Analysis of the LaSalle Unit 2 nuclear power plant: Risk Methods Integration and Evaluation Program (RMIEP). Volume 8, Seismic analysis

    SciTech Connect

    Wells, J.E.; Lappa, D.A.; Bernreuter, D.L.; Chen, J.C.; Chuang, T.Y.; Johnson, J.J.; Campbell, R.D.; Hashimoto, P.S.; Maslenikov, O.R.; Tiong, L.W.; Ravindra, M.K.; Kincaid, R.H.; Sues, R.H.; Putcha, C.S.

    1993-11-01

    This report describes the methodology used and the results obtained from the application of a simplified seismic risk methodology to the LaSalle County Nuclear Generating Station Unit 2. This study is part of the Level I analysis being performed by the Risk Methods Integration and Evaluation Program (RMIEP). Using the RMIEP developed event and fault trees, the analysis resulted in a seismically induced core damage frequency point estimate of 6.OE-7/yr. This result, combined with the component importance analysis, indicated that system failures were dominated by random events. The dominant components included diesel generator failures (failure to swing, failure to start, failure to run after started), and condensate storage tank.

  9. Program risk analysis handbook

    NASA Technical Reports Server (NTRS)

    Batson, R. G.

    1987-01-01

    NASA regulations specify that formal risk analysis be performed on a program at each of several milestones. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from extremely simple to complex network-based simulation, are described in this handbook in order to provide both analyst and manager with a guide for selection of the most appropriate technique. All program risk assessment techniques are shown to be based on elicitation and encoding of subjective probability estimates from the various area experts on a program. Techniques to encode the five most common distribution types are given. Then, a total of twelve distinct approaches to risk assessment are given. Steps involved, good and bad points, time involved, and degree of computer support needed are listed. Why risk analysis should be used by all NASA program managers is discussed. Tools available at NASA-MSFC are identified, along with commercially available software. Bibliography (150 entries) and a program risk analysis check-list are provided.

  10. Recasting risk analysis methods in terms of object-oriented modeling techniques

    SciTech Connect

    Wyss, G.D.; Craft, R.L.; Vandewart, R.L.; Funkhouser, D.R.

    1998-08-01

    For more than two decades, risk analysts have relied on powerful logic-based models to perform their analyses. However, the applicability of these models has been limited because they can be complex and expensive to develop. Analysts must frequently start from scratch when analyzing a new (but similar) system because the understanding of how the system works exists only in the mind of the analyst and is only incompletely instantiated in the actual logic model. This paper introduces the notion of using explicit object-oriented system models, such as those embodied in computer-aided software engineering (CASE) tools, to document the analyst`s understanding of the system and appropriately capture how the system works. It also shows that from these models, standard assessment products, such as fault trees and event trees, can be automatically derived.

  11. The influence of the free space environment on the superlight-weight thermal protection system: conception, methods, and risk analysis

    NASA Astrophysics Data System (ADS)

    Yatsenko, Vitaliy; Falchenko, Iurii; Fedorchuk, Viktor; Petrushynets, Lidiia

    2016-07-01

    This report focuses on the results of the EU project "Superlight-weight thermal protection system for space application (LIGHT-TPS)". The bottom line is an analysis of influence of the free space environment on the superlight-weight thermal protection system (TPS). This report focuses on new methods that based on the following models: synergetic, physical, and computational. This report concentrates on four approaches. The first concerns the synergetic approach. The synergetic approach to the solution of problems of self-controlled synthesis of structures and creation of self-organizing technologies is considered in connection with the super-problem of creation of materials with new functional properties. Synergetics methods and mathematical design are considered according to actual problems of material science. The second approach describes how the optimization methods can be used to determine material microstructures with optimized or targeted properties. This technique enables one to find unexpected microstructures with exotic behavior (e.g., negative thermal expansion coefficients). The third approach concerns the dynamic probabilistic risk analysis of TPS l elements with complex characterizations for damages using a physical model of TPS system and a predictable level of ionizing radiation and space weather. Focusing is given mainly on the TPS model, mathematical models for dynamic probabilistic risk assessment and software for the modeling and prediction of the influence of the free space environment. The probabilistic risk assessment method for TPS is presented considering some deterministic and stochastic factors. The last approach concerns results of experimental research of the temperature distribution on the surface of the honeycomb sandwich panel size 150 x 150 x 20 mm at the diffusion welding in vacuum are considered. An equipment, which provides alignment of temperature fields in a product for the formation of equal strength of welded joints is

  12. Designing a Software for Flood Risk Assessment Based on Multi Criteria Desicion Analysis and Information Diffusion Methods

    NASA Astrophysics Data System (ADS)

    Musaoglu, N.; Saral, A.; Seker, D. Z.

    2012-12-01

    Flooding is one of the major natural disasters not only in Turkey but also in all over the world and it causes serious damage and harm. It is estimated that of the total economic loss caused by all kinds of disasters, 40% was due to floods. In July 1995, the Ayamama Creek in Istanbul was flooded, the insurance sector received around 1,200 claims notices during that period, insurance companies had to pay a total of $40 million for claims. In 2009, the same creek was flooded again and killed 31 people over two days and insurance firms paid for damages around cost €150 million for claims. To solve these kinds of problems modern tools such as GIS and Remote Sensing should be utilized. In this study, a software was designed for the flood risk analysis with Analytic Hierarchy Process (AHP) and Information Diffusion( InfoDif) methods.In the developed sofware, five evaluation criterias were taken into account, which were slope, aspect, elevation, geology and land use which were extracted from the satellite sensor data. The Digital Elevation Model (DEM) of the Ayamama River Basin was acquired from the SPOT 5 satellite image with 2.5 meter spatial resolution. Slope and aspect values of the study basin were extracted from this DEM. The land use of the Ayamama Creek was obtained by performing object-oriented nearest neighbor classification method by image segmentation on SPOT 5 image dated 2010. All produced data were used as an input for the part of Multi Criteria Desicion Analysis (MCDA) method of this software. Criterias and their each sub criteras were weighted and flood vulnerability was determined with MCDA-AHP. Also, daily flood data was collected from Florya Meteorological Station, between 1975 to 2009 years and the daily flood peak discharge was calculated with the method of Soil Conservation Service-Curve Number (SCS-CN) and were used as an input in the software for the part of InfoDif.Obtained results were verified using ground truth data and it has been clearly

  13. Risk assessment of water pollution sources based on an integrated k-means clustering and set pair analysis method in the region of Shiyan, China.

    PubMed

    Li, Chunhui; Sun, Lian; Jia, Junxiang; Cai, Yanpeng; Wang, Xuan

    2016-07-01

    Source water areas are facing many potential water pollution risks. Risk assessment is an effective method to evaluate such risks. In this paper an integrated model based on k-means clustering analysis and set pair analysis was established aiming at evaluating the risks associated with water pollution in source water areas, in which the weights of indicators were determined through the entropy weight method. Then the proposed model was applied to assess water pollution risks in the region of Shiyan in which China's key source water area Danjiangkou Reservoir for the water source of the middle route of South-to-North Water Diversion Project is located. The results showed that eleven sources with relative high risk value were identified. At the regional scale, Shiyan City and Danjiangkou City would have a high risk value in term of the industrial discharge. Comparatively, Danjiangkou City and Yunxian County would have a high risk value in terms of agricultural pollution. Overall, the risk values of north regions close to the main stream and reservoir of the region of Shiyan were higher than that in the south. The results of risk level indicated that five sources were in lower risk level (i.e., level II), two in moderate risk level (i.e., level III), one in higher risk level (i.e., level IV) and three in highest risk level (i.e., level V). Also risks of industrial discharge are higher than that of the agricultural sector. It is thus essential to manage the pillar industry of the region of Shiyan and certain agricultural companies in the vicinity of the reservoir to reduce water pollution risks of source water areas. PMID:27016678

  14. Targeted assets risk analysis.

    PubMed

    Bouwsema, Barry

    2013-01-01

    Risk assessments utilising the consolidated risk assessment process as described by Public Safety Canada and the Centre for Security Science utilise the five threat categories of natural, human accidental, technological, human intentional and chemical, biological, radiological, nuclear or explosive (CBRNE). The categories of human intentional and CBRNE indicate intended actions against specific targets. It is therefore necessary to be able to identify which pieces of critical infrastructure represent the likely targets of individuals with malicious intent. Using the consolidated risk assessment process and the target capabilities list, coupled with the CARVER methodology and a security vulnerability analysis, it is possible to identify these targeted assets and their weaknesses. This process can help emergency managers to identify where resources should be allocated and funding spent. Targeted Assets Risk Analysis (TARA) presents a new opportunity to improve how risk is measured, monitored, managed and minimised through the four phases of emergency management, namely, prevention, preparation, response and recovery. To reduce risk throughout Canada, Defence Research and Development Canada is interested in researching the potential benefits of a comprehensive approach to risk assessment and management. The TARA provides a framework against which potential human intentional threats can be measured and quantified, thereby improving safety for all Canadians. PMID:23615063

  15. DWPF risk analysis summary

    SciTech Connect

    Shedrow, C.B.

    1990-10-01

    This document contains selected risk analysis data from Chapter 9 (Safety Analysis) of the Defense Waste Processing Facility Safety Analysis Report DWPF SAR and draft Addendum 1 to the Waste Tank Farms SAR. Although these data may be revised prior to finalization of the draft SAR and the draft addendum, they are presently the best available information and were therefore used in preparing the risk analysis portion of the DWPF Environmental Analysis (DWPF EA). This information has been extracted from those draft documents an approved under separate cover so that it can be used as reference material for the DWPF EA when it is placed in the public reading rooms. 9 refs., 4 tabs.

  16. A Comparison of Rule-based Analysis with Regression Methods in Understanding the Risk Factors for Study Withdrawal in a Pediatric Study

    PubMed Central

    Haghighi, Mona; Johnson, Suzanne Bennett; Qian, Xiaoning; Lynch, Kristian F.; Vehik, Kendra; Huang, Shuai; Rewers, Marian; Barriga, Katherine; Baxter, Judith; Eisenbarth, George; Frank, Nicole; Gesualdo, Patricia; Hoffman, Michelle; Norris, Jill; Ide, Lisa; Robinson, Jessie; Waugh, Kathleen; She, Jin-Xiong; Schatz, Desmond; Hopkins, Diane; Steed, Leigh; Choate, Angela; Silvis, Katherine; Shankar, Meena; Huang, Yi-Hua; Yang, Ping; Wang, Hong-Jie; Leggett, Jessica; English, Kim; McIndoe, Richard; Dequesada, Angela; Haller, Michael; Anderson, Stephen W.; Ziegler, Anette G.; Boerschmann, Heike; Bonifacio, Ezio; Bunk, Melanie; Försch, Johannes; Henneberger, Lydia; Hummel, Michael; Hummel, Sandra; Joslowski, Gesa; Kersting, Mathilde; Knopff, Annette; Kocher, Nadja; Koletzko, Sibylle; Krause, Stephanie; Lauber, Claudia; Mollenhauer, Ulrike; Peplow, Claudia; Pflüger, Maren; Pöhlmann, Daniela; Ramminger, Claudia; Rash-Sur, Sargol; Roth, Roswith; Schenkel, Julia; Thümer, Leonore; Voit, Katja; Winkler, Christiane; Zwilling, Marina; Simell, Olli G.; Nanto-Salonen, Kirsti; Ilonen, Jorma; Knip, Mikael; Veijola, Riitta; Simell, Tuula; Hyöty, Heikki; Virtanen, Suvi M.; Kronberg-Kippilä, Carina; Torma, Maija; Simell, Barbara; Ruohonen, Eeva; Romo, Minna; Mantymaki, Elina; Schroderus, Heidi; Nyblom, Mia; Stenius, Aino; Lernmark, Åke; Agardh, Daniel; Almgren, Peter; Andersson, Eva; Andrén-Aronsson, Carin; Ask, Maria; Karlsson, Ulla-Marie; Cilio, Corrado; Bremer, Jenny; Ericson-Hallström, Emilie; Gard, Thomas; Gerardsson, Joanna; Gustavsson, Ulrika; Hansson, Gertie; Hansen, Monica; Hyberg, Susanne; Håkansson, Rasmus; Ivarsson, Sten; Johansen, Fredrik; Larsson, Helena; Lernmark, Barbro; Markan, Maria; Massadakis, Theodosia; Melin, Jessica; Månsson-Martinez, Maria; Nilsson, Anita; Nilsson, Emma; Rahmati, Kobra; Rang, Sara; Järvirova, Monica Sedig; Sibthorpe, Sara; Sjöberg, Birgitta; Törn, Carina; Wallin, Anne; Wimar, Åsa; Hagopian, William A.; Yan, Xiang; Killian, Michael; Crouch, Claire Cowen; Hay, Kristen M.; Ayres, Stephen; Adams, Carissa; Bratrude, Brandi; Fowler, Greer; Franco, Czarina; Hammar, Carla; Heaney, Diana; Marcus, Patrick; Meyer, Arlene; Mulenga, Denise; Scott, Elizabeth; Skidmore, Jennifer; Small, Erin; Stabbert, Joshua; Stepitova, Viktoria; Becker, Dorothy; Franciscus, Margaret; Dalmagro-Elias Smith, MaryEllen; Daftary, Ashi; Krischer, Jeffrey P.; Abbondondolo, Michael; Ballard, Lori; Brown, Rasheedah; Cuthbertson, David; Eberhard, Christopher; Gowda, Veena; Lee, Hye-Seung; Liu, Shu; Malloy, Jamie; McCarthy, Cristina; McLeod, Wendy; Smith, Laura; Smith, Stephen; Smith, Susan; Uusitalo, Ulla; Yang, Jimin; Akolkar, Beena; Briese, Thomas; Erlich, Henry; Oberste, Steve

    2016-01-01

    Regression models are extensively used in many epidemiological studies to understand the linkage between specific outcomes of interest and their risk factors. However, regression models in general examine the average effects of the risk factors and ignore subgroups with different risk profiles. As a result, interventions are often geared towards the average member of the population, without consideration of the special health needs of different subgroups within the population. This paper demonstrates the value of using rule-based analysis methods that can identify subgroups with heterogeneous risk profiles in a population without imposing assumptions on the subgroups or method. The rules define the risk pattern of subsets of individuals by not only considering the interactions between the risk factors but also their ranges. We compared the rule-based analysis results with the results from a logistic regression model in The Environmental Determinants of Diabetes in the Young (TEDDY) study. Both methods detected a similar suite of risk factors, but the rule-based analysis was superior at detecting multiple interactions between the risk factors that characterize the subgroups. A further investigation of the particular characteristics of each subgroup may detect the special health needs of the subgroup and lead to tailored interventions. PMID:27561809

  17. A Comparison of Rule-based Analysis with Regression Methods in Understanding the Risk Factors for Study Withdrawal in a Pediatric Study.

    PubMed

    Haghighi, Mona; Johnson, Suzanne Bennett; Qian, Xiaoning; Lynch, Kristian F; Vehik, Kendra; Huang, Shuai

    2016-01-01

    Regression models are extensively used in many epidemiological studies to understand the linkage between specific outcomes of interest and their risk factors. However, regression models in general examine the average effects of the risk factors and ignore subgroups with different risk profiles. As a result, interventions are often geared towards the average member of the population, without consideration of the special health needs of different subgroups within the population. This paper demonstrates the value of using rule-based analysis methods that can identify subgroups with heterogeneous risk profiles in a population without imposing assumptions on the subgroups or method. The rules define the risk pattern of subsets of individuals by not only considering the interactions between the risk factors but also their ranges. We compared the rule-based analysis results with the results from a logistic regression model in The Environmental Determinants of Diabetes in the Young (TEDDY) study. Both methods detected a similar suite of risk factors, but the rule-based analysis was superior at detecting multiple interactions between the risk factors that characterize the subgroups. A further investigation of the particular characteristics of each subgroup may detect the special health needs of the subgroup and lead to tailored interventions. PMID:27561809

  18. Initial Decision and Risk Analysis

    SciTech Connect

    Engel, David W.

    2012-02-29

    Decision and Risk Analysis capabilities will be developed for industry consideration and possible adoption within Year 1. These tools will provide a methodology for merging qualitative ranking of technology maturity and acknowledged risk contributors with quantitative metrics that drive investment decision processes. Methods and tools will be initially introduced as applications to the A650.1 case study, but modular spreadsheets and analysis routines will be offered to industry collaborators as soon as possible to stimulate user feedback and co-development opportunities.

  19. Bivariate hydrologic risk analysis based on a coupled entropy-copula method for the Xiangxi River in the Three Gorges Reservoir area, China

    NASA Astrophysics Data System (ADS)

    Fan, Y. R.; Huang, W. W.; Huang, G. H.; Huang, K.; Li, Y. P.; Kong, X. M.

    2016-07-01

    In this study, a bivariate hydrologic risk framework is proposed based on a coupled entropy-copula method. In the proposed risk analysis framework, bivariate flood frequency would be analyzed for different flood variable pairs (i.e., flood peak-volume, flood peak-duration, flood volume-duration). The marginal distributions of flood peak, volume, and duration are quantified through both parametric (i.e., gamma, general extreme value (GEV), and lognormal distributions) and nonparametric (i.e., entropy) approaches. The joint probabilities of flood peak-volume, peak-duration, and volume-duration are established through copulas. The bivariate hydrologic risk is then derived based on the joint return period to reflect the interactive effects of flood variables on the final hydrologic risk values. The proposed method is applied to the risk analysis for the Xiangxi River in the Three Gorges Reservoir area, China. The results indicate the entropy method performs best in quantifying the distribution of flood duration. Bivariate hydrologic risk would then be generated to characterize the impacts of flood volume and duration on the occurrence of a flood. The results suggest that the bivariate risk for flood peak-volume would not decrease significantly for the flood volume less than 1000 m3/s. Moreover, a flood in the Xiangxi River may last at least 5 days without significant decrease of the bivariate risk for flood peak-duration.

  20. The report of Task Group 100 of the AAPM: Application of risk analysis methods to radiation therapy quality management.

    PubMed

    Huq, M Saiful; Fraass, Benedick A; Dunscombe, Peter B; Gibbons, John P; Ibbott, Geoffrey S; Mundt, Arno J; Mutic, Sasa; Palta, Jatinder R; Rath, Frank; Thomadsen, Bruce R; Williamson, Jeffrey F; Yorke, Ellen D

    2016-07-01

    The increasing complexity of modern radiation therapy planning and delivery challenges traditional prescriptive quality management (QM) methods, such as many of those included in guidelines published by organizations such as the AAPM, ASTRO, ACR, ESTRO, and IAEA. These prescriptive guidelines have traditionally focused on monitoring all aspects of the functional performance of radiotherapy (RT) equipment by comparing parameters against tolerances set at strict but achievable values. Many errors that occur in radiation oncology are not due to failures in devices and software; rather they are failures in workflow and process. A systematic understanding of the likelihood and clinical impact of possible failures throughout a course of radiotherapy is needed to direct limit QM resources efficiently to produce maximum safety and quality of patient care. Task Group 100 of the AAPM has taken a broad view of these issues and has developed a framework for designing QM activities, based on estimates of the probability of identified failures and their clinical outcome through the RT planning and delivery process. The Task Group has chosen a specific radiotherapy process required for "intensity modulated radiation therapy (IMRT)" as a case study. The goal of this work is to apply modern risk-based analysis techniques to this complex RT process in order to demonstrate to the RT community that such techniques may help identify more effective and efficient ways to enhance the safety and quality of our treatment processes. The task group generated by consensus an example quality management program strategy for the IMRT process performed at the institution of one of the authors. This report describes the methodology and nomenclature developed, presents the process maps, FMEAs, fault trees, and QM programs developed, and makes suggestions on how this information could be used in the clinic. The development and implementation of risk-assessment techniques will make radiation therapy

  1. The report of Task Group 100 of the AAPM: Application of risk analysis methods to radiation therapy quality management

    PubMed Central

    Huq, M. Saiful; Fraass, Benedick A.; Dunscombe, Peter B.; Gibbons, John P.; Mundt, Arno J.; Mutic, Sasa; Palta, Jatinder R.; Rath, Frank; Thomadsen, Bruce R.; Williamson, Jeffrey F.; Yorke, Ellen D.

    2016-01-01

    The increasing complexity of modern radiation therapy planning and delivery challenges traditional prescriptive quality management (QM) methods, such as many of those included in guidelines published by organizations such as the AAPM, ASTRO, ACR, ESTRO, and IAEA. These prescriptive guidelines have traditionally focused on monitoring all aspects of the functional performance of radiotherapy (RT) equipment by comparing parameters against tolerances set at strict but achievable values. Many errors that occur in radiation oncology are not due to failures in devices and software; rather they are failures in workflow and process. A systematic understanding of the likelihood and clinical impact of possible failures throughout a course of radiotherapy is needed to direct limit QM resources efficiently to produce maximum safety and quality of patient care. Task Group 100 of the AAPM has taken a broad view of these issues and has developed a framework for designing QM activities, based on estimates of the probability of identified failures and their clinical outcome through the RT planning and delivery process. The Task Group has chosen a specific radiotherapy process required for “intensity modulated radiation therapy (IMRT)” as a case study. The goal of this work is to apply modern risk-based analysis techniques to this complex RT process in order to demonstrate to the RT community that such techniques may help identify more effective and efficient ways to enhance the safety and quality of our treatment processes. The task group generated by consensus an example quality management program strategy for the IMRT process performed at the institution of one of the authors. This report describes the methodology and nomenclature developed, presents the process maps, FMEAs, fault trees, and QM programs developed, and makes suggestions on how this information could be used in the clinic. The development and implementation of risk-assessment techniques will make radiation

  2. Analysis of the LaSalle Unit 2 Nuclear Power Plant, Risk Methods Integration and Evaluation Program (RMIEP)

    SciTech Connect

    Ferrell, W.L. ); Payne, A.C. Jr.; Daniel, S.L. )

    1992-10-01

    This report is a description of the internal flood analysis performed on the LaSalle County Nuclear Generating Station, Unit 2. A more detailed integration with the internal events analysis than in prior flood risk assessments was accomplished. The same system fault trees used for the internal events analysis were also used for the flood analysis, which included modeling of components down to the contact pair level. Subsidiary equations were created to map the effects of pipe failures. All component locations were traced and mapped into the fault trees. The effects of floods were then mapped directly onto the internal plant model and their relative importance was evaluated. A detailed screening analysis was performed which showed that most plant areas had a negligible contribution to the flood-induced core damage frequency. This was influenced strongly by the fact that the LaSalle plant was designed with a high level of concern about the effects of external events such as fire and flood and significant separation was maintained between systems in the original design. Detailed analysis of the remaining flood scenarios identified only two that contributed significantly to risk. The flood analysis resulted in a total (mean) core damage frequency of 3.23E-6 per year.

  3. Study Of The Risks Arising From Natural Disasters And Hazards On Urban And Intercity Motorways By Using Failure Mode Effect Analysis (FMEA) Methods

    NASA Astrophysics Data System (ADS)

    DELİCE, Yavuz

    2015-04-01

    Highways, Located in the city and intercity locations are generally prone to many kind of natural disaster risks. Natural hazards and disasters that may occur firstly from highway project making to construction and operation stages and later during the implementation of highway maintenance and repair stages have to be taken into consideration. And assessment of risks that may occur against adverse situations is very important in terms of project design, construction, operation maintenance and repair costs. Making hazard and natural disaster risk analysis is largely depending on the definition of the likelihood of the probable hazards on the highways. However, assets at risk , and the impacts of the events must be examined and to be rated in their own. With the realization of these activities, intended improvements against natural hazards and disasters will be made with the utilization of Failure Mode Effects Analysis (FMEA) method and their effects will be analyzed with further works. FMEA, is a useful method to identify the failure mode and effects depending on the type of failure rate effects priorities and finding the most optimum economic and effective solution. Although relevant measures being taken for the identified risks by this analysis method , it may also provide some information for some public institutions about the nature of these risks when required. Thus, the necessary measures will have been taken in advance in the city and intercity highways. Many hazards and natural disasters are taken into account in risk assessments. The most important of these dangers can be listed as follows; • Natural disasters 1. Meteorological based natural disasters (floods, severe storms, tropical storms, winter storms, avalanches, etc.). 2. Geological based natural disasters (earthquakes, tsunamis, landslides, subsidence, sinkholes, etc) • Human originated disasters 1. Transport accidents (traffic accidents), originating from the road surface defects (icing

  4. Qualitative methods for assessing risk

    SciTech Connect

    Mahn, J.A.; Hannaman, G.W.; Kryska, P.

    1995-03-01

    The purpose of this document is to describe a qualitative risk assessment process that supplements the requirements of DOE/AL 5481.1B. Although facility managers have a choice of assessing risk either quantitatively or qualitatively, trade offs are involved in making the most appropriate choice for a given application. The results that can be obtained from a quantitative risk assessment are significantly more robust than those results derived from a qualitative approach. However, the advantages derived from quantitative risk assessment are achieved at a greater expenditure of money, time and convenience. This document provides the elements of a framework for performing a much less costly qualitative risk assessment, while retaining the best attributes of quantitative methods. The approach discussed herein will; (1) provide facility managers with the tools to prepare consistent, site wide assessments, and (2) aid the reviewers who may be tasked to evaluate the assessments. Added cost/benefit measures of the qualitative methodology include the identification of mechanisms for optimally allocating resources for minimizing risk in an expeditious, and fiscally responsible manner.

  5. [Comparative analysis of two different methods for risk assessment of groundwater pollution: a case study in Beijing plain].

    PubMed

    Wang, Hong-na; He, Jiang-tao; Ma, Wen-jie; Xu, Zhen

    2015-01-01

    Groundwater contamination risk assessment has important meaning to groundwater contamination prevention planning and groundwater exploitation potentiality. Recently, UN assessment system and WP assessment system have become the focuses of international research. In both systems, the assessment framework and indices were drawn from five aspects: intrinsic vulnerability, aquifer storage, groundwater quality, groundwater resource protection zone and contamination load. But, the five factors were built up in different ways. In order to expound the difference between the UN and WP assessment systems, and explain the main reasons, the UN and WP assessment systems were applied to Beijing Plain, China. The maps constructed from the UN and WP risk assessment systems were compared. The results showed that both kinds of groundwater contamination risk assessment maps were in accordance with the actual conditions and were similar in spatial distribution trends. However, there was quite significant different in the coverage area at the same level. It also revealed that during the system construction process, the structural hierarchy, relevant overlaying principles and classification method might have effects on the groundwater contamination risk assessment map. UN assessment system and WP assessment system were both suitable for groundwater contamination risk assessment of the plain, however, their emphasis was different. PMID:25898663

  6. Comparison of nonlinear methods symbolic dynamics, detrended fluctuation, and Poincaré plot analysis in risk stratification in patients with dilated cardiomyopathy

    NASA Astrophysics Data System (ADS)

    Voss, Andreas; Schroeder, Rico; Truebner, Sandra; Goernig, Matthias; Figulla, Hans Reiner; Schirdewan, Alexander

    2007-03-01

    Dilated cardiomyopathy (DCM) has an incidence of about 20/100 000 new cases per annum and accounts for nearly 10 000 deaths per year in the United States. Approximately 36% of patients with dilated cardiomyopathy (DCM) suffer from cardiac death within five years after diagnosis. Currently applied methods for an early risk prediction in DCM patients are rather insufficient. The objective of this study was to investigate the suitability of short-term nonlinear methods symbolic dynamics (STSD), detrended fluctuation (DFA), and Poincaré plot analysis (PPA) for risk stratification in these patients. From 91 DCM patients and 30 healthy subjects (REF), heart rate and blood pressure variability (HRV, BPV), STSD, DFA, and PPA were analyzed. Measures from BPV analysis, DFA, and PPA revealed highly significant differences (p<0.0011) discriminating REF and DCM. For risk stratification in DCM patients, four parameters from BPV analysis, STSD, and PPA revealed significant differences between low and high risk (maximum sensitivity: 90%, specificity: 90%). These results suggest that STSD and PPA are useful nonlinear methods for enhanced risk stratification in DCM patients.

  7. [METHODS AND TECHNOLOGIES OF HEALTH RISK ANALYSIS IN THE SYSTEM OF THE STATE MANAGEMENT UNDER ASSURANCE OF THE SANITATION AND EPIDEMIOLOGICAL WELFARE OF POPULATION].

    PubMed

    Zaĭtseva, N V; Popova, A Iu; Maĭ, I V; Shur, P Z

    2015-01-01

    The methodology of the analysis of health risk at the present stage of development of Russian society is in-demand at all levels of government management. In conjunction with the methods of mathematical modeling, spatial-temporal analysis and economic tools the risk assessment in the analysis of the situation makes it possible to determine the level of safety of the population, workers and consumers, to select prior resources, and threat factors as a point for exertion efforts. At the planning stage risk assessment is a basis for the establishment of most effective measures for the minimization of hazard and dangers. At the realization stage the methodology allows to estimate the efficiency of measures; at the control and supervision phase it permits to select out priorities for the concentration of efforts on the objects of maximal health risk for population. Risk assessments, including the elements of evolutionary modeling, are incorporated in the system of state hygienic regulation, the formation of evidence base of harm to health, the organization of control and supervisory activities. This allows you to harmonize the domestic legal framework with ternational legal requirements and ultimately enhances the credibility of the Russian data on the safety of the environment products and services. There is seemed to be actual the further assignment of enforcement of methodology of health risk analysis in the field of assurance of sanitary and epidemiological well-being and health of employers; he development of informational and analytical base in the part of the establishment of models of dependencies "exposure-response" for different types and levels of exposure and risk contingents; the accuracy enhancement of estimations of exposure; improvement of the economic aspects of health risk analysis and forecasting of measures aimed at mitigation of the losses associated with the negative impact of manifold factors on the health of citizens. PMID:26155657

  8. Risk Analysis Virtual ENvironment

    Energy Science and Technology Software Center (ESTSC)

    2014-02-10

    RAVEN has 3 major functionalities: 1. Provides a Graphical User Interface for the pre- and post-processing of the RELAP-7 input and output. 2. Provides the capability to model nuclear power plants control logic for the RELAP-7 code and dynamic control of the accident scenario evolution. This capability is based on a software structure that realizes a direct connection between the RELAP-7 solver engine (MOOSE) and a python environment where the variables describing the plant statusmore » are accessible in a scripting environment. RAVEN support the generation of the probabilistic scenario control by supplying a wide range of probability and cumulative distribution functions and their inverse functions. 3. Provides a general environment to perform probability risk analysis for RELAP-7, RELAP-5 and any generic MOOSE based applications. The probabilistic analysis is performed by sampling the input space of the coupled code parameters and it is enhanced by using modern artificial intelligence algorithms that accelerate the identification of the areas of major risk (in the input parameter space). This environment also provides a graphical visualization capability to analyze the outcomes. Among other approaches, the classical Monte Carlo and Latin Hypercube sampling algorithms are available. For the acceleration of the convergence of the sampling methodologies, Support Vector Machines, Bayesian regression, and collocation stochastic polynomials chaos are implemented. The same methodologies here described could be used to solve optimization and uncertainties propagation problems using the RAVEN framework.« less

  9. Risk Analysis Virtual ENvironment

    SciTech Connect

    2014-02-10

    RAVEN has 3 major functionalities: 1. Provides a Graphical User Interface for the pre- and post-processing of the RELAP-7 input and output. 2. Provides the capability to model nuclear power plants control logic for the RELAP-7 code and dynamic control of the accident scenario evolution. This capability is based on a software structure that realizes a direct connection between the RELAP-7 solver engine (MOOSE) and a python environment where the variables describing the plant status are accessible in a scripting environment. RAVEN support the generation of the probabilistic scenario control by supplying a wide range of probability and cumulative distribution functions and their inverse functions. 3. Provides a general environment to perform probability risk analysis for RELAP-7, RELAP-5 and any generic MOOSE based applications. The probabilistic analysis is performed by sampling the input space of the coupled code parameters and it is enhanced by using modern artificial intelligence algorithms that accelerate the identification of the areas of major risk (in the input parameter space). This environment also provides a graphical visualization capability to analyze the outcomes. Among other approaches, the classical Monte Carlo and Latin Hypercube sampling algorithms are available. For the acceleration of the convergence of the sampling methodologies, Support Vector Machines, Bayesian regression, and collocation stochastic polynomials chaos are implemented. The same methodologies here described could be used to solve optimization and uncertainties propagation problems using the RAVEN framework.

  10. Integrated seismic risk analysis using simple weighting method: the case of residential Eskişehir, Turkey

    NASA Astrophysics Data System (ADS)

    Pekkan, E.; Tun, M.; Guney, Y.

    2014-11-01

    A large part of the residential areas in Turkey are at risk for earthquakes. The main factors that threaten residential areas during an earthquake are poor quality building stock and soil problems. Liquefaction, loss of bearing capacity, amplification, slope failure, and landslide risks must be taken into account for residential areas that are close to the fault zones and covered with younger sediments. If these risks were separately analyzed and these analyses were combined, this would be more realistic than analyzing several hazard maps based a single parameter. In this study, an integrated seismic hazard map of central Eskişehir was created based on two earthquake related parameters, liquefaction, and amplification, by using a simple weighting method. Other earthquake-related problems such as loss of bearing capacity, landslides, and slope failures are not significant for Eskişehir because of the geologic and the topographic conditions of the region. According to the integrated seismic hazard map of the Eskişehir residential area, the area is found to be generally at medium-high risk during a potential earthquake.

  11. Augmenting the Deliberative Method for Ranking Risks.

    PubMed

    Susel, Irving; Lasley, Trace; Montezemolo, Mark; Piper, Joel

    2016-01-01

    The Department of Homeland Security (DHS) characterized and prioritized the physical cross-border threats and hazards to the nation stemming from terrorism, market-driven illicit flows of people and goods (illegal immigration, narcotics, funds, counterfeits, and weaponry), and other nonmarket concerns (movement of diseases, pests, and invasive species). These threats and hazards pose a wide diversity of consequences with very different combinations of magnitudes and likelihoods, making it very challenging to prioritize them. This article presents the approach that was used at DHS to arrive at a consensus regarding the threats and hazards that stand out from the rest based on the overall risk they pose. Due to time constraints for the decision analysis, it was not feasible to apply multiattribute methodologies like multiattribute utility theory or the analytic hierarchy process. Using a holistic approach was considered, such as the deliberative method for ranking risks first published in this journal. However, an ordinal ranking alone does not indicate relative or absolute magnitude differences among the risks. Therefore, the use of the deliberative method for ranking risks is not sufficient for deciding whether there is a material difference between the top-ranked and bottom-ranked risks, let alone deciding what the stand-out risks are. To address this limitation of ordinal rankings, the deliberative method for ranking risks was augmented by adding an additional step to transform the ordinal ranking into a ratio scale ranking. This additional step enabled the selection of stand-out risks to help prioritize further analysis. PMID:26224206

  12. A ubiquitous method for street scale spatial data collection and analysis in challenging urban environments: mapping health risks using spatial video in Haiti

    PubMed Central

    2013-01-01

    Background Fine-scale and longitudinal geospatial analysis of health risks in challenging urban areas is often limited by the lack of other spatial layers even if case data are available. Underlying population counts, residential context, and associated causative factors such as standing water or trash locations are often missing unless collected through logistically difficult, and often expensive, surveys. The lack of spatial context also hinders the interpretation of results and designing intervention strategies structured around analytical insights. This paper offers a ubiquitous spatial data collection approach using a spatial video that can be used to improve analysis and involve participatory collaborations. A case study will be used to illustrate this approach with three health risks mapped at the street scale for a coastal community in Haiti. Methods Spatial video was used to collect street and building scale information, including standing water, trash accumulation, presence of dogs, cohort specific population characteristics, and other cultural phenomena. These data were digitized into Google Earth and then coded and analyzed in a GIS using kernel density and spatial filtering approaches. The concentrations of these risks around area schools which are sometimes sources of diarrheal disease infection because of the high concentration of children and variable sanitary practices will show the utility of the method. In addition schools offer potential locations for cholera education interventions. Results Previously unavailable fine scale health risk data vary in concentration across the town, with some schools being proximate to greater concentrations of the mapped risks. The spatial video is also used to validate coded data and location specific risks within these “hotspots”. Conclusions Spatial video is a tool that can be used in any environment to improve local area health analysis and intervention. The process is rapid and can be repeated in study

  13. Multiple Interacting Risk Factors: On Methods for Allocating Risk Factor Interactions.

    PubMed

    Price, Bertram; MacNicoll, Michael

    2015-05-01

    A persistent problem in health risk analysis where it is known that a disease may occur as a consequence of multiple risk factors with interactions is allocating the total risk of the disease among the individual risk factors. This problem, referred to here as risk apportionment, arises in various venues, including: (i) public health management, (ii) government programs for compensating injured individuals, and (iii) litigation. Two methods have been described in the risk analysis and epidemiology literature for allocating total risk among individual risk factors. One method uses weights to allocate interactions among the individual risk factors. The other method is based on risk accounting axioms and finding an optimal and unique allocation that satisfies the axioms using a procedure borrowed from game theory. Where relative risk or attributable risk is the risk measure, we find that the game-theory-determined allocation is the same as the allocation where risk factor interactions are apportioned to individual risk factors using equal weights. Therefore, the apportionment problem becomes one of selecting a meaningful set of weights for allocating interactions among the individual risk factors. Equal weights and weights proportional to the risks of the individual risk factors are discussed. PMID:25644783

  14. Object Oriented Risk Analysis Workshop

    NASA Astrophysics Data System (ADS)

    Pons, M. Güell I.; Jaboyedoff, M.

    2009-04-01

    In the framework of the RISET Project (Interfaculty Network of Support to Education and Technology) an educational tool for introducing risk analysis has been developed. This workshop enables to carry a group of students (role-play game) through a step-by-step process of risk identification and quantification. The aim is to assess risk in a characteristic alpine village regarding natural hazards (rockfall, snow avalanche, flooding…) and is oriented to affected objects such as buildings, infrastructures... The workshop contains the following steps: 1.- Planning of the study and definition of stakeholders 2.- Hazard identification 3.- Risk analysis 4.- Risk assessment 5.- Proposition of mitigation measures 6- Risk management and cost-benefit analysis. During the process, information related to past events and useful concepts are provided in order to bring up discussion and decision making. The Risk Matrix and other graphical tools allow having a visual representation of the risk level and help to prioritize counter measures. At the end of the workshop, there is the possibility to compare the results between different groups and print out a summarizing report. This approach provides a rapid and comprehensible risk evaluation. The workshop is accessible from the internet and will be used for educational purposes at bachelor and master level as well as for external persons dealing with risk analysis.

  15. Analysis of the LaSalle Unit 2 Nuclear Power Plant, Risk Methods Integration and Evaluation Program (RMIEP). Volume 10, Internal flood analysis

    SciTech Connect

    Ferrell, W.L.; Payne, A.C. Jr.; Daniel, S.L.

    1992-10-01

    This report is a description of the internal flood analysis performed on the LaSalle County Nuclear Generating Station, Unit 2. A more detailed integration with the internal events analysis than in prior flood risk assessments was accomplished. The same system fault trees used for the internal events analysis were also used for the flood analysis, which included modeling of components down to the contact pair level. Subsidiary equations were created to map the effects of pipe failures. All component locations were traced and mapped into the fault trees. The effects of floods were then mapped directly onto the internal plant model and their relative importance was evaluated. A detailed screening analysis was performed which showed that most plant areas had a negligible contribution to the flood-induced core damage frequency. This was influenced strongly by the fact that the LaSalle plant was designed with a high level of concern about the effects of external events such as fire and flood and significant separation was maintained between systems in the original design. Detailed analysis of the remaining flood scenarios identified only two that contributed significantly to risk. The flood analysis resulted in a total (mean) core damage frequency of 3.23E-6 per year.

  16. Risk analysis and management

    NASA Technical Reports Server (NTRS)

    Smith, H. E.

    1990-01-01

    Present software development accomplishments are indicative of the emerging interest in and increasing efforts to provide risk assessment backbone tools in the manned spacecraft engineering community. There are indications that similar efforts are underway in the chemical processes industry and are probably being planned for other high risk ground base environments. It appears that complex flight systems intended for extended manned planetary exploration will drive this technology.

  17. Budget Risk & Prioritization Analysis Tool

    Energy Science and Technology Software Center (ESTSC)

    2010-12-31

    BRPAtool performs the following: •Assists managers in making solid decisions on what scope/activities to reduce and/or eliminate, to meet constrained budgets, based on multiple risk factors •Enables analysis of different budget scenarios •Can analyze risks and cost for each activity based on technical, quantifiable risk criteria and management-determined risks •Real-time analysis •Enables managers to determine the multipliers and where funding is best applied •Promotes solid budget defense

  18. Bivariate random effects models for meta-analysis of comparative studies with binary outcomes: methods for the absolute risk difference and relative risk.

    PubMed

    Chu, Haitao; Nie, Lei; Chen, Yong; Huang, Yi; Sun, Wei

    2012-12-01

    Multivariate meta-analysis is increasingly utilised in biomedical research to combine data of multiple comparative clinical studies for evaluating drug efficacy and safety profile. When the probability of the event of interest is rare, or when the individual study sample sizes are small, a substantial proportion of studies may not have any event of interest. Conventional meta-analysis methods either exclude such studies or include them through ad hoc continuality correction by adding an arbitrary positive value to each cell of the corresponding 2 × 2 tables, which may result in less accurate conclusions. Furthermore, different continuity corrections may result in inconsistent conclusions. In this article, we discuss a bivariate Beta-binomial model derived from Sarmanov family of bivariate distributions and a bivariate generalised linear mixed effects model for binary clustered data to make valid inferences. These bivariate random effects models use all available data without ad hoc continuity corrections, and accounts for the potential correlation between treatment (or exposure) and control groups within studies naturally. We then utilise the bivariate random effects models to reanalyse two recent meta-analysis data sets. PMID:21177306

  19. Reinterpretation of the results of a pooled analysis of dietary carotenoid intake and breast cancer risk by using the interval collapsing method

    PubMed Central

    2016-01-01

    OBJECTIVES: A pooled analysis of 18 prospective cohort studies reported in 2012 for evaluating carotenoid intakes and breast cancer risk defined by estrogen receptor (ER) and progesterone receptor (PR) statuses by using the “highest versus lowest intake” method (HLM). By applying the interval collapsing method (ICM) to maximize the use of the estimated information, we reevaluated the results of the previous analysis in order to reinterpret the inferences made. METHODS: In order to estimate the summary effect size (sES) and its 95% confidence interval (CI), meta-analyses with the random-effects model were conducted for adjusted relative risks and their 95% CI from the second to the fifth interval according to five kinds of carotenoids and ER/PR status. RESULTS: The following new findings were identified: α-Carotene and β-cryptoxanthin have protective effects on overall breast cancer. All five kinds of carotenoids showed protective effects on ER− breast cancer. β-Carotene level increased the risk of ER+ or ER+/PR+ breast cancer. α-Carotene, β-carotene, lutein/zeaxanthin, and lycopene showed a protective effect on ER−/PR+ or ER−/PR− breast cancer. CONCLUSIONS: The new facts support the hypothesis that carotenoids that show anticancer effects with anti-oxygen function might reduce the risk of ER− breast cancer. Based on the new facts, the modification of the effects of α-carotene, β-carotene, and β-cryptoxanthin should be evaluated according to PR and ER statuses. PMID:27283141

  20. Safety analysis, risk assessment, and risk acceptance criteria

    SciTech Connect

    Jamali, K.; Stack, D.W.; Sullivan, L.H.; Sanzo, D.L.

    1997-08-01

    This paper discusses a number of topics that relate safety analysis as documented in the Department of Energy (DOE) safety analysis reports (SARs), probabilistic risk assessments (PRA) as characterized primarily in the context of the techniques that have assumed some level of formality in commercial nuclear power plant applications, and risk acceptance criteria as an outgrowth of PRA applications. DOE SARs of interest are those that are prepared for DOE facilities under DOE Order 5480.23 and the implementing guidance in DOE STD-3009-94. It must be noted that the primary area of application for DOE STD-3009 is existing DOE facilities and that certain modifications of the STD-3009 approach are necessary in SARs for new facilities. Moreover, it is the hazard analysis (HA) and accident analysis (AA) portions of these SARs that are relevant to the present discussions. Although PRAs can be qualitative in nature, PRA as used in this paper refers more generally to all quantitative risk assessments and their underlying methods. HA as used in this paper refers more generally to all qualitative risk assessments and their underlying methods that have been in use in hazardous facilities other than nuclear power plants. This discussion includes both quantitative and qualitative risk assessment methods. PRA has been used, improved, developed, and refined since the Reactor Safety Study (WASH-1400) was published in 1975 by the Nuclear Regulatory Commission (NRC). Much debate has ensued since WASH-1400 on exactly what the role of PRA should be in plant design, reactor licensing, `ensuring` plant and process safety, and a large number of other decisions that must be made for potentially hazardous activities. Of particular interest in this area is whether the risks quantified using PRA should be compared with numerical risk acceptance criteria (RACs) to determine whether a facility is `safe.` Use of RACs requires quantitative estimates of consequence frequency and magnitude.

  1. [The cascade scheme as a methodical platform for analysis of health risks in space flight and partially and fully analog conditions].

    PubMed

    Ushakov, I B; Poliakov, A V; Usov, V M

    2011-01-01

    Space anthropoecology, a subsection of human ecology, studies various aspects of physiological, psychological, social and professional adaptation to the extreme environment of space flight and human life and work in partially- and fully analogous conditions on Earth. Both SF and simulated extreme conditions are known for high human safety standards and a substantial analytic base that secures on-line analysis of torrent of information. Management evaluation and response to germing undesired developments aimed to curb their impact on the functioning of the crew-vehicle-environment system and human health involve the complete wealth of knowledge about risks to human health and performance. Spacecrew safety issues are tackled by experts of many specialties which emphasizes the importance of integral methodical approaches to risk estimation and mitigation, setting up barriers to adverse trends in human physiology and psychology in challenging conditions, and minimization of delayed effects on professional longevity and disorders in behavioral reactions. PMID:21970036

  2. Probabilistic Exposure Analysis for Chemical Risk Characterization

    PubMed Central

    Bogen, Kenneth T.; Cullen, Alison C.; Frey, H. Christopher; Price, Paul S.

    2009-01-01

    This paper summarizes the state of the science of probabilistic exposure assessment (PEA) as applied to chemical risk characterization. Current probabilistic risk analysis methods applied to PEA are reviewed. PEA within the context of risk-based decision making is discussed, including probabilistic treatment of related uncertainty, interindividual heterogeneity, and other sources of variability. Key examples of recent experience gained in assessing human exposures to chemicals in the environment, and other applications to chemical risk characterization and assessment, are presented. It is concluded that, although improvements continue to be made, existing methods suffice for effective application of PEA to support quantitative analyses of the risk of chemically induced toxicity that play an increasing role in key decision-making objectives involving health protection, triage, civil justice, and criminal justice. Different types of information required to apply PEA to these different decision contexts are identified, and specific PEA methods are highlighted that are best suited to exposure assessment in these separate contexts. PMID:19223660

  3. Failure risk assessment by analysis and testing

    NASA Technical Reports Server (NTRS)

    Moore, N.; Ebbeler, D.; Creager, M.

    1992-01-01

    The sources of information on which to base an evaluation of reliability or failure risk of an aerospace flight system are (1) experience from tests and flights and (2) engineering analysis. It is rarely feasible to establish high reliability at high confidence by testing aerospace systems or components. Moreover, failure prediction by conventional, deterministic methods of engineering analysis can become arbitrary and subject to serious misinterpretation when uncertain or approximate information is used to establish analysis parameter values and to calibrate the accuracy of engineering models. The limitations of testing to evaluate failure risk are discussed, and a statistical approach which incorporates both engineering analysis and testing is presented.

  4. At-Risk Youngsters: Methods That Work.

    ERIC Educational Resources Information Center

    Obiakor, Festus E.

    This paper examines problems faced by youngsters at risk of failure in school, and discusses methods for helping them succeed in educational programs. At-risk youngsters confront many problems in school and in mainstream society, and are frequently misidentified, misdiagnosed, and improperly instructed. Problems faced by at-risk youngsters…

  5. Post hoc Analysis for Detecting Individual Rare Variant Risk Associations Using Probit Regression Bayesian Variable Selection Methods in Case-Control Sequencing Studies.

    PubMed

    Larson, Nicholas B; McDonnell, Shannon; Albright, Lisa Cannon; Teerlink, Craig; Stanford, Janet; Ostrander, Elaine A; Isaacs, William B; Xu, Jianfeng; Cooney, Kathleen A; Lange, Ethan; Schleutker, Johanna; Carpten, John D; Powell, Isaac; Bailey-Wilson, Joan; Cussenot, Olivier; Cancel-Tassin, Geraldine; Giles, Graham; MacInnis, Robert; Maier, Christiane; Whittemore, Alice S; Hsieh, Chih-Lin; Wiklund, Fredrik; Catolona, William J; Foulkes, William; Mandal, Diptasri; Eeles, Rosalind; Kote-Jarai, Zsofia; Ackerman, Michael J; Olson, Timothy M; Klein, Christopher J; Thibodeau, Stephen N; Schaid, Daniel J

    2016-09-01

    Rare variants (RVs) have been shown to be significant contributors to complex disease risk. By definition, these variants have very low minor allele frequencies and traditional single-marker methods for statistical analysis are underpowered for typical sequencing study sample sizes. Multimarker burden-type approaches attempt to identify aggregation of RVs across case-control status by analyzing relatively small partitions of the genome, such as genes. However, it is generally the case that the aggregative measure would be a mixture of causal and neutral variants, and these omnibus tests do not directly provide any indication of which RVs may be driving a given association. Recently, Bayesian variable selection approaches have been proposed to identify RV associations from a large set of RVs under consideration. Although these approaches have been shown to be powerful at detecting associations at the RV level, there are often computational limitations on the total quantity of RVs under consideration and compromises are necessary for large-scale application. Here, we propose a computationally efficient alternative formulation of this method using a probit regression approach specifically capable of simultaneously analyzing hundreds to thousands of RVs. We evaluate our approach to detect causal variation on simulated data and examine sensitivity and specificity in instances of high RV dimensionality as well as apply it to pathway-level RV analysis results from a prostate cancer (PC) risk case-control sequencing study. Finally, we discuss potential extensions and future directions of this work. PMID:27312771

  6. Multiattribute risk analysis in nuclear emergency management.

    PubMed

    Hämäläinen, R P; Lindstedt, M R; Sinkko, K

    2000-08-01

    Radiation protection authorities have seen a potential for applying multiattribute risk analysis in nuclear emergency management and planning to deal with conflicting objectives, different parties involved, and uncertainties. This type of approach is expected to help in the following areas: to ensure that all relevant attributes are considered in decision making; to enhance communication between the concerned parties, including the public; and to provide a method for explicitly including risk analysis in the process. A multiattribute utility theory analysis was used to select a strategy for protecting the population after a simulated nuclear accident. The value-focused approach and the use of a neutral facilitator were identified as being useful. PMID:11051070

  7. Selecting Needs Analysis Methods.

    ERIC Educational Resources Information Center

    Newstrom, John W.; Lilyquist, John M.

    1979-01-01

    Presents a contingency model for decision making with regard to needs analysis methods. Focus is on 12 methods with brief discussion of their defining characteristics and some operational guidelines for their use. (JOW)

  8. GROWING NEED FOR RISK ANALYSIS

    EPA Science Inventory

    Risk analysis has been increasingly receiving attention in making environmental decisions. or example, in its May 18, 1993 Combustion Strategy announcement, EPA required that any issuance of a new hazardous waste combustion permit be preceded by the performance of a complete (dir...

  9. Intelligent adversary risk analysis: a bioterrorism risk management model.

    PubMed

    Parnell, Gregory S; Smith, Christopher M; Moxley, Frederick I

    2010-01-01

    The tragic events of 9/11 and the concerns about the potential for a terrorist or hostile state attack with weapons of mass destruction have led to an increased emphasis on risk analysis for homeland security. Uncertain hazards (natural and engineering) have been successfully analyzed using probabilistic risk analysis (PRA). Unlike uncertain hazards, terrorists and hostile states are intelligent adversaries who can observe our vulnerabilities and dynamically adapt their plans and actions to achieve their objectives. This article compares uncertain hazard risk analysis with intelligent adversary risk analysis, describes the intelligent adversary risk analysis challenges, and presents a probabilistic defender-attacker-defender model to evaluate the baseline risk and the potential risk reduction provided by defender investments. The model includes defender decisions prior to an attack; attacker decisions during the attack; defender actions after an attack; and the uncertainties of attack implementation, detection, and consequences. The risk management model is demonstrated with an illustrative bioterrorism problem with notional data. PMID:20002893

  10. Risk analysis of spent fuel transportation

    SciTech Connect

    Not Available

    1986-01-01

    This book discusses the kinds of judgments that must go into a technical analysis of risk and moves on to the sociopolitical aspects of risk analysis where the same set of facts can be honestly but differently interpreted. Also outlines options available in risk management and reviews courts' involvement with risk analysis.

  11. Evaluating the Risks of Clinical Research: Direct Comparative Analysis

    PubMed Central

    Abdoler, Emily; Roberson-Nay, Roxann; Pine, Daniel S.; Wendler, David

    2014-01-01

    Abstract Objectives: Many guidelines and regulations allow children and adolescents to be enrolled in research without the prospect of clinical benefit when it poses minimal risk. However, few systematic methods exist to determine when research risks are minimal. This situation has led to significant variation in minimal risk judgments, raising concern that some children are not being adequately protected. To address this concern, we describe a new method for implementing the widely endorsed “risks of daily life” standard for minimal risk. This standard defines research risks as minimal when they do not exceed the risks posed by daily life activities or routine examinations. Methods: This study employed a conceptual and normative analysis, and use of an illustrative example. Results: Different risks are composed of the same basic elements: Type, likelihood, and magnitude of harm. Hence, one can compare the risks of research and the risks of daily life by comparing the respective basic elements with each other. We use this insight to develop a systematic method, direct comparative analysis, for implementing the “risks of daily life” standard for minimal risk. The method offers a way of evaluating research procedures that pose the same types of risk as daily life activities, such as the risk of experiencing anxiety, stress, or other psychological harm. We thus illustrate how direct comparative analysis can be applied in practice by using it to evaluate whether the anxiety induced by a respiratory CO2 challenge poses minimal or greater than minimal risks in children and adolescents. Conclusions: Direct comparative analysis is a systematic method for applying the “risks of daily life” standard for minimal risk to research procedures that pose the same types of risk as daily life activities. It thereby offers a method to protect children and adolescents in research, while ensuring that important studies are not blocked because of unwarranted concerns about

  12. Failure analysis of pinch-torsion tests as a thermal runaway risk evaluation method of Li-Ion Cells

    SciTech Connect

    Xia, Yuzhi; Li, Dr. Tianlei; Ren, Prof. Fei; Gao, Yanfei; Wang, Hsin

    2014-01-01

    Recently a pinch-torsion test is developed for safety testing of Li-ion batteries (Ren et al., J. Power Source, 2013). It has been demonstrated that this test can generate small internal short-circuit spots in the separator in a controllable and repeatable manner. In the current research, the failure mechanism is examined by numerical simulations and comparisons to experimental observations. Finite element models are developed to evaluate the deformation of the separators under both pure pinch and pinch-torsion loading conditions. It is discovered that the addition of the torsion component significantly increased the maximum principal strain, which is believed to induce the internal short circuit. In addition, the applied load in the pinch-torsion test is significantly less than in the pure pinch test, thus dramatically improving the applicability of this method to ultra-thick batteries which otherwise require heavy load in excess of machine capability. It is further found that the separator failure is achieved in the early stage of torsion (within a few degree of rotation). Effect of coefficient of friction on the maximum principal strain is also examined.

  13. Recursive Partitioning Method on Competing Risk Outcomes

    PubMed Central

    Xu, Wei; Che, Jiahua; Kong, Qin

    2016-01-01

    In some cancer clinical studies, researchers have interests to explore the risk factors associated with competing risk outcomes such as recurrence-free survival. We develop a novel recursive partitioning framework on competing risk data for both prognostic and predictive model constructions. We define specific splitting rules, pruning algorithm, and final tree selection algorithm for the competing risk tree models. This methodology is quite flexible that it can corporate both semiparametric method using Cox proportional hazards model and parametric competing risk model. Both prognostic and predictive tree models are developed to adjust for potential confounding factors. Extensive simulations show that our methods have well-controlled type I error and robust power performance. Finally, we apply both Cox proportional hazards model and flexible parametric model for prognostic tree development on a retrospective clinical study on oropharyngeal cancer patients. PMID:27486300

  14. Command Process Modeling & Risk Analysis

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  15. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 2 2010-07-01 2010-07-01 false Risk analysis. 75.115...) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive... analysis or VA's Office of Inspector General conducts an independent risk analysis of the data breach....

  16. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 2 2011-07-01 2011-07-01 false Risk analysis. 75.115...) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive... analysis or VA's Office of Inspector General conducts an independent risk analysis of the data breach....

  17. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 2 2014-07-01 2014-07-01 false Risk analysis. 75.115...) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive... analysis or VA's Office of Inspector General conducts an independent risk analysis of the data breach....

  18. Draft Waste Management Programmatic Environmental Impact Statement for managing treatment, storage, and disposal of radioactive and hazardous waste. Volume 3, Appendix A: Public response to revised NOI, Appendix B: Environmental restoration, Appendix C, Environmental impact analysis methods, Appendix D, Risk

    SciTech Connect

    1995-08-01

    Volume three contains appendices for the following: Public comments do DOE`s proposed revisions to the scope of the waste management programmatic environmental impact statement; Environmental restoration sensitivity analysis; Environmental impacts analysis methods; and Waste management facility human health risk estimates.

  19. Comparison Of Intake Gate Closure Methods At Lower Granite, Little Goose, Lower Monumental, And Mcnary Dams Using Risk-Based Analysis

    SciTech Connect

    Gore, Bryan F.; Blackburn, Tyrone R.; Heasler, Patrick G.; Mara, Neil L.; Phan, Hahn K.; Bardy, David M.; Hollenbeck, Robert E.

    2001-01-19

    The objective of this report is to compare the benefits and costs of modifications proposed for intake gate closure systems at four hydroelectric stations on the Lower Snake and Upper Columbia Rivers in the Walla Walla District that are unable to meet the COE 10-minute closure rule due to the installation of fish screens. The primary benefit of the proposed modifications is to reduce the risk of damage to the station and environs when emergency intake gate closure is required. Consequently, this report presents the results and methodology of an extensive risk analysis performed to assess the reliability of powerhouse systems and the costs and timing of potential damages resulting from events requiring emergency intake gate closure. As part of this analysis, the level of protection provided by the nitrogen emergency closure system was also evaluated. The nitrogen system was the basis for the original recommendation to partially disable the intake gate systems. The risk analysis quantifies this protection level.

  20. Comparison of Intake Gate Closure Methods At Lower Granite, Little Goose, Lower Monumental, And McNary Dams Using Risk-Based Analysis

    SciTech Connect

    Gore, Bryan F; Blackburn, Tye R; Heasler, Patrick G; Mara, Neil L

    2001-01-19

    The objective of this report is to compare the benefits and costs of modifications proposed for intake gate closure systems at four hydroelectric stations on the Lower Snake and Upper Columbia Rivers in the Walla Walla District that are unable to meet the COE 10-minute closure rule due to the installation of fish screens. The primary benefit of the proposed modifications is to reduce the risk of damage to the station and environs when emergency intake gate closure is required. Consequently, this report presents the results and methodology of an extensive risk analysis performed to assess the reliability of powerhouse systems and the costs and timing of potential damages resulting from events requiring emergency intake gate closure. As part of this analysis, the level of protection provided by the nitrogen emergency closure system was also evaluated. The nitrogen system was the basis for the original recommendation to partially disable the intake gate systems. The risk analysis quantifies this protection level.

  1. Association analysis of the dopamine D{sub 2} receptor gene in Tourette`s syndrome using the haplotype relative risk method

    SciTech Connect

    Noethen, M.M.; Cichon, S.; Propping, P.

    1994-09-15

    Comings et al. have recently reported a highly significant association between Tourette`s syndrome (TS) and a restriction fragment length polymorphism (RFLP) of the dopamine D{sub 2} receptor gene (DRD2) locus. The A1 allele of the DRD2 Taq I RFLP was present in 45% of the Tourette patients compared with 25% of controls. We tried to replicate this finding by using the haplotype relative risk (HRR) method for association analysis. This method overcomes a major problem of conventional case-control studies, where undetected ethnic differences between patients and controls may result in a false-positive finding, by using parental alleles not inherited by the proband as control alleles. Sixty-one nuclear families encompassing an affected child and parents were typed for the DRD2 Taq I polymorphism. No significant differences in DRD2 A1 allele frequency were observed between TS probands, sub-populations of probands classified according to tic severity, or parental control alleles. Our data do not support the hypothesis that the DRD2 locus may act as a modifying gene in the expression of the disorder in TS probands. 40 refs., 1 tab.

  2. Russian risk assessment methods and approaches

    SciTech Connect

    Dvorack, M.A.; Carlson, D.D.; Smith, R.E.

    1996-07-01

    One of the benefits resulting from the collapse of the Soviet Union is the increased dialogue currently taking place between American and Russian nuclear weapons scientists in various technical arenas. One of these arenas currently being investigated involves collaborative studies which illustrate how risk assessment is perceived and utilized in the Former Soviet Union (FSU). The collaborative studies indicate that, while similarities exist with respect to some methodologies, the assumptions and approaches in performing risk assessments were, and still are, somewhat different in the FSU as opposed to that in the US. The purpose of this paper is to highlight the present knowledge of risk assessment methodologies and philosophies within the two largest nuclear weapons laboratories of the Former Soviet Union, Arzamas-16 and Chelyabinsk-70. Furthermore, This paper will address the relative progress of new risk assessment methodologies, such as Fuzzy Logic, within the framework of current risk assessment methods at these two institutes.

  3. Contribution of European research to risk analysis.

    PubMed

    Boenke, A

    2001-12-01

    The European Commission's, Quality of Life Research Programme, Key Action 1-Health, Food & Nutrition is mission-oriented and aims, amongst other things, at providing a healthy, safe and high-quality food supply leading to reinforced consumer confidence in the safety, of European food. Its objectives also include the enhancing of the competitiveness of the European food supply. Key Action 1 is currently supporting a number of different types of European collaborative projects in the area of risk analysis. The objectives of these projects range from the development and validation of prevention strategies including the reduction of consumers risks; development and validation of new modelling approaches, harmonization of risk assessment principles methodologies and terminology; standardization of methods and systems used for the safety evaluation of transgenic food; providing of tools for the evaluation of human viral contamination of shellfish and quality control; new methodologies for assessing the potential of unintended effects of genetically modified (genetically modified) foods; development of a risk assessment model for Cryptosporidium parvum related to the food and water industries, to the development of a communication platform for genetically modified organism, producers, retailers, regulatory authorities and consumer groups to improve safety assessment procedures, risk management strategies and risk communication; development and validation of new methods for safety testing of transgenic food; evaluation of the safety and efficacy of iron supplementation in pregnant women, evaluation of the potential cancer-preventing activity of pro- and pre-biotic ('synbiotic') combinations in human volunteers. An overview of these projects is presented here. PMID:11761126

  4. [Methods of risk assessment and their validation].

    PubMed

    Baracco, Alessandro

    2014-01-01

    The review of the literature data shows several methods for the the risks assessment of biomnechanical overload of the musculoskeletal system in activities with repetitive strain of the upper limbs and manual material handling. The application of these methods should allow the quantification ofriskfor the working population, the identification of the preventive measures to reduce the risk and their effectiveness and thle design of a specific health surveillance scheme. In this paper we analyze the factors which must be taken into account in Occupational Medicine to implement a process of validation of these methods. In conclusion we believe it will necessary in the future the availability of new methods able to analyze and reduce the risk already in the design phase of the production process. PMID:25558718

  5. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2007-01-01

    A formal method is described to quantify structural reliability and risk in the presence of a multitude of uncertainties. The method is based on the materials behavior level where primitive variables with their respective scatters are used to describe that behavior. Computational simulation is then used to propagate those uncertainties to the structural scale where reliability and risk are usually specified. A sample case is described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from this method demonstrate that the method is mature and that it can be used for future strategic projections and planning to assure better, cheaper, faster products for competitive advantages in world markets. The results also indicate that the methods are suitable for predicting remaining life in aging or deteriorating structures.

  6. Tsunamis: Global Exposure and Local Risk Analysis

    NASA Astrophysics Data System (ADS)

    Harbitz, C. B.; Løvholt, F.; Glimsdal, S.; Horspool, N.; Griffin, J.; Davies, G.; Frauenfelder, R.

    2014-12-01

    The 2004 Indian Ocean tsunami led to a better understanding of the likelihood of tsunami occurrence and potential tsunami inundation, and the Hyogo Framework for Action (HFA) was one direct result of this event. The United Nations International Strategy for Disaster Risk Reduction (UN-ISDR) adopted HFA in January 2005 in order to reduce disaster risk. As an instrument to compare the risk due to different natural hazards, an integrated worldwide study was implemented and published in several Global Assessment Reports (GAR) by UN-ISDR. The results of the global earthquake induced tsunami hazard and exposure analysis for a return period of 500 years are presented. Both deterministic and probabilistic methods (PTHA) are used. The resulting hazard levels for both methods are compared quantitatively for selected areas. The comparison demonstrates that the analysis is rather rough, which is expected for a study aiming at average trends on a country level across the globe. It is shown that populous Asian countries account for the largest absolute number of people living in tsunami prone areas, more than 50% of the total exposed people live in Japan. Smaller nations like Macao and the Maldives are among the most exposed by population count. Exposed nuclear power plants are limited to Japan, China, India, Taiwan, and USA. On the contrary, a local tsunami vulnerability and risk analysis applies information on population, building types, infrastructure, inundation, flow depth for a certain tsunami scenario with a corresponding return period combined with empirical data on tsunami damages and mortality. Results and validation of a GIS tsunami vulnerability and risk assessment model are presented. The GIS model is adapted for optimal use of data available for each study. Finally, the importance of including landslide sources in the tsunami analysis is also discussed.

  7. Methods for external event screening quantification: Risk Methods Integration and Evaluation Program (RMIEP) methods development

    SciTech Connect

    Ravindra, M.K.; Banon, H.

    1992-07-01

    In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as a three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC`s PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described.

  8. Methods for external event screening quantification: Risk Methods Integration and Evaluation Program (RMIEP) methods development

    SciTech Connect

    Ravindra, M.K.; Banon, H. )

    1992-07-01

    In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as a three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC's PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described.

  9. Flood hazard energy in urban areas: a new integrated method for flood risk analysis in synthesizing interactions with urban boundary layer

    NASA Astrophysics Data System (ADS)

    Park, S. Y.; Schmidt, A.

    2015-12-01

    Since urban physical characteristics (such as morphology and land-use/land-cover) are different from those of nature, altered interactions between the surface and atmosphere (especially urban boundary layer, UBL) or surface and subsurface can affect the hydrologic behavior and hence the flood hazards. In this research we focus on three main aspects of the urban surface/atmosphere interactions that affect flood hazard: urban heat island (UHI) effect, increased surface roughness, and accumulated aerosols. These factors, along with the uncertainties in quantifying these components make risk analysis intractable. In order to perform a risk analysis, the impact of these components needs to be mapped to a variable that can be mathematically described in a risk-analysis framework. We propose defining hazard energy as a surrogate for the combined effect of these three components. Perturbations that can change the hazard energy come from diverse sources in the urban areas and these somewhat disconnected things can be combined by the energy concept to characterize the impacts of urban areas in risk assessment. This approach synthesizes across hydrological and hydraulic processes in UBL, land surface, subsurface, and sewer network with scrutinizing energy exchange across places. We can extend our understanding about not only the influence of cities on local climate in rural areas or larger scales but also the interaction of cities and nature affecting each other.

  10. Low-thrust mission risk analysis.

    NASA Technical Reports Server (NTRS)

    Yen, C. L.; Smith, D. B.

    1973-01-01

    A computerized multi-stage failure process simulation procedure is used to evaluate the risk in a solar electric space mission. The procedure uses currently available thrust-subsystem reliability data and performs approximate simulations of the thrust subsystem burn operation, the system failure processes, and the retargetting operations. The application of the method is used to assess the risks in carrying out a 1980 rendezvous mission to Comet Encke. Analysis of the results and evaluation of the effects of various risk factors on the mission show that system component failure rates is the limiting factor in attaining a high mission reliability. But it is also shown that a well-designed trajectory and system operation mode can be used effectively to partially compensate for unreliable thruster performance.

  11. Method and apparatus for assessing cardiovascular risk

    NASA Technical Reports Server (NTRS)

    Albrecht, Paul (Inventor); Bigger, J. Thomas (Inventor); Cohen, Richard J. (Inventor)

    1998-01-01

    The method for assessing risk of an adverse clinical event includes detecting a physiologic signal in the subject and determining from the physiologic signal a sequence of intervals corresponding to time intervals between heart beats. The long-time structure of fluctuations in the intervals over a time period of more than fifteen minutes is analyzed to assess risk of an adverse clinical event. In a preferred embodiment, the physiologic signal is an electrocardiogram and the time period is at least fifteen minutes. A preferred method for analyzing the long-time structure variability in the intervals includes computing the power spectrum and fitting the power spectrum to a power law dependence on frequency over a selected frequency range such as 10.sup.-4 to 10.sup.-2 Hz. Characteristics of the long-time structure fluctuations in the intervals is used to assess risk of an adverse clinical event.

  12. A classification scheme for risk assessment methods.

    SciTech Connect

    Stamp, Jason Edwin; Campbell, Philip LaRoche

    2004-08-01

    This report presents a classification scheme for risk assessment methods. This scheme, like all classification schemes, provides meaning by imposing a structure that identifies relationships. Our scheme is based on two orthogonal aspects--level of detail, and approach. The resulting structure is shown in Table 1 and is explained in the body of the report. Each cell in the Table represent a different arrangement of strengths and weaknesses. Those arrangements shift gradually as one moves through the table, each cell optimal for a particular situation. The intention of this report is to enable informed use of the methods so that a method chosen is optimal for a situation given. This report imposes structure on the set of risk assessment methods in order to reveal their relationships and thus optimize their usage.We present a two-dimensional structure in the form of a matrix, using three abstraction levels for the rows and three approaches for the columns. For each of the nine cells in the matrix we identify the method type by name and example. The matrix helps the user understand: (1) what to expect from a given method, (2) how it relates to other methods, and (3) how best to use it. Each cell in the matrix represent a different arrangement of strengths and weaknesses. Those arrangements shift gradually as one moves through the table, each cell optimal for a particular situation. The intention of this report is to enable informed use of the methods so that a method chosen is optimal for a situation given. The matrix, with type names in the cells, is introduced in Table 2 on page 13 below. Unless otherwise stated we use the word 'method' in this report to refer to a 'risk assessment method', though often times we use the full phrase. The use of the terms 'risk assessment' and 'risk management' are close enough that we do not attempt to distinguish them in this report. The remainder of this report is organized as follows. In Section 2 we provide context for this report

  13. Comprehensive rotorcraft analysis methods

    NASA Technical Reports Server (NTRS)

    Stephens, Wendell B.; Austin, Edward E.

    1988-01-01

    The development and application of comprehensive rotorcraft analysis methods in the field of rotorcraft technology are described. These large scale analyses and the resulting computer programs are intended to treat the complex aeromechanical phenomena that describe the behavior of rotorcraft. They may be used to predict rotor aerodynamics, acoustic, performance, stability and control, handling qualities, loads and vibrations, structures, dynamics, and aeroelastic stability characteristics for a variety of applications including research, preliminary and detail design, and evaluation and treatment of field problems. The principal comprehensive methods developed or under development in recent years and generally available to the rotorcraft community because of US Army Aviation Research and Technology Activity (ARTA) sponsorship of all or part of the software systems are the Rotorcraft Flight Simulation (C81), Dynamic System Coupler (DYSCO), Coupled Rotor/Airframe Vibration Analysis Program (SIMVIB), Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics (CAMRAD), General Rotorcraft Aeromechanical Stability Program (GRASP), and Second Generation Comprehensive Helicopter Analysis System (2GCHAS).

  14. Economic Risk Analysis: Using Analytical and Monte Carlo Techniques.

    ERIC Educational Resources Information Center

    O'Donnell, Brendan R.; Hickner, Michael A.; Barna, Bruce A.

    2002-01-01

    Describes the development and instructional use of a Microsoft Excel spreadsheet template that facilitates analytical and Monte Carlo risk analysis of investment decisions. Discusses a variety of risk assessment methods followed by applications of the analytical and Monte Carlo methods. Uses a case study to illustrate use of the spreadsheet tool…

  15. The JPL Cost Risk Analysis Approach that Incorporates Engineering Realism

    NASA Technical Reports Server (NTRS)

    Harmon, Corey C.; Warfield, Keith R.; Rosenberg, Leigh S.

    2006-01-01

    This paper discusses the JPL Cost Engineering Group (CEG) cost risk analysis approach that accounts for all three types of cost risk. It will also describe the evaluation of historical cost data upon which this method is based. This investigation is essential in developing a method that is rooted in engineering realism and produces credible, dependable results to aid decision makers.

  16. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2008-01-01

    A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multi-factor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.

  17. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multifactor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.

  18. New method for assessing risks of email

    NASA Astrophysics Data System (ADS)

    Raja, Seyyed H.; Afrooz, Farzad

    2013-03-01

    E-mail technology, has become one of the requirements of human lives for correspondence between individuals. Given this, the important point is that the messages, server and client of e-mail and correspondences that exchanged between different people have acceptable security, to make people sure to use of this technology. In the information age, many of financial and non financial transactions are done electronically, data exchange takes place via the internet and theft and manipulation of data can make exorbitant cost in terms of integrity, financial, political, economic and culture. E-mail correspondence in there is same and it is very important. With review took place, a method that will focus on email system for risks assessment is not provided. We are examining ways of assessing for other systems and their strengths and weaknesses, then we use Mr Convery method for assessing email risks which it is for assessing network risks. At the end of paper we have offered special table for email risk assessment.

  19. Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581

    NASA Astrophysics Data System (ADS)

    Prayogo, Galang Sandy; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin

    2016-04-01

    Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presented a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.

  20. Spatio-temporal earthquake risk assessment for the Lisbon Metropolitan Area - A contribution to improving standard methods of population exposure and vulnerability analysis

    NASA Astrophysics Data System (ADS)

    Freire, Sérgio; Aubrecht, Christoph

    2010-05-01

    The recent 7.0 M earthquake that caused severe damage and destruction in parts of Haiti struck close to 5 PM (local time), at a moment when many people were not in their residences, instead being in their workplaces, schools, or churches. Community vulnerability assessment to seismic hazard relying solely on the location and density of resident-based census population, as is commonly the case, would grossly misrepresent the real situation. In particular in the context of global (climate) change, risk analysis is a research field increasingly gaining in importance whereas risk is usually defined as a function of hazard probability and vulnerability. Assessment and mapping of human vulnerability has however generally been lagging behind hazard analysis efforts. Central to the concept of vulnerability is the issue of human exposure. Analysis of exposure is often spatially tied to administrative units or reference objects such as buildings, spanning scales from the regional level to local studies for small areas. Due to human activities and mobility, the spatial distribution of population is time-dependent, especially in metropolitan areas. Accurately estimating population exposure is a key component of catastrophe loss modeling, one element of effective risk analysis and emergency management. Therefore, accounting for the spatio-temporal dynamics of human vulnerability correlates with recent recommendations to improve vulnerability analyses. Earthquakes are the prototype for a major disaster, being low-probability, rapid-onset, high-consequence events. Lisbon, Portugal, is subject to a high risk of earthquake, which can strike at any day and time, as confirmed by modern history (e.g. December 2009). The recently-approved Special Emergency and Civil Protection Plan (PEERS) is based on a Seismic Intensity map, and only contemplates resident population from the census as proxy for human exposure. In the present work we map and analyze the spatio-temporal distribution of

  1. Barriers to uptake among high-risk individuals declining participation in lung cancer screening: a mixed methods analysis of the UK Lung Cancer Screening (UKLS) trial

    PubMed Central

    Ali, Noor; Lifford, Kate J; Carter, Ben; McRonald, Fiona; Yadegarfar, Ghasem; Baldwin, David R; Weller, David; Hansell, David M; Duffy, Stephen W; Field, John K; Brain, Kate

    2015-01-01

    Objective The current study aimed to identify the barriers to participation among high-risk individuals in the UK Lung Cancer Screening (UKLS) pilot trial. Setting The UKLS pilot trial is a randomised controlled trial of low-dose CT (LDCT) screening that has recruited high-risk people using a population approach in the Cambridge and Liverpool areas. Participants High-risk individuals aged 50–75 years were invited to participate in UKLS. Individuals were excluded if a LDCT scan was performed within the last year, if they were unable to provide consent, or if LDCT screening was unable to be carried out due to coexisting comorbidities. Outcome measures Statistical associations between individual characteristics and UKLS uptake were examined using multivariable regression modelling. In those who completed a non-participation questionnaire (NPQ), thematic analysis of free-text data was undertaken to identify reasons for not taking part, with subsequent exploratory linkage of key themes to risk factors for non-uptake. Results Comparative data were available from 4061 high-risk individuals who consented to participate in the trial and 2756 who declined participation. Of those declining participation, 748 (27.1%) completed a NPQ. Factors associated with non-uptake included: female gender (OR=0.64, p<0.001), older age (OR=0.73, p<0.001), current smoking (OR=0.70, p<0.001), lower socioeconomic group (OR=0.56, p<0.001) and higher affective risk perception (OR=0.52, p<0.001). Among non-participants who provided a reason, two main themes emerged reflecting practical and emotional barriers. Smokers were more likely to report emotional barriers to participation. Conclusions A profile of risk factors for non-participation in lung screening has emerged, with underlying reasons largely relating to practical and emotional barriers. Strategies for engaging high-risk, hard-to-reach groups are critical for the equitable uptake of a potential future lung cancer screening programme

  2. Landsafe: Landing Site Risk Analysis Software Framework

    NASA Astrophysics Data System (ADS)

    Schmidt, R.; Bostelmann, J.; Cornet, Y.; Heipke, C.; Philippe, C.; Poncelet, N.; de Rosa, D.; Vandeloise, Y.

    2012-08-01

    The European Space Agency (ESA) is planning a Lunar Lander mission in the 2018 timeframe that will demonstrate precise soft landing at the polar regions of the Moon. To ensure a safe and successful landing a careful risk analysis has to be carried out. This is comprised of identifying favorable target areas and evaluating the surface conditions in these areas. Features like craters, boulders, steep slopes, rough surfaces and shadow areas have to be identified in order to assess the risk associated to a landing site in terms of a successful touchdown and subsequent surface operation of the lander. In addition, global illumination conditions at the landing site have to be simulated and analyzed. The Landing Site Risk Analysis software framework (LandSAfe) is a system for the analysis, selection and certification of safe landing sites on the lunar surface. LandSAfe generates several data products including high resolution digital terrain models (DTMs), hazard maps, illumination maps, temperature maps and surface reflectance maps which assist the user in evaluating potential landing site candidates. This paper presents the LandSAfe system and describes the methods and products of the different modules. For one candidate landing site on the rim of Shackleton crater at the south pole of the Moon a high resolution DTM is showcased.

  3. Applying the partitioned multiobjective risk method (PMRM) to portfolio selection.

    PubMed

    Reyes Santos, Joost; Haimes, Yacov Y

    2004-06-01

    The analysis of risk-return tradeoffs and their practical applications to portfolio analysis paved the way for Modern Portfolio Theory (MPT), which won Harry Markowitz a 1992 Nobel Prize in Economics. A typical approach in measuring a portfolio's expected return is based on the historical returns of the assets included in a portfolio. On the other hand, portfolio risk is usually measured using volatility, which is derived from the historical variance-covariance relationships among the portfolio assets. This article focuses on assessing portfolio risk, with emphasis on extreme risks. To date, volatility is a major measure of risk owing to its simplicity and validity for relatively small asset price fluctuations. Volatility is a justified measure for stable market performance, but it is weak in addressing portfolio risk under aberrant market fluctuations. Extreme market crashes such as that on October 19, 1987 ("Black Monday") and catastrophic events such as the terrorist attack of September 11, 2001 that led to a four-day suspension of trading on the New York Stock Exchange (NYSE) are a few examples where measuring risk via volatility can lead to inaccurate predictions. Thus, there is a need for a more robust metric of risk. By invoking the principles of the extreme-risk-analysis method through the partitioned multiobjective risk method (PMRM), this article contributes to the modeling of extreme risks in portfolio performance. A measure of an extreme portfolio risk, denoted by f(4), is defined as the conditional expectation for a lower-tail region of the distribution of the possible portfolio returns. This article presents a multiobjective problem formulation consisting of optimizing expected return and f(4), whose solution is determined using Evolver-a software that implements a genetic algorithm. Under business-as-usual market scenarios, the results of the proposed PMRM portfolio selection model are found to be compatible with those of the volatility-based model

  4. Novel methods to evaluate fracture risk models

    PubMed Central

    Donaldson, M.G.; Cawthon, P. M.; Schousboe, J.T.; Ensrud, K.E.; Lui, L.Y.; Cauley, J.A.; Hillier, T.A.; Taylor, B.C.; Hochberg, M.C.; Bauer, D.C.; Cummings, S.R.

    2013-01-01

    Fracture prediction models help identify individuals at high risk who may benefit from treatment. Area Under the Curve (AUC) is used to compare prediction models. However, the AUC has limitations and may miss important differences between models. Novel reclassification methods quantify how accurately models classify patients who benefit from treatment and the proportion of patients above/below treatment thresholds. We applied two reclassification methods, using the NOF treatment thresholds, to compare two risk models: femoral neck BMD and age (“simple model”) and FRAX (”FRAX model”). The Pepe method classifies based on case/non-case status and examines the proportion of each above and below thresholds. The Cook method examines fracture rates above and below thresholds. We applied these to the Study of Osteoporotic Fractures. There were 6036 (1037 fractures) and 6232 (389 fractures) participants with complete data for major osteoporotic and hip fracture respectively. Both models for major osteoporotic fracture (0.68 vs. 0.69) and hip fracture (0.75 vs. 0.76) had similar AUCs. In contrast, using reclassification methods, each model classified a substantial number of women differently. Using the Pepe method, the FRAX model (vs. simple model), missed treating 70 (7%) cases of major osteoporotic fracture but avoided treating 285 (6%) non-cases. For hip fracture, the FRAX model missed treating 31 (8%) cases but avoided treating 1026 (18%) non-cases. The Cook method (both models, both fracture outcomes) had similar fracture rates above/below the treatment thresholds. Compared with the AUC, new methods provide more detailed information about how models classify patients. PMID:21351143

  5. RISK ANALYSIS: CASE HISTORY OF PUCCINIA JACEAE ON YELLOW STARTHISTLE

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Risk analysis has five components: Risk awareness, Risk perception, Risk assessment, Risk management, and Risk communication. Using the case with the foreign plant pathogen, Puccinia jaceae, under evaluation for biological control of yellow starthistle (Centaurea solstitialis, YST), approaches and...

  6. Generalized Multicoincidence Analysis Methods

    SciTech Connect

    Warren, Glen A.; Smith, Leon E.; Aalseth, Craig E.; Ellis, J. E.; Valsan, Andrei B.; Mengesha, Wondwosen

    2005-10-01

    The ability to conduct automated trace radionuclide analysis at or near the sample collection point would provide a valuable tool for emergency response, nuclear forensics and environmental monitoring. Pacific Northwest National Laboratory is developing systems for this purpose based on dual gamma-ray spectrometers, e.g. NaI(TI) or HPGe, combined with thin organic scintillator sensors to detect light charged particles. Translating the coincident signatures recorded by these systems, which include , and , into the concentration of detectable radionuclides in the sample requires generalized multicoincidence analysis tools. The development and validation of the Coincidence Lookup Library, which currently contains the probabilities of single and coincidence signatures from more than 420 isotopes, is described. Also discussed is a method to calculate the probability of observing a coincidence signature which incorporates true coincidence summing effects. These effects are particularly important for high-geometric-efficiency detection systems. Finally, a process for validating the integrated analysis software package is demonstrated using GEANT 4 simulations of the prototype detector systems.

  7. Generalized Multicoincidence Analysis Methods

    SciTech Connect

    Warren, Glen A.; Smith, Leon E.; Aalseth, Craig E.; Ellis, J. E.; Valsan, Andrei B.; Mengesha, Wondwosen

    2006-02-01

    The ability to conduct automated trace radionuclide analysis at or near the sample collection point would provide a valuable tool for emergency response, environmental monitoring, and verification of treaties and agreements. Pacific Northwest National Laboratory is developing systems for this purpose based on dual gamma-ray spectrometers, e.g. NaI(TI) or HPGe, combined with thin organic scintillator sensors to detect light charged particles. Translating the coincident signatures recorded by these systems, which include beta-gamma, gamma-gamma and beta-gamma-gamma, into the concentration of detectable radionuclides in the sample requires generalized multicoincidence analysis tools. The development and validation of the Coincidence Lookup Library, which currently contains the probabilities of single and coincidence signatures from more than 420 isotopes, is described. Also discussed is a method to calculate the probability of observing a coincidence signature which incorporates true coincidence summing effects. These effects are particularly important for high-geometric-efficiency detection systems. Finally, a process for verifying the integrated analysis software package is demonstrated using GEANT 4 simulations of the prototype detector systems.

  8. [Risk sharing methods in middle income countries].

    PubMed

    Inotai, András; Kaló, Zoltán

    2012-01-01

    The pricing strategy of innovative medicines is based on the therapeutic value in the largest pharmaceutical markets. The cost-effectiveness of new medicines with value based ex-factory price is justifiable. Due to the international price referencing and parallel trade the ex-factory price corridor of new medicines has been narrowed in recent years. Middle income countries have less negotiation power to change the narrow drug pricing corridor, although their fair intention is to buy pharmaceuticals at lower price from their scarce public resources compared to higher income countries. Therefore the reimbursement of new medicines at prices of Western-European countries may not be justifiable in Central-Eastern European countries. Confidential pricing agreements (i.e. confidential price discounts, claw-back or rebate) in lower income countries of the European Union can alleviate this problem, as prices of new medicines can be adjusted to local purchasing power without influencing the published ex-factory price and so the accessibility of patients to these drugs in other countries. In order to control the drug budget payers tend to apply financial risk sharing agreements for new medicines in more and more countries to shift the consequences of potential overspending to pharmaceutical manufacturers. The major paradox of financial risk-sharing schemes is that increased mortality, poor persistence of patients, reduced access to healthcare providers, and no treatment reduce pharmaceutical spending. Consequently, payers have started to apply outcome based risk sharing agreements for new medicines recently to improve the quality of health care provision. Our paper aims to review and assess the published financial and outcome based risk sharing methods. Introduction of outcome based risk-sharing schemes can be a major advancement in the drug reimbursement strategy of payers in middle income countries. These schemes can help to reduce the medical uncertainty in coverage

  9. Latent Model Analysis of Substance Use and HIV Risk Behaviors among High-Risk Minority Adults

    ERIC Educational Resources Information Center

    Wang, Min Qi; Matthew, Resa F.; Chiu, Yu-Wen; Yan, Fang; Bellamy, Nikki D.

    2007-01-01

    Objectives: This study evaluated substance use and HIV risk profile using a latent model analysis based on ecological theory, inclusive of a risk and protective factor framework, in sexually active minority adults (N=1,056) who participated in a federally funded substance abuse and HIV prevention health initiative from 2002 to 2006. Methods: Data…

  10. A comparison of radiological risk assessment methods for environmental restoration

    SciTech Connect

    Dunning, D.E. Jr.; Peterson, J.M.

    1993-09-01

    Evaluation of risks to human health from exposure to ionizing radiation at radioactively contaminated sites is an integral part of the decision-making process for determining the need for remediation and selecting remedial actions that may be required. At sites regulated under the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA), a target risk range of 10{sup {minus}4} to 10{sup {minus}6} incremental cancer incidence over a lifetime is specified by the US Environmental Protection Agency (EPA) as generally acceptable, based on the reasonable maximum exposure to any individual under current and future land use scenarios. Two primary methods currently being used in conducting radiological risk assessments at CERCLA sites are compared in this analysis. Under the first method, the radiation dose equivalent (i.e., Sv or rem) to the receptors of interest over the appropriate period of exposure is estimated and multiplied by a risk factor (cancer risk/Sv). Alternatively, incremental cancer risk can be estimated by combining the EPA`s cancer slope factors (previously termed potency factors) for radionuclides with estimates of radionuclide intake by ingestion and inhalation, as well as radionuclide concentrations in soil that contribute to external dose. The comparison of the two methods has demonstrated that resulting estimates of lifetime incremental cancer risk under these different methods may differ significantly, even when all other exposure assumptions are held constant, with the magnitude of the discrepancy depending upon the dominant radionuclides and exposure pathways for the site. The basis for these discrepancies, the advantages and disadvantages of each method, and the significance of the discrepant results for environmental restoration decisions are presented.

  11. Risk Analysis for Resource Planning Optimization

    NASA Technical Reports Server (NTRS)

    Chueng, Kar-Ming

    2008-01-01

    The main purpose of this paper is to introduce a risk management approach that allows planners to quantify the risk and efficiency tradeoff in the presence of uncertainties, and to make forward-looking choices in the development and execution of the plan. Demonstrate a planning and risk analysis framework that tightly integrates mathematical optimization, empirical simulation, and theoretical analysis techniques to solve complex problems.

  12. Risk Analysis and Uncertainty: Implications for Counselling

    ERIC Educational Resources Information Center

    Hassenzahl, David

    2004-01-01

    Over the past two decades, the risk analysis community has made substantial advances in understanding and describing uncertainty. Uncertainty is ubiquitous, complex, both quantitative and qualitative in nature, and often irreducible. Uncertainty thus creates a challenge when using risk analysis to evaluate the rationality of group and individual…

  13. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 2 2012-07-01 2012-07-01 false Risk analysis. 75.115 Section 75.115 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS AFFAIRS (CONTINUED) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive personal information that is processed...

  14. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 2 2013-07-01 2013-07-01 false Risk analysis. 75.115 Section 75.115 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS AFFAIRS (CONTINUED) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive personal information that is processed...

  15. RISK ASSESSMENT FOR BENEFITS ANALYSIS

    EPA Science Inventory

    Among the important types of information considered in decision making at the U.S. Environmental Protection Agency (EPA) are the outputs of risk assessments and benefit-cost analyses. Risk assessments present estimates of the adverse consequences of exposure to environmental poll...

  16. Carbon Fiber Risk Analysis. [conference

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The scope and status of the effort to assess the risks associated with the accidental release of carbon/graphite fibers from civil aircraft is presented. Vulnerability of electrical and electronic equipment to carbon fibers, dispersal of carbon fibers, effectiveness of filtering systems, impact of fiber induced failures, and risk methodology are among the topics covered.

  17. Cumulative Risk Assessment Toolbox: Methods and Approaches for the Practitioner

    PubMed Central

    MacDonell, Margaret M.; Haroun, Lynne A.; Teuschler, Linda K.; Rice, Glenn E.; Hertzberg, Richard C.; Butler, James P.; Chang, Young-Soo; Clark, Shanna L.; Johns, Alan P.; Perry, Camarie S.; Garcia, Shannon S.; Jacobi, John H.; Scofield, Marcienne A.

    2013-01-01

    The historical approach to assessing health risks of environmental chemicals has been to evaluate them one at a time. In fact, we are exposed every day to a wide variety of chemicals and are increasingly aware of potential health implications. Although considerable progress has been made in the science underlying risk assessments for real-world exposures, implementation has lagged because many practitioners are unaware of methods and tools available to support these analyses. To address this issue, the US Environmental Protection Agency developed a toolbox of cumulative risk resources for contaminated sites, as part of a resource document that was published in 2007. This paper highlights information for nearly 80 resources from the toolbox and provides selected updates, with practical notes for cumulative risk applications. Resources are organized according to the main elements of the assessment process: (1) planning, scoping, and problem formulation; (2) environmental fate and transport; (3) exposure analysis extending to human factors; (4) toxicity analysis; and (5) risk and uncertainty characterization, including presentation of results. In addition to providing online access, plans for the toolbox include addressing nonchemical stressors and applications beyond contaminated sites and further strengthening resource accessibility to support evolving analyses for cumulative risk and sustainable communities. PMID:23762048

  18. Nuclear risk analysis of the Ulysses mission

    SciTech Connect

    Bartram, B.W.; Vaughan, F.R. ); Englehart, D.R.W. )

    1991-01-01

    The use of a radioisotope thermoelectric generator fueled with plutonium-238 dioxide on the Space Shuttle-launched Ulysses mission implies some level of risk due to potential accidents. This paper describes the method used to quantify risks in the Ulysses mission Final Safety Analysis Report prepared for the U.S. Department of Energy. The starting point for the analysis described herein is following input of source term probability distributions from the General Electric Company. A Monte Carlo technique is used to develop probability distributions of radiological consequences for a range of accident scenarios thoughout the mission. Factors affecting radiological consequences are identified, the probability distribution of the effect of each factor determined, and the functional relationship among all the factors established. The probability distributions of all the factor effects are then combined using a Monte Carlo technique. The results of the analysis are presented in terms of complementary cumulative distribution functions (CCDF) by mission sub-phase, phase, and the overall mission. The CCDFs show the total probability that consequences (calculated health effects) would be equal to or greater than a given value.

  19. Carbon Fiber Risk Analysis: Conclusions

    NASA Technical Reports Server (NTRS)

    Huston, R. J.

    1979-01-01

    It was concluded that preliminary estimates indicate that the public risk due to accidental release of carbon fiber from air transport aircraft is small. It was also concluded that further work is required to increase confidence in these estimates.

  20. A GIS-based method for flood risk assessment

    NASA Astrophysics Data System (ADS)

    Kalogeropoulos, Kleomenis; Stathopoulos, Nikos; Psarogiannis, Athanasios; Penteris, Dimitris; Tsiakos, Chrisovalantis; Karagiannopoulou, Aikaterini; Krikigianni, Eleni; Karymbalis, Efthimios; Chalkias, Christos

    2016-04-01

    Floods are physical global hazards with negative environmental and socio-economic impacts on local and regional scale. The technological evolution during the last decades, especially in the field of geoinformatics, has offered new advantages in hydrological modelling. This study seeks to use this technology in order to quantify flood risk assessment. The study area which was used is an ungauged catchment and by using mostly GIS hydrological and geomorphological analysis together with a GIS-based distributed Unit Hydrograph model, a series of outcomes have risen. More specifically, this paper examined the behaviour of the Kladeos basin (Peloponnese, Greece) using real rainfall data, as well hypothetical storms. The hydrological analysis held using a Digital Elevation Model of 5x5m pixel size, while the quantitative drainage basin characteristics were calculated and were studied in terms of stream order and its contribution to the flood. Unit Hydrographs are, as it known, useful when there is lack of data and in this work, based on time-area method, a sequences of flood risk assessments have been made using the GIS technology. Essentially, the proposed methodology estimates parameters such as discharge, flow velocity equations etc. in order to quantify flood risk assessment. Keywords Flood Risk Assessment Quantification; GIS; hydrological analysis; geomorphological analysis.

  1. Bridging the two cultures of risk analysis

    SciTech Connect

    Jasanoff, S. )

    1993-04-01

    During the past 15 years, risk analysis has come of age as an interdisciplinary field of remarkable breadth, nurturing connections among fields as diverse as mathematics, biostatistics, toxicology, and engineering on one hand, and law, psychology, sociology, and economics on the other hand. In this editorial, the author addresses the question: What has the presence of social scientists in the network meant to the substantive development of the field of risk analysis The answers offered here discuss the substantial progress in bridging the two cultures of risk analysis. Emphasis is made of the continual need for monitoring risk analysis. Topics include: the micro-worlds of risk assessment; constraining assumptions; and exchange programs. 14 refs.

  2. Initial Risk Analysis and Decision Making Framework

    SciTech Connect

    Engel, David W.

    2012-02-01

    Commercialization of new carbon capture simulation initiative (CCSI) technology will include two key elements of risk management, namely, technical risk (will process and plant performance be effective, safe, and reliable) and enterprise risk (can project losses and costs be controlled within the constraints of market demand to maintain profitability and investor confidence). Both of these elements of risk are incorporated into the risk analysis subtask of Task 7. Thus far, this subtask has developed a prototype demonstration tool that quantifies risk based on the expected profitability of expenditures when retrofitting carbon capture technology on a stylized 650 MW pulverized coal electric power generator. The prototype is based on the selection of specific technical and financial factors believed to be important determinants of the expected profitability of carbon capture, subject to uncertainty. The uncertainty surrounding the technical performance and financial variables selected thus far is propagated in a model that calculates the expected profitability of investments in carbon capture and measures risk in terms of variability in expected net returns from these investments. Given the preliminary nature of the results of this prototype, additional work is required to expand the scope of the model to include additional risk factors, additional information on extant and proposed risk factors, the results of a qualitative risk factor elicitation process, and feedback from utilities and other interested parties involved in the carbon capture project. Additional information on proposed distributions of these risk factors will be integrated into a commercial implementation framework for the purpose of a comparative technology investment analysis.

  3. Risk analysis of computer system designs

    NASA Technical Reports Server (NTRS)

    Vallone, A.

    1981-01-01

    Adverse events during implementation can affect final capabilities, schedule and cost of a computer system even though the system was accurately designed and evaluated. Risk analysis enables the manager to forecast the impact of those events and to timely ask for design revisions or contingency plans before making any decision. This paper presents a structured procedure for an effective risk analysis. The procedure identifies the required activities, separates subjective assessments from objective evaluations, and defines a risk measure to determine the analysis results. The procedure is consistent with the system design evaluation and enables a meaningful comparison among alternative designs.

  4. How important are venue-based HIV risks among male clients of female sex workers? A mixed methods analysis of the risk environment in nightlife venues in Tijuana, Mexico.

    PubMed

    Goldenberg, Shira M; Strathdee, Steffanie A; Gallardo, Manuel; Nguyen, Lucie; Lozada, Remedios; Semple, Shirley J; Patterson, Thomas L

    2011-05-01

    In 2008, 400 males ≥18 years old who paid or traded for sex with a female sex worker (FSW) in Tijuana, Mexico, in the past 4 months completed surveys and HIV/STI testing; 30 also completed qualitative interviews. To analyze environmental sources of HIV vulnerability among male clients of FSWs in Tijuana, we used mixed methods to investigate correlates of clients who met FSWs in nightlife venues and clients' perspectives on venue-based HIV risk. Logistic regression identified micro-level correlates of meeting FSWs in nightlife venues, which were triangulated with clients' narratives regarding macro-level influences. In a multivariate model, offering increased pay for unprotected sex and binge drinking were micro-level factors that were independently associated with meeting FSWs in nightlife venues versus other places. In qualitative interviews, clients characterized nightlife venues as high risk due to the following macro-level features: social norms dictating heavy alcohol consumption; economic exploitation by establishment owners; and poor enforcement of sex work regulations in nightlife venues. Structural interventions in nightlife venues are needed to address venue-based risks. PMID:21396875

  5. Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William

    2009-01-01

    This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).

  6. DETERMINING SIGNIFICANT ENDPOINTS FOR ECOLOGICAL RISK ANALYSIS

    EPA Science Inventory

    Risk analyses, both human health and ecological, will be important factors in determining which DOE sites should be cleaned up and in deciding if acceptable performance standards have been met. Risk analysis procedures for humans use the individual as the 'unit' of observation, a...

  7. CUMULATIVE RISK ANALYSIS FOR ORGANOPHOSPHORUS PESTICIDES

    EPA Science Inventory

    Cumulative Risk Analysis for Organophosphorus Pesticides
    R. Woodrow Setzer, Jr. NHEERL MD-74, USEPA, RTP, NC 27711

    The US EPA has recently completed a risk assessment of the effects of exposure to 33 organophosphorous pesticides (OPs) through the diet, water, and resi...

  8. Risk analysis approach. [of carbon fiber release

    NASA Technical Reports Server (NTRS)

    Huston, R. J.

    1979-01-01

    The assessment of the carbon fiber hazard is outlined. Program objectives, requirements of the risk analysis, and elements associated with the physical phenomena of the accidental release are described.

  9. Advanced reliability method for fatigue analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Wirsching, P. H.

    1984-01-01

    When design factors are considered as random variables and the failure condition cannot be expressed by a closed form algebraic inequality, computations of risk (or probability of failure) may become extremely difficult or very inefficient. This study suggests using a simple and easily constructed second degree polynomial to approximate the complicated limit state in the neighborhood of the design point; a computer analysis relates the design variables at selected points. Then a fast probability integration technique (i.e., the Rackwitz-Fiessler algorithm) can be used to estimate risk. The capability of the proposed method is demonstrated in an example of a low cycle fatigue problem for which a computer analysis is required to perform local strain analysis to relate the design variables. A comparison of the performance of this method is made with a far more costly Monte Carlo solution. Agreement of the proposed method with Monte Carlo is considered to be good.

  10. Risk analysis in bioequivalence and biowaiver decisions.

    PubMed

    Kubbinga, Marlies; Langguth, Peter; Barends, Dirk

    2013-07-01

    This article evaluates the current biowaiver guidance documents published by the FDA, EU and WHO from a risk based perspective. The authors introduce the use of a Failure Mode and Effect Analysis (FMEA) risk calculation tool to show that current regulatory documents implicitly limit the risk for bioinequivalence after granting a biowaiver by reduction of the incidence, improving the detection and limiting the severity of any unforeseen bioinequivalent product. In addition, the authors use the risk calculation to expose yet unexplored options for future extension of comparative in vitro tools for biowaivers. PMID:23280474

  11. RISK ANALYSIS, ANALYSIS OF VARIANCE: GETTING MORE FROM OUR DATA

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Analysis of variance (ANOVA) and regression are common statistical techniques used to analyze agronomic experimental data and determine significant differences among yields due to treatments or other experimental factors. Risk analysis provides an alternate and complimentary examination of the same...

  12. ANALYSIS OF LAMB MORTALITY USING COMPETING RISKS

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A competing risks model was used to describe lamb mortality up to four weeks of age in a composite sheep flock with 8,642 lamb records. Discrete survival methods were applied using sire and animal models. The results indicated that substantial variation exists in the risk of lambs dying from diffe...

  13. Fuzzy risk analysis for nuclear safeguards

    SciTech Connect

    Zardecki, A.

    1993-05-01

    Analysis of a safeguards system, based on the notion of fuzzy sets and linguistic variables, concerns such as complexity and inherent imprecision in estimating the possibility of loss or compromise. The automated risk analysis allows the risk to be determined for an entire system based on estimates for lowest level components and the component proportion. In addition, for each component (asset) the most effective combination of protection mechanisms against a given set of threats is determined. A distinction between bar and featured risk is made.

  14. Fuzzy risk analysis for nuclear safeguards

    SciTech Connect

    Zardecki, A.

    1993-01-01

    Analysis of a safeguards system, based on the notion of fuzzy sets and linguistic variables, concerns such as complexity and inherent imprecision in estimating the possibility of loss or compromise. The automated risk analysis allows the risk to be determined for an entire system based on estimates for lowest level components and the component proportion. In addition, for each component (asset) the most effective combination of protection mechanisms against a given set of threats is determined. A distinction between bar and featured risk is made.

  15. Risk based requirements for long term stewardship: A proof-of-principle analysis of an analytic method tested on selected Hanford locations

    SciTech Connect

    Jarvis, T.T.; Andrews, W.B.; Buck, J.W.

    1998-03-01

    Since 1989, the Department of Energy`s (DOE) Environmental Management (EM) Program has managed the environmental legacy of US nuclear weapons production, research and testing at 137 facilities in 31 states and one US territory. The EM program has conducted several studies on the public risks posed by contaminated sites at these facilities. In Risks and the Risk Debate [DOE, 1995a], the Department analyzed the risks at sites before, during, and after remediation work by the EM program. The results indicated that aside from a few urgent risks, most hazards present little inherent risk because physical and active site management controls limit both the releases of site contaminants, and public access to these hazards. Without these controls, these sites would pose greater risks to the public. Past risk reports, however, provided little information about post-cleanup risk, primarily because of uncertainty about future site uses and site characteristics at the end of planned cleanup activities. This is of concern because in many cases current cleanup technologies, and remedies, will last a shorter period of time than the waste itself and the resulting contamination will remain hazardous.

  16. Risk Based Requirements for Long Term Stewardship: A Proof-of-Principle Analysis of an Analytic Method Tested on Selected Hanford Locations

    SciTech Connect

    GM Gelston; JW Buck; LR Huesties; MS Peffers; TB Miley; TT Jarvis; WB Andrews

    1998-12-03

    Since 1989, the Department of Energy's (DOE) Environmental Management (EM) Program has managed the environmental legacy of US nuclear weapons production research and testing at 137 facilities in 31 states and one US territory. The EM program has conducted several studies on the public risks posed by contaminated sites at these facilities. In Risks and the Risk Debate DOE, 1995a, the Department analyzed the risks at sites before, during, and after remediation work by the EM program. The results indicated that aside from a few urgent risks, most hazards present little inherent risk because physical and active site management controls limit both the releases of site contaminants, and public access to these hazards. Without these controls, these sites would pose greater risks to the public. Past risk reports, however, provided little irdiormation about post- cleanup risk, primarily because of uncertainty about fiture site uses and site characteristics at the end of planned cleanup activities. This is of concern because in many cases current cleanup technologies, and remedies, will last a shorter period of time than the waste itself and the resulting contamination will remain hazardous.

  17. Occupational safety and HIV risk among female sex workers in China: A mixed-methods analysis of sex-work harms and mommies

    PubMed Central

    Yi, Huso; Zheng, Tiantian; Wan, Yanhai; Mantell, Joanne E.; Park, Minah; Csete, Joanne

    2013-01-01

    Female sex workers (FSWs) in China are exposed to multiple work-related harms that increase HIV vulnerability. Using mixed-methods, we explored the social-ecological aspects of sexual risk among 348 FSWs in Beijing. Sex-work harms were assessed by property stolen, being underpaid or not paid at all, verbal and sexual abuse, forced drinking; and forced sex more than once. The majority (90%) reported at least one type of harm, 38% received harm protection from ‘mommies’ (i.e., managers) and 32% reported unprotected sex with clients. In multivariate models, unprotected sex was significantly associated with longer involvement in sex work, greater exposure to harms, and no protection from mommies. Mommies’ protection moderated the effect of sex-work harms on unprotected sex with clients. Our ethnography indicated that mommies played a core role in sex-work networks. Such networks provide a basis for social capital; they are not only profitable economically, but also protect FSWs from sex-work harms. Effective HIV prevention interventions for FSWs in China must address the occupational safety and health of FSWs by facilitating social capital and protection agency (e.g., mommies) in the sex-work industry. PMID:22375698

  18. Communication Network Analysis Methods.

    ERIC Educational Resources Information Center

    Farace, Richard V.; Mabee, Timothy

    This paper reviews a variety of analytic procedures that can be applied to network data, discussing the assumptions and usefulness of each procedure when applied to the complexity of human communication. Special attention is paid to the network properties measured or implied by each procedure. Factor analysis and multidimensional scaling are among…

  19. Complementing Gender Analysis Methods.

    PubMed

    Kumar, Anant

    2016-01-01

    The existing gender analysis frameworks start with a premise that men and women are equal and should be treated equally. These frameworks give emphasis on equal distribution of resources between men and women and believe that this will bring equality which is not always true. Despite equal distribution of resources, women tend to suffer and experience discrimination in many areas of their lives such as the power to control resources within social relationships, and the need for emotional security and reproductive rights within interpersonal relationships. These frameworks believe that patriarchy as an institution plays an important role in women's oppression, exploitation, and it is a barrier in their empowerment and rights. Thus, some think that by ensuring equal distribution of resources and empowering women economically, institutions like patriarchy can be challenged. These frameworks are based on proposed equality principle which puts men and women in competing roles. Thus, the real equality will never be achieved. Contrary to the existing gender analysis frameworks, the Complementing Gender Analysis framework proposed by the author provides a new approach toward gender analysis which not only recognizes the role of economic empowerment and equal distribution of resources but suggests to incorporate the concept and role of social capital, equity, and doing gender in gender analysis which is based on perceived equity principle, putting men and women in complementing roles that may lead to equality. In this article the author reviews the mainstream gender theories in development from the viewpoint of the complementary roles of gender. This alternative view is argued based on existing literature and an anecdote of observations made by the author. While criticizing the equality theory, the author offers equity theory in resolving the gender conflict by using the concept of social and psychological capital. PMID:25941756

  20. Starlink corn: a risk analysis.

    PubMed Central

    Bucchini, Luca; Goldman, Lynn R

    2002-01-01

    Modern biotechnology has dramatically increased our ability to alter the agronomic traits of plants. Among the novel traits that biotechnology has made available, an important group includes Bacillus thuringiensis-derived insect resistance. This technology has been applied to potatoes, cotton, and corn. Benefits of Bt crops, and biotechnology generally, can be realized only if risks are assessed and managed properly. The case of Starlink corn, a plant modified with a gene that encodes the Bt protein Cry9c, was a severe test of U.S. regulatory agencies. The U.S. Environmental Protection Agency had restricted its use to animal feed due to concern about the potential for allergenicity. However, Starlink corn was later found throughout the human food supply, resulting in food recalls by the Food and Drug Administration and significant disruption of the food supply. Here we examine the regulatory history of Starlink, the assessment framework employed by the U.S. government, assumptions and information gaps, and the key elements of government efforts to manage the product. We explore the impacts on regulations, science, and society and conclude that only significant advances in our understanding of food allergies and improvements in monitoring and enforcement will avoid similar events in the future. Specifically, we need to develop a stronger fundamental basis for predicting allergic sensitization and reactions if novel proteins are to be introduced in this fashion. Mechanisms are needed to assure that worker and community aeroallergen risks are considered. Requirements are needed for the development of valid assays so that enforcement and post market surveillance activities can be conducted. PMID:11781159

  1. Fuzzy Comprehensive Evaluation Method Applied in the Real Estate Investment Risks Research

    NASA Astrophysics Data System (ADS)

    ML(Zhang Minli), Zhang; Wp(Yang Wenpo), Yang

    Real estate investment is a high-risk and high returned of economic activity, the key of real estate analysis is the identification of their types of investment risk and the risk of different types of effective prevention. But, as the financial crisis sweeping the world, the real estate industry also faces enormous risks, how effective and correct evaluation of real estate investment risks becomes the multitudinous scholar concern[1]. In this paper, real estate investment risks were summarized and analyzed, and comparative analysis method is discussed and finally presented fuzzy comprehensive evaluation method, not only in theory has the advantages of science, in the application also has the reliability, for real estate investment risk assessment provides an effective means for investors in real estate investing guidance on risk factors and forecasts.

  2. Risk Assessment and Integration Team (RAIT) Portfolio Risk Analysis Strategy

    NASA Technical Reports Server (NTRS)

    Edwards, Michelle

    2010-01-01

    Impact at management level: Qualitative assessment of risk criticality in conjunction with risk consequence, likelihood, and severity enable development of an "investment policy" towards managing a portfolio of risks. Impact at research level: Quantitative risk assessments enable researchers to develop risk mitigation strategies with meaningful risk reduction results. Quantitative assessment approach provides useful risk mitigation information.

  3. Preliminary Technical Risk Analysis for the Geothermal Technologies Program

    SciTech Connect

    2009-01-18

    This report explains the goals, methods, and results of a probabilistic analysis of technical risk for a portfolio of R&D projects in the DOE Geothermal Technologies Program (The Program). The analysis is a task by Princeton Energy Resources International, LLC, in support of the National Renewable Energy Laboratory on behalf of the Program. The main challenge in the analysis lies in translating R&D results to a quantitative reflection of technical risk for a key Program metric: levelized cost of energy (LCOE).

  4. A risk analysis model for radioactive wastes.

    PubMed

    Külahcı, Fatih

    2011-07-15

    Hazardous wastes affect natural environmental systems to a significant extend, and therefore, it is necessary to control their harm through risk analysis. Herein, an effective risk methodology is proposed by considering their uncertain behaviors on stochastic, statistical and probabilistic bases. The basic element is attachment of a convenient probability distribution function (pdf) to a given waste quality measurement sequence. In this paper, (40)K contaminant measurements are adapted for risk assessment application after derivation of necessary fundamental formulations. The spatial contaminant distribution of (40)K is presented in the forms of maps and three-dimensional surfaces. PMID:21571428

  5. Risk-based methods applicable to ranking conceptual designs

    SciTech Connect

    Breeding, R.J.; Ortiz, K.; Ringland, J.T.; Lim, J.J.

    1993-11-01

    In Ginichi Taguchi`s latest book on quality engineering, an emphasis is placed on robust design processes in which quality engineering techniques are brought ``upstream,`` that is, they are utilized as early as possible, preferably in the conceptual design stage. This approach was used in a study of possible future safety system designs for weapons. As an experiment, a method was developed for using probabilistic risk analysis (PRA) techniques to rank conceptual designs for performance against a safety metric for ultimate incorporation into a Pugh matrix evaluation. This represents a high-level UW application of PRA methods to weapons. As with most conceptual designs, details of the implementation were not yet developed; many of the components had never been built, let alone tested. Therefore, our application of risk assessment methods was forced to be at such a high level that the entire evaluation could be performed on a spreadsheet. Nonetheless, the method produced numerical estimates of safety in a manner that was consistent, reproducible, and scrutable. The results enabled us to rank designs to identify areas where returns on research efforts would be the greatest. The numerical estimates were calibrated against what is achievable by current weapon safety systems. The use of expert judgement is inescapable, but these judgements are explicit and the method is easily implemented on an spreadsheet computer program.

  6. State of the art in benefit-risk analysis: medicines.

    PubMed

    Luteijn, J M; White, B C; Gunnlaugsdóttir, H; Holm, F; Kalogeras, N; Leino, O; Magnússon, S H; Odekerken, G; Pohjola, M V; Tijhuis, M J; Tuomisto, J T; Ueland, Ø; McCarron, P A; Verhagen, H

    2012-01-01

    Benefit-risk assessment in medicine has been a valuable tool in the regulation of medicines since the 1960s. Benefit-risk assessment takes place in multiple stages during a medicine's life-cycle and can be conducted in a variety of ways, using methods ranging from qualitative to quantitative. Each benefit-risk assessment method is subject to its own specific strengths and limitations. Despite its widespread and long-time use, benefit-risk assessment in medicine is subject to debate and suffers from a number of limitations and is currently still under development. This state of the art review paper will discuss the various aspects and approaches to benefit-risk assessment in medicine in a chronological pathway. The review will discuss all types of benefit-risk assessment a medicinal product will undergo during its lifecycle, from Phase I clinical trials to post-marketing surveillance and health technology assessment for inclusion in public formularies. The benefit-risk profile of a drug is dynamic and differs for different indications and patient groups. In the end of this review we conclude benefit-risk analysis in medicine is a developed practice that is subject to continuous improvement and modernisation. Improvement not only in methodology, but also in cooperation between organizations can improve benefit-risk assessment. PMID:21683115

  7. Seismic vulnerability assessments in risk analysis

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Larionov, Valery; Bonnin, Jean; Ugarov, Alexander

    2013-04-01

    The assessment of seismic vulnerability is a critical issue within natural and technological risk analysis. In general, there are three common types of methods used for development of vulnerability functions of different elements at risk: empirical, analytical and expert estimations. The paper addresses the empirical methods for seismic vulnerability estimation for residential buildings and industrial facilities. The results of engineering analysis of past earthquake consequences, as well as the statistical data on buildings behavior during strong earthquakes presented in the different seismic intensity scales, are used to verify the regional parameters of mathematical models in order to simulate physical and economic vulnerability for different building types classified according to seismic scale MMSK-86. Verified procedure has been used to estimate the physical and economic vulnerability of buildings and constructions against earthquakes for the Northern Caucasus Federal region of the Russian Federation and Krasnodar area, which are characterized by rather high level of seismic activity and high population density. In order to estimate expected damage states to buildings and constructions in the case of the earthquakes according to the OSR-97B (return period T=1,000 years) within big cities and towns, they were divided into unit sites and their coordinates were presented as dots located in the centers of unit sites. Then the indexes obtained for each unit site were summed up. The maps of physical vulnerability zoning for Northern Caucasus Federal region of the Russian Federation and Krasnodar area includes two elements: percent of different damage states for settlements with number of inhabitants less than 1,000 and vulnerability for cities and towns with number of inhabitants more than 1,000. The hypsometric scale is used to represent both elements on the maps. Taking into account the size of oil pipe line systems located in the highly active seismic zones in

  8. Traditional Methods for Mineral Analysis

    NASA Astrophysics Data System (ADS)

    Ward, Robert E.; Carpenter, Charles E.

    This chapter describes traditional methods for analysis of minerals involving titrimetric and colorimetric procedures, and the use of ion selective electrodes. Other traditional methods of mineral analysis include gravimetric titration (i.e., insoluble forms of minerals are precipitated, rinse, dried, and weighed) and redox reactions (i.e., mineral is part of an oxidation-reduction reaction, and product is quantitated). However, these latter two methods will not be covered because they currently are used little in the food industry. The traditional methods that will be described have maintained widespread usage in the food industry despite the development of more modern instrumentation such as atomic absorption spectroscopy and inductively coupled plasma-atomic emission spectroscopy (Chap. 24). Traditional methods generally require chemicals and equipment that are routinely available in an analytical laboratory and are within the experience of most laboratory technicians. Additionally, traditional methods often form the basis for rapid analysis kits (e.g., Quantab®; for salt determination) that are increasingly in demand. Procedures for analysis of minerals of major nutritional or food processing concern are used for illustrative purposes. For additional examples of traditional methods refer to references (1-6). Slight modifications of these traditional methods are often needed for specific foodstuffs to minimize interferences or to be in the range of analytical performance. For analytical requirements for specific foods see the Official Methods of Analysis of AOAC International (5) and related official methods (6).

  9. Risk analysis for critical asset protection.

    PubMed

    McGill, William L; Ayyub, Bilal M; Kaminskiy, Mark

    2007-10-01

    This article proposes a quantitative risk assessment and management framework that supports strategic asset-level resource allocation decision making for critical infrastructure and key resource protection. The proposed framework consists of five phases: scenario identification, consequence and criticality assessment, security vulnerability assessment, threat likelihood assessment, and benefit-cost analysis. Key innovations in this methodology include its initial focus on fundamental asset characteristics to generate an exhaustive set of plausible threat scenarios based on a target susceptibility matrix (which we refer to as asset-driven analysis) and an approach to threat likelihood assessment that captures adversary tendencies to shift their preferences in response to security investments based on the expected utilities of alternative attack profiles assessed from the adversary perspective. A notional example is provided to demonstrate an application of the proposed framework. Extensions of this model to support strategic portfolio-level analysis and tactical risk analysis are suggested. PMID:18076495

  10. Probabilistic methods for rotordynamics analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.

    1991-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.

  11. Risk determination method for accidental water basin contamination based on risk source coupling with sensitive targets.

    PubMed

    Li, Zongfeng; Zeng, Bo; Zhou, Tinggang; Li, Guowei; Zhu, Xiaobo

    2016-01-01

    Accidental water basin pollution seriously threatens human health and ecological security, but rapid, effective methods for evaluating this threat are lacking. This paper aims to develop a risk evaluation method for basin accidents by coupling the risk source with sensitive targets to evaluate the zone accident risk levels of basins and prevent the accidental environmental pollution of water. This method incorporates the interplay between risk sources and sensitive targets by evaluating the zone risk levels of water environments from different sources, effectiveness of the risk source control mechanisms, vulnerability of sensitive targets and spatial and temporal relationships between these sources and targets. Using the Three Gorges Reservoir region as an example, a risk system for water basin pollution incidents consisting of a risk indicator quantification system, a risk zoning method and a verification method for the zoning results is developed and implemented. The results were verified in a field investigation, which showed that the risk zoning model provides rapid, effective and reliable zoning results. This research method could serve as a theoretical reference and technological support for evaluating water basin accident risks. Furthermore, the results are useful for evaluating and protecting the aquatic environments in the Three Gorges Reservoir region. PMID:26207430

  12. NATO PILOT STUDY ON ADVANCED CANCER RISK ASSESSMENT METHODS

    EPA Science Inventory

    NCEA scientists are participating in a study of advanced cancer risk assessment methods, conducted under the auspices of NATO's Committee on the Challenges of Modern Society. The product will be a book of case studies that illustrate advanced cancer risk assessment methods, avail...

  13. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    SciTech Connect

    Lowry, Peter P.; Wagner, Katie A.

    2015-04-01

    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish a lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.

  14. A Method for Dynamic Risk Assessment and Management of Rockbursts in Drill and Blast Tunnels

    NASA Astrophysics Data System (ADS)

    Liu, Guo-Feng; Feng, Xia-Ting; Feng, Guang-Liang; Chen, Bing-Rui; Chen, Dong-Fang; Duan, Shu-Qian

    2016-08-01

    Focusing on the problems caused by rockburst hazards in deep tunnels, such as casualties, damage to construction equipment and facilities, construction schedule delays, and project cost increase, this research attempts to present a methodology for dynamic risk assessment and management of rockbursts in D&B tunnels. The basic idea of dynamic risk assessment and management of rockbursts is determined, and methods associated with each step in the rockburst risk assessment and management process are given, respectively. Among them, the main parts include a microseismic method for early warning the occurrence probability of rockburst risk, an estimation method that aims to assess potential consequences of rockburst risk, an evaluation method that utilizes a new quantitative index considering both occurrence probability and consequences for determining the level of rockburst risk, and the dynamic updating. Specifically, this research briefly describes the referenced microseismic method of warning rockburst, but focuses on the analysis of consequences and associated risk assessment and management of rockburst. Using the proposed method of risk assessment and management of rockburst, the occurrence probability, potential consequences, and the level of rockburst risk can be obtained in real-time during tunnel excavation, which contributes to the dynamic optimisation of risk mitigation measures and their application. The applicability of the proposed method has been verified by those cases from the Jinping II deep headrace and water drainage tunnels at depths of 1900-2525 m (with a length of 11.6 km in total for D&B tunnels).

  15. 14 CFR Appendix C to Part 420 - Risk Analysis

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Risk Analysis C Appendix C to Part 420... TRANSPORTATION LICENSING LICENSE TO OPERATE A LAUNCH SITE Pt. 420, App. C Appendix C to Part 420—Risk Analysis (a... risk is minimal. (2) An applicant shall perform a risk analysis when a populated area is located...

  16. 14 CFR Appendix C to Part 420 - Risk Analysis

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Risk Analysis C Appendix C to Part 420... TRANSPORTATION LICENSING LICENSE TO OPERATE A LAUNCH SITE Pt. 420, App. C Appendix C to Part 420—Risk Analysis (a... risk is minimal. (2) An applicant shall perform a risk analysis when a populated area is located...

  17. 14 CFR Appendix C to Part 420 - Risk Analysis

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Risk Analysis C Appendix C to Part 420... TRANSPORTATION LICENSING LICENSE TO OPERATE A LAUNCH SITE Pt. 420, App. C Appendix C to Part 420—Risk Analysis (a... risk is minimal. (2) An applicant shall perform a risk analysis when a populated area is located...

  18. 14 CFR Appendix C to Part 420 - Risk Analysis

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Risk Analysis C Appendix C to Part 420... TRANSPORTATION LICENSING LICENSE TO OPERATE A LAUNCH SITE Pt. 420, App. C Appendix C to Part 420—Risk Analysis (a... risk is minimal. (2) An applicant shall perform a risk analysis when a populated area is located...

  19. 14 CFR Appendix C to Part 420 - Risk Analysis

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Risk Analysis C Appendix C to Part 420... TRANSPORTATION LICENSING LICENSE TO OPERATE A LAUNCH SITE Pt. 420, App. C Appendix C to Part 420—Risk Analysis (a... risk is minimal. (2) An applicant shall perform a risk analysis when a populated area is located...

  20. Risk analysis for confined space entries: Critical analysis of four tools applied to three risk scenarios.

    PubMed

    Burlet-Vienney, Damien; Chinniah, Yuvin; Bahloul, Ali; Roberge, Brigitte

    2016-06-01

    Investigation reports of fatal confined space accidents nearly always point to a problem of identifying or underestimating risks. This paper compares 4 different risk analysis tools developed for confined spaces by applying them to 3 hazardous scenarios. The tools were namely 1. a checklist without risk estimation (Tool A), 2. a checklist with a risk scale (Tool B), 3. a risk calculation without a formal hazard identification stage (Tool C), and 4. a questionnaire followed by a risk matrix (Tool D). Each tool's structure and practical application were studied. Tools A and B gave crude results comparable to those of more analytic tools in less time. Their main limitations were lack of contextual information for the identified hazards and greater dependency on the user's expertise and ability to tackle hazards of different nature. Tools C and D utilized more systematic approaches than tools A and B by supporting risk reduction based on the description of the risk factors. Tool D is distinctive because of 1. its comprehensive structure with respect to the steps suggested in risk management, 2. its dynamic approach to hazard identification, and 3. its use of data resulting from the risk analysis. PMID:26864350

  1. Method for Analyzing District Level IAI Data Bases to Identify Learning Opportunity Risks.

    ERIC Educational Resources Information Center

    Milazzo, Patricia; And Others

    A learning opportunity risk is defined as an absence of instruction or insufficient attention to proficiency at an early grade of instruction in a subject matter which will generate serious learning problems in later grades. A method for identifying such risks has been derived from analysis of district-level Instructional Accomplishment…

  2. 78 FR 25440 - Request for Information and Citations on Methods for Cumulative Risk Assessment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-01

    ...The United States Environmental Protection Agency (EPA) is requesting information and citations on approaches and methods for the planning, analysis, assessment, and characterization of cumulative risks to human populations and the environment. The EPA is developing guidelines for the assessment of cumulative risk as defined and characterized in the EPA 2003 publication Framework for......

  3. Key Attributes of the SAPHIRE Risk and Reliability Analysis Software for Risk-Informed Probabilistic Applications

    SciTech Connect

    Curtis Smith; James Knudsen; Kellie Kvarfordt; Ted Wood

    2008-08-01

    The Idaho National Laboratory is a primary developer of probabilistic risk and reliability analysis (PRRA) tools, dating back over 35 years. Evolving from mainframe-based software, the current state-of-the-practice has lead to the creation of the SAPHIRE software. Currently, agencies such as the Nuclear Regulatory Commission, the National Aeronautics and Aerospace Agency, the Department of Energy, and the Department of Defense use version 7 of the SAPHIRE software for many of their risk-informed activities. In order to better understand and appreciate the power of software as part of risk-informed applications, we need to recall that our current analysis methods and solution methods have built upon pioneering work done 30 to 40 years ago. We contrast this work with the current capabilities in the SAPHIRE analysis package. As part of this discussion, we provide information for both the typical features and special analysis capabilities which are available. We also present the application and results typically found with state-of-the-practice PRRA models. By providing both a high-level and detailed look at the SAPHIRE software, we give a snapshot in time for the current use of software tools in a risk-informed decision arena.

  4. Impact of NDE reliability developments on risk-informed methods

    SciTech Connect

    Walker, S.M.; Ammirato, F.V.

    1996-12-01

    Risk informed inspection procedures are being developed to more effectively and economically manage degradation in plant piping systems. A key element of this process is applying nondestructive examination (NDE) procedures capable of detecting specific damage mechanisms that may be operative in particular locations. Thus, the needs of risk informed analysis are closely coupled with a firm understanding of the capability of NDE.

  5. Integrated Reliability and Risk Analysis System (IRRAS)

    SciTech Connect

    Russell, K D; McKay, M K; Sattison, M.B. Skinner, N.L.; Wood, S T; Rasmuson, D M

    1992-01-01

    The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the user the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification. Version 1.0 of the IRRAS program was released in February of 1987. Since that time, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 4.0 and is the subject of this Reference Manual. Version 4.0 of IRRAS provides the same capabilities as Version 1.0 and adds a relational data base facility for managing the data, improved functionality, and improved algorithm performance.

  6. Comparing risks from low-level radioactive waste disposal on land and in the ocean: A review of agreements/statutes, scenarios, processing/packaging/disposal technologies, models, and decision analysis methods

    SciTech Connect

    Moskowitz, P.D.; Kalb, P.D.; Morris, S.C.; Rowe, M.D. ); Marietta, M. ); Anspaugh, L; McKone, T. )

    1989-10-01

    This report gives background information on: The history of LLW disposal in the US; agreements, statutes and regulations for the disposal of LLW; disposal scenarios and alternative treatment options for LLW; methods and models which could be used to assess and compare risks associated with land and ocean options for LLW disposal; technical and methodological issues associated with comparing risks of different options; and, roles of decision making approaches in comparing risks across media. 63 refs., 23 figs., 33 tabs.

  7. Reliability/Risk Methods and Design Tools for Application in Space Programs

    NASA Technical Reports Server (NTRS)

    Townsend, John S.; Smart, Christian

    1999-01-01

    Since 1984 NASA has funded several major programs to develop Reliability/Risk Methods and tools for engineers to apply in the design and assessment of aerospace hardware. Two probabilistic software tools that show great promise for practical application are the finite element code NESSUS and the system risk analysis code QRAS. This paper examines NASA's past, present, and future directions in reliability and risk engineering applications, Both the NESSUS and QRAS software tools are detailed.

  8. Risk Analysis for Environmental Health Triage

    SciTech Connect

    Bogen, K T

    2005-11-18

    The Homeland Security Act mandates development of a national, risk-based system to support planning for, response to and recovery from emergency situations involving large-scale toxic exposures. To prepare for and manage consequences effectively, planners and responders need not only to identify zones of potentially elevated individual risk, but also to predict expected casualties. Emergency response support systems now define ''consequences'' by mapping areas in which toxic chemical concentrations do or may exceed Acute Exposure Guideline Levels (AEGLs) or similar guidelines. However, because AEGLs do not estimate expected risks, current unqualified claims that such maps support consequence management are misleading. Intentionally protective, AEGLs incorporate various safety/uncertainty factors depending on scope and quality of chemical-specific toxicity data. Some of these factors are irrelevant, and others need to be modified, whenever resource constraints or exposure-scenario complexities require responders to make critical trade-off (triage) decisions in order to minimize expected casualties. AEGL-exceedance zones cannot consistently be aggregated, compared, or used to calculate expected casualties, and so may seriously misguide emergency response triage decisions. Methods and tools well established and readily available to support environmental health protection are not yet developed for chemically related environmental health triage. Effective triage decisions involving chemical risks require a new assessment approach that focuses on best estimates of likely casualties, rather than on upper plausible bounds of individual risk. If risk-based consequence management is to become a reality, federal agencies tasked with supporting emergency response must actively coordinate to foster new methods that can support effective environmental health triage.

  9. Risk and value analysis of SETI.

    PubMed

    Billingham, J

    1990-01-01

    This paper attempts to apply a traditional risk and value analysis to the Search for Extraterrestrial Intelligence--SETI. In view of the difficulties of assessing the probability of success, a comparison is made between SETI and a previous search for extraterrestrial life, the biological component of Project Viking. Our application of simple Utility Theory, given some reasonable assumptions, suggests that SETI is at least as worthwhile as the biological experiment on Viking. PMID:11538075

  10. Risk and value analysis of SETI

    NASA Technical Reports Server (NTRS)

    Billingham, J.

    1990-01-01

    This paper attempts to apply a traditional risk and value analysis to the Search for Extraterrestrial Intelligence--SETI. In view of the difficulties of assessing the probability of success, a comparison is made between SETI and a previous search for extraterrestrial life, the biological component of Project Viking. Our application of simple Utility Theory, given some reasonable assumptions, suggests that SETI is at least as worthwhile as the biological experiment on Viking.

  11. 14 CFR 417.225 - Debris risk analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Debris risk analysis. 417.225 Section 417... OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.225 Debris risk analysis. A flight safety analysis must demonstrate that the risk to the public potentially exposed to inert...

  12. 14 CFR 417.225 - Debris risk analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Debris risk analysis. 417.225 Section 417... OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.225 Debris risk analysis. A flight safety analysis must demonstrate that the risk to the public potentially exposed to inert...

  13. 14 CFR 417.225 - Debris risk analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Debris risk analysis. 417.225 Section 417... OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.225 Debris risk analysis. A flight safety analysis must demonstrate that the risk to the public potentially exposed to inert...

  14. 14 CFR 417.225 - Debris risk analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Debris risk analysis. 417.225 Section 417... OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.225 Debris risk analysis. A flight safety analysis must demonstrate that the risk to the public potentially exposed to inert...

  15. 14 CFR 417.225 - Debris risk analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Debris risk analysis. 417.225 Section 417... OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.225 Debris risk analysis. A flight safety analysis must demonstrate that the risk to the public potentially exposed to inert...

  16. Fire Risk Analysis for Armenian NPP Confinement

    SciTech Connect

    Poghosyan, Shahen; Malkhasyan, Albert; Bznuni, Surik; Amirjanyan, Armen

    2006-07-01

    Major fire occurred at Armenian NPP (ANPP) in October 1982 showed that fire-induced initiating events (IE) can have dominant contribution in overall risk of core damage. Probabilistic Safety Assessment study for fire-induced initiating events for ANPP was initiated in 2002. Analysis was performed for compartments fires in which could result in failure of components which are necessary for reactor cold shutdown. Analysis shows that main risk from fire at ANPP is conditioned by fire in cable tunnels 61-64. Meanwhile fire in confinement compartments don't have significant contribution to overall risk of core damage. The exception is so called 'confinement valves compartment' (room no.A-013/2) fire (more than 7.5% of CDF) in which fire could result in the loss of coolant accident with unavailability of primary makeup system, which directly leads to core damage. Detailed analysis of this problem that is common for typical WWER-440/230 reactors with no hermetic MCPs and recommendations for solution are presented in this paper. (authors)

  17. Preliminary Technical Risk Analysis for the Geothermal Technologies Program

    SciTech Connect

    McVeigh, J.; Cohen, J.; Vorum, M.; Porro, G.; Nix, G.

    2007-03-01

    This report explains the goals, methods, and results of a probabilistic analysis of technical risk for a portfolio of R&D projects in the DOE Geothermal Technologies Program ('the Program'). The analysis is a task by Princeton Energy Resources International, LLC (PERI), in support of the National Renewable Energy Laboratory (NREL) on behalf of the Program. The main challenge in the analysis lies in translating R&D results to a quantitative reflection of technical risk for a key Program metric: levelized cost of energy (LCOE). This requires both computational development (i.e., creating a spreadsheet-based analysis tool) and a synthesis of judgments by a panel of researchers and experts of the expected results of the Program's R&D.

  18. Analysis of exogenous components of mortality risks.

    PubMed

    Blinkin, V L

    1998-04-01

    A new technique for deriving exogenous components of mortality risks from national vital statistics has been developed. Each observed death rate Dij (where i corresponds to calendar time (year or interval of years) and j denotes the number of corresponding age group) was represented as Dij = Aj + BiCj, and unknown quantities Aj, Bi, and Cj were estimated by a special procedure using the least-squares principle. The coefficients of variation do not exceed 10%. It is shown that the term Aj can be interpreted as the endogenous and the second term BiCj as the exogenous components of the death rate. The aggregate of endogenous components Aj can be described by a regression function, corresponding to the Gompertz-Makeham law, A(tau) = gamma + beta x e alpha tau, where gamma, beta, and alpha are constants, tau is age, A(tau) [symbol: see text] tau = tau j identical to A(tau j) identical to Aj and tau j is the value of age tau in jth age group. The coefficients of variation for such a representation does not exceed 4%. An analysis of exogenous risk levels in the Moscow and Russian populations during 1980-1995 shows that since 1992 all components of exogenous risk in the Moscow population had been increasing up to 1994. The greatest contribution to the total level of exogenous risk was lethal diseases, and their death rate was 387 deaths per 100,000 persons in 1994, i.e., 61.9% of all deaths. The dynamics of exogenous mortality risk change during 1990-1994 in the Moscow population and in the Russian population without Moscow had been identical: the risk had been increasing and its value in the Russian population had been higher than that in the Moscow population. PMID:9637078

  19. Predicted 10-year risk of cardiovascular disease is influenced by the risk equation adopted: a cross-sectional analysis

    PubMed Central

    Gray, Benjamin J; Bracken, Richard M; Turner, Daniel; Morgan, Kerry; Mellalieu, Stephen D; Thomas, Michael; Williams, Sally P; Williams, Meurig; Rice, Sam; Stephens, Jeffrey W

    2014-01-01

    Background Validated risk equations are currently recommended to assess individuals to determine those at ‘high risk’ of cardiovascular disease (CVD). However, there is no longer a risk ‘equation of choice’. Aim This study examined the differences between four commonly-used CVD risk equations. Design and setting Cross-sectional analysis of individuals who participated in a workplace-based risk assessment in Carmarthenshire, south Wales. Method Analysis of 790 individuals (474 females, 316 males) with no prior diagnosis of CVD or diabetes. Ten-year CVD risk was predicted by entering the relevant variables into the QRISK2, Framingham Lipids, Framingham BMI, and JBS2 risk equations. Results The Framingham BMI and JBS2 risk equations predicted a higher absolute risk than the QRISK2 and Framingham Lipids equations, and CVD risk increased concomitantly with age irrespective of which risk equation was adopted. Only a small proportion of females (0–2.1%) were predicted to be at high risk of developing CVD using any of the risk algorithms. The proportion of males predicted at high risk ranged from 5.4% (QRISK2) to 20.3% (JBS2). After age stratification, few differences between isolated risk factors were observed in males, although a greater proportion of males aged ≥50 years were predicted to be at ‘high risk’ independent of risk equation used. Conclusions Different risk equations can influence the predicted 10-year CVD risk of individuals. More males were predicted at ‘high risk’ using the JBS2 or Framingham BMI equations. Consideration should also be given to the number of isolated risk factors, especially in younger adults when evaluating CVD risk. PMID:25267049

  20. Risk analysis by FMEA as an element of analytical validation.

    PubMed

    van Leeuwen, J F; Nauta, M J; de Kaste, D; Odekerken-Rombouts, Y M C F; Oldenhof, M T; Vredenbregt, M J; Barends, D M

    2009-12-01

    We subjected a Near-Infrared (NIR) analytical procedure used for screening drugs on authenticity to a Failure Mode and Effects Analysis (FMEA), including technical risks as well as risks related to human failure. An FMEA team broke down the NIR analytical method into process steps and identified possible failure modes for each step. Each failure mode was ranked on estimated frequency of occurrence (O), probability that the failure would remain undetected later in the process (D) and severity (S), each on a scale of 1-10. Human errors turned out to be the most common cause of failure modes. Failure risks were calculated by Risk Priority Numbers (RPNs)=O x D x S. Failure modes with the highest RPN scores were subjected to corrective actions and the FMEA was repeated, showing reductions in RPN scores and resulting in improvement indices up to 5.0. We recommend risk analysis as an addition to the usual analytical validation, as the FMEA enabled us to detect previously unidentified risks. PMID:19640668

  1. Incorporating Classification Uncertainty in Competing- risks Nest- failure Analysis

    EPA Science Inventory

    Nesting birds risk nest failure due to many causes. Though partitioning risk of failure among causes has long been of interest to ornithologists, formal methods for estimating competing risk have been lacking.

  2. Conceptual issues with risk analysis in Switzerland

    NASA Astrophysics Data System (ADS)

    Nicolet, Pierrick; Jaboyedoff, Michel; Lévy, Sébastien

    2015-04-01

    Risk analysis is a tricky procedure, where one can easily make mistakes. Indeed, although risk equations are rather general, transferring a methodology to another context or hazard type can often lead to inaccuracies or even significant errors. To illustrate this, common mistakes made with the Swiss methodology are presented, together with possible solutions. This includes the following: Risk analysis for moving objects only takes the process dimension into account (e.g. the length of a road section potentially affected by a landslide), but not the object dimension (e.g. the cars length). This is a fair simplification as long as the object dimension is considerably smaller than the process dimension. However, when the object is large compared to the process (e.g. rockfalls on a train), the results will be wrong. This problem can be illustrated by considering two blocs. According to this methodology a 1 m diameter bloc will be twice more susceptible to reach a train than a 50 cm bloc. This is obviously not correct. When it comes to rockfalls risk analysis on roads or railway found in the literature, the bloc dimension is usually neglected, in favour of the object dimension, which is a fair assumption in this context. However, it is possible to include both dimensions by using the sum of the lengths instead of one of them. Risk analysis is usually performed using 3 different scenarios, for 3 different ranges of return periods, namely 1-30, 30-100 and 100-300 years. In order to be conservative, the operator commonly considers the magnitude of the worst event that happens with a return period included between the class bounds, which means that the operator evaluates the magnitude reached or overpassed with a return period of 30, 100 and 300 years respectively. Then, since the magnitude corresponds to the upper bounds of the classes, risk is calculated using the frequency corresponding to these return periods and not to the middle of the class (and also subtracting the

  3. A meta-analysis on depression and subsequent cancer risk

    PubMed Central

    2007-01-01

    Background The authors tested the hypothesis that depression is a possible factor influencing the course of cancer by reviewing prospective epidemiological studies and calculating summary relative risks. Methods Studies were identified by computerized searches of Medline, Embase and PsycINFO. as well as manual searches of reference lists of selected publications. Inclusion criteria were cohort design, population-based sample, structured measurement of depression and outcome of cancer known for depressed and non-depressed subjects Results Thirteen eligible studies were identified. Based on eight studies with complete crude data on overall cancer, our summary relative risk (95% confidence interval) was 1.19 (1.06–1.32). After adjustment for confounders we pooled a summary relative risk of 1.12 (0.99–1.26). No significant association was found between depression and subsequent breast cancer risk, based on seven heterogeneous studies, with or without adjustment for possible confounders. Subgroup analysis of studies with a follow-up of ten years or more, however, resulted in a statistically significant summary relative risk of 2.50 (1.06–5.91). No significant associations were found for lung, colon or prostate cancer. Conclusion This review suggests a tendency towards a small and marginally significant association between depression and subsequent overall cancer risk and towards a stronger increase of breast cancer risk emerging many years after a previous depression. PMID:18053168

  4. Sensitivity Analysis of Launch Vehicle Debris Risk Model

    NASA Technical Reports Server (NTRS)

    Gee, Ken; Lawrence, Scott L.

    2010-01-01

    As part of an analysis of the loss of crew risk associated with an ascent abort system for a manned launch vehicle, a model was developed to predict the impact risk of the debris resulting from an explosion of the launch vehicle on the crew module. The model consisted of a debris catalog describing the number, size and imparted velocity of each piece of debris, a method to compute the trajectories of the debris and a method to calculate the impact risk given the abort trajectory of the crew module. The model provided a point estimate of the strike probability as a function of the debris catalog, the time of abort and the delay time between the abort and destruction of the launch vehicle. A study was conducted to determine the sensitivity of the strike probability to the various model input parameters and to develop a response surface model for use in the sensitivity analysis of the overall ascent abort risk model. The results of the sensitivity analysis and the response surface model are presented in this paper.

  5. Method of multivariate spectral analysis

    DOEpatents

    Keenan, Michael R.; Kotula, Paul G.

    2004-01-06

    A method of determining the properties of a sample from measured spectral data collected from the sample by performing a multivariate spectral analysis. The method can include: generating a two-dimensional matrix A containing measured spectral data; providing a weighted spectral data matrix D by performing a weighting operation on matrix A; factoring D into the product of two matrices, C and S.sup.T, by performing a constrained alternating least-squares analysis of D=CS.sup.T, where C is a concentration intensity matrix and S is a spectral shapes matrix; unweighting C and S by applying the inverse of the weighting used previously; and determining the properties of the sample by inspecting C and S. This method can be used to analyze X-ray spectral data generated by operating a Scanning Electron Microscope (SEM) with an attached Energy Dispersive Spectrometer (EDS).

  6. Compounding conservatisms: EPA's health risk assessment methods

    SciTech Connect

    Stackelberg, K. von; Burmaster, D.E. )

    1993-03-01

    Superfund conjures up images of hazardous waste sites, which EPA is spending billions of dollars to remediate. One of the law's most worrisome effects is that it drains enormous economic resources without returning commensurate benefits. In a Sept. 1, 1991, front page article in The New York Times, experts argued that most health dangers at Superfund sites could be eliminated for a fraction of the billions that will be spent cleaning up the 1,200 high-priority sites across the country. Even EPA has suggested that the Superfund program may receive disproportionate resources, compared with other public health programs, such as radon in houses, the diminishing ozone layer and occupational diseases. Public opinion polls over the last decade consistently have mirrored the public's vast fear of hazardous waste sites, a fear as great as that held for nuclear power plants. Fear notwithstanding, the high cost of chosen remedies at given sites may have less to do with public health goals than with the method EPA uses to translate them into acceptable contaminant concentrations in soil, groundwater and other environmental media.

  7. Modeling Opponents in Adversarial Risk Analysis.

    PubMed

    Rios Insua, David; Banks, David; Rios, Jesus

    2016-04-01

    Adversarial risk analysis has been introduced as a framework to deal with risks derived from intentional actions of adversaries. The analysis supports one of the decisionmakers, who must forecast the actions of the other agents. Typically, this forecast must take account of random consequences resulting from the set of selected actions. The solution requires one to model the behavior of the opponents, which entails strategic thinking. The supported agent may face different kinds of opponents, who may use different rationality paradigms, for example, the opponent may behave randomly, or seek a Nash equilibrium, or perform level-k thinking, or use mirroring, or employ prospect theory, among many other possibilities. We describe the appropriate analysis for these situations, and also show how to model the uncertainty about the rationality paradigm used by the opponent through a Bayesian model averaging approach, enabling a fully decision-theoretic solution. We also show how as we observe an opponent's decision behavior, this approach allows learning about the validity of each of the rationality models used to predict his decision by computing the models' (posterior) probabilities, which can be understood as a measure of their validity. We focus on simultaneous decision making by two agents. PMID:26133501

  8. Method of photon spectral analysis

    DOEpatents

    Gehrke, Robert J.; Putnam, Marie H.; Killian, E. Wayne; Helmer, Richard G.; Kynaston, Ronnie L.; Goodwin, Scott G.; Johnson, Larry O.

    1993-01-01

    A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and .gamma.-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2000 keV), as well as high-energy .gamma. rays (>1 MeV). A 8192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The .gamma.-ray portion of each spectrum is analyzed by a standard Ge .gamma.-ray analysis program. This method can be applied to any analysis involving x- and .gamma.-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the .gamma.-ray analysis and accommodated during the x-ray analysis.

  9. Method of photon spectral analysis

    DOEpatents

    Gehrke, R.J.; Putnam, M.H.; Killian, E.W.; Helmer, R.G.; Kynaston, R.L.; Goodwin, S.G.; Johnson, L.O.

    1993-04-27

    A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and [gamma]-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2,000 keV), as well as high-energy [gamma] rays (>1 MeV). A 8,192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The [gamma]-ray portion of each spectrum is analyzed by a standard Ge [gamma]-ray analysis program. This method can be applied to any analysis involving x- and [gamma]-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the [gamma]-ray analysis and accommodated during the x-ray analysis.

  10. Defining Human Failure Events for Petroleum Risk Analysis

    SciTech Connect

    Ronald L. Boring; Knut Øien

    2014-06-01

    In this paper, an identification and description of barriers and human failure events (HFEs) for human reliability analysis (HRA) is performed. The barriers, called target systems, are identified from risk significant accident scenarios represented as defined situations of hazard and accident (DSHAs). This report serves as the foundation for further work to develop petroleum HFEs compatible with the SPAR-H method and intended for reuse in future HRAs.

  11. FORTRAN computer program for seismic risk analysis

    USGS Publications Warehouse

    McGuire, Robin K.

    1976-01-01

    A program for seismic risk analysis is described which combines generality of application, efficiency and accuracy of operation, and the advantage of small storage requirements. The theoretical basis for the program is first reviewed, and the computational algorithms used to apply this theory are described. The information required for running the program is listed. Published attenuation functions describing the variation with earthquake magnitude and distance of expected values for various ground motion parameters are summarized for reference by the program user. Finally, suggestions for use of the program are made, an example problem is described (along with example problem input and output) and the program is listed.

  12. Risk Analysis Related to Quality Management Principles

    NASA Astrophysics Data System (ADS)

    Vykydal, David; Halfarová, Petra; Nenadál, Jaroslav; Plura, Jiří; Hekelová, Edita

    2012-12-01

    Efficient and effective implementation of quality management principles asks for a responsible approach from top managers' perspectives. A study of the current state of affairs in Czech organizations discovers a lot of shortcomings in this field that can be changed to vary managerial risks. The article identifies and analyses some of them and gives short guidance for appropriate treatment. Text of the article reflects the authors' experience as well as knowledge obtained from the systematic analysis of industrial companies' environments.

  13. New challenges on uncertainty propagation assessment of flood risk analysis

    NASA Astrophysics Data System (ADS)

    Martins, Luciano; Aroca-Jiménez, Estefanía; Bodoque, José M.; Díez-Herrero, Andrés

    2016-04-01

    Natural hazards, such as floods, cause considerable damage to the human life, material and functional assets every year and around the World. Risk assessment procedures has associated a set of uncertainties, mainly of two types: natural, derived from stochastic character inherent in the flood process dynamics; and epistemic, that are associated with lack of knowledge or the bad procedures employed in the study of these processes. There are abundant scientific and technical literature on uncertainties estimation in each step of flood risk analysis (e.g. rainfall estimates, hydraulic modelling variables); but very few experience on the propagation of the uncertainties along the flood risk assessment. Therefore, epistemic uncertainties are the main goal of this work, in particular,understand the extension of the propagation of uncertainties throughout the process, starting with inundability studies until risk analysis, and how far does vary a proper analysis of the risk of flooding. These methodologies, such as Polynomial Chaos Theory (PCT), Method of Moments or Monte Carlo, are used to evaluate different sources of error, such as data records (precipitation gauges, flow gauges...), hydrologic and hydraulic modelling (inundation estimation), socio-demographic data (damage estimation) to evaluate the uncertainties propagation (UP) considered in design flood risk estimation both, in numerical and cartographic expression. In order to consider the total uncertainty and understand what factors are contributed most to the final uncertainty, we used the method of Polynomial Chaos Theory (PCT). It represents an interesting way to handle to inclusion of uncertainty in the modelling and simulation process. PCT allows for the development of a probabilistic model of the system in a deterministic setting. This is done by using random variables and polynomials to handle the effects of uncertainty. Method application results have a better robustness than traditional analysis

  14. Forecasting method of nationak-level forest fire risk rating

    NASA Astrophysics Data System (ADS)

    Qin, Xian-lin; Zhang, Zi-hui; Li, Zeng-yuan; Yi, Hao-ruo

    2008-11-01

    The risk level of forest fire not only depends on weather, topography, human activities, socio-economic conditions, but is also closely related to the types, growth, moisture content, and quantity of forest fuel on the ground. How to timely acquire information about the growth and moisture content of forest fuel and climate for the whole country is critical to national-level forest fire risk forecasting. The development and application of remote sensing (RS), geographic information system (GIS), databases, internet, and other modern information technologies has provided important technical means for macro-regional forest fire risk forecasting. In this paper, quantified forecasting of national-level forest fire risk was studied using Fuel State Index (FSI) and Background Composite Index (BCI). The FSI was estimated using Moderate Resolution Imaging Spectroradiaometer (MODIS) data. National meteorological data and other basic data on distribution of fuel types and forest fire risk rating were standardized in ArcGIS platform to calculate BCI. The FSI and the BCI were used to calculate the Forest Fire Danger Index (FFDI), which is regarded as a quantitative indicator for national forest fire risk forecasting and forest fire risk rating, shifting from qualitative description to quantitative estimation. The major forest fires occurred in recent years were taken as examples to validate the above method, and results indicated that the method can be used for quantitative forecasting of national-level forest fire risks.

  15. Topographic Avalanche Risk: DEM Sensitivity Analysis

    NASA Astrophysics Data System (ADS)

    Nazarkulova, Ainura; Strobl, Josef

    2015-04-01

    GIS-based models are frequently used to assess the risk and trigger probabilities of (snow) avalanche releases, based on parameters and geomorphometric derivatives like elevation, exposure, slope, proximity to ridges and local relief energy. Numerous models, and model-based specific applications and project results have been published based on a variety of approaches and parametrizations as well as calibrations. Digital Elevation Models (DEM) come with many different resolution (scale) and quality (accuracy) properties, some of these resulting from sensor characteristics and DEM generation algorithms, others from different DEM processing workflows and analysis strategies. This paper explores the impact of using different types and characteristics of DEMs for avalanche risk modeling approaches, and aims at establishing a framework for assessing the uncertainty of results. The research question is derived from simply demonstrating the differences in release risk areas and intensities by applying identical models to DEMs with different properties, and then extending this into a broader sensitivity analysis. For the quantification and calibration of uncertainty parameters different metrics are established, based on simple value ranges, probabilities, as well as fuzzy expressions and fractal metrics. As a specific approach the work on DEM resolution-dependent 'slope spectra' is being considered and linked with the specific application of geomorphometry-base risk assessment. For the purpose of this study focusing on DEM characteristics, factors like land cover, meteorological recordings and snowpack structure and transformation are kept constant, i.e. not considered explicitly. Key aims of the research presented here are the development of a multi-resolution and multi-scale framework supporting the consistent combination of large area basic risk assessment with local mitigation-oriented studies, and the transferability of the latter into areas without availability of

  16. Risk analysis for renewable energy projects due to constraints arising

    NASA Astrophysics Data System (ADS)

    Prostean, G.; Vasar, C.; Prostean, O.; Vartosu, A.

    2016-02-01

    Starting from the target of the European Union (EU) to use renewable energy in the area that aims a binding target of 20% renewable energy in final energy consumption by 2020, this article illustrates the identification of risks for implementation of wind energy projects in Romania, which could lead to complex technical implications, social and administrative. In specific projects analyzed in this paper were identified critical bottlenecks in the future wind power supply chain and reasonable time periods that may arise. Renewable energy technologies have to face a number of constraints that delayed scaling-up their production process, their transport process, the equipment reliability, etc. so implementing these types of projects requiring complex specialized team, the coordination of which also involve specific risks. The research team applied an analytical risk approach to identify major risks encountered within a wind farm project developed in Romania in isolated regions with different particularities, configured for different geographical areas (hill and mountain locations in Romania). Identification of major risks was based on the conceptual model set up for the entire project implementation process. Throughout this conceptual model there were identified specific constraints of such process. Integration risks were examined by an empirical study based on the method HAZOP (Hazard and Operability). The discussion describes the analysis of our results implementation context of renewable energy projects in Romania and creates a framework for assessing energy supply to any entity from renewable sources.

  17. A comprehensive risk analysis of coastal zones in China

    NASA Astrophysics Data System (ADS)

    Wang, Guanghui; Liu, Yijun; Wang, Hongbing; Wang, Xueying

    2014-03-01

    Although coastal zones occupy an important position in the world development, they face high risks and vulnerability to natural disasters because of their special locations and their high population density. In order to estimate their capability for crisis-response, various models have been established. However, those studies mainly focused on natural factors or conditions, which could not reflect the social vulnerability and regional disparities of coastal zones. Drawing lessons from the experiences of the United Nations Environment Programme (UNEP), this paper presents a comprehensive assessment strategy based on the mechanism of Risk Matrix Approach (RMA), which includes two aspects that are further composed of five second-class indicators. The first aspect, the probability phase, consists of indicators of economic conditions, social development, and living standards, while the second one, the severity phase, is comprised of geographic exposure and natural disasters. After weighing all of the above indicators by applying the Analytic Hierarchy Process (AHP) and Delphi Method, the paper uses the comprehensive assessment strategy to analyze the risk indices of 50 coastal cities in China. The analytical results are presented in ESRI ArcGis10.1, which generates six different risk maps covering the aspects of economy, society, life, environment, disasters, and an overall assessment of the five areas. Furthermore, the study also investigates the spatial pattern of these risk maps, with detailed discussion and analysis of different risks in coastal cities.

  18. System Analysis and Risk Assessment System.

    Energy Science and Technology Software Center (ESTSC)

    2000-11-20

    Version 00 SARA4.16 is a program that allows the user to review the results of a Probabilistic Risk Assessment (PRA) and to perform limited sensitivity analysis on these results. This tool is intended to be used by a less technical oriented user and does not require the level of understanding of PRA concepts required by a full PRA analysis tool. With this program a user can review the information generated by a PRA analyst andmore » compare the results to those generated by making limited modifications to the data in the PRA. Also included in this program is the ability to graphically display the information stored in the database. This information includes event trees, fault trees, P&IDs and uncertainty distributions. SARA 4.16 is incorporated in the SAPHIRE 5.0 code package.« less

  19. [Risk analysis in radiation therapy: state of the art].

    PubMed

    Mazeron, R; Aguini, N; Deutsch, É

    2013-01-01

    Five radiotherapy accidents, from which two serial, occurred in France from 2003 to 2007, led the authorities to establish a roadmap for securing radiotherapy. By analogy with industrial processes, a technical decision form the French Nuclear Safety Authority in 2008 requires radiotherapy professionals to conduct analyzes of risks to patients. The process of risk analysis had been tested in three pilot centers, before the occurrence of accidents, with the creation of cells feedback. The regulation now requires all radiotherapy services to have similar structures to collect precursor events, incidents and accidents, to perform analyzes following rigorous methods and to initiate corrective actions. At the same time, it is also required to conduct analyzes a priori, less intuitive, and usually require the help of a quality engineer, with the aim of reducing risk. The progressive implementation of these devices is part of an overall policy to improve the quality of radiotherapy. Since 2007, no radiotherapy accident was reported. PMID:23787020

  20. Project 6: Cumulative Risk Assessment (CRA) Methods and Applications

    EPA Science Inventory

    Project 6: CRA Methods and Applications addresses the need to move beyond traditional risk assessment practices by developing CRA methods to integrate and evaluate impacts of chemical and nonchemical stressors on the environment and human health. Project 6 has three specific obje...

  1. Best self visualization method with high-risk youth.

    PubMed

    Schussel, Lorne; Miller, Lisa

    2013-08-01

    The healing process of the Best Self Visualization Method (BSM) is described within the framework of meditation, neuroscience, and psychodynamic theory. Cases are drawn from the treatment of high-risk youth, who have histories of poverty, survival of sexual and physical abuse, and/or current risk for perpetrating abuse. Clinical use of BSM is demonstrated in two case illustrations, one of group psychotherapy and another of individual therapy. PMID:23775428

  2. Bleeding after endoscopic submucosal dissection: Risk factors and preventive methods

    PubMed Central

    Kataoka, Yosuke; Tsuji, Yosuke; Sakaguchi, Yoshiki; Minatsuki, Chihiro; Asada-Hirayama, Itsuko; Niimi, Keiko; Ono, Satoshi; Kodashima, Shinya; Yamamichi, Nobutake; Fujishiro, Mitsuhiro; Koike, Kazuhiko

    2016-01-01

    Endoscopic submucosal dissection (ESD) has become widely accepted as a standard method of treatment for superficial gastrointestinal neoplasms because it enables en block resection even for large lesions or fibrotic lesions with minimal invasiveness, and decreases the local recurrence rate. Moreover, specimens resected in an en block fashion enable accurate histological assessment. Taking these factors into consideration, ESD seems to be more advantageous than conventional endoscopic mucosal resection (EMR), but the associated risks of perioperative adverse events are higher than in EMR. Bleeding after ESD is the most frequent among these adverse events. Although post-ESD bleeding can be controlled by endoscopic hemostasis in most cases, it may lead to serious conditions including hemorrhagic shock. Even with preventive methods including administration of acid secretion inhibitors and preventive hemostasis, post-ESD bleeding cannot be completely prevented. In addition high-risk cases for post-ESD bleeding, which include cases with the use of antithrombotic agents or which require large resection, are increasing. Although there have been many reports about associated risk factors and methods of preventing post-ESD bleeding, many issues remain unsolved. Therefore, in this review, we have overviewed risk factors and methods of preventing post-ESD bleeding from previous studies. Endoscopists should have sufficient knowledge of these risk factors and preventive methods when performing ESD. PMID:27468187

  3. Bleeding after endoscopic submucosal dissection: Risk factors and preventive methods.

    PubMed

    Kataoka, Yosuke; Tsuji, Yosuke; Sakaguchi, Yoshiki; Minatsuki, Chihiro; Asada-Hirayama, Itsuko; Niimi, Keiko; Ono, Satoshi; Kodashima, Shinya; Yamamichi, Nobutake; Fujishiro, Mitsuhiro; Koike, Kazuhiko

    2016-07-14

    Endoscopic submucosal dissection (ESD) has become widely accepted as a standard method of treatment for superficial gastrointestinal neoplasms because it enables en block resection even for large lesions or fibrotic lesions with minimal invasiveness, and decreases the local recurrence rate. Moreover, specimens resected in an en block fashion enable accurate histological assessment. Taking these factors into consideration, ESD seems to be more advantageous than conventional endoscopic mucosal resection (EMR), but the associated risks of perioperative adverse events are higher than in EMR. Bleeding after ESD is the most frequent among these adverse events. Although post-ESD bleeding can be controlled by endoscopic hemostasis in most cases, it may lead to serious conditions including hemorrhagic shock. Even with preventive methods including administration of acid secretion inhibitors and preventive hemostasis, post-ESD bleeding cannot be completely prevented. In addition high-risk cases for post-ESD bleeding, which include cases with the use of antithrombotic agents or which require large resection, are increasing. Although there have been many reports about associated risk factors and methods of preventing post-ESD bleeding, many issues remain unsolved. Therefore, in this review, we have overviewed risk factors and methods of preventing post-ESD bleeding from previous studies. Endoscopists should have sufficient knowledge of these risk factors and preventive methods when performing ESD. PMID:27468187

  4. 49 CFR 260.17 - Credit risk premium analysis.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 4 2014-10-01 2014-10-01 false Credit risk premium analysis. 260.17 Section 260... Financial Assistance § 260.17 Credit risk premium analysis. (a) When Federal appropriations are not available to cover the total subsidy cost, the Administrator will determine the Credit Risk...

  5. 49 CFR 260.17 - Credit risk premium analysis.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 4 2013-10-01 2013-10-01 false Credit risk premium analysis. 260.17 Section 260... Financial Assistance § 260.17 Credit risk premium analysis. (a) When Federal appropriations are not available to cover the total subsidy cost, the Administrator will determine the Credit Risk...

  6. 49 CFR 260.17 - Credit risk premium analysis.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 4 2011-10-01 2011-10-01 false Credit risk premium analysis. 260.17 Section 260... Financial Assistance § 260.17 Credit risk premium analysis. (a) When Federal appropriations are not available to cover the total subsidy cost, the Administrator will determine the Credit Risk...

  7. 49 CFR 260.17 - Credit risk premium analysis.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 4 2012-10-01 2012-10-01 false Credit risk premium analysis. 260.17 Section 260... Financial Assistance § 260.17 Credit risk premium analysis. (a) When Federal appropriations are not available to cover the total subsidy cost, the Administrator will determine the Credit Risk...

  8. Voltametric analysis apparatus and method

    DOEpatents

    Almon, Amy C.

    1993-01-01

    An apparatus and method for electrochemical analysis of elements in solution. An auxiliary electrode 14, a reference electrode 18, and five working electrodes 20, 22, 26, 28, and 30 are positioned in a container 12 containing a sample solution 34. The working electrodes are spaced apart evenly from each other and auxiliary electrode 14 to minimize any inter-electrode interference that may occur during analysis. An electric potential is applied between auxiliary electrode 14 and each of the working electrodes 20, 22, 26, 28, and 30. Simultaneous measurements taken of the current flow through each of the working electrodes for each given potential in a potential range are used for identifying chemical elements present in sample solution 34 and their respective concentrations. Multiple working electrodes enable a more positive identification to be made by providing unique data characteristic of chemical elements present in the sample solution.

  9. Voltammetric analysis apparatus and method

    DOEpatents

    Almon, A.C.

    1993-06-08

    An apparatus and method is described for electrochemical analysis of elements in solution. An auxiliary electrode, a reference electrode, and five working electrodes are positioned in a container containing a sample solution. The working electrodes are spaced apart evenly from each other and the auxiliary electrode to minimize any inter-electrode interference that may occur during analysis. An electric potential is applied between the auxiliary electrode and each of the working electrodes. Simultaneous measurements taken of the current flow through each of the working electrodes for each given potential in a potential range are used for identifying chemical elements present in the sample solution and their respective concentrations. Multiple working electrodes enable a more positive identification to be made by providing unique data characteristic of chemical elements present in the sample solution.

  10. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments

    PubMed Central

    Conners, Erin E.; West, Brooke S.; Roth, Alexis M.; Meckel-Parker, Kristen G.; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D.; Brouwer, Kimberly C.

    2016-01-01

    Increasingly, ‘place’, including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions. PMID:27191846

  11. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments.

    PubMed

    Conners, Erin E; West, Brooke S; Roth, Alexis M; Meckel-Parker, Kristen G; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D; Brouwer, Kimberly C

    2016-01-01

    Increasingly, 'place', including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions. PMID:27191846

  12. Code System to Calculate Integrated Reliability and Risk Analysis.

    Energy Science and Technology Software Center (ESTSC)

    2002-02-18

    Version 04 IRRAS Version 4.16, the latest in a series (2.0, 2.5, 4.0, 4.15), is a program developed for the purpose of performing those functions necessary to create and analyze a complete Probabilistic Risk Assessment (PRA). This program includes functions to allow the user to create event trees and fault trees, to define accident sequences and basic event failure data, to solve system and accident sequence fault trees, to quantify cut sets, and to performmore » uncertainty analysis on the results. Also included in this program are features to allow the analyst to generate reports and displays that can be used to document the results of an analysis. Since this software is a very detailed technical tool, the user of this program should be familiar with PRA concepts and the methods used to perform these analyses. IRRAS Version 4.16 is the latest in the stand-alone IRRAS series (2.0, 2.5, 4.0, 4.15). Be sure to review the PSR-405/ SAPHIRE 7.06 package which was released in January 2000 and includes three programs: the Integrated Reliability and Risk Analysis System (IRRAS), the System Analysis and Risk Assessment (SARA) system, the Models And Results Database (MAR-D) system, and the Fault tree, Event tree and P&ID (FEP) editors.« less

  13. OVERVIEW: MYCOTOXIN RISK ANALYSIS IN THE USA

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The objective of the risk assessment process is to provide the basis for decisions that will allow successful management of the risk. Poor risk management decisions that result from inadequate risk assessment can be enormousely costly, both economically and in terms of human or animal health. Accco...

  14. Correlation method of electrocardiogram analysis

    NASA Astrophysics Data System (ADS)

    Strinadko, Marina M.; Timochko, Katerina B.

    2002-02-01

    The electrocardiograph method is the informational source for functional heart state characteristics. The electrocardiogram parameters are the integrated map of many component characteristics of the heart system and depend on disturbance requirements of each device. In the research work the attempt of making the skeleton diagram of perturbation of the heart system is made by the characteristic description of its basic components and connections between them through transition functions, which are written down by the differential equations of the first and second order with the purpose to build-up and analyze electrocardiogram. Noting the vector character of perturbation and the various position of heart in each organism, we offer own coordinate system connected with heart. The comparative analysis of electrocardiogram was conducted with the usage of correlation method.

  15. Method and system for dynamic probabilistic risk assessment

    NASA Technical Reports Server (NTRS)

    Dugan, Joanne Bechta (Inventor); Xu, Hong (Inventor)

    2013-01-01

    The DEFT methodology, system and computer readable medium extends the applicability of the PRA (Probabilistic Risk Assessment) methodology to computer-based systems, by allowing DFT (Dynamic Fault Tree) nodes as pivot nodes in the Event Tree (ET) model. DEFT includes a mathematical model and solution algorithm, supports all common PRA analysis functions and cutsets. Additional capabilities enabled by the DFT include modularization, phased mission analysis, sequence dependencies, and imperfect coverage.

  16. Working session 5: Operational aspects and risk analysis

    SciTech Connect

    Cizelj, L.; Donoghue, J.

    1997-02-01

    A general observation is that both operational aspects and risk analysis cannot be adequately discussed without information presented in other sessions. Some overlap of conclusions and recommendations is therefore to be expected. Further, it was assumed that recommendations concerning improvements in some related topics were generated by other sessions and are not repeated here. These include: (1) Knowledge on degradation mechanisms (initiation, progression, and failure). (2) Modeling of degradation (initiation, progression, and failure). (3) Capabilities of NDE methods. (4) Preventive maintenance and repair. One should note here, however, that all of these directly affect both operational and risk aspects of affected plants. A list of conclusions and recommendations is based on available presentations and discussions addressing risk and operational experience. The authors aimed at reaching as broad a consensus as possible. It should be noted here that there is no strict delineation between operational and safety aspects of degradation of steam generator tubes. This is caused by different risk perceptions in different countries/societies. The conclusions and recommendations were divided into four broad groups: human reliability; leakage monitoring; risk impact; and consequence assessment.

  17. [Irrationality and risk--a problem analysis].

    PubMed

    Bergler, R

    1995-04-01

    The way one experiences risks and one's risk behaviour is not centrally the result of rational considerations and decisions. Contradictions and incompatibilities can be proved empirically: (1) General paranoiac hysterical fear of risks, (2) complete repression of health risks, (3) unsureness and inability in dealing with risks, (4) prejudice against risks, (5) admiring risk behaviour, (6) seeking dangerous ways to test one's own limits, (7) readiness for aggression by not subjectively being able to control the risk, (8) naively assessing a risk positively thus intensifying pleasurable sensation, (9) increased preparedness to take risks in groups, (10) increased preparedness to take risk in anonymity, (11) accepting risks as a creative challenge for achievement motivation, (12) loss of innovation when avoiding risks. The awareness and assessment of risk factors is dependent on (1) personality factors (achievement-motivated persons experience risk as a creative challenge; risk behaviour being a variation in stimulus and exploration of one's environment), (2) the social back-ground (booster function), (3) the culture-related assessment of being able to control risks and (4) age (youthful risk behaviour being a way of coping with developmental exercises, e.g. purposely breaking parent's rules, provocatively demonstrating adult behaviour, a means to solve frustrating performance failures, advantages through social acceptance). The way one copes with risk factors is based on subjective estimation of risk; this assessment is made anew for each area of behaviour according to the specific situation. Making a decision and acting upon it is the result of simultaneously assessing the possible psychological benefit-cost factors of taking risks. The extent, to which the threat of risk factors is felt, depends on the importance one attaches to certain needs and benefits of quality of life and, in addition to this, what the subjective probabilities are that one is likely to be

  18. Corticosteroids and Pediatric Septic Shock Outcomes: A Risk Stratified Analysis

    PubMed Central

    Atkinson, Sarah J.; Cvijanovich, Natalie Z.; Thomas, Neal J.; Allen, Geoffrey L.; Anas, Nick; Bigham, Michael T.; Hall, Mark; Freishtat, Robert J.; Sen, Anita; Meyer, Keith; Checchia, Paul A.; Shanley, Thomas P.; Nowak, Jeffrey; Quasney, Michael; Weiss, Scott L.; Banschbach, Sharon; Beckman, Eileen; Howard, Kelli; Frank, Erin; Harmon, Kelli; Lahni, Patrick; Lindsell, Christopher J.; Wong, Hector R.

    2014-01-01

    Background The potential benefits of corticosteroids for septic shock may depend on initial mortality risk. Objective We determined associations between corticosteroids and outcomes in children with septic shock who were stratified by initial mortality risk. Methods We conducted a retrospective analysis of an ongoing, multi-center pediatric septic shock clinical and biological database. Using a validated biomarker-based stratification tool (PERSEVERE), 496 subjects were stratified into three initial mortality risk strata (low, intermediate, and high). Subjects receiving corticosteroids during the initial 7 days of admission (n = 252) were compared to subjects who did not receive corticosteroids (n = 244). Logistic regression was used to model the effects of corticosteroids on 28-day mortality and complicated course, defined as death within 28 days or persistence of two or more organ failures at 7 days. Results Subjects who received corticosteroids had greater organ failure burden, higher illness severity, higher mortality, and a greater requirement for vasoactive medications, compared to subjects who did not receive corticosteroids. PERSEVERE-based mortality risk did not differ between the two groups. For the entire cohort, corticosteroids were associated with increased risk of mortality (OR 2.3, 95% CI 1.3–4.0, p = 0.004) and a complicated course (OR 1.7, 95% CI 1.1–2.5, p = 0.012). Within each PERSEVERE-based stratum, corticosteroid administration was not associated with improved outcomes. Similarly, corticosteroid administration was not associated with improved outcomes among patients with no comorbidities, nor in groups of patients stratified by PRISM. Conclusions Risk stratified analysis failed to demonstrate any benefit from corticosteroids in this pediatric septic shock cohort. PMID:25386653

  19. Pressure Systems Stored-Energy Threshold Risk Analysis

    SciTech Connect

    Paulsen, Samuel S.

    2009-08-25

    Federal Regulation 10 CFR 851, which became effective February 2007, brought to light potential weaknesses regarding the Pressure Safety Program at the Pacific Northwest National Laboratory (PNNL). The definition of a pressure system in 10 CFR 851 does not contain a limit based upon pressure or any other criteria. Therefore, the need for a method to determine an appropriate risk-based hazard level for pressure safety was identified. The Laboratory has historically used a stored energy of 1000 lbf-ft to define a pressure hazard; however, an analytical basis for this value had not been documented. This document establishes the technical basis by evaluating the use of stored energy as an appropriate criterion to establish a pressure hazard, exploring a suitable risk threshold for pressure hazards, and reviewing the methods used to determine stored energy. The literature review and technical analysis concludes the use of stored energy as a method for determining a potential risk, the 1000 lbf-ft threshold, and the methods used by PNNL to calculate stored energy are all appropriate. Recommendations for further program improvements are also discussed

  20. Global Human Settlement Analysis for Disaster Risk Reduction

    NASA Astrophysics Data System (ADS)

    Pesaresi, M.; Ehrlich, D.; Ferri, S.; Florczyk, A.; Freire, S.; Haag, F.; Halkia, M.; Julea, A. M.; Kemper, T.; Soille, P.

    2015-04-01

    The Global Human Settlement Layer (GHSL) is supported by the European Commission, Joint Research Center (JRC) in the frame of his institutional research activities. Scope of GHSL is developing, testing and applying the technologies and analysis methods integrated in the JRC Global Human Settlement analysis platform for applications in support to global disaster risk reduction initiatives (DRR) and regional analysis in the frame of the European Cohesion policy. GHSL analysis platform uses geo-spatial data, primarily remotely sensed and population. GHSL also cooperates with the Group on Earth Observation on SB-04-Global Urban Observation and Information, and various international partners andWorld Bank and United Nations agencies. Some preliminary results integrating global human settlement information extracted from Landsat data records of the last 40 years and population data are presented.

  1. Putting problem formulation at the forefront of GMO risk analysis.

    PubMed

    Tepfer, Mark; Racovita, Monica; Craig, Wendy

    2013-01-01

    When applying risk assessment and the broader process of risk analysis to decisions regarding the dissemination of genetically modified organisms (GMOs), the process has a tendency to become remarkably complex. Further, as greater numbers of countries consider authorising the large-scale dissemination of GMOs, and as GMOs with more complex traits reach late stages of development, there has been increasing concern about the burden posed by the complexity of risk analysis. We present here an improved approach for GMO risk analysis that gives a central role to problem formulation. Further, the risk analysis strategy has been clarified and simplified in order to make rigorously scientific risk assessment and risk analysis more broadly accessible to diverse stakeholder groups. PMID:23160540

  2. Movement recognition technology as a method of assessing spontaneous general movements in high risk infants.

    PubMed

    Marcroft, Claire; Khan, Aftab; Embleton, Nicholas D; Trenell, Michael; Plötz, Thomas

    2014-01-01

    Preterm birth is associated with increased risks of neurological and motor impairments such as cerebral palsy. The risks are highest in those born at the lowest gestations. Early identification of those most at risk is challenging meaning that a critical window of opportunity to improve outcomes through therapy-based interventions may be missed. Clinically, the assessment of spontaneous general movements is an important tool, which can be used for the prediction of movement impairments in high risk infants. Movement recognition aims to capture and analyze relevant limb movements through computerized approaches focusing on continuous, objective, and quantitative assessment. Different methods of recording and analyzing infant movements have recently been explored in high risk infants. These range from camera-based solutions to body-worn miniaturized movement sensors used to record continuous time-series data that represent the dynamics of limb movements. Various machine learning methods have been developed and applied to the analysis of the recorded movement data. This analysis has focused on the detection and classification of atypical spontaneous general movements. This article aims to identify recent translational studies using movement recognition technology as a method of assessing movement in high risk infants. The application of this technology within pediatric practice represents a growing area of inter-disciplinary collaboration, which may lead to a greater understanding of the development of the nervous system in infants at high risk of motor impairment. PMID:25620954

  3. Movement Recognition Technology as a Method of Assessing Spontaneous General Movements in High Risk Infants

    PubMed Central

    Marcroft, Claire; Khan, Aftab; Embleton, Nicholas D.; Trenell, Michael; Plötz, Thomas

    2015-01-01

    Preterm birth is associated with increased risks of neurological and motor impairments such as cerebral palsy. The risks are highest in those born at the lowest gestations. Early identification of those most at risk is challenging meaning that a critical window of opportunity to improve outcomes through therapy-based interventions may be missed. Clinically, the assessment of spontaneous general movements is an important tool, which can be used for the prediction of movement impairments in high risk infants. Movement recognition aims to capture and analyze relevant limb movements through computerized approaches focusing on continuous, objective, and quantitative assessment. Different methods of recording and analyzing infant movements have recently been explored in high risk infants. These range from camera-based solutions to body-worn miniaturized movement sensors used to record continuous time-series data that represent the dynamics of limb movements. Various machine learning methods have been developed and applied to the analysis of the recorded movement data. This analysis has focused on the detection and classification of atypical spontaneous general movements. This article aims to identify recent translational studies using movement recognition technology as a method of assessing movement in high risk infants. The application of this technology within pediatric practice represents a growing area of inter-disciplinary collaboration, which may lead to a greater understanding of the development of the nervous system in infants at high risk of motor impairment. PMID:25620954

  4. HANFORD SAFETY ANALYSIS & RISK ASSESSMENT HANDBOOK (SARAH)

    SciTech Connect

    EVANS, C B

    2004-12-21

    The purpose of the Hanford Safety Analysis and Risk Assessment Handbook (SARAH) is to support the development of safety basis documentation for Hazard Category 2 and 3 (HC-2 and 3) U.S. Department of Energy (DOE) nuclear facilities to meet the requirements of 10 CFR 830, ''Nuclear Safety Management''. Subpart B, ''Safety Basis Requirements.'' Consistent with DOE-STD-3009-94, Change Notice 2, ''Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'' (STD-3009), and DOE-STD-3011-2002, ''Guidance for Preparation of Basis for Interim Operation (BIO) Documents'' (STD-3011), the Hanford SARAH describes methodology for performing a safety analysis leading to development of a Documented Safety Analysis (DSA) and derivation of Technical Safety Requirements (TSR), and provides the information necessary to ensure a consistently rigorous approach that meets DOE expectations. The DSA and TSR documents, together with the DOE-issued Safety Evaluation Report (SER), are the basic components of facility safety basis documentation. For HC-2 or 3 nuclear facilities in long-term surveillance and maintenance (S&M), for decommissioning activities, where source term has been eliminated to the point that only low-level, residual fixed contamination is present, or for environmental remediation activities outside of a facility structure, DOE-STD-1120-98, ''Integration of Environment, Safety, and Health into Facility Disposition Activities'' (STD-1120), may serve as the basis for the DSA. HC-2 and 3 environmental remediation sites also are subject to the hazard analysis methodologies of this standard.

  5. Cost Risk Analysis Based on Perception of the Engineering Process

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.; Wood, Darrell A.; Moore, Arlene A.; Bogart, Edward H.

    1986-01-01

    In most cost estimating applications at the NASA Langley Research Center (LaRC), it is desirable to present predicted cost as a range of possible costs rather than a single predicted cost. A cost risk analysis generates a range of cost for a project and assigns a probability level to each cost value in the range. Constructing a cost risk curve requires a good estimate of the expected cost of a project. It must also include a good estimate of expected variance of the cost. Many cost risk analyses are based upon an expert's knowledge of the cost of similar projects in the past. In a common scenario, a manager or engineer, asked to estimate the cost of a project in his area of expertise, will gather historical cost data from a similar completed project. The cost of the completed project is adjusted using the perceived technical and economic differences between the two projects. This allows errors from at least three sources. The historical cost data may be in error by some unknown amount. The managers' evaluation of the new project and its similarity to the old project may be in error. The factors used to adjust the cost of the old project may not correctly reflect the differences. Some risk analyses are based on untested hypotheses about the form of the statistical distribution that underlies the distribution of possible cost. The usual problem is not just to come up with an estimate of the cost of a project, but to predict the range of values into which the cost may fall and with what level of confidence the prediction is made. Risk analysis techniques that assume the shape of the underlying cost distribution and derive the risk curve from a single estimate plus and minus some amount usually fail to take into account the actual magnitude of the uncertainty in cost due to technical factors in the project itself. This paper addresses a cost risk method that is based on parametric estimates of the technical factors involved in the project being costed. The engineering

  6. Methods for Cancer Epigenome Analysis

    PubMed Central

    Nagarajan, Raman P.; Fouse, Shaun D.; Bell, Robert J.A.; Costello, Joseph F.

    2014-01-01

    Accurate detection of epimutations in tumor cells is crucial for understanding the molecular pathogenesis of cancer. Alterations in DNA methylation in cancer are functionally important and clinically relevant, but even this well-studied area is continually re-evaluated in light of unanticipated results, including a strong connection between aberrant DNA methylation in adult tumors and polycomb group profiles in embryonic stem cells, cancer-associated genetic mutations in epigenetic regulators such as DNMT3A and TET family genes, and the discovery of abundant 5-hydroxymethylcytosine, a product of TET proteins acting on 5-methylcytosine, in human tissues. The abundance and distribution of covalent histone modifications in primary cancer tissues relative to normal cells is a largely uncharted area, although there is good evidence for a mechanistic role of cancer-specific alterations in epigenetic marks in tumor etiology, drug response and tumor progression. Meanwhile, the discovery of new epigenetic marks continues, and there are many useful methods for epigenome analysis applicable to primary tumor samples, in addition to cancer cell lines. For DNA methylation and hydroxymethylation, next-generation sequencing allows increasingly inexpensive and quantitative whole-genome profiling. Similarly, the refinement and maturation of chromatin immunoprecipitation with next-generation sequencing (ChIP-seq) has made possible genome-wide mapping of histone modifications, open chromatin and transcription factor binding sites. Computational tools have been developed apace with these epigenome methods to better enable the accuracy and interpretation of the data from the profiling methods. PMID:22956508

  7. Structural reliability analysis and seismic risk assessment

    SciTech Connect

    Hwang, H.; Reich, M.; Shinozuka, M.

    1984-01-01

    This paper presents a reliability analysis method for safety evaluation of nuclear structures. By utilizing this method, it is possible to estimate the limit state probability in the lifetime of structures and to generate analytically the fragility curves for PRA studies. The earthquake ground acceleration, in this approach, is represented by a segment of stationary Gaussian process with a zero mean and a Kanai-Tajimi Spectrum. All possible seismic hazard at a site represented by a hazard curve is also taken into consideration. Furthermore, the limit state of a structure is analytically defined and the corresponding limit state surface is then established. Finally, the fragility curve is generated and the limit state probability is evaluated. In this paper, using a realistic reinforced concrete containment as an example, results of the reliability analysis of the containment subjected to dead load, live load and ground earthquake acceleration are presented and a fragility curve for PRA studies is also constructed.

  8. INTERIM REPORT IMPROVED METHODS FOR INCORPORATING RISK IN DECISION MAKING

    SciTech Connect

    Clausen, M. J.; Fraley, D. W.; Denning, R. S.

    1980-08-01

    This paper reports observations and preliminary investigations in the first phase of a research program covering methodologies for making safety-related decisions. The objective has been to gain insight into NRC perceptions of the value of formal decision methods, their possible applications, and how risk is, or may be, incorporated in decision making. The perception of formal decision making techniques, held by various decision makers, and what may be done to improve them, were explored through interviews with NRC staff. An initial survey of decision making methods, an assessment of the applicability of formal methods vis-a-vis the available information, and a review of methods of incorporating risk and uncertainty have also been conducted.

  9. A Proposal of Operational Risk Management Method Using FMEA for Drug Manufacturing Computerized System

    NASA Astrophysics Data System (ADS)

    Takahashi, Masakazu; Nanba, Reiji; Fukue, Yoshinori

    This paper proposes operational Risk Management (RM) method using Failure Mode and Effects Analysis (FMEA) for drug manufacturing computerlized system (DMCS). The quality of drug must not be influenced by failures and operational mistakes of DMCS. To avoid such situation, DMCS has to be conducted enough risk assessment and taken precautions. We propose operational RM method using FMEA for DMCS. To propose the method, we gathered and compared the FMEA results of DMCS, and develop a list that contains failure modes, failures and countermeasures. To apply this list, we can conduct RM in design phase, find failures, and conduct countermeasures efficiently. Additionally, we can find some failures that have not been found yet.

  10. 31 CFR 223.11 - Limitation of risk: Protective methods.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false Limitation of risk: Protective methods. 223.11 Section 223.11 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FISCAL SERVICE, DEPARTMENT OF THE TREASURY FINANCIAL MANAGEMENT SERVICE SURETY COMPANIES DOING BUSINESS WITH THE UNITED STATES §...

  11. Aircraft accidents : method of analysis

    NASA Technical Reports Server (NTRS)

    1929-01-01

    This report on a method of analysis of aircraft accidents has been prepared by a special committee on the nomenclature, subdivision, and classification of aircraft accidents organized by the National Advisory Committee for Aeronautics in response to a request dated February 18, 1928, from the Air Coordination Committee consisting of the Assistant Secretaries for Aeronautics in the Departments of War, Navy, and Commerce. The work was undertaken in recognition of the difficulty of drawing correct conclusions from efforts to analyze and compare reports of aircraft accidents prepared by different organizations using different classifications and definitions. The air coordination committee's request was made "in order that practices used may henceforth conform to a standard and be universally comparable." the purpose of the special committee therefore was to prepare a basis for the classification and comparison of aircraft accidents, both civil and military. (author)

  12. Flow analysis system and method

    NASA Technical Reports Server (NTRS)

    Hill, Wayne S. (Inventor); Barck, Bruce N. (Inventor)

    1998-01-01

    A non-invasive flow analysis system and method wherein a sensor, such as an acoustic sensor, is coupled to a conduit for transmitting a signal which varies depending on the characteristics of the flow in the conduit. The signal is amplified and there is a filter, responsive to the sensor signal, and tuned to pass a narrow band of frequencies proximate the resonant frequency of the sensor. A demodulator generates an amplitude envelope of the filtered signal and a number of flow indicator quantities are calculated based on variations in amplitude of the amplitude envelope. A neural network, or its equivalent, is then used to determine the flow rate of the flow in the conduit based on the flow indicator quantities.

  13. Computational methods for global/local analysis

    NASA Technical Reports Server (NTRS)

    Ransom, Jonathan B.; Mccleary, Susan L.; Aminpour, Mohammad A.; Knight, Norman F., Jr.

    1992-01-01

    Computational methods for global/local analysis of structures which include both uncoupled and coupled methods are described. In addition, global/local analysis methodology for automatic refinement of incompatible global and local finite element models is developed. Representative structural analysis problems are presented to demonstrate the global/local analysis methods.

  14. Risk-Informed Safety Margin Characterization Methods Development Work

    SciTech Connect

    Smith, Curtis L; Ma, Zhegang; Tom Riley; Mandelli, Diego; Nielsen, Joseph W; Alfonsi, Andrea; Rabiti, Cristian

    2014-09-01

    This report summarizes the research activity developed during the Fiscal year 2014 within the Risk Informed Safety Margin and Characterization (RISMC) pathway within the Light Water Reactor Sustainability (LWRS) campaign. This research activity is complementary to the one presented in the INL/EXT-??? report which shows advances Probabilistic Risk Assessment Analysis using RAVEN and RELAP-7 in conjunction to novel flooding simulation tools. Here we present several analyses that prove the values of the RISMC approach in order to assess risk associated to nuclear power plants (NPPs). We focus on simulation based PRA which, in contrast to classical PRA, heavily employs system simulator codes. Firstly we compare, these two types of analyses, classical and RISMC, for a Boiling water reactor (BWR) station black out (SBO) initiating event. Secondly we present an extended BWR SBO analysis using RAVEN and RELAP-5 which address the comments and suggestions received about he original analysis presented in INL/EXT-???. This time we focus more on the stochastic analysis such probability of core damage and on the determination of the most risk-relevant factors. We also show some preliminary results regarding the comparison between RELAP5-3D and the new code RELAP-7 for a simplified Pressurized Water Reactors system. Lastly we present some conceptual ideas regarding the possibility to extended the RISMC capabilities from an off-line tool (i.e., as PRA analysis tool) to an online-tool. In this new configuration, RISMC capabilities can be used to assist and inform reactor operator during real accident scenarios.

  15. Method and tool for network vulnerability analysis

    DOEpatents

    Swiler, Laura Painton; Phillips, Cynthia A.

    2006-03-14

    A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

  16. [Reliability theory based on quality risk network analysis for Chinese medicine injection].

    PubMed

    Li, Zheng; Kang, Li-Yuan; Fan, Xiao-Hui

    2014-08-01

    A new risk analysis method based upon reliability theory was introduced in this paper for the quality risk management of Chinese medicine injection manufacturing plants. The risk events including both cause and effect ones were derived in the framework as nodes with a Bayesian network analysis approach. It thus transforms the risk analysis results from failure mode and effect analysis (FMEA) into a Bayesian network platform. With its structure and parameters determined, the network can be used to evaluate the system reliability quantitatively with probabilistic analytical appraoches. Using network analysis tools such as GeNie and AgenaRisk, we are able to find the nodes that are most critical to influence the system reliability. The importance of each node to the system can be quantitatively evaluated by calculating the effect of the node on the overall risk, and minimization plan can be determined accordingly to reduce their influences and improve the system reliability. Using the Shengmai injection manufacturing plant of SZYY Ltd as a user case, we analyzed the quality risk with both static FMEA analysis and dynamic Bayesian Network analysis. The potential risk factors for the quality of Shengmai injection manufacturing were identified with the network analysis platform. Quality assurance actions were further defined to reduce the risk and improve the product quality. PMID:25509315

  17. Risk analysis for autonomous underwater vehicle operations in extreme environments.

    PubMed

    Brito, Mario Paulo; Griffiths, Gwyn; Challenor, Peter

    2010-12-01

    Autonomous underwater vehicles (AUVs) are used increasingly to explore hazardous marine environments. Risk assessment for such complex systems is based on subjective judgment and expert knowledge as much as on hard statistics. Here, we describe the use of a risk management process tailored to AUV operations, the implementation of which requires the elicitation of expert judgment. We conducted a formal judgment elicitation process where eight world experts in AUV design and operation were asked to assign a probability of AUV loss given the emergence of each fault or incident from the vehicle's life history of 63 faults and incidents. After discussing methods of aggregation and analysis, we show how the aggregated risk estimates obtained from the expert judgments were used to create a risk model. To estimate AUV survival with mission distance, we adopted a statistical survival function based on the nonparametric Kaplan-Meier estimator. We present theoretical formulations for the estimator, its variance, and confidence limits. We also present a numerical example where the approach is applied to estimate the probability that the Autosub3 AUV would survive a set of missions under Pine Island Glacier, Antarctica in January-March 2009. PMID:20731790

  18. Methods development to evaluate the risk of upgrading to DCS: The human factor

    SciTech Connect

    Ostrom, L.T.; Wilhelmsen, C.A.

    1995-04-01

    The NRC recognizes that a more complete technical basis for understanding and regulating advanced digital technologies in commercial nuclear power plants is needed. A concern is that the introduction of digital safety systems may have an impact on risk. There is currently no standard methodology for measuring digital system reliability. A tool currently used to evaluate NPP risk in analog systems is the probabilistic risk assessment (PRA). The use of this tool to evaluate the digital system risk was considered to be a potential methodology for determining the risk. To test this hypothesis, it was decided to perform a limited PRA on a single dominant accident sequence. However, a review of existing human reliability analysis (HRA) methods showed that they were inadequate to analyze systems utilizing digital technology. A four step process was used to adapt existing HRA methodologies to digital environments and to develop new techniques. The HRA methods were then used to analyze an NPP that had undergone a backfit to digital technology in order to determine, as a first step, whether the methods were effective. The very small-break loss of coolant accident sequence was analyzed to determine whether the upgrade to the Eagle-21 process protection system had an effect on risk. The analysis of the very small-break LOCA documented in the Sequoyah PRA was used as the basis of the analysis. The analysis of the results of the HRA showed that the mean human error probabilities for the Eagle-21 PPS were slightly less than those for the analog system it replaced. One important observation from the analysis is that the operators have increased confidence steming from the better level of control provided by the digital system. The analysis of the PRA results, which included the human error component and the Eagle-21 PPS, disclosed that the reactor protection system had a higher failure rate than the analog system, although the difference was not statistically significant.

  19. Geotechnical risk analysis by flat dilatometer (DMT)

    NASA Astrophysics Data System (ADS)

    Amoroso, Sara; Monaco, Paola

    2015-04-01

    In the last decades we have assisted at a massive migration from laboratory testing to in situ testing, to the point that, today, in situ testing is often the major part of a geotechnical investigation. The State of the Art indicates that direct-push in situ tests, such as the Cone Penetration Test (CPT) and the Flat Dilatometer Test (DMT), are fast and convenient in situ tests for routine site investigation. In most cases the DMT estimated parameters, in particular the undrained shear strength su and the constrained modulus M, are used with the common design methods of Geotechnical Engineering for evaluating bearing capacity, settlements etc. The paper focuses on the prediction of settlements of shallow foundations, that is probably the No. 1 application of the DMT, especially in sands, where undisturbed samples cannot be retrieved, and on the risk associated with their design. A compilation of documented case histories that compare DMT-predicted vs observed settlements, was collected by Monaco et al. (2006), indicating that, in general, the constrained modulus M can be considered a reasonable "operative modulus" (relevant to foundations in "working conditions") for settlement predictions based on the traditional linear elastic approach. Indeed, the use of a site investigation method, such as DMT, that improve the accuracy of design parameters, reduces risk, and the design can then center on the site's true soil variability without parasitic test variability. In this respect, Failmezger et al. (1999, 2015) suggested to introduce Beta probability distribution, that provides a realistic and useful description of variability for geotechnical design problems. The paper estimates Beta probability distribution in research sites where DMT tests and observed settlements are available. References Failmezger, R.A., Rom, D., Ziegler, S.R. (1999). "SPT? A better approach of characterizing residual soils using other in-situ tests", Behavioral Characterics of Residual Soils, B

  20. A performance-based approach to landslide risk analysis

    NASA Astrophysics Data System (ADS)

    Romeo, R. W.

    2009-04-01

    An approach for the risk assessment based on a probabilistic analysis of the performance of structures threatened by landslides is shown and discussed. The risk is a possible loss due to the occurrence of a potentially damaging event. Analytically the risk is the probability convolution of hazard, which defines the frequency of occurrence of the event (i.e., the demand), and fragility that defines the capacity of the system to withstand the event given its characteristics (i.e., severity) and those of the exposed goods (vulnerability), that is: Risk=p(D>=d|S,V) The inequality sets a damage (or loss) threshold beyond which the system's performance is no longer met. Therefore a consistent approach to risk assessment should: 1) adopt a probabilistic model which takes into account all the uncertainties of the involved variables (capacity and demand), 2) follow a performance approach based on given loss or damage thresholds. The proposed method belongs to the category of the semi-empirical ones: the theoretical component is given by the probabilistic capacity-demand model; the empirical component is given by the observed statistical behaviour of structures damaged by landslides. Two landslide properties alone are required: the area-extent and the type (or kinematism). All other properties required to determine the severity of landslides (such as depth, speed and frequency) are derived via probabilistic methods. The severity (or intensity) of landslides, in terms of kinetic energy, is the demand of resistance; the resistance capacity is given by the cumulative distribution functions of the limit state performance (fragility functions) assessed via damage surveys and cards compilation. The investigated limit states are aesthetic (of nominal concern alone), functional (interruption of service) and structural (economic and social losses). The damage probability is the probabilistic convolution of hazard (the probability mass function of the frequency of occurrence of given

  1. Automating Risk Analysis of Software Design Models

    PubMed Central

    Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  2. Automating risk analysis of software design models.

    PubMed

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  3. Will HIV Vaccination Reshape HIV Risk Behavior Networks? A Social Network Analysis of Drug Users' Anticipated Risk Compensation

    PubMed Central

    Young, April M.; Halgin, Daniel S.; DiClemente, Ralph J.; Sterk, Claire E.; Havens, Jennifer R.

    2014-01-01

    Background An HIV vaccine could substantially impact the epidemic. However, risk compensation (RC), or post-vaccination increase in risk behavior, could present a major challenge. The methodology used in previous studies of risk compensation has been almost exclusively individual-level in focus, and has not explored how increased risk behavior could affect the connectivity of risk networks. This study examined the impact of anticipated HIV vaccine-related RC on the structure of high-risk drug users' sexual and injection risk network. Methods A sample of 433 rural drug users in the US provided data on their risk relationships (i.e., those involving recent unprotected sex and/or injection equipment sharing). Dyad-specific data were collected on likelihood of increasing/initiating risk behavior if they, their partner, or they and their partner received an HIV vaccine. Using these data and social network analysis, a "post-vaccination network" was constructed and compared to the current network on measures relevant to HIV transmission, including network size, cohesiveness (e.g., diameter, component structure, density), and centrality. Results Participants reported 488 risk relationships. Few reported an intention to decrease condom use or increase equipment sharing (4% and 1%, respectively). RC intent was reported in 30 existing risk relationships and vaccination was anticipated to elicit the formation of five new relationships. RC resulted in a 5% increase in risk network size (n = 142 to n = 149) and a significant increase in network density. The initiation of risk relationships resulted in the connection of otherwise disconnected network components, with the largest doubling in size from five to ten. Conclusions This study demonstrates a new methodological approach to studying RC and reveals that behavior change following HIV vaccination could potentially impact risk network connectivity. These data will be valuable in parameterizing future network models

  4. A Comparison of Methods for Assessing Mortality Risk

    PubMed Central

    Levine, ME; Crimmins, EM

    2014-01-01

    Objectives Concepts such as Allostatic Load, Framingham Risk Score, and Biological Age were developed to combine information from multiple measures into a single latent variable that can be used to quantify a person's biological state. Given these varying approaches, the goal of this paper is to compare how well these three measures predict subsequent all-cause and disease-specific mortality within a large nationally representative U.S. sample. Methods Our study population consisted of 9,942 adults, ages 30 and above from National Health and Nutrition Examination Survey III. Receiver Operating Characteristic curves and Cox Proportional Hazard models for the whole sample and for stratified age groups were used to compare how well Allostatic Load, Framingham Risk Score, and Biological Age predict ten-year all-cause and disease-specific mortality in the sample, for whom there were 1,076 deaths over 96,420 person years of exposure. Results Overall, Biological Age predicted 10-year mortality more accurately than other measures for the full age range, as well as for participants ages 50-69 and 70 +. Additionally, out of the three measures, Biological Age had the strongest association with all-cause and cancer mortality, while the Framingham Risk Score had the strongest association with CVD mortality. Conclusions Methods for quantifying biological risk provide important approaches to improving our understanding of the causes and consequences of changes in physiological function and dysregulation. Biological Age offers an alternative and, in some cases more accurate summary approach to the traditionally used methods, such as Allostatic Load and Framingham Risk Score. PMID:25088793

  5. Risk analysis of a gas-processing complex in India.

    PubMed

    Garg, R K; Khan, A A

    1991-09-01

    ONGC's Hazira Gas-Processing Complex (HGPC) consists of facilities for receiving natural gas along with associated condensate from an off-shore field at a rate of 20 MMN M3 per day. After separating the condensate, which is processed in condensate fractionation units, the gas is processed through various steps to recover LPG and to reduce its dew point to less than 5 degrees C in order to make it suitable for transportation over long distances. The acid gas recovered during the gas-sweetening step is processed to obtain sulphur. The major products manufactured at HGPC therefore are lean sweet gas, LPG, NGL, and sulphur. The Oil and Natural Gas Commission awarded the assignment on Hazard Study and Risk Analysis of their Hazira Gas-Processing Complex (HGPC) to the Council of Scientific and Industrial Research (CSIR) in association with the Netherlands Organisation for Applied Scientific Research (TNO). The scope of this assignment covered a number of closely related and fully defined activities normally encountered in this type of work. Identification of hazards through the most appropriate methods, assigning frequency of occurrence of major unwanted incidents, quantification and assessment of probable damage to plant equipment, environment, human and animal life due to an unexpected event, and evaluation of various methods for reducing risk, together constituted the methodology for this assignment. Detailed recommendations aimed at reducing risk and enhancing reliability of plant and machinery were made. This article gives an overview of the assignment. PMID:1947347

  6. Application of fuzzy expert systems for EOR project risk analysis

    SciTech Connect

    Ting-Horng, Chung; Carroll, H.B.; Lindsey, R.

    1995-12-31

    This work describes a new method for enhanced oil recovery (EOR) project preassessment. Instead of using the conventional costly simulation approach, a fuzzy expert system was developed. A database of EOR project costs and oil prices of the past decades was incorporated into the expert system. The EOR project risk-analysis includes three stages: (1) preliminary screening of EOR methods, (2) field performance estimation, and (3) economic analysis. Since this fuzzy expert system has incorporated both implemented EOR technology and experts` experience, it thus reduces the requirements of massive laboratory and field data for input. Estimates of displacement efficiency (E{sub d}) and sweep efficiency (E{sub v}) were formulated for each EOR process. E{sub d} and E{sub v} were treated as fuzzy variables. The overall recovery efficiency is evaluated from the product of Ed and Ev using fuzzy set arithmetic. Economic analysis is based on the estimated recovery efficiency, residual oil inplace, oil price, and operating costs. Examples of the application of the developed method for a CO{sub 2}-flooding project analysis is presented.

  7. The conditional risk probability-based seawall height design method

    NASA Astrophysics Data System (ADS)

    Yang, Xing; Hu, Xiaodong; Li, Zhiqing

    2015-11-01

    The determination of the required seawall height is usually based on the combination of wind speed (or wave height) and still water level according to a specified return period, e.g., 50-year return period wind speed and 50-year return period still water level. In reality, the two variables are be partially correlated. This may be lead to over-design (costs) of seawall structures. The above-mentioned return period for the design of a seawall depends on economy, society and natural environment in the region. This means a specified risk level of overtopping or damage of a seawall structure is usually allowed. The aim of this paper is to present a conditional risk probability-based seawall height design method which incorporates the correlation of the two variables. For purposes of demonstration, the wind speeds and water levels collected from Jiangsu of China are analyzed. The results show this method can improve seawall height design accuracy.

  8. Probabilistic methods for structural response analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Burnside, O. H.; Cruse, T. A.

    1988-01-01

    This paper addresses current work to develop probabilistic structural analysis methods for integration with a specially developed probabilistic finite element code. The goal is to establish distribution functions for the structural responses of stochastic structures under uncertain loadings. Several probabilistic analysis methods are proposed covering efficient structural probabilistic analysis methods, correlated random variables, and response of linear system under stationary random loading.

  9. 49 CFR 260.17 - Credit risk premium analysis.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Credit risk premium analysis. 260.17 Section 260.17 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD... Financial Assistance § 260.17 Credit risk premium analysis. (a) When Federal appropriations are...

  10. Virtues and Limitations of Risk Analysis

    ERIC Educational Resources Information Center

    Weatherwax, Robert K.

    1975-01-01

    After summarizing the Rasmussion Report, the author reviews the probabilistic portion of the report from the perspectives of engineering utility and risk assessment uncertainty. The author shows that the report may represent a significant step forward in the assurance of reactor safety and an imperfect measure of actual reactor risk. (BT)

  11. Risk analysis for worker exposure to benzene

    NASA Astrophysics Data System (ADS)

    Hallenbeck, William H.; Flowers, Roxanne E.

    1992-05-01

    Cancer risk factors (characterized by route, dose, dose rate per kilogram, fraction of lifetime exposed, species, and sex) were derived for workers exposed to benzene via inhalation or ingestion. Exposure at the current Occupational Safety and Health Administration (OSHA) permissible exposure limit (PEL) and at leaking underground storage tank (LUST) sites were evaluated. At the current PEL of 1 ppm, the theoretical lifetime excess risk of cancer from benzene inhalation is ten per 1000. The theoretical lifetime excess risk for worker inhalation exposure at LUST sites ranged from 10 to 40 per 1000. These results indicate that personal protection should be required. The theoretical lifetime excess risk due to soil ingestion is five to seven orders of magnitude less than the inhalation risks.

  12. a Meteorological Risk Assessment Method for Power Lines Based on GIS and Multi-Sensor Integration

    NASA Astrophysics Data System (ADS)

    Lin, Zhiyong; Xu, Zhimin

    2016-06-01

    Power lines, exposed in the natural environment, are vulnerable to various kinds of meteorological factors. Traditional research mainly deals with the influence of a single meteorological condition on the power line, which lacks of comprehensive effects evaluation and analysis of the multiple meteorological factors. In this paper, we use multiple meteorological monitoring data obtained by multi-sensors to implement the meteorological risk assessment and early warning of power lines. Firstly, we generate meteorological raster map from discrete meteorological monitoring data using spatial interpolation. Secondly, the expert scoring based analytic hierarchy process is used to compute the power line risk index of all kinds of meteorological conditions and establish the mathematical model of meteorological risk. By adopting this model in raster calculator of ArcGIS, we will have a raster map showing overall meteorological risks for power line. Finally, by overlaying the power line buffer layer to that raster map, we will get to know the exact risk index around a certain part of power line, which will provide significant guidance for power line risk management. In the experiment, based on five kinds of observation data gathered from meteorological stations in Guizhou Province of China, including wind, lightning, rain, ice, temperature, we carry on the meteorological risk analysis for the real power lines, and experimental results have proved the feasibility and validity of our proposed method.

  13. A structured elicitation method to identify key direct risk factors for the management of natural resources.

    PubMed

    Smith, Michael; Wallace, Ken; Lewis, Loretta; Wagner, Christian

    2015-11-01

    The high level of uncertainty inherent in natural resource management requires planners to apply comprehensive risk analyses, often in situations where there are few resources. In this paper, we demonstrate a broadly applicable, novel and structured elicitation approach to identify important direct risk factors. This new approach combines expert calibration and fuzzy based mathematics to capture and aggregate subjective expert estimates of the likelihood that a set of direct risk factors will cause management failure. A specific case study is used to demonstrate the approach; however, the described methods are widely applicable in risk analysis. For the case study, the management target was to retain all species that characterise a set of natural biological elements. The analysis was bounded by the spatial distribution of the biological elements under consideration and a 20-year time frame. Fourteen biological elements were expected to be at risk. Eleven important direct risk factors were identified that related to surrounding land use practices, climate change, problem species (e.g., feral predators), fire and hydrological change. In terms of their overall influence, the two most important risk factors were salinisation and a lack of water which together pose a considerable threat to the survival of nine biological elements. The described approach successfully overcame two concerns arising from previous risk analysis work: (1) the lack of an intuitive, yet comprehensive scoring method enabling the detection and clarification of expert agreement and associated levels of uncertainty; and (2) the ease with which results can be interpreted and communicated while preserving a rich level of detail essential for informed decision making. PMID:27441228

  14. A workshop on developing risk assessment methods for medical use of radioactive material. Volume 1: Summary

    SciTech Connect

    Tortorelli, J.P.

    1995-08-01

    A workshop was held at the Idaho National Engineering Laboratory, August 16--18, 1994 on the topic of risk assessment on medical devices that use radioactive isotopes. Its purpose was to review past efforts to develop a risk assessment methodology to evaluate these devices, and to develop a program plan and a scoping document for future methodology development. This report contains a summary of that workshop. Participants included experts in the fields of radiation oncology, medical physics, risk assessment, human-error analysis, and human factors. Staff from the US Nuclear Regulatory Commission (NRC) associated with the regulation of medical uses of radioactive materials and with research into risk-assessment methods participated in the workshop. The workshop participants concurred in NRC`s intended use of risk assessment as an important technology in the development of regulations for the medical use of radioactive material and encouraged the NRC to proceed rapidly with a pilot study. Specific recommendations are included in the executive summary and the body of this report. An appendix contains the 8 papers presented at the conference: NRC proposed policy statement on the use of probabilistic risk assessment methods in nuclear regulatory activities; NRC proposed agency-wide implementation plan for probabilistic risk assessment; Risk evaluation of high dose rate remote afterloading brachytherapy at a large research/teaching institution; The pros and cons of using human reliability analysis techniques to analyze misadministration events; Review of medical misadministration event summaries and comparison of human error modeling; Preliminary examples of the development of error influences and effects diagrams to analyze medical misadministration events; Brachytherapy risk assessment program plan; and Principles of brachytherapy quality assurance.

  15. Risk prediction with machine learning and regression methods.

    PubMed

    Steyerberg, Ewout W; van der Ploeg, Tjeerd; Van Calster, Ben

    2014-07-01

    This is a discussion of issues in risk prediction based on the following papers: "Probability estimation with machine learning methods for dichotomous and multicategory outcome: Theory" by Jochen Kruppa, Yufeng Liu, Gérard Biau, Michael Kohler, Inke R. König, James D. Malley, and Andreas Ziegler; and "Probability estimation with machine learning methods for dichotomous and multicategory outcome: Applications" by Jochen Kruppa, Yufeng Liu, Hans-Christian Diener, Theresa Holste, Christian Weimar, Inke R. König, and Andreas Ziegler. PMID:24615859

  16. Method Analysis of Microbial-Resistant Gypsum Products

    EPA Science Inventory

    Method Analysis of Microbial-Resistant Gypsum ProductsD.A. Betancourt1, T.R.Dean1, A. Evans2, and G.Byfield2 1. US Environmental Protection Agency, Office of Research and Development, National Risk Management Research Laboratory; RTP, NC 277112. RTI International, RTP, NCSeveral...

  17. Prioritizing risk analysis quality control plans based on Sigma-metrics.

    PubMed

    Westgard, Sten

    2013-03-01

    Six Sigma provides data-driven techniques that can enhance and improve the EP23 risk management approach for formulating quality control (QC) Plans. Risk analysis has significant drawbacks in its ability to identify and appropriately prioritize hazards and failure modes for mitigation of risks. Six Sigma quality management is inherently risk oriented on the basis of the required tolerance limits that define defective products. Six Sigma QC tools provide a quantitative assessment of method performance and an objective selection/design of statistical QC procedures. Furthermore, the observed sigma performance of a method is useful for prioritizing the need for development of QC plans. PMID:23331728

  18. Contextual Risk Factors for Low Birth Weight: A Multilevel Analysis

    PubMed Central

    Kayode, Gbenga A.; Amoakoh-Coleman, Mary; Agyepong, Irene Akua; Ansah, Evelyn; Grobbee, Diederick E.; Klipstein-Grobusch, Kerstin

    2014-01-01

    Background Low birth weight (LBW) remains to be a leading cause of neonatal death and a major contributor to infant and under-five mortality. Its prevalence has not declined in the last decade in sub-Saharan Africa (SSA) and Asia. Some individual level factors have been identified as risk factors for LBW but knowledge is limited on contextual risk factors for LBW especially in SSA. Methods Contextual risk factors for LBW in Ghana were identified by performing multivariable multilevel logistic regression analysis of 6,900 mothers dwelling in 412 communities that participated in the 2003 and 2008 Demographic and Health Surveys in Ghana. Results Contextual-level factors were significantly associated with LBW: Being a rural dweller increased the likelihood of having a LBW infant by 43% (OR 1.43; 95% CI 1.01–2.01; P-value <0.05) while living in poverty-concentrated communities increased the risk of having a LBW infant twofold (OR 2.16; 95% CI 1.29–3.61; P-value <0.01). In neighbourhoods with a high coverage of safe water supply the odds of having a LBW infant reduced by 28% (OR 0.74; 95% CI 0.57–0.96; P-value <0.05). Conclusion This study showed contextual risk factors to have independent effects on the prevalence of LBW infants. Being a rural dweller, living in a community with a high concentration of poverty and a low coverage of safe water supply were found to increase the prevalence of LBW infants. Implementing appropriate community-based intervention programmes will likely reduce the occurrence of LBW infants. PMID:25360709

  19. Ethnographic Analysis of Instructional Method.

    ERIC Educational Resources Information Center

    Brooks, Douglas M.

    1980-01-01

    Instructional methods are operational exchanges between participants within environments that attempt to produce a learning outcome. The classroom teacher's ability to produce a learning outcome is the measure of instructional competence within that learning method. (JN)

  20. EC Transmission Line Risk Identification and Analysis

    SciTech Connect

    Bigelow, Tim S

    2012-04-01

    The purpose of this document is to assist in evaluating and planning for the cost, schedule, and technical project risks associated with the delivery and operation of the EC (Electron cyclotron) transmission line system. In general, the major risks that are anticipated to be encountered during the project delivery phase associated with the implementation of the Procurement Arrangement for the EC transmission line system are associated with: (1) Undefined or changing requirements (e.g., functional or regulatory requirements) (2) Underperformance of prototype, first unit, or production components during testing (3) Unavailability of qualified vendors for critical components Technical risks associated with the design and operation of the system are also identified.

  1. Methods and Techniques for Risk Prediction of Space Shuttle Upgrades

    NASA Technical Reports Server (NTRS)

    Hoffman, Chad R.; Pugh, Rich; Safie, Fayssal

    1998-01-01

    Since the Space Shuttle Accident in 1986, NASA has been trying to incorporate probabilistic risk assessment (PRA) in decisions concerning the Space Shuttle and other NASA projects. One major study NASA is currently conducting is in the PRA area in establishing an overall risk model for the Space Shuttle System. The model is intended to provide a tool to predict the Shuttle risk and to perform sensitivity analyses and trade studies including evaluation of upgrades. Marshall Space Flight Center (MSFC) and its prime contractors including Pratt and Whitney (P&W) are part of the NASA team conducting the PRA study. MSFC responsibility involves modeling the External Tank (ET), the Solid Rocket Booster (SRB), the Reusable Solid Rocket Motor (RSRM), and the Space Shuttle Main Engine (SSME). A major challenge that faced the PRA team is modeling the shuttle upgrades. This mainly includes the P&W High Pressure Fuel Turbopump (HPFTP) and the High Pressure Oxidizer Turbopump (HPOTP). The purpose of this paper is to discuss the various methods and techniques used for predicting the risk of the P&W redesigned HPFTP and HPOTP.

  2. Stratospheric Aerosol and Gas Experiment, SAGE III on ISS, An Earth Science Mission on the International Space Station, Schedule Risk Analysis, A Project Perspective

    NASA Technical Reports Server (NTRS)

    Bonine, Lauren

    2015-01-01

    The presentation provides insight into the schedule risk analysis process used by the Stratospheric Aerosol and Gas Experiment III on the International Space Station Project. The presentation focuses on the schedule risk analysis process highlighting the methods for identification of risk inputs, the inclusion of generic risks identified outside the traditional continuous risk management process, and the development of tailored analysis products used to improve risk informed decision making.

  3. Enhancing local action planning through quantitative flood risk analysis: a case study in Spain

    NASA Astrophysics Data System (ADS)

    Castillo-Rodríguez, Jesica Tamara; Escuder-Bueno, Ignacio; Perales-Momparler, Sara; Ramón Porta-Sancho, Juan

    2016-07-01

    This article presents a method to incorporate and promote quantitative risk analysis to support local action planning against flooding. The proposed approach aims to provide a framework for local flood risk analysis, combining hazard mapping with vulnerability data to quantify risk in terms of expected annual affected population, potential injuries, number of fatalities, and economic damages. Flood risk is estimated combining GIS data of loads, system response, and consequences and using event tree modelling for risk calculation. The study area is the city of Oliva, located on the eastern coast of Spain. Results from risk modelling have been used to inform local action planning and to assess the benefits of structural and non-structural risk reduction measures. Results show the potential impact on risk reduction of flood defences and improved warning communication schemes through local action planning: societal flood risk (in terms of annual expected affected population) would be reduced up to 51 % by combining both structural and non-structural measures. In addition, the effect of seasonal population variability is analysed (annual expected affected population ranges from 82 to 107 %, compared with the current situation, depending on occupancy rates in hotels and campsites). Results highlight the need for robust and standardized methods for urban flood risk analysis replicability at regional and national scale.

  4. Risk Analysis and the Construction of News.

    ERIC Educational Resources Information Center

    Wilkins, Lee; Patterson, Philip

    1987-01-01

    Explains that the news media commit fundamental errors of attribution in covering risk situations by (1) treating them as novelties, (2) failing to analyze the entire system, and (3) using insufficiently analytical language. (NKA)

  5. Fire behavior and risk analysis in spacecraft

    NASA Technical Reports Server (NTRS)

    Friedman, Robert; Sacksteder, Kurt R.

    1988-01-01

    Practical risk management for present and future spacecraft, including space stations, involves the optimization of residual risks balanced by the spacecraft operational, technological, and economic limitations. Spacecraft fire safety is approached through three strategies, in order of risk: (1) control of fire-causing elements, through exclusion of flammable materials for example; (2) response to incipient fires through detection and alarm; and (3) recovery of normal conditions through extinguishment and cleanup. Present understanding of combustion in low gravity is that, compared to normal gravity behavior, fire hazards may be reduced by the absence of buoyant gas flows yet at the same time increased by ventilation flows and hot particle expulsion. This paper discusses the application of low-gravity combustion knowledge and appropriate aircraft analogies to fire detection, fire fighting, and fire-safety decisions for eventual fire-risk management and optimization in spacecraft.

  6. Methods of Building Cost Analysis.

    ERIC Educational Resources Information Center

    Building Research Inst., Inc., Washington, DC.

    Presentation of symposium papers includes--(1) a study describing techniques for economic analysis of building designs, (2) three case studies of analysis techniques, (3) procedures for measuring the area and volume of buildings, and (4) an open forum discussion. Case studies evaluate--(1) the thermal economics of building enclosures, (2) an…

  7. Application of advanced reliability methods to local strain fatigue analysis

    NASA Technical Reports Server (NTRS)

    Wu, T. T.; Wirsching, P. H.

    1983-01-01

    When design factors are considered as random variables and the failure condition cannot be expressed by a closed form algebraic inequality, computations of risk (or probability of failure) might become extremely difficult or very inefficient. This study suggests using a simple, and easily constructed, second degree polynomial to approximate the complicated limit state in the neighborhood of the design point; a computer analysis relates the design variables at selected points. Then a fast probability integration technique (i.e., the Rackwitz-Fiessler algorithm) can be used to estimate risk. The capability of the proposed method is demonstrated in an example of a low cycle fatigue problem for which a computer analysis is required to perform local strain analysis to relate the design variables. A comparison of the performance of this method is made with a far more costly Monte Carlo solution. Agreement of the proposed method with Monte Carlo is considered to be good.

  8. Proposal of a management method of rockfall risk induced on a road

    NASA Astrophysics Data System (ADS)

    Mignelli, C.; Peila, D.; Lo Russo, S.

    2012-04-01

    Many kilometers of roads have adjacent rock slopes that are prone to rockfall. The analysis of risks associated with these types of instabilities is a complex operation requiring the precise assessment of hazard, the vulnerability and therefore the risk of vehicles on roads along the foothills. Engineering design of protection devices should aim to minimize risk while taking advantage of the most advanced technologies. Decision makers should be equipped with the technical tools permitting them to choose the best solution within the context of local maximum acceptable risk levels. The fulfilment of safety requirements for mountainside routes involves in many cases the implementation of protective measures and devices to control and manage rockfall and it is of key importance the evaluation of the positive effects of such measures in terms of risk reduction. A risk analysis management procedure for roads subject to rockfall phenomena using a specifically developed method named: Rockfall risk Management (RO.MA.) is presented and discussed. The method is based on statistic tools, using as input the data coming both from in situ survey and from historical data. It is important to highline that historical database are not often available and usually there is a lack of useful information due to a not complete setting of parameters. The analysis based only on historical data can be difficult to be developed. For this purpose a specific database collection system has been developed to provide geotechnical and geomechanical description of the studied rockside. This parameters and the data collected from historical database, define the input parameters of the Ro.Ma method. Moreover to allow the quantification of the harm, the data coming from the monitoring of the road by the road manager are required. The value of harm is proportional to the number of persons on the road (i.e. people in a vehicle) and the following traffic characteristics: type of vehicles (i.e. bicycles

  9. Toward a risk assessment of the spent fuel and high-level nuclear waste disposal system. Risk assessment requirements, literature review, methods evaluation: an interim report

    SciTech Connect

    Hamilton, L.D.; Hill, D.; Rowe, M.D.; Stern, E.

    1986-04-01

    This report provides background information for a risk assessment of the disposal system for spent nuclear fuel and high-level radioactive waste (HLW). It contains a literature review, a survey of the statutory requirements for risk assessment, and a preliminary evaluation of methods. The literature review outlines the state of knowledge of risk assessment and accident consequence analysis in the nuclear fuel cycle and its applicability to spent fuel and HLW disposal. The survey of statutory requirements determines the extent to which risk assessment may be needed in development of the waste-disposal system. The evaluation of methods reviews and evaluates merits and applicabilities of alternative methods for assessing risks and relates them to the problems of spent fuel and HLW disposal. 99 refs.

  10. Survey and evaluation of aging risk assessment methods and applications

    SciTech Connect

    Sanzo, D.L.; Kvam, P.; Apostolakis, G.; Wu, J.; Milici, T.; Ghoniem, N.; Guarro, S.

    1993-11-01

    The Nuclear Regulatory Commission (NRC) initiated the nuclear power plant aging research (NPAR) program about 6 years ago to gather information about nuclear power plant aging. Since then, this program has collected a significant amount of information, largely qualitative, on plant aging and its potential effects on plant safety. However, this body of knowledge has not yet been integrated into formalisms that can be used effectively and systematically to assess plant risk resulting from aging, although models for assessing the effect of increasing failure rates on core damage frequency have been proposed. The purpose of this review is to survey the work conducted to address the aging of systems, structures, and components (SSCs) of nuclear power plants (NPPs), as well as the associated data bases. The review takes a critical look at the need to revise probabilistic risk assessment (PRAs) so that they will include the contribution to risk from plant aging, the adequacy of existing methods for evaluating this contribution, and the adequacy of the data that have been used in these evaluation methods. A preliminary framework is identified for integrating the aging of SSCs into the PRA, including the identification of needed data for such an integration.

  11. The role of risk analysis in understanding bioterrorism.

    PubMed

    Haas, Charles N

    2002-08-01

    Recent events have made the domestic risk from bioterrorism more tangible. The risk management process so far, however, has not benefited from many of the contributions that analysts, communicators, and managers can make to the public discourse. Risk professionals can contribute much to the understanding of and solutions to bioterrorist events and threats. This article will provide an overview of the bioterrorism problem and outline a number of areas to which members of the Society for Risk Analysis, and other risk practitioners, could usefully contribute. PMID:12224741

  12. Risk analysis of an RTG on the Space Shuttle

    NASA Technical Reports Server (NTRS)

    Frank, Michael V.

    1991-01-01

    As part of the effort to review the Ulysses Final Safety Analysis Report and to understand the risk of plutonium release from the Ulysses spacecraft General Purpose Heat Source-Radioisotope Thermal Generator (GPHS-RTG), the Interagency Nuclear Safety Review Panel (INSRP) and the author performed an integrated, quantitative analysis of the uncertainties of the calculated risk of plutonium release from Ulysses. Using state-of-the-art probabilistic risk assessment technology, the uncertainty analysis accounted for both variability and uncertainty of the key parameters of the risk analysis. The results show tht INSRP had high confidence that risk of fatal cancers from potential plutonium release associated with calculated launch and deployment accident scenarios is low.

  13. Space Weather Influence on Power Systems: Prediction, Risk Analysis, and Modeling

    NASA Astrophysics Data System (ADS)

    Yatsenko, Vitaliy

    2016-04-01

    This report concentrates on dynamic probabilistic risk analysis of optical elements for complex characterization of damages using physical model of solid state lasers and predictable level of ionizing radiation and space weather. The following main subjects will be covered by our report: (a) solid-state laser model; (b) mathematical models for dynamic probabilistic risk assessment; and (c) software for modeling and prediction of ionizing radiation. A probabilistic risk assessment method for solid-state lasers is presented with consideration of some deterministic and stochastic factors. Probabilistic risk assessment is a comprehensive, structured, and logical analysis method aimed at identifying and assessing risks in solid-state lasers for the purpose of cost-e®ectively improving their safety and performance. This method based on the Conditional Value-at-Risk measure (CVaR) and the expected loss exceeding Value-at-Risk (VaR). We propose to use a new dynamical-information approach for radiation damage risk assessment of laser elements by cosmic radiation. Our approach includes the following steps: laser modeling, modeling of ionizing radiation in°uences on laser elements, probabilistic risk assessment methods, and risk minimization. For computer simulation of damage processes at microscopic and macroscopic levels the following methods are used: () statistical; (b) dynamical; (c) optimization; (d) acceleration modeling, and (e) mathematical modeling of laser functioning. Mathematical models of space ionizing radiation in°uence on laser elements were developed for risk assessment in laser safety analysis. This is a so-called `black box' or `input-output' models, which seeks only to reproduce the behaviour of the system's output in response to changes in its inputs. The model inputs are radiation in°uences on laser systems and output parameters are dynamical characteristics of the solid laser. Algorithms and software for optimal structure and parameters of

  14. Use-related risk analysis for medical devices based on improved FMEA.

    PubMed

    Liu, Long; Shuai, Ma; Wang, Zhu; Li, Ping

    2012-01-01

    In order to effectively analyze and control use-related risk of medical devices, quantitative methodologies must be applied. Failure Mode and Effects Analysis (FMEA) is a proactive technique for error detection and risk reduction. In this article, an improved FMEA based on Fuzzy Mathematics and Grey Relational Theory is developed to better carry out user-related risk analysis for medical devices. As an example, the analysis process using this improved FMEA method for a certain medical device (C-arm X-ray machine) is described. PMID:22317712

  15. Metabolic Disease Risk in Children by Salivary Biomarker Analysis

    PubMed Central

    Goodson, J. Max; Kantarci, Alpdogan; Hartman, Mor-Li; Denis, Gerald V.; Stephens, Danielle; Hasturk, Hatice; Yaskell, Tina; Vargas, Jorel; Wang, Xiaoshan; Cugini, Maryann; Barake, Roula; Alsmadi, Osama; Al-Mutawa, Sabiha; Ariga, Jitendra; Soparkar, Pramod; Behbehani, Jawad; Behbehani, Kazem; Welty, Francine

    2014-01-01

    Objective The study of obesity-related metabolic syndrome or Type 2 diabetes (T2D) in children is particularly difficult because of fear of needles. We tested a non-invasive approach to study inflammatory parameters in an at-risk population of children to provide proof-of-principle for future investigations of vulnerable subjects. Design and Methods We evaluated metabolic differences in 744, 11-year old children selected from underweight, normal healthy weight, overweight and obese categories by analyzing fasting saliva samples for 20 biomarkers. Saliva supernatants were obtained following centrifugation and used for analyses. Results Salivary C-reactive protein (CRP) was 6 times higher, salivary insulin and leptin were 3 times higher, and adiponectin was 30% lower in obese children compared to healthy normal weight children (all P<0.0001). Categorical analysis suggested that there might be three types of obesity in children. Distinctly inflammatory characteristics appeared in 76% of obese children while in 13%, salivary insulin was high but not associated with inflammatory mediators. The remaining 11% of obese children had high insulin and reduced adiponectin. Forty percent of the non-obese children were found in groups which, based on biomarker characteristics, may be at risk for becoming obese. Conclusions Significantly altered levels of salivary biomarkers in obese children from a high-risk population, suggest the potential for developing non-invasive screening procedures to identify T2D-vulnerable individuals and a means to test preventative strategies. PMID:24915044

  16. Analysis of Radiation Pneumonitis Risk Using a Generalized Lyman Model

    SciTech Connect

    Tucker, Susan L. Liu, H. Helen; Liao Zhongxing; Wei Xiong; Wang Shulian; Jin Hekun; Komaki, Ritsuko; Martel, Mary K.; Mohan, Radhe

    2008-10-01

    Purpose: To introduce a version of the Lyman normal-tissue complication probability (NTCP) model adapted to incorporate censored time-to-toxicity data and clinical risk factors and to apply the generalized model to analysis of radiation pneumonitis (RP) risk. Methods and Materials: Medical records and radiation treatment plans were reviewed retrospectively for 576 patients with non-small cell lung cancer treated with radiotherapy. The time to severe (Grade {>=}3) RP was computed, with event times censored at last follow-up for patients not experiencing this endpoint. The censored time-to-toxicity data were analyzed using the standard and generalized Lyman models with patient smoking status taken into account. Results: The generalized Lyman model with patient smoking status taken into account produced NTCP estimates up to 27 percentage points different from the model based on dose-volume factors alone. The generalized model also predicted that 8% of the expected cases of severe RP were unobserved because of censoring. The estimated volume parameter for lung was not significantly different from n = 1, corresponding to mean lung dose. Conclusions: NTCP models historically have been based solely on dose-volume effects and binary (yes/no) toxicity data. Our results demonstrate that inclusion of nondosimetric risk factors and censored time-to-event data can markedly affect outcome predictions made using NTCP models.

  17. Site Response and Liquefaction Risk Analysis for Bucharest, Romania

    NASA Astrophysics Data System (ADS)

    Hannich, D.; Ehret, D.; Hoetzl, H.; Grandas, C.; Huber, G.; Bala, A.

    2007-12-01

    Bucharest, the capital of Romania, with more than 2 million inhabitants, is considered, after Istanbul, the second- most earthquake-endangered metropolis in Europe. Four major earthquakes with moment-magnitudes between 6.9 and 7.7 hit Bucharest in the last 65 years. All disastrous earthquakes are generated within a small epicentral area - the Vrancea region - about 150 km northeast of Bucharest. Thick unconsolidated sedimentary layers in the area of Bucharest amplify the arriving seismic shear waves causing severe destruction. Thus, pertinent site response analysis combined with liquefaction risk analysis for the city area are of highest priority for the disaster prevention and mitigation of earthquake effects. Within the frame of the Collaborative Research Center (CRC) 461: "Strong Earthquakes: A Challenge for Geosciences and Civil Engineering, at the University of Karlsruhe, Germany, recently detailed field investigations for the near-surface soil layers in Bucharest were performed. These include Seismic Cone Penetration Tests (SCPTU) and seismic refraction measurements for shallow depths. SCPTU is used to obtain a detailed distribution of the shear wave velocities and in situ state parameters of soils. The results are used for site response analysis with linear and non-linear wave-propagation models as well as for a liquefaction risk analysis. Commonly linear-equivalent models are used to simulate the response of the soil during earthquakes. Nevertheless, the linear-equivalent model cannot reproduce non-linear effects like liquefaction, layer isolation, and consolidation during shaking. Thus, a 1-D wave propagation model was used to study the influence of non- linear effects in the site response analysis. Cone resistance, sleeve friction and pore water pressure registered continuous by the SCPTU method until depths of 35m permit to determine by simplified empirical methods the factor of safety and the probability of liquefaction for different soil layers at

  18. Pesticide residues in cashew apple, guava, kaki and peach: GC-μECD, GC-FPD and LC-MS/MS multiresidue method validation, analysis and cumulative acute risk assessment.

    PubMed

    Jardim, Andréia Nunes Oliveira; Mello, Denise Carvalho; Goes, Fernanda Caroline Silva; Frota Junior, Elcio Ferreira; Caldas, Eloisa Dutra

    2014-12-01

    A multiresidue method for the determination of 46 pesticides in fruits was validated. Samples were extracted with acidified ethyl acetate, MgSO4 and CH3COONa and cleaned up by dispersive SPE with PSA. The compounds were analysed by GC-FPD, GC-μECD or LC-MS/MS, with LOQs from 1 to 8 μg/kg. The method was used to analyse 238 kaki, cashew apple, guava, and peach fruit and pulp samples, which were also analysed for dithiocarbamates (DTCs) using a spectrophotometric method. Over 70% of the samples were positive, with DTC present in 46.5%, λ-cyhalothrin in 37.1%, and omethoate in 21.8% of the positive samples. GC-MS/MS confirmed the identities of the compounds detected by GC. None of the pesticides found in kaki, cashew apple and guava was authorised for these crops in Brazil. The risk assessment has shown that the cumulative acute intake of organophosphorus or pyrethroid compounds from the consumption of these fruits is unlikely to pose a health risk to consumers. PMID:24996324

  19. Prognostic Analysis System and Methods of Operation

    NASA Technical Reports Server (NTRS)

    MacKey, Ryan M. E. (Inventor); Sneddon, Robert (Inventor)

    2014-01-01

    A prognostic analysis system and methods of operating the system are provided. In particular, a prognostic analysis system for the analysis of physical system health applicable to mechanical, electrical, chemical and optical systems and methods of operating the system are described herein.

  20. Risk analysis of Finnish peacekeeping in Kosovo.

    PubMed

    Lehtomäki, Kyösti; Pääkkönen, Rauno J; Rantanen, Jorma

    2005-04-01

    The research team interviewed over 90 Finnish battalion members in Kosovo, visited 22 units or posts, registered its observations, and made any necessary measurements. Key persons were asked to list the most important risks for occupational safety and health in their area of responsibility. Altogether, 106 accidents and 40 cases of disease resulted in compensation claims in 2000. The risks to the peacekeeping force were about twice those of the permanent staff of military trainees in Finland. Altogether, 21 accidents or cases of disease resulted in sick leave for at least 3 months after service. One permanent injury resulted from an explosion. Biological, chemical, and physical factors caused 8 to 9 occupational illnesses each. Traffic accidents, operational factors, and munitions and mines were evaluated to be the three most important risk factors, followed by occupational hygiene, living conditions (mold, fungi, dust), and general hygiene. Possible fatal risks, such as traffic accidents and munitions and explosives, received a high ranking in both the subjective and the objective evaluations. One permanent injury resulted from an explosion, and two traffic accidents involved a fatality, although not of a peacekeeper. The reduction of sports and military training accidents, risk-control programs, and, for some tasks, better personal protection is considered a development challenge for the near future. PMID:15876212

  1. Convergence analysis of combinations of different methods

    SciTech Connect

    Kang, Y.

    1994-12-31

    This paper provides a convergence analysis for combinations of different numerical methods for solving systems of differential equations. The author proves that combinations of two convergent linear multistep methods or Runge-Kutta methods produce a new convergent method of which the order is equal to the smaller order of the two original methods.

  2. Control of Risks Through the Use of Procedures: A Method for Evaluating the Change in Risk

    NASA Technical Reports Server (NTRS)

    Praino, Gregory T.; Sharit, Joseph

    2010-01-01

    This paper considers how procedures can be used to control risks faced by an organization and proposes a means of recognizing if a particular procedure reduces risk or contributes to the organization's exposure. The proposed method was developed out of the review of work documents and the governing procedures performed in the wake of the Columbia accident by NASA and the Space Shuttle prime contractor, United Space Alliance, LLC. A technique was needed to understand the rules, or procedural controls, in place at the time in the context of how important the role of each rule was. The proposed method assesses procedural risks, the residual risk associated with a hazard after a procedure's influence is accounted for, by considering each clause of a procedure as a unique procedural control that may be beneficial or harmful. For procedural risks with consequences severe enough to threaten the survival of the organization, the method measures the characteristics of each risk on a scale that is an alternative to the traditional consequence/likelihood couple. The dual benefits of the substitute scales are that they eliminate both the need to quantify a relationship between different consequence types and the need for the extensive history a probabilistic risk assessment would require. Control Value is used as an analog for the consequence, where the value of a rule is based on how well the control reduces the severity of the consequence when operating successfully. This value is composed of two parts: the inevitability of the consequence in the absence of the control, and the opportunity to intervene before the consequence is realized. High value controls will be ones where there is minimal need for intervention but maximum opportunity to actively prevent the outcome. Failure Likelihood is used as the substitute for the conventional likelihood of the outcome. For procedural controls, a failure is considered to be any non-malicious violation of the rule, whether intended or

  3. Human-centered risk management for medical devices - new methods and tools.

    PubMed

    Janß, Armin; Plogmann, Simon; Radermacher, Klaus

    2016-04-01

    Studies regarding adverse events with technical devices in the medical context showed, that in most of the cases non-usable interfaces are the cause for use deficiencies and therefore a potential harm for the patient and third parties. This is partially due to the lack of suitable methods for interlinking usability engineering and human-centered risk management. Especially regarding the early identification of human-induced errors and the systematic control of these failures, medical device manufacturers and in particular the developers have to be supported in order to guarantee reliable design and error-tolerant human-machine interfaces (HMI). In this context, we developed the HiFEM methodology and a corresponding software tool (mAIXuse) for model-based human risk analysis. Based on a two-fold approach, HiFEM provides a task-type-sensitive modeling structure with integrated temporal relations in order to represent and analyze the use process in a detailed way. The approach can be used from early developmental stages up to the validation process. Results of a comparative study with the HiFEM method and a classical process-failure mode and effect analysis (FMEA) depict, that the new modeling and analysis technique clearly outperforms the FMEA. Besides, we implemented a new method for systematic human risk control (mAIXcontrol). Accessing information from the method's knowledge base enables the operator to detect the most suitable countermeasures for a respective risk. Forty-one approved generic countermeasure principles have been indexed as a resulting combination of root causes and failures in a matrix. The methodology has been tested in comparison to a conventional approach as well. Evaluation of the matrix and the reassessment of the risk priority numbers by a blind expert demonstrate a substantial benefit of the new mAIXcontrol method. PMID:26985681

  4. Oil shale health and environmental risk analysis

    SciTech Connect

    Gratt, L.B.

    1983-04-01

    The potential human health and environmental risks of hypothetical one-million-barrels-per-day oil shale industry have been analyzed to serve as an aid in the formulation and management of a program of environmental research. The largest uncertainties for expected fatalities are in the public sector from air pollutants although the occupational sector is estimated to have 60% more expected fatalities than the public sector. Occupational safety and illness have been analyzed for the oil shale fuel cycle from extraction to delivery of products for end use. Pneumoconiosis from the dust environment is the worker disease resulting in the greatest number of fatalities, followed by chronic bronchitis, internal cancer, and skin cancers, respectively. Research recommendations are presented for reducing the uncertainties in the risks analyzed and to fill data gaps to estimate other risks.

  5. Perceived Risks Associated with Contraceptive Method Use among Men and Women in Ibadan and Kaduna, Nigeria.

    PubMed

    Schwandt, Hilary M; Skinner, Joanna; Hebert, Luciana E; Saad, Abdulmumin

    2015-12-01

    Research shows that side effects are often the most common reason for contraceptive non-use in Nigeria; however, research to date has not explored the underlying factors that influence risk and benefit perceptions associated with specific contraceptive methods in Nigeria. A qualitative study design using focus group discussions was used to explore social attitudes and beliefs about family planning methods in Ibadan and Kaduna, Nigeria. A total of 26 focus group discussions were held in 2010 with men and women of reproductive age, disaggregated by city, sex, age, marital status, neighborhood socioeconomic status, and--for women only--family planning experience. A discussion guide was used that included specific questions about the perceived risks and benefits associated with the use of six different family planning methods. A thematic content analytic approach guided the analysis. Participants identified a spectrum of risks encompassing perceived threats to health (both real and fictitious) and social concerns, as well as benefits associated with each method. By exploring Nigerian perspectives on the risks and benefits associated with specific family planning methods, programs aiming to increase contraceptive use in Nigeria can be better equipped to highlight recognized benefits, address specific concerns, and work to dispel misperceptions associated with each family planning method. PMID:27337851

  6. Convex geometry analysis method of hyperspectral data

    NASA Astrophysics Data System (ADS)

    Gong, Yanjun; Wang, XiChang; Qi, Hongxing; Yu, BingXi

    2003-06-01

    We present matrix expression of convex geometry analysis method of hyperspectral data by linear mixing model and establish a mathematic model of endmembers. A 30-band remote sensing image is applied to testify the model. The results of analysis reveal that the method can analyze mixed pixel questions. The targets that are smaller than earth surface pixel can be identified by applying the method.

  7. Risk analysis: divergent models and convergent interpretations

    NASA Technical Reports Server (NTRS)

    Carnes, B. A.; Gavrilova, N.

    2001-01-01

    Material presented at a NASA-sponsored workshop on risk models for exposure conditions relevant to prolonged space flight are described in this paper. Analyses used mortality data from experiments conducted at Argonne National Laboratory on the long-term effects of external whole-body irradiation on B6CF1 mice by 60Co gamma rays and fission neutrons delivered as a single exposure or protracted over either 24 or 60 once-weekly exposures. The maximum dose considered was restricted to 1 Gy for neutrons and 10 Gy for gamma rays. Proportional hazard models were used to investigate the shape of the dose response at these lower doses for deaths caused by solid-tissue tumors and tumors of either connective or epithelial tissue origin. For protracted exposures, a significant mortality effect was detected at a neutron dose of 14 cGy and a gamma-ray dose of 3 Gy. For single exposures, radiation-induced mortality for neutrons also occurred within the range of 10-20 cGy, but dropped to 86 cGy for gamma rays. Plots of risk relative to control estimated for each observed dose gave a visual impression of nonlinearity for both neutrons and gamma rays. At least for solid-tissue tumors, male and female mortality was nearly identical for gamma-ray exposures, but mortality risks for females were higher than for males for neutron exposures. As expected, protracting the gamma-ray dose reduced mortality risks. Although curvature consistent with that observed visually could be detected by a model parameterized to detect curvature, a relative risk term containing only a simple term for total dose was usually sufficient to describe the dose response. Although detectable mortality for the three pathology end points considered typically occurred at the same level of dose, the highest risks were almost always associated with deaths caused by tumors of epithelial tissue origin.

  8. Risk analysis: Wet weather flows in S.E. Michigan

    SciTech Connect

    Bulkley, J.W.

    1994-12-31

    Institutional aspects of risk analysis are examined in two separate activities in the state of Michigan. The state of Michigan`s relative risk project, patterned after the US EPA`s risk reduction report, provided an institutional mechanism to rank environmental issues facing the state of Michigan in terms of relative risk in 1991--92. This project identified 24 important environmental issues facing the state. These were categorized into four primary groups ranging from highest priority to medium priority. Seventeen of the identified issues are directly related to water resources, including water pollution in the Great Lakes. Combined sewer overflows are identified as the primary focus remaining for point-source discharges. The second institutional aspect of risk analysis considered in this paper is the development of a demonstration program for combined sewer overflow in the Rouge River Basin, which will help establish the trade-off between increased and risk reduction from such overflows.

  9. Integrated Hybrid System Architecture for Risk Analysis

    NASA Technical Reports Server (NTRS)

    Moynihan, Gary P.; Fonseca, Daniel J.; Ray, Paul S.

    2010-01-01

    A conceptual design has been announced of an expert-system computer program, and the development of a prototype of the program, intended for use as a project-management tool. The program integrates schedule and risk data for the purpose of determining the schedule applications of safety risks and, somewhat conversely, the effects of changes in schedules on changes on safety. It is noted that the design has been delivered to a NASA client and that it is planned to disclose the design in a conference presentation.

  10. What risks do people perceive in everyday life? A perspective gained from the experience sampling method (ESM).

    PubMed

    Hogarth, Robin M; Portell, Mariona; Cuxart, Anna

    2007-12-01

    The experience sampling method (ESM) was used to collect data from 74 part-time students who described and assessed the risks involved in their current activities when interrupted at random moments by text messages. The major categories of perceived risk were short term in nature and involved "loss of time or materials" related to work and "physical damage" (e.g., from transportation). Using techniques of multilevel analysis, we demonstrate effects of gender, emotional state, and types of risk on assessments of risk. Specifically, females do not differ from males in assessing the potential severity of risks but they see these as more likely to occur. Also, participants assessed risks to be lower when in more positive self-reported emotional states. We further demonstrate the potential of ESM by showing that risk assessments associated with current actions exceed those made retrospectively. We conclude by noting advantages and disadvantages of ESM for collecting data about risk perceptions. PMID:18093044

  11. Analysis of driver casualty risk for different work zone types.

    PubMed

    Weng, Jinxian; Meng, Qiang

    2011-09-01

    Using driver casualty data from the Fatality Analysis Report System, this study examines driver casualty risk and investigates the risk contributing factors in the construction, maintenance and utility work zones. The multiple t-tests results show that the driver casualty risk is statistically different depending on the work zone type. Moreover, construction work zones have the largest driver casualty risk, followed by maintenance and utility work zones. Three separate logistic regression models are developed to predict driver casualty risk for the three work zone types because of their unique features. Finally, the effects of risk factors on driver casualty risk for each work zone type are examined and compared. For all three work zone types, five significant risk factors including road alignment, truck involvement, most harmful event, vehicle age and notification time are associated with increased driver casualty risk while traffic control devices and restraint use are associated with reduced driver casualty risk. However, one finding is that three risk factors (light condition, gender and day of week) exhibit opposing effects on the driver casualty risk in different types of work zones. This may largely be due to different work zone features and driver behavior in different types of work zones. PMID:21658509

  12. On methods for assessing water-resource risks and vulnerabilities

    NASA Astrophysics Data System (ADS)

    Gleick, Peter H.

    2015-11-01

    Because of the critical role that freshwater plays in maintaining ecosystem health and supporting human development through agricultural and industrial production there have been numerous efforts over the past few decades to develop indicators and indices of water vulnerability. Each of these efforts has tried to identify key factors that both offer insights into water-related risks and strategies that might be useful for reducing those risks. These kinds of assessments have serious limitations associated with data, the complexity of water challenges, and the changing nature of climatic and hydrologic variables. This new letter by Padowski et al (2015 Environ. Res. Lett. 10 104014) adds to the field by broadening the kinds of measures that should be integrated into such tools, especially in the area of institutional characteristics, and analyzing them in a way that provides new insights into the similarities and differences in water risks facing different countries, but much more can and should be done with new data and methods to improve our understanding of water challenges.

  13. 2007 Wholesale Power Rate Case Final Proposal : Risk Analysis Study.

    SciTech Connect

    United States. Bonneville Power Administration.

    2006-07-01

    BPA's operating environment is filled with numerous uncertainties, and thus the rate-setting process must take into account a wide spectrum of risks. The objective of the Risk Analysis is to identify, model, and analyze the impacts that key risks have on BPA's net revenue (total revenues less total expenses). This is carried out in two distinct steps: a risk analysis step, in which the distributions, or profiles, of operating and non operating risks are defined, and a risk mitigation step, in which different rate tools are tested to assess their ability to recover BPA's costs in the face of this uncertainty. Two statistical models are used in the risk analysis step for this rate proposal, the Risk Analysis Model (RiskMod), and the Non-Operating Risk Model (NORM), while a third model, the ToolKit, is used to test the effectiveness of rate tools options in the risk mitigation step. RiskMod is discussed in Sections 2.1 through 2.4, the NORM is discussed in Section 2.5, and the ToolKit is discussed in Section 3. The models function together so that BPA can develop rates that cover all of its costs and provide a high probability of making its Treasury payments on time and in full during the rate period. By law, BPA's payments to Treasury are the lowest priority for revenue application, meaning that payments to Treasury are the first to be missed if financial reserves are insufficient to pay all bills on time. For this reason, BPA measures its potential for recovering costs in terms of probability of being able to make Treasury payments on time (also known as Treasury Payment Probability or TPP).

  14. Modeling Finite-Time Failure Probabilities in Risk Analysis Applications.

    PubMed

    Dimitrova, Dimitrina S; Kaishev, Vladimir K; Zhao, Shouqi

    2015-10-01

    In this article, we introduce a framework for analyzing the risk of systems failure based on estimating the failure probability. The latter is defined as the probability that a certain risk process, characterizing the operations of a system, reaches a possibly time-dependent critical risk level within a finite-time interval. Under general assumptions, we define two dually connected models for the risk process and derive explicit expressions for the failure probability and also the joint probability of the time of the occurrence of failure and the excess of the risk process over the risk level. We illustrate how these probabilistic models and results can be successfully applied in several important areas of risk analysis, among which are systems reliability, inventory management, flood control via dam management, infectious disease spread, and financial insolvency. Numerical illustrations are also presented. PMID:26010201

  15. Computational methods in the pricing and risk management of modern financial derivatives

    NASA Astrophysics Data System (ADS)

    Deutsch, Hans-Peter

    1999-09-01

    In the last 20 years modern finance has developed into a complex mathematically challenging field. Very complicated risks exist in financial markets which need very advanced methods to measure and/or model them. The financial instruments invented by the market participants to trade these risk, the so called derivatives are usually even more complicated than the risks themselves and also sometimes generate new riks. Topics like random walks, stochastic differential equations, martingale measures, time series analysis, implied correlations, etc. are of common use in the field. This is why more and more people with a science background, such as physicists, mathematicians, or computer scientists, are entering the field of finance. The measurement and management of all theses risks is the key to the continuing success of banks. This talk gives insight into today's common methods of modern market risk management such as variance-covariance, historical simulation, Monte Carlo, “Greek” ratios, etc., including the statistical concepts on which they are based. Derivatives are at the same time the main reason for and the most effective means of conducting risk management. As such, they stand at the beginning and end of risk management. The valuation of derivatives and structured financial instruments is therefore the prerequisite, the condition sine qua non, for all risk management. This talk introduces some of the important valuation methods used in modern derivatives pricing such as present value, Black-Scholes, binomial trees, Monte Carlo, etc. In summary this talk highlights an area outside physics where there is a lot of interesting work to do, especially for physicists. Or as one of our consultants said: The fascinating thing about this job is that Arthur Andersen hired me not ALTHOUGH I am a physicist but BECAUSE I am a physicist.

  16. Metabolic syndrome risk factors and dry eye syndrome: a Meta-analysis

    PubMed Central

    Tang, Ye-Lei; Cheng, Ya-Lan; Ren, Yu-Ping; Yu, Xiao-Ning; Shentu, Xing-Chao

    2016-01-01

    AIM To explore the relationship between metabolic risk factors and dry eye syndrome (DES). METHODS Retrieved studies on the association of metabolic syndrome risk factors (hypertension, hyperglycemia, obesity, and hyperlipidemia) and DES were collected from PubMed, Web of Science, and the Cochrane Library in December 2015. Odds ratio (OR) with 95% confidence interval (CI) were pooled to evaluate the final relationship. Subgroup analyses were conducted according to diagnostic criteria of DES. RESULTS Nine cross-sectional studies and three case-control studies were included in this Meta-analysis. The pooled results showed that people with hypertension, hyperglycemia, and hyperlipidemia had a higher risk of suffering from DES (P<0.05), especially the typical DES symptoms. On the other hand, obesity did not increase the risk of DES. CONCLUSION The present Meta-analysis suggests that all metabolic risk factors except obesity were risk factors for DES. PMID:27500114

  17. Methods of DNA methylation analysis.

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The purpose of this review was to provide guidance for investigators who are new to the field of DNA methylation analysis. Epigenetics is the study of mitotically heritable alterations in gene expression potential that are not mediated by changes in DNA sequence. Recently, it has become clear that n...

  18. Risk and value analysis of SETI

    NASA Technical Reports Server (NTRS)

    Billingham, J.

    1986-01-01

    The risks, values, and costs of the SETI project are evaluated and compared with those of the Viking project. Examination of the scientific values, side benefits, and costs of the two projects reveal that both projects provide equal benefits at equal costs. The probability of scientific and technical success is analyzed.

  19. Passive sampling methods for contaminated sediments: risk assessment and management.

    PubMed

    Greenberg, Marc S; Chapman, Peter M; Allan, Ian J; Anderson, Kim A; Apitz, Sabine E; Beegan, Chris; Bridges, Todd S; Brown, Steve S; Cargill, John G; McCulloch, Megan C; Menzie, Charles A; Shine, James P; Parkerton, Thomas F

    2014-04-01

    This paper details how activity-based passive sampling methods (PSMs), which provide information on bioavailability in terms of freely dissolved contaminant concentrations (Cfree ), can be used to better inform risk management decision making at multiple points in the process of assessing and managing contaminated sediment sites. PSMs can increase certainty in site investigation and management, because Cfree is a better predictor of bioavailability than total bulk sediment concentration (Ctotal ) for 4 key endpoints included in conceptual site models (benthic organism toxicity, bioaccumulation, sediment flux, and water column exposures). The use of passive sampling devices (PSDs) presents challenges with respect to representative sampling for estimating average concentrations and other metrics relevant for exposure and risk assessment. These challenges can be addressed by designing studies that account for sources of variation associated with PSMs and considering appropriate spatial scales to meet study objectives. Possible applications of PSMs include: quantifying spatial and temporal trends in bioavailable contaminants, identifying and evaluating contaminant source contributions, calibrating site-specific models, and, improving weight-of-evidence based decision frameworks. PSM data can be used to assist in delineating sediment management zones based on likelihood of exposure effects, monitor remedy effectiveness, and, evaluate risk reduction after sediment treatment, disposal, or beneficial reuse after management actions. Examples are provided illustrating why PSMs and freely dissolved contaminant concentrations (Cfree ) should be incorporated into contaminated sediment investigations and study designs to better focus on and understand contaminant bioavailability, more accurately estimate exposure to sediment-associated contaminants, and better inform risk management decisions. Research and communication needs for encouraging broader use are discussed. PMID

  20. Passive sampling methods for contaminated sediments: Risk assessment and management

    PubMed Central

    Greenberg, Marc S; Chapman, Peter M; Allan, Ian J; Anderson, Kim A; Apitz, Sabine E; Beegan, Chris; Bridges, Todd S; Brown, Steve S; Cargill, John G; McCulloch, Megan C; Menzie, Charles A; Shine, James P; Parkerton, Thomas F

    2014-01-01

    This paper details how activity-based passive sampling methods (PSMs), which provide information on bioavailability in terms of freely dissolved contaminant concentrations (Cfree), can be used to better inform risk management decision making at multiple points in the process of assessing and managing contaminated sediment sites. PSMs can increase certainty in site investigation and management, because Cfree is a better predictor of bioavailability than total bulk sediment concentration (Ctotal) for 4 key endpoints included in conceptual site models (benthic organism toxicity, bioaccumulation, sediment flux, and water column exposures). The use of passive sampling devices (PSDs) presents challenges with respect to representative sampling for estimating average concentrations and other metrics relevant for exposure and risk assessment. These challenges can be addressed by designing studies that account for sources of variation associated with PSMs and considering appropriate spatial scales to meet study objectives. Possible applications of PSMs include: quantifying spatial and temporal trends in bioavailable contaminants, identifying and evaluating contaminant source contributions, calibrating site-specific models, and, improving weight-of-evidence based decision frameworks. PSM data can be used to assist in delineating sediment management zones based on likelihood of exposure effects, monitor remedy effectiveness, and, evaluate risk reduction after sediment treatment, disposal, or beneficial reuse after management actions. Examples are provided illustrating why PSMs and freely dissolved contaminant concentrations (Cfree) should be incorporated into contaminated sediment investigations and study designs to better focus on and understand contaminant bioavailability, more accurately estimate exposure to sediment-associated contaminants, and better inform risk management decisions. Research and communication needs for encouraging broader use are discussed. Integr

  1. A risk analysis model in concurrent engineering product development.

    PubMed

    Wu, Desheng Dash; Kefan, Xie; Gang, Chen; Ping, Gui

    2010-09-01

    Concurrent engineering has been widely accepted as a viable strategy for companies to reduce time to market and achieve overall cost savings. This article analyzes various risks and challenges in product development under the concurrent engineering environment. A three-dimensional early warning approach for product development risk management is proposed by integrating graphical evaluation and review technique (GERT) and failure modes and effects analysis (FMEA). Simulation models are created to solve our proposed concurrent engineering product development risk management model. Solutions lead to identification of key risk controlling points. This article demonstrates the value of our approach to risk analysis as a means to monitor various risks typical in the manufacturing sector. This article has three main contributions. First, we establish a conceptual framework to classify various risks in concurrent engineering (CE) product development (PD). Second, we propose use of existing quantitative approaches for PD risk analysis purposes: GERT, FMEA, and product database management (PDM). Based on quantitative tools, we create our approach for risk management of CE PD and discuss solutions of the models. Third, we demonstrate the value of applying our approach using data from a typical Chinese motor company. PMID:20840492

  2. Hybrid methods for cybersecurity analysis :

    SciTech Connect

    Davis, Warren Leon,; Dunlavy, Daniel M.

    2014-01-01

    Early 2010 saw a signi cant change in adversarial techniques aimed at network intrusion: a shift from malware delivered via email attachments toward the use of hidden, embedded hyperlinks to initiate sequences of downloads and interactions with web sites and network servers containing malicious software. Enterprise security groups were well poised and experienced in defending the former attacks, but the new types of attacks were larger in number, more challenging to detect, dynamic in nature, and required the development of new technologies and analytic capabilities. The Hybrid LDRD project was aimed at delivering new capabilities in large-scale data modeling and analysis to enterprise security operators and analysts and understanding the challenges of detection and prevention of emerging cybersecurity threats. Leveraging previous LDRD research e orts and capabilities in large-scale relational data analysis, large-scale discrete data analysis and visualization, and streaming data analysis, new modeling and analysis capabilities were quickly brought to bear on the problems in email phishing and spear phishing attacks in the Sandia enterprise security operational groups at the onset of the Hybrid project. As part of this project, a software development and deployment framework was created within the security analyst work ow tool sets to facilitate the delivery and testing of new capabilities as they became available, and machine learning algorithms were developed to address the challenge of dynamic threats. Furthermore, researchers from the Hybrid project were embedded in the security analyst groups for almost a full year, engaged in daily operational activities and routines, creating an atmosphere of trust and collaboration between the researchers and security personnel. The Hybrid project has altered the way that research ideas can be incorporated into the production environments of Sandias enterprise security groups, reducing time to deployment from months and

  3. Balancing precision and risk: should multiple detection methods be analyzed separately in N-mixture models?

    PubMed

    Graves, Tabitha A; Royle, J Andrew; Kendall, Katherine C; Beier, Paul; Stetz, Jeffrey B; Macleod, Amy C

    2012-01-01

    Using multiple detection methods can increase the number, kind, and distribution of individuals sampled, which may increase accuracy and precision and reduce cost of population abundance estimates. However, when variables influencing abundance are of interest, if individuals detected via different methods are influenced by the landscape differently, separate analysis of multiple detection methods may be more appropriate. We evaluated the effects of combining two detection methods on the identification of variables important to local abundance using detections of grizzly bears with hair traps (systematic) and bear rubs (opportunistic). We used hierarchical abundance models (N-mixture models) with separate model components for each detection method. If both methods sample the same population, the use of either data set alone should (1) lead to the selection of the same variables as important and (2) provide similar estimates of relative local abundance. We hypothesized that the inclusion of 2 detection methods versus either method alone should (3) yield more support for variables identified in single method analyses (i.e. fewer variables and models with greater weight), and (4) improve precision of covariate estimates for variables selected in both separate and combined analyses because sample size is larger. As expected, joint analysis of both methods increased precision as well as certainty in variable and model selection. However, the single-method analyses identified different variables and the resulting predicted abundances had different spatial distributions. We recommend comparing single-method and jointly modeled results to identify the presence of individual heterogeneity between detection methods in N-mixture models, along with consideration of detection probabilities, correlations among variables, and tolerance to risk of failing to identify variables important to a subset of the population. The benefits of increased precision should be weighed against

  4. Balancing precision and risk: should multiple detection methods be analyzed separately in N-mixture models?

    USGS Publications Warehouse

    Graves, Tabitha A.; Royle, J. Andrew; Kendall, Katherine C.; Beier, Paul; Stetz, Jeffrey B.; Macleod, Amy C.

    2012-01-01

    Using multiple detection methods can increase the number, kind, and distribution of individuals sampled, which may increase accuracy and precision and reduce cost of population abundance estimates. However, when variables influencing abundance are of interest, if individuals detected via different methods are influenced by the landscape differently, separate analysis of multiple detection methods may be more appropriate. We evaluated the effects of combining two detection methods on the identification of variables important to local abundance using detections of grizzly bears with hair traps (systematic) and bear rubs (opportunistic). We used hierarchical abundance models (N-mixture models) with separate model components for each detection method. If both methods sample the same population, the use of either data set alone should (1) lead to the selection of the same variables as important and (2) provide similar estimates of relative local abundance. We hypothesized that the inclusion of 2 detection methods versus either method alone should (3) yield more support for variables identified in single method analyses (i.e. fewer variables and models with greater weight), and (4) improve precision of covariate estimates for variables selected in both separate and combined analyses because sample size is larger. As expected, joint analysis of both methods increased precision as well as certainty in variable and model selection. However, the single-method analyses identified different variables and the resulting predicted abundances had different spatial distributions. We recommend comparing single-method and jointly modeled results to identify the presence of individual heterogeneity between detection methods in N-mixture models, along with consideration of detection probabilities, correlations among variables, and tolerance to risk of failing to identify variables important to a subset of the population. The benefits of increased precision should be weighed against

  5. Uncertainties in Cancer Risk Coefficients for Environmental Exposure to Radionuclides. An Uncertainty Analysis for Risk Coefficients Reported in Federal Guidance Report No. 13

    SciTech Connect

    Pawel, David; Leggett, Richard Wayne; Eckerman, Keith F; Nelson, Christopher

    2007-01-01

    Federal Guidance Report No. 13 (FGR 13) provides risk coefficients for estimation of the risk of cancer due to low-level exposure to each of more than 800 radionuclides. Uncertainties in risk coefficients were quantified in FGR 13 for 33 cases (exposure to each of 11 radionuclides by each of three exposure pathways) on the basis of sensitivity analyses in which various combinations of plausible biokinetic, dosimetric, and radiation risk models were used to generate alternative risk coefficients. The present report updates the uncertainty analysis in FGR 13 for the cases of inhalation and ingestion of radionuclides and expands the analysis to all radionuclides addressed in that report. The analysis indicates that most risk coefficients for inhalation or ingestion of radionuclides are determined within a factor of 5 or less by current information. That is, application of alternate plausible biokinetic and dosimetric models and radiation risk models (based on the linear, no-threshold hypothesis with an adjustment for the dose and dose rate effectiveness factor) is unlikely to change these coefficients by more than a factor of 5. In this analysis the assessed uncertainty in the radiation risk model was found to be the main determinant of the uncertainty category for most risk coefficients, but conclusions concerning the relative contributions of risk and dose models to the total uncertainty in a risk coefficient may depend strongly on the method of assessing uncertainties in the risk model.

  6. Dairy consumption and lung cancer risk: a meta-analysis of prospective cohort studies

    PubMed Central

    Yu, Yi; Li, Hui; Xu, Kaiwu; Li, Xin; Hu, Chunlin; Wei, Hongyan; Zeng, Xiaoyun; Jing, Xiaoli

    2016-01-01

    Background Lung cancer risk is the leading cause of cancer-related deaths worldwide. We conducted a meta-analysis to evaluate the relationship between dairy consumption and lung cancer risk. Methods The databases included EMBASE, Medline (PubMed), and Web of Science. The relationship between dairy consumption and lung cancer risk was analyzed by relative risk or odds ratio estimates with 95% confidence intervals (CIs). We identified eight prospective cohort studies, which amounted to 10,344 cases and 61,901 participants. Results For milk intake, relative risk was 0.95 (95% CI: 0.76–1.15); heterogeneity was 70.2% (P=0.003). For total dairy product intake, relative risk was 0.96 (95% CI: 0.89–1.03), heterogeneity was 68.4% (P=0.004). Conclusion There was no significant association between dairy consumption and lung cancer risk. PMID:26766916

  7. Bisphosphonates and Risk of Cardiovascular Events: A Meta-Analysis

    PubMed Central

    Kim, Dae Hyun; Rogers, James R.; Fulchino, Lisa A.; Kim, Caroline A.; Solomon, Daniel H.; Kim, Seoyoung C.

    2015-01-01

    Background and Objectives Some evidence suggests that bisphosphonates may reduce atherosclerosis, while concerns have been raised about atrial fibrillation. We conducted a meta-analysis to determine the effects of bisphosphonates on total adverse cardiovascular (CV) events, atrial fibrillation, myocardial infarction (MI), stroke, and CV death in adults with or at risk for low bone mass. Methods A systematic search of MEDLINE and EMBASE through July 2014 identified 58 randomized controlled trials with longer than 6 months in duration that reported CV events. Absolute risks and the Mantel-Haenszel fixed-effects odds ratios (ORs) and 95% confidence intervals (CIs) of total CV events, atrial fibrillation, MI, stroke, and CV death were estimated. Subgroup analyses by follow-up duration, population characteristics, bisphosphonate types, and route were performed. Results Absolute risks over 25–36 months in bisphosphonate-treated versus control patients were 6.5% versus 6.2% for total CV events; 1.4% versus 1.5% for atrial fibrillation; 1.0% versus 1.2% for MI; 1.6% versus 1.9% for stroke; and 1.5% versus 1.4% for CV death. Bisphosphonate treatment up to 36 months did not have any significant effects on total CV events (14 trials; ORs [95% CI]: 0.98 [0.84–1.14]; I2 = 0.0%), atrial fibrillation (41 trials; 1.08 [0.92–1.25]; I2 = 0.0%), MI (10 trials; 0.96 [0.69–1.34]; I2 = 0.0%), stroke (10 trials; 0.99 [0.82–1.19]; I2 = 5.8%), and CV death (14 trials; 0.88 [0.72–1.07]; I2 = 0.0%) with little between-study heterogeneity. The risk of atrial fibrillation appears to be modestly elevated for zoledronic acid (6 trials; 1.24 [0.96–1.61]; I2 = 0.0%), not for oral bisphosphonates (26 trials; 1.02 [0.83–1.24]; I2 = 0.0%). The CV effects did not vary by subgroups or study quality. Conclusions Bisphosphonates do not have beneficial or harmful effects on atherosclerotic CV events, but zoledronic acid may modestly increase the risk of atrial fibrillation. Given the large

  8. Analysis methods for photovoltaic applications

    SciTech Connect

    1980-01-01

    Because photovoltaic power systems are being considered for an ever-widening range of applications, it is appropriate for system designers to have knowledge of and access to photovoltaic power systems simulation models and design tools. This brochure gives brief descriptions of a variety of such aids and was compiled after surveying both manufacturers and researchers. Services available through photovoltaic module manufacturers are outlined, and computer codes for systems analysis are briefly described. (WHK)

  9. Vulnerabilities, Influences and Interaction Paths: Failure Data for Integrated System Risk Analysis

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land

    2006-01-01

    We describe graph-based analysis methods for identifying and analyzing cross-subsystem interaction risks from subsystem connectivity information. By discovering external and remote influences that would be otherwise unexpected, these methods can support better communication among subsystem designers at points of potential conflict and to support design of more dependable and diagnosable systems. These methods identify hazard causes that can impact vulnerable functions or entities if propagated across interaction paths from the hazard source to the vulnerable target. The analysis can also assess combined impacts of And-Or trees of disabling influences. The analysis can use ratings of hazards and vulnerabilities to calculate cumulative measures of the severity and importance. Identification of cross-subsystem hazard-vulnerability pairs and propagation paths across subsystems will increase coverage of hazard and risk analysis and can indicate risk control and protection strategies.

  10. A Framework for Flood Risk Analysis and Benefit Assessment of Flood Control Measures in Urban Areas

    PubMed Central

    Li, Chaochao; Cheng, Xiaotao; Li, Na; Du, Xiaohe; Yu, Qian; Kan, Guangyuan

    2016-01-01

    Flood risk analysis is more complex in urban areas than that in rural areas because of their closely packed buildings, different kinds of land uses, and large number of flood control works and drainage systems. The purpose of this paper is to propose a practical framework for flood risk analysis and benefit assessment of flood control measures in urban areas. Based on the concept of disaster risk triangle (hazard, vulnerability and exposure), a comprehensive analysis method and a general procedure were proposed for urban flood risk analysis. Urban Flood Simulation Model (UFSM) and Urban Flood Damage Assessment Model (UFDAM) were integrated to estimate the flood risk in the Pudong flood protection area (Shanghai, China). S-shaped functions were adopted to represent flood return period and damage (R-D) curves. The study results show that flood control works could significantly reduce the flood risk within the 66-year flood return period and the flood risk was reduced by 15.59%. However, the flood risk was only reduced by 7.06% when the flood return period exceeded 66-years. Hence, it is difficult to meet the increasing demands for flood control solely relying on structural measures. The R-D function is suitable to describe the changes of flood control capacity. This frame work can assess the flood risk reduction due to flood control measures, and provide crucial information for strategy development and planning adaptation. PMID:27527202

  11. A Framework for Flood Risk Analysis and Benefit Assessment of Flood Control Measures in Urban Areas.

    PubMed

    Li, Chaochao; Cheng, Xiaotao; Li, Na; Du, Xiaohe; Yu, Qian; Kan, Guangyuan

    2016-01-01

    Flood risk analysis is more complex in urban areas than that in rural areas because of their closely packed buildings, different kinds of land uses, and large number of flood control works and drainage systems. The purpose of this paper is to propose a practical framework for flood risk analysis and benefit assessment of flood control measures in urban areas. Based on the concept of disaster risk triangle (hazard, vulnerability and exposure), a comprehensive analysis method and a general procedure were proposed for urban flood risk analysis. Urban Flood Simulation Model (UFSM) and Urban Flood Damage Assessment Model (UFDAM) were integrated to estimate the flood risk in the Pudong flood protection area (Shanghai, China). S-shaped functions were adopted to represent flood return period and damage (R-D) curves. The study results show that flood control works could significantly reduce the flood risk within the 66-year flood return period and the flood risk was reduced by 15.59%. However, the flood risk was only reduced by 7.06% when the flood return period exceeded 66-years. Hence, it is difficult to meet the increasing demands for flood control solely relying on structural measures. The R-D function is suitable to describe the changes of flood control capacity. This frame work can assess the flood risk reduction due to flood control measures, and provide crucial information for strategy development and planning adaptation. PMID:27527202

  12. A quantitative flood risk analysis methodology for urban areas with integration of social research data

    NASA Astrophysics Data System (ADS)

    Escuder-Bueno, I.; Castillo-Rodríguez, J. T.; Zechner, S.; Jöbstl, C.; Perales-Momparler, S.; Petaccia, G.

    2012-09-01

    Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009-2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative). First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.

  13. DOE 2009 Geothermal Risk Analysis: Methodology and Results (Presentation)

    SciTech Connect

    Young, K. R.; Augustine, C.; Anderson, A.

    2010-02-01

    This presentation summarizes the methodology and results for a probabilistic risk analysis of research, development, and demonstration work-primarily for enhanced geothermal systems (EGS)-sponsored by the U.S. Department of Energy Geothermal Technologies Program.

  14. American Airlines Propeller STOL Transport Economic Risk Analysis

    NASA Technical Reports Server (NTRS)

    Ransone, B.

    1972-01-01

    A Monte Carlo risk analysis on the economics of STOL transports in air passenger traffic established the probability of making the expected internal rate of financial return, or better, in a hypothetical regular Washington/New York intercity operation.

  15. Nonstationary risk analysis of climate extremes

    NASA Astrophysics Data System (ADS)

    Chavez-Demoulin, V.; Davison, A. C.; Suveges, M.

    2009-04-01

    There is growing interest in the modelling of the size and frequency of rare events in a changing climate. Standard models for extreme events are based on the modelling of annual maxima or exceedances over high or under low thresholds: in either case appropriate probability distributions are fitted to the data, and extrapolation to rare events is based on the fitted models. Very often, however, extremal models do not take full advantage of techniques that are standard in other domains of statistics. Smoothing methods are now well-established in many domains of statistics, and are increasingly used in analysis of extremal data. The crucial idea of smoothing is to replace a simple linear or quadratic form of dependence of one variable on another by a more flexible form, and thus to 'allow the data to speak for themselves,ánd thus, perhaps, to reveal unexpected features. There are many approaches to smoothing in the context of linear regression, of which the use of spline smoothing and of local polynomial modelling are perhaps the most common. Under the first, a basis of spline functions is used to represent the dependence; often this is called generalised additive modelling. Under the second, polynomial models are fitted locally to the data, resulting in a more flexible overall fit. The selection of the degree of smoothing is crucial, and there are automatic ways to do this. The talk will describe some applications of smoothing to data on temperature extremes, elucidating the relation between cold winter weather in the Alps and the North Atlantic Oscillation, and changes in the lengths of usually hot and cold spells in Britain. The work mixes classical models for extremes, generalised additive modelling, local polynomial smoothing, and the bootstrap. References Chavez-Demoulin, V. and Davison, A. C. (2005) Generalized additive modelling of sample extremes. Applied Statistics, 54, 207-222. Süveges, M. (2007) Likelihood estimation of the extremal index. Extremes, 10

  16. SARA (System Analysis and Risk Assessment): An NRC risk management tool

    SciTech Connect

    Sattison, M.B.; Robinson, R.C.

    1987-01-01

    The System Analysis and Risk Assessment (SARA) system is being developed at the Idaho National Engineering Laboratory under the direction and guidance of the Office of Nuclear Regulatory Research. The objective of this project is to provide a personal computer (PC)-based, user-friendly system for the computation and analysis of information on nuclear power plant risk characteristics. The latest released version of SARA (Version 2.0) contains the probabilistic risk assessment (PRA) results from four of the five plants included in the NUREG-1150 study. These plants are: Surry;Peach Bottom;Sequoyah;and Grand Gulf.

  17. Microparticle analysis system and method

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R. (Inventor)

    2007-01-01

    A device for analyzing microparticles is provided which includes a chamber with an inlet and an outlet for respectively introducing and dispensing a flowing fluid comprising microparticles, a light source for providing light through the chamber and a photometer for measuring the intensity of light transmitted through individual microparticles. The device further includes an imaging system for acquiring images of the fluid. In some cases, the device may be configured to identify and determine a quantity of the microparticles within the fluid. Consequently, a method for identifying and tracking microparticles in motion is contemplated herein. The method involves flowing a fluid comprising microparticles in laminar motion through a chamber, transmitting light through the fluid, measuring the intensities of the light transmitted through the microparticles, imaging the fluid a plurality of times and comparing at least some of the intensities of light between different images of the fluid.

  18. Anticipating risk for human subjects participating in clinical research: application of Failure Mode and Effects Analysis.

    PubMed

    Cody, Robert J

    2006-03-01

    Failure Mode and Effects Analysis (FMEA) is a method applied in various industries to anticipate and mitigate risk. This methodology can be more systematically applied to the protection of human subjects in research. The purpose of FMEA is simple: prevent problems before they occur. By applying FMEA process analysis to the elements of a specific research protocol, the failure severity, occurrence, and detection rates can be estimated for calculation of a "risk priority number" (RPN). Methods can then be identified to reduce the RPN to levels where the risk/benefit ratio favors human subject benefit, to a greater magnitude than existed in the pre-analysis risk profile. At the very least, the approach provides a checklist of issues that can be individualized for specific research protocols or human subject populations. PMID:16537191

  19. A Fuzzy Set Risk Analysis Technique for Water Supply under Climate Change

    NASA Astrophysics Data System (ADS)

    Bogardi, I.

    2012-04-01

    The main purpose of the risk analysis techniques is to select/rank management actions for local waterworks/water authorities under climate change. Risk of water supply is influenced by the available water supply, water demand and consequences of water shortage. Due to the inherently uncertain climate change estimation, all these quantities influenced by climate change are also uncertain. Thus, the combination of water shortages and consequences may be accomplished in a risk analysis framework. Often, frequency based statistical information is unavailable so common probabilistic risk analysis may not be applicable. To this end, a non-probabilistic risk analysis is presented that is relatively simple, practical and applicable with available data/information. The method is based on simplified fuzzy set mathematics. Thus, supply, demand and consequences are represented as uncertain (fuzzy) numbers. Four main parts of the methodology include 1. formulation of alternative management actions, 2. definition of the structure of ranking criteria, 3. estimation of ranking criteria values for each management action, and 4. ranking of the management actions according to the ranking criteria. Management actions are evaluated according to several criteria. One group of criteria considers water supply risk reduction for the various users. Another group of criteria may be also necessary: one related to the realization of the actions. Both water quantity and quality risk are considered. The use of the risk analysis techniques is illustrated by a case example.

  20. Current status of methods for shielding analysis

    SciTech Connect

    Engle, W.W.

    1980-01-01

    Current methods used in shielding analysis and recent improvements in those methods are discussed. The status of methods development is discussed based on needs cited at the 1977 International Conference on Reactor Shielding. Additional areas where methods development is needed are discussed.

  1. Environmental risk assessment in GMO analysis.

    PubMed

    Pirondini, Andrea; Marmiroli, Nelson

    2008-01-01

    Genetically modified or engineered organisms (GMOs, GEOs) are utilised in agriculture, expressing traits of interest, such as insect or herbicide resistance. Soybean, maize, cotton and oilseed rape are the GM crops with the largest acreage in the world. The distribution of GM acreage in the different countries is related with the different positions concerning labelling of GMO products: based on the principle of substantial equivalence, or rather based on the precautionary principle. The paper provides an overview on how the risks associated with release of GMO in the environments can be analysed and predicted, in view of a possible coexistence of GM and non-GM organisms in agriculture.Risk assessment procedures, both qualitative and quantitative, are compared in the context of application to GMOs considering also legislation requirements (Directive 2001/18/EC). Criteria and measurable properties to assess harm for human health and environmental safety are listed, and the possible consequences are evaluated in terms of significance.Finally, a mapping of the possible risks deriving from GMO release is reported, focusing on gene transfer to related species, horizontal gene transfer, direct and indirect effects on non target organisms, development of resistance in target organisms, and effects on biodiversity. PMID:19048472

  2. Environmental risk assessment in GMO analysis.

    PubMed

    Pirondini, Andrea; Marmiroli, Nelson

    2010-01-01

    Genetically modified or engineered organisms (GMOs, GEOs) are utilised in agriculture, expressing traits of interest, such as insect or herbicide resistance. Soybean, maize, cotton and oilseed rape are the GM crops with the largest acreage in the world. The distribution of GM acreage in the different countries is related with the different positions concerning labelling of GMO products: based on the principle of substantial equivalence, or rather based on the precautionary principle. The paper provides an overview on how the risks associated with release of GMO in the environments can be analysed and predicted, in view of a possible coexistence of GM and non-GM organisms in agriculture.Risk assessment procedures, both qualitative and quantitative, are compared in the context of application to GMOs considering also legislation requirements (Directive 2001/18/EC). Criteria and measurable properties to assess harm for human health and environmental safety are listed, and the possible consequences are evaluated in terms of significance.Finally, a mapping of the possible risks deriving from GMO release is reported, focusing on gene transfer to related species, horizontal gene transfer, direct and indirect effects on non target organisms, development of resistance in target organisms, and effects on biodiversity. PMID:21384330

  3. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    DOEpatents

    West, Phillip B.; Novascone, Stephen R.; Wright, Jerry P.

    2011-09-27

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  4. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    DOEpatents

    West, Phillip B.; Novascone, Stephen R.; Wright, Jerry P.

    2012-05-29

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  5. Environmental transport in the Oil Shale Risk Analysis.

    PubMed

    Feerer, J L; Gratt, L B

    1983-06-01

    The Oil Shale Risk Analysis differs from similar efforts in coal and nuclear energy in that the industry is not yet developed to a commercial scale. Many assumptions are necessary to predict the future oil shale industry pollutants, the environmental transport of these pollutants, and subsequent human health and environmental effects. The environmental transport analysis in the Oil Shale Risk Analysis is used as an example of applying assumptions to the best available data to predict potential environmental effects of a future commercial industry. The analysis provides information to aid in formulating and managing a program of environmental research focused on reducing uncertainties in critical areas. PMID:6879167

  6. Analysis of Risk Management in Adapted Physical Education Textbooks

    ERIC Educational Resources Information Center

    Murphy, Kelle L.; Donovan, Jacqueline B.; Berg, Dominck A.

    2016-01-01

    Physical education teacher education (PETE) programs vary on how the topics of safe teaching and risk management are addressed. Common practices to cover such issues include requiring textbooks, lesson planning, peer teaching, videotaping, reflecting, and reading case law analyses. We used a mixed methods design to examine how risk management is…

  7. Risk analysis of analytical validations by probabilistic modification of FMEA.

    PubMed

    Barends, D M; Oldenhof, M T; Vredenbregt, M J; Nauta, M J

    2012-05-01

    Risk analysis is a valuable addition to validation of an analytical chemistry process, enabling not only detecting technical risks, but also risks related to human failures. Failure Mode and Effect Analysis (FMEA) can be applied, using a categorical risk scoring of the occurrence, detection and severity of failure modes, and calculating the Risk Priority Number (RPN) to select failure modes for correction. We propose a probabilistic modification of FMEA, replacing the categorical scoring of occurrence and detection by their estimated relative frequency and maintaining the categorical scoring of severity. In an example, the results of traditional FMEA of a Near Infrared (NIR) analytical procedure used for the screening of suspected counterfeited tablets are re-interpretated by this probabilistic modification of FMEA. Using this probabilistic modification of FMEA, the frequency of occurrence of undetected failure mode(s) can be estimated quantitatively, for each individual failure mode, for a set of failure modes, and the full analytical procedure. PMID:22410502

  8. Risk Analysis for Unintentional Slide Deployment During Airline Operations.

    PubMed

    Ayra, Eduardo S; Insua, David Ríos; Castellanos, María Eugenia; Larbi, Lydia

    2015-09-01

    We present a risk analysis undertaken to mitigate problems in relation to the unintended deployment of slides under normal operations within a commercial airline. This type of incident entails relevant costs for the airline industry. After assessing the likelihood and severity of its consequences, we conclude that such risks need to be managed. We then evaluate the effectiveness of various countermeasures, describing and justifying the chosen ones. We also discuss several issues faced when implementing and communicating the proposed measures, thus fully illustrating the risk analysis process. PMID:26061899

  9. PRIME VALUE METHOD TO PRIORITIZE RISK HANDLING STRATEGIES

    SciTech Connect

    Noller, D

    2007-10-31

    Funding for implementing risk handling strategies typically is allocated according to either the risk-averse approach (the worst risk first) or the cost-effective approach (the greatest risk reduction per implementation dollar first). This paper introduces a prime value approach in which risk handling strategies are prioritized according to how nearly they meet the goals of the organization that disburses funds for risk handling. The prime value approach factors in the importance of the project in which the risk has been identified, elements of both risk-averse and cost-effective approaches, and the time period in which the risk could happen. This paper also presents a prioritizer spreadsheet, which employs weighted criteria to calculate a relative rank for the handling strategy of each risk evaluated.

  10. Land Use Adaptation Strategies Analysis in Landslide Risk Region

    NASA Astrophysics Data System (ADS)

    Lin, Yu-Ching; Chang, Chin-Hsin; Chen, Ying-Tung

    2013-04-01

    In order to respond to the impact of climate and environmental change on Taiwanese mountain region, this study used GTZ (2004) Risk analysis guidelines to assess the landslide risk for 178 Taiwanese mountain towns. This study used 7 indicators to assess landslide risk, which are rainfall distribution, natural environment vulnerability (e.g., rainfall threshold criterion for debris flow, historical disaster frequency, landslide ratio, and road density), physicality vulnerability (e.g., population density) and socio-economic vulnerability (e.g., population with higher education, death rate and income). The landslide risk map can be obtained by multiplying 7 indicators together and ranking the product. The map had 5 risk ranges, and towns within the range of 4 to 5, which are high landslide risk regions, and have high priority in reducing risk. This study collected the regions with high landslide risk regions and analyzed the difference after Typhoon Morakot (2009). The spatial distribution showed that after significant environmental damage high landslide risk regions moved from central to south Taiwan. The changeable pattern of risk regions pointed out the necessity of updating the risk map periodically. Based on the landslide risk map and the land use investigation data which was provided by the National Land Surveying and Mapping Center in 2007, this study calculated the size of the land use area with landslide disaster risk. According to the above results and discussion, this study can be used to suggest appropriate land use adaptation strategies provided for reducing landslide risk under the impact of climate and environmental change.

  11. Environmental risk analysis of hazardous material rail transportation.

    PubMed

    Saat, Mohd Rapik; Werth, Charles J; Schaeffer, David; Yoon, Hongkyu; Barkan, Christopher P L

    2014-01-15

    An important aspect of railroad environmental risk management involves tank car transportation of hazardous materials. This paper describes a quantitative, environmental risk analysis of rail transportation of a group of light, non-aqueous-phase liquid (LNAPL) chemicals commonly transported by rail in North America. The Hazardous Materials Transportation Environmental Consequence Model (HMTECM) was used in conjunction with a geographic information system (GIS) analysis of environmental characteristics to develop probabilistic estimates of exposure to different spill scenarios along the North American rail network. The risk analysis incorporated the estimated clean-up cost developed using the HMTECM, route-specific probability distributions of soil type and depth to groundwater, annual traffic volume, railcar accident rate, and tank car safety features, to estimate the nationwide annual risk of transporting each product. The annual risk per car-mile (car-km) and per ton-mile (ton-km) was also calculated to enable comparison between chemicals and to provide information on the risk cost associated with shipments of these products. The analysis and the methodology provide a quantitative approach that will enable more effective management of the environmental risk of transporting hazardous materials. PMID:24239259

  12. Poultry consumption and prostate cancer risk: a meta-analysis

    PubMed Central

    He, Qian; Wan, Zheng-ce; Xu, Xiao-bing; Wu, Jing

    2016-01-01

    Background. Several kinds of foods are hypothesized to be potential factors contributing to the variation of prostate cancer (PCa) incidence. But the effect of poultry on PCa is still inconsistent and no quantitative assessment has been published up to date. So we conducted this meta-analysis to clarify the association between them. Materials and Methods. We conducted a literature search of PubMed and Embase for studies examining the association between poultry consumption and PCa up to June, 2015. Pooled risk ratio (RR) and corresponding 95% confidence interval (CI) of the highest versus lowest poultry consumption categories were calculated by fixed-effect model or random-effect model. Results. A total of 27 (12 cohort and 15 case-control) studies comprising 23,703 cases and 469,986 noncases were eligible for inclusion. The summary RR of total PCa incidence was 1.03 (95% CI [0.95–1.11]) for the highest versus lowest categories of poultry intake. The heterogeneity between studies was not statistically significant (P = 0.768, I2 = 28.5%). Synthesized analysis of 11 studies on high stage PCa and 8 studies on chicken exposure also demonstrated null association. We also did not obtain significant association in the subgroup of cohort study (RR = 1.04, 95% CI [0.98–1.10]), as well as in the subgroups of population-based case-control study and hospital-based case-control study. Then the studies were divided into three geographic groups: Western countries, Asia and South America. The pooled RRs in these areas did not reveal statistically significant association between poultry and PCa. Conclusions. This meta-analysis suggests no association between poultry consumption and PCa risk. Further well-designed studies are warranted to confirm the result. PMID:26855875

  13. The Qualitative Method of Impact Analysis.

    ERIC Educational Resources Information Center

    Mohr, Lawrence B.

    1999-01-01

    Discusses qualitative methods of impact analysis and provides an introductory treatment of one such approach. Combines an awareness of an alternative causal epistemology with current knowledge of qualitative methods of data collection and measurement to produce an approach to the analysis of impacts. (SLD)

  14. Comparison of 3 Methods for Identifying Dietary Patterns Associated With Risk of Disease

    PubMed Central

    DiBello, Julia R.; Kraft, Peter; McGarvey, Stephen T.; Goldberg, Robert; Campos, Hannia

    2008-01-01

    Reduced rank regression and partial least-squares regression (PLS) are proposed alternatives to principal component analysis (PCA). Using all 3 methods, the authors derived dietary patterns in Costa Rican data collected on 3,574 cases and controls in 1994–2004 and related the resulting patterns to risk of first incident myocardial infarction. Four dietary patterns associated with myocardial infarction were identified. Factor 1, characterized by high intakes of lean chicken, vegetables, fruit, and polyunsaturated oil, was generated by all 3 dietary pattern methods and was associated with a significantly decreased adjusted risk of myocardial infarction (28%–46%, depending on the method used). PCA and PLS also each yielded a pattern associated with a significantly decreased risk of myocardial infarction (31% and 23%, respectively); this pattern was characterized by moderate intake of alcohol and polyunsaturated oil and low intake of high-fat dairy products. The fourth factor derived from PCA was significantly associated with a 38% increased risk of myocardial infarction and was characterized by high intakes of coffee and palm oil. Contrary to previous studies, the authors found PCA and PLS to produce more patterns associated with cardiovascular disease than reduced rank regression. The most effective method for deriving dietary patterns related to disease may vary depending on the study goals. PMID:18945692

  15. Handbook of methods for risk-based analyses of technical specifications

    SciTech Connect

    Samanta, P.K.; Kim, I.S.; Mankamo, T.; Vesely, W.E.

    1994-12-01

    Technical Specifications (TS) requirements for nuclear power plants define the Limiting Conditions for Operation (LCOs) and Surveillance Requirements (SRs) to assure safety during operation. In general, these requirements are based on deterministic analysis and engineering judgments. Experiences with plant operation indicate that some elements of the requirements are unnecessarily restrictive, while a few may not be conducive to safety. The US Nuclear Regulatory Commission (USNRC) Office of Research has sponsored research to develop systematic risk-based methods to improve various aspects of TS requirements. This handbook summarizes these risk-based methods. The scope of the handbook includes reliability and risk-based methods for evaluating allowed outage times (AOTs), scheduled or preventive maintenance, action statements requiring shutdown where shutdown risk may be substantial, surveillance test intervals (STIs), and management of plant configurations resulting from outages of systems, or components. For each topic, the handbook summarizes analytic methods with data needs, outlines the insights to be gained, lists additional references, and gives examples of evaluations.

  16. Vibration analysis methods for piping

    NASA Astrophysics Data System (ADS)

    Gibert, R. J.

    1981-09-01

    Attention is given to flow vibrations in pipe flow induced by singularity points in the piping system. The types of pressure fluctuations induced by flow singularities are examined, including the intense wideband fluctuations immediately downstream of the singularity and the acoustic fluctuations encountered in the remainder of the circuit, and a theory of noise generation by unsteady flow in internal acoustics is developed. The response of the piping systems to the pressure fluctuations thus generated is considered, and the calculation of the modal characteristics of piping containing a dense fluid in order to obtain the system transfer function is discussed. The TEDEL program, which calculates the vibratory response of a structure composed of straight and curved pipes with variable mechanical characteristics forming a three-dimensional network by a finite element method, is then presented, and calculations of fluid-structural coupling in tubular networks are illustrated.

  17. Climate change, land slide risks and sustainable development, risk analysis and decision support process tool

    NASA Astrophysics Data System (ADS)

    Andersson-sköld, Y. B.; Tremblay, M.

    2011-12-01

    Climate change is in most parts of Sweden expected to result in increased precipitation and increased sea water levels causing flooding, erosion, slope instability and related secondary consequences. Landslide risks are expected to increase with climate change in large parts of Sweden due to increased annual precipitation, more intense precipitation and increased flows combined with dryer summers. In response to the potential climate related risks, and on the commission of the Ministry of Environment, the Swedish Geotechnical Institute (SGI) is at present performing a risk analysis project for the most prominent landslide risk area in Sweden: the Göta river valley. As part of this, a methodology for land slide ex-ante consequence analysis today, and in a future climate, has been developed and applied in the Göta river valley. Human life, settlements, industry, contaminated sites, infrastructure of national importance are invented and assessed important elements at risk. The goal of the consequence analysis is to produce a map of geographically distributed expected losses, which can be combined with a corresponding map displaying landslide probability to describe the risk (the combination of probability and consequence of a (negative) event). The risk analysis is GIS-aided in presenting and visualise the risk and using existing databases for quantification of the consequences represented by ex-ante estimated monetary losses. The results will be used on national, regional and as an indication of the risk on local level, to assess the need of measures to mitigate the risk. The costs and environmental and social impacts to mitigate the risk are expected to be very high but the costs and impacts of a severe landslide are expected to be even higher. Therefore, civil servants have pronounced a need of tools to assess both the vulnerability and a more holistic picture of impacts of climate change adaptation measures. At SGI a tool for the inclusion of sustainability

  18. FREQUENCY ANALYSIS OF PESTICIDE CONCENTRATIONS FOR RISK ASSESSMENT (FRANCO MODEL)

    EPA Science Inventory

    This report describes a method for statistically characterizing the occurrence and duration of pesticide concentrations in surface waters receiving runoff from agricultural lands. The characterization bridges the gap between simulated instream pesticide modeling and the risk asse...

  19. Risk Assessment of Infrastructure System of Systems with Precursor Analysis.

    PubMed

    Guo, Zhenyu; Haimes, Yacov Y

    2016-08-01

    Physical infrastructure systems are commonly composed of interconnected and interdependent subsystems, which in their essence constitute system of systems (S-o-S). System owners and policy researchers need tools to foresee potential emergent forced changes and to understand their impact so that effective risk management strategies can be developed. We develop a systemic framework for precursor analysis to support the design of an effective and efficient precursor monitoring and decision support system with the ability to (i) identify and prioritize indicators of evolving risks of system failure; and (ii) evaluate uncertainties in precursor analysis to support informed and rational decision making. This integrated precursor analysis framework is comprised of three processes: precursor identification, prioritization, and evaluation. We use an example of a highway bridge S-o-S to demonstrate the theories and methodologies of the framework. Bridge maintenance processes involve many interconnected and interdependent functional subsystems and decision-making entities and bridge failure can have broad social and economic consequences. The precursor analysis framework, which constitutes an essential part of risk analysis, examines the impact of various bridge inspection and maintenance scenarios. It enables policy researchers and analysts who are seeking a risk perspective on bridge infrastructure in a policy setting to develop more risk informed policies and create guidelines to efficiently allocate limited risk management resources and mitigate severe consequences resulting from bridge failures. PMID:27575259

  20. Issues in benchmarking human reliability analysis methods : a literature review.

    SciTech Connect

    Lois, Erasmia; Forester, John Alan; Tran, Tuan Q.; Hendrickson, Stacey M. Langfitt; Boring, Ronald L.

    2008-04-01

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessment (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study is currently underway that compares HRA methods with each other and against operator performance in simulator studies. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies are presented in order to aid in the design of future HRA benchmarking endeavors.

  1. Issues in Benchmarking Human Reliability Analysis Methods: A Literature Review

    SciTech Connect

    Ronald L. Boring; Stacey M. L. Hendrickson; John A. Forester; Tuan Q. Tran; Erasmia Lois

    2010-06-01

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessments (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study comparing and evaluating HRA methods in assessing operator performance in simulator experiments is currently underway. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies are presented in order to aid in the design of future HRA benchmarking endeavors.

  2. Probabilistic structural analysis by extremum methods

    NASA Technical Reports Server (NTRS)

    Nafday, Avinash M.

    1990-01-01

    The objective is to demonstrate discrete extremum methods of structural analysis as a tool for structural system reliability evaluation. Specifically, linear and multiobjective linear programming models for analysis of rigid plastic frames under proportional and multiparametric loadings, respectively, are considered. Kinematic and static approaches for analysis form a primal-dual pair in each of these models and have a polyhedral format. Duality relations link extreme points and hyperplanes of these polyhedra and lead naturally to dual methods for system reliability evaluation.

  3. Risk Analysis for Resource Planning Optimization

    NASA Technical Reports Server (NTRS)

    Cheung, Kar-Ming

    2008-01-01

    This paper describes a systems engineering approach to resource planning by integrating mathematical modeling and constrained optimization, empirical simulation, and theoretical analysis techniques to generate an optimal task plan in the presence of uncertainties.

  4. Text analysis methods, text analysis apparatuses, and articles of manufacture

    DOEpatents

    Whitney, Paul D; Willse, Alan R; Lopresti, Charles A; White, Amanda M

    2014-10-28

    Text analysis methods, text analysis apparatuses, and articles of manufacture are described according to some aspects. In one aspect, a text analysis method includes accessing information indicative of data content of a collection of text comprising a plurality of different topics, using a computing device, analyzing the information indicative of the data content, and using results of the analysis, identifying a presence of a new topic in the collection of text.

  5. Relative risk analysis of several manufactured nanomaterials: an insurance industry context.

    PubMed

    Robichaud, Christine Ogilvie; Tanzil, Dicksen; Weilenmann, Ulrich; Wiesner, Mark R

    2005-11-15

    A relative risk assessment is presented for the industrial fabrication of several nanomaterials. The production processes for five nanomaterials were selected for this analysis, based on their current or near-term potential for large-scale production and commercialization: single-walled carbon nanotubes, bucky balls (C60), one variety of quantum dots, alumoxane nanoparticles, and nano-titanium dioxide. The assessment focused on the activities surrounding the fabrication of nanomaterials, exclusive of any impacts or risks with the nanomaterials themselves. A representative synthesis method was selected for each nanomaterial based on its potential for scaleup. A list of input materials, output materials, and waste streams for each step of fabrication was developed and entered into a database that included key process characteristics such as temperature and pressure. The physical-chemical properties and quantities of the inventoried materials were used to assess relative risk based on factors such as volatility, carcinogenicity, flammability, toxicity, and persistence. These factors were first used to qualitatively rank risk, then combined using an actuarial protocol developed by the insurance industry for the purpose of calculating insurance premiums for chemical manufacturers. This protocol ranks three categories of risk relative to a 100 point scale (where 100 represents maximum risk): incident risk, normal operations risk, and latent contamination risk. Results from this analysis determined that relative environmental risk from manufacturing each of these five materials was comparatively low in relation to other common industrial manufacturing processes. PMID:16323804

  6. In vitro method for medical risk assessment of laser fumes

    NASA Astrophysics Data System (ADS)

    Malkusch, W.; Rehn, B.; Bruch, J.

    1995-02-01

    Laser processing of different materials may produce toxic fumes. In preventive occupational medicine it is necessary to evaluate valid hygienic standards for work places. The basis for such hygienic standards is the classification of laser fumes by their fibrogenic, emphysematous, immunological or other harmful potencies in biological assay systems. This paper is part of a European project on laser safety. Our part in this project is the development of a method for the investigation of lung responses using in vitro cell assays. The appropriate laser fume samples will be supplied by other groups in this European project. In contrast to the cell assays usually used in risk assessment, our method is based on isolated target cells in the lung, such as alveolar macrophages. The test criteria are mediator release, surfactant reactions, release of reactive oxygen species and cell proliferation. As demonstrated in the lung response to other dusts (minerals, fibres etc) these parameters are medically relevant factors in the pathogenic alveolar dust response. The paper gives basic information about the method using lung cell assays and the results of known substances, in comparison with a dust generated by laser processing.

  7. Risk D&D Rapid Prototype: Scenario Documentation and Analysis Tool

    SciTech Connect

    Unwin, Stephen D.; Seiple, Timothy E.

    2009-05-28

    Report describes process and methodology associated with a rapid prototype tool for integrating project risk analysis and health & safety risk analysis for decontamination and decommissioning projects.

  8. Statin use and risk of fracture: a meta-analysis

    PubMed Central

    Jin, Shaolin; Jiang, Jianping; Bai, Pengcheng; Zhang, Mei; Tong, Xujun; Wang, Hui; Lu, Youqun

    2015-01-01

    This meta-analysis investigates the associations of statins use and fracture risk. Two reviewers independently searched six databases including PubMed, Cochrane Library, Ovid, Embase, China National Knowledge Infrastructure (CNKI) and Wanfang databases. Studies retrieved from database searches were screened using our stringent inclusion and exclusion criteria. A sum of 17 studies, published between 2000 and 2014, were included in this meta-analysis. The results of this meta-analysis suggested that statins use was associated with a decreased risk of fracture (OR=0.80; 95% CI, 0.73-0.88; P < 0.00001). In the subgroup analysis by study design, statins was significantly associated with a decreased risk of fracture in both case-control studies (OR=0.67; 95% CI, 0.55-0.87; P < 0.0001) and cohort studies (OR=0.86; 95% CI, 0.77-0.97; P=0.02). In the female subgroup analysis, statins user showed decreased fracture risk (OR=0.76; 95% CI, 0.63-0.92; P=0.005). In the subgroup analysis by duration of follow-up, studies with both long and short duration of follow-up showed decreased risk of fracture (OR=0.67; 95% CI, 0.54-0.82; P=0.001 and OR=0.85; 95% CI, 0.74-0.96; P=0.01). Studies with large sample size and small sample size showed decreased risk of fracture (OR=0.85; 95% CI, 0.77-0.94; P=0.002 and OR=0.65; 95% CI, 0.54-0.78; P < 0.0001). In conclusion, this meta-analysis suggested a significant association between statins use and decreased fracture risk. PMID:26221409

  9. Contract Negotiations Supported Through Risk Analysis

    NASA Astrophysics Data System (ADS)

    Rodrigues, Sérgio A.; Vaz, Marco A.; Souza, Jano M.

    Many clients often view software as a commodity; then, it is critical that IT sellers know how to create value into their offering to differentiate their service from all the others. Clients sometimes refuse to contract software development due to lack of technical understanding or simply because they are afraid of IT contractual commitments. The IT negotiators who recognize the importance of this issue and the reason why it is a problem will be able to work to reach the commercial terms they want. Therefore, this chapter aims to stimulate IT professionals to improve their negotiation skills and presents a computational tool to support managers to get the best out of software negotiations through the identification of contract risks.

  10. Germany wide seasonal flood risk analysis for agricultural crops

    NASA Astrophysics Data System (ADS)

    Klaus, Stefan; Kreibich, Heidi; Kuhlmann, Bernd; Merz, Bruno; Schröter, Kai

    2016-04-01

    In recent years, large-scale flood risk analysis and mapping has gained attention. Regional to national risk assessments are needed, for example, for national risk policy developments, for large-scale disaster management planning and in the (re-)insurance industry. Despite increasing requests for comprehensive risk assessments some sectors have not received much scientific attention, one of these is the agricultural sector. In contrast to other sectors, agricultural crop losses depend strongly on the season. Also flood probability shows seasonal variation. Thus, the temporal superposition of high flood susceptibility of crops and high flood probability plays an important role for agricultural flood risk. To investigate this interrelation and provide a large-scale overview of agricultural flood risk in Germany, an agricultural crop loss model is used for crop susceptibility analyses and Germany wide seasonal flood-frequency analyses are undertaken to derive seasonal flood patterns. As a result, a Germany wide map of agricultural flood risk is shown as well as the crop type most at risk in a specific region. The risk maps may provide guidance for federal state-wide coordinated designation of retention areas.

  11. Comparing risk in conventional and organic dairy farming in the Netherlands: an empirical analysis.

    PubMed

    Berentsen, P B M; Kovacs, K; van Asseldonk, M A P M

    2012-07-01

    This study was undertaken to contribute to the understanding of why most dairy farmers do not convert to organic farming. Therefore, the objective of this research was to assess and compare risks for conventional and organic farming in the Netherlands with respect to gross margin and the underlying price and production variables. To investigate the risk factors a farm accountancy database was used containing panel data from both conventional and organic representative Dutch dairy farms (2001-2007). Variables with regard to price and production risk were identified using a gross margin analysis scheme. Price risk variables were milk price and concentrate price. The main production risk variables were milk yield per cow, roughage yield per hectare, and veterinary costs per cow. To assess risk, an error component implicit detrending method was applied and the resulting detrended standard deviations were compared between conventional and organic farms. Results indicate that the risk included in the gross margin per cow is significantly higher in organic farming. This is caused by both higher price and production risks. Price risks are significantly higher in organic farming for both milk price and concentrate price. With regard to production risk, only milk yield per cow poses a significantly higher risk in organic farming. PMID:22720936

  12. Use of a risk assessment method to improve the safety of negative pressure wound therapy.

    PubMed

    Lelong, Anne-Sophie; Martelli, Nicolas; Bonan, Brigitte; Prognon, Patrice; Pineau, Judith

    2014-06-01

    To conduct a risk analysis of the negative pressure wound therapy (NPWT) care process and to improve the safety of NPWT, a working group of nurses, hospital pharmacists, physicians and hospital managers performed a risk analysis for the process of NPWT care. The failure modes, effects and criticality analysis (FMECA) method was used for this analysis. Failure modes and their consequences were defined and classified as a function of their criticality to identify priority actions for improvement. By contrast to classical FMECA, the criticality index (CI) of each consequence was calculated by multiplying occurrence, severity and detection scores. We identified 13 failure modes, leading to 20 different consequences. The CI of consequences was initially 712, falling to 357 after corrective measures were implemented. The major improvements proposed included the establishment of 6-monthly training cycles for nurses, physicians and surgeons and the introduction of computerised prescription for NPWT. The FMECA method also made it possible to prioritise actions as a function of the criticality ranking of consequences and was easily understood and used by the working group. This study is, to our knowledge, the first to use the FMECA method to improve the safety of NPWT. PMID:22931525

  13. Decomposition-Based Failure Mode Identification Method for Risk-Free Design of Large Systems

    NASA Technical Reports Server (NTRS)

    Tumer, Irem Y.; Stone, Robert B.; Roberts, Rory A.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    When designing products, it is crucial to assure failure and risk-free operation in the intended operating environment. Failures are typically studied and eliminated as much as possible during the early stages of design. The few failures that go undetected result in unacceptable damage and losses in high-risk applications where public safety is of concern. Published NASA and NTSB accident reports point to a variety of components identified as sources of failures in the reported cases. In previous work, data from these reports were processed and placed in matrix form for all the system components and failure modes encountered, and then manipulated using matrix methods to determine similarities between the different components and failure modes. In this paper, these matrices are represented in the form of a linear combination of failures modes, mathematically formed using Principal Components Analysis (PCA) decomposition. The PCA decomposition results in a low-dimensionality representation of all failure modes and components of interest, represented in a transformed coordinate system. Such a representation opens the way for efficient pattern analysis and prediction of failure modes with highest potential risks on the final product, rather than making decisions based on the large space of component and failure mode data. The mathematics of the proposed method are explained first using a simple example problem. The method is then applied to component failure data gathered from helicopter, accident reports to demonstrate its potential.

  14. Risk analysis of dust explosion scenarios using Bayesian networks.

    PubMed

    Yuan, Zhi; Khakzad, Nima; Khan, Faisal; Amyotte, Paul

    2015-02-01

    In this study, a methodology has been proposed for risk analysis of dust explosion scenarios based on Bayesian network. Our methodology also benefits from a bow-tie diagram to better represent the logical relationships existing among contributing factors and consequences of dust explosions. In this study, the risks of dust explosion scenarios are evaluated, taking into account common cause failures and dependencies among root events and possible consequences. Using a diagnostic analysis, dust particle properties, oxygen concentration, and safety training of staff are identified as the most critical root events leading to dust explosions. The probability adaptation concept is also used for sequential updating and thus learning from past dust explosion accidents, which is of great importance in dynamic risk assessment and management. We also apply the proposed methodology to a case study to model dust explosion scenarios, to estimate the envisaged risks, and to identify the vulnerable parts of the system that need additional safety measures. PMID:25264172

  15. Use of Monte Carlo methods in environmental risk assessments at the INEL: Applications and issues

    SciTech Connect

    Harris, G.; Van Horn, R.

    1996-06-01

    The EPA is increasingly considering the use of probabilistic risk assessment techniques as an alternative or refinement of the current point estimate of risk. This report provides an overview of the probabilistic technique called Monte Carlo Analysis. Advantages and disadvantages of implementing a Monte Carlo analysis over a point estimate analysis for environmental risk assessment are discussed. The general methodology is provided along with an example of its implementation. A phased approach to risk analysis that allows iterative refinement of the risk estimates is recommended for use at the INEL.

  16. TNF -308 G/A Polymorphism and Risk of Acne Vulgaris: A Meta-Analysis

    PubMed Central

    Yang, Jian-Kang; Wu, Wen-Juan; Qi, Jue; He, Li; Zhang, Ya-Ping

    2014-01-01

    Background The -308 G/A polymorphism in the tumor necrosis factor (TNF) gene has been implicated in the risk of acne vulgaris, but the results are inconclusive. The present meta-analysis aimed to investigate the overall association between the -308 G/A polymorphism and acne vulgaris risk. Methods We searched in Pubmed, Embase, Web of Science and CNKI for studies evaluating the association between the -308 G/A gene polymorphism and acne vulgaris risk. Data were extracted and statistical analysis was performed using STATA 12.0 software. Results A total of five publications involving 1553 subjects (728 acne vulgaris cases and 825 controls) were included in this meta-analysis. Combined analysis revealed a significant association between this polymorphism and acne vulgaris risk under recessive model (OR = 2.73, 95% CI: 1.37–5.44, p = 0.004 for AA vs. AG + GG). Subgroup analysis by ethnicity showed that the acne vulgaris risk associated with the -308 G/A gene polymorphism was significantly elevated among Caucasians under recessive model (OR = 2.34, 95% CI: 1.13–4.86, p = 0.023). Conclusion This meta-analysis suggests that the -308 G/A polymorphism in the TNF gene contributes to acne vulgaris risk, especially in Caucasian populations. Further studies among different ethnicity populations are needed to validate these findings. PMID:24498378

  17. State of the art in benefit-risk analysis: introduction.

    PubMed

    Verhagen, H; Tijhuis, M J; Gunnlaugsdóttir, H; Kalogeras, N; Leino, O; Luteijn, J M; Magnússon, S H; Odekerken, G; Pohjola, M V; Tuomisto, J T; Ueland, Ø; White, B C; Holm, F

    2012-01-01

    Risk-taking is normal in everyday life if there are associated (perceived) benefits. Benefit-Risk Analysis (BRA) compares the risk of a situation to its related benefits and addresses the acceptability of the risk. Over the past years BRA in relation to food and food ingredients has gained attention. Food, and even the same food ingredient, may confer both beneficial and adverse effects. Measures directed at food safety may lead to suboptimal or insufficient levels of ingredients from a benefit perspective. In BRA, benefits and risks of food (ingredients) are assessed in one go and may conditionally be expressed into one currency. This allows the comparison of adverse and beneficial effects to be qualitative and quantitative. A BRA should help policy-makers to make more informed and balanced benefit-risk management decisions. Not allowing food benefits to occur in order to guarantee food safety is a risk management decision much the same as accepting some risk in order to achieve more benefits. BRA in food and nutrition is making progress, but difficulties remain. The field may benefit from looking across its borders to learn from other research areas. The BEPRARIBEAN project (Best Practices for Risk-Benefit Analysis: experience from out of food into food; http://en.opasnet.org/w/Bepraribean) aims to do so, by working together with Medicines, Food Microbiology, Environmental Health, Economics & Marketing-Finance and Consumer Perception. All perspectives are reviewed and subsequently integrated to identify opportunities for further development of BRA for food and food ingredients. Interesting issues that emerge are the varying degrees of risk that are deemed acceptable within the areas and the trend towards more open and participatory BRA processes. A set of 6 'state of the art' papers covering the above areas and a paper integrating the separate (re)views are published in this volume. PMID:21679738

  18. Risk analysis of gravity dam instability using credibility theory Monte Carlo simulation model.

    PubMed

    Xin, Cao; Chongshi, Gu

    2016-01-01

    Risk analysis of gravity dam stability involves complicated uncertainty in many design parameters and measured data. Stability failure risk ratio described jointly by probability and possibility has deficiency in characterization of influence of fuzzy factors and representation of the likelihood of risk occurrence in practical engineering. In this article, credibility theory is applied into stability failure risk analysis of gravity dam. Stability of gravity dam is viewed as a hybrid event considering both fuzziness and randomness of failure criterion, design parameters and measured data. Credibility distribution function is conducted as a novel way to represent uncertainty of influence factors of gravity dam stability. And combining with Monte Carlo simulation, corresponding calculation method and procedure are proposed. Based on a dam section, a detailed application of the modeling approach on risk calculation of both dam foundation and double sliding surfaces is provided. The results show that, the present method is feasible to be applied on analysis of stability failure risk for gravity dams. The risk assessment obtained can reflect influence of both sorts of uncertainty, and is suitable as an index value. PMID:27386264

  19. Credibility analysis of risk classes by generalized linear model

    NASA Astrophysics Data System (ADS)

    Erdemir, Ovgucan Karadag; Sucu, Meral

    2016-06-01

    In this paper generalized linear model (GLM) and credibility theory which are frequently used in nonlife insurance pricing are combined for reliability analysis. Using full credibility standard, GLM is associated with limited fluctuation credibility approach. Comparison criteria such as asymptotic variance and credibility probability are used to analyze the credibility of risk classes. An application is performed by using one-year claim frequency data of a Turkish insurance company and results of credible risk classes are interpreted.

  20. Multi-hazard risk analysis for management strategies

    NASA Astrophysics Data System (ADS)

    Kappes, M.; Keiler, M.; Bell, R.; Glade, T.

    2009-04-01

    Risk management is very often operating in a reactive way, responding to an event, instead of proactive starting with risk analysis and building up the whole process of risk evaluation, prevention, event management and regeneration. Since damage and losses from natural hazards raise continuously more and more studies, concepts (e.g. Switzerland or South Tyrol-Bolozano) and software packages (e.g. ARMAGEDOM, HAZUS or RiskScape) are developed to guide, standardize and facilitate the risk analysis. But these approaches focus on different aspects and are mostly closely adapted to the situation (legislation, organization of the administration, specific processes etc.) of the specific country or region. We propose in this study the development of a flexible methodology for multi-hazard risk analysis, identifying the stakeholders and their needs, processes and their characteristics, modeling approaches as well as incoherencies occurring by combining all these different aspects. Based on this concept a flexible software package will be established consisting of ArcGIS as central base and being complemented by various modules for hazard modeling, vulnerability assessment and risk calculation. Not all modules will be developed newly but taken from the current state-of-the-art and connected or integrated into ArcGIS. For this purpose two study sites, Valtellina in Italy and Bacelonnette in France, were chosen and the hazards types debris flows, rockfalls, landslides, avalanches and floods are planned to be included in the tool for a regional multi-hazard risk analysis. Since the central idea of this tool is its flexibility this will only be a first step, in the future further processes and scales can be included and the instrument thus adapted to any study site.

  1. Risk Analysis and Decision Making FY 2013 Milestone Report

    SciTech Connect

    Engel, David W.; Dalton, Angela C.; Dale, Crystal; Jones, Edward; Thompson, J.

    2013-06-01

    Risk analysis and decision making is one of the critical objectives of CCSI, which seeks to use information from science-based models with quantified uncertainty to inform decision makers who are making large capital investments. The goal of this task is to develop tools and capabilities to facilitate the development of risk models tailored for carbon capture technologies, quantify the uncertainty of model predictions, and estimate the technical and financial risks associated with the system. This effort aims to reduce costs by identifying smarter demonstrations, which could accelerate development and deployment of the technology by several years.

  2. Gambler Risk Perception: A Mental Model and Grounded Theory Analysis.

    PubMed

    Spurrier, Michael; Blaszczynski, Alexander; Rhodes, Paul

    2015-09-01

    Few studies have investigated how gamblers perceive risk or the role of risk perception in disordered gambling. The purpose of the current study therefore was to obtain data on lay gamblers' beliefs on these variables and their effects on decision-making, behaviour, and disordered gambling aetiology. Fifteen regular lay gamblers (non-problem/low risk, moderate risk and problem gamblers) completed a semi-structured interview following mental models and grounded theory methodologies. Gambler interview data was compared to an expert 'map' of risk-perception, to identify comparative gaps or differences associated with harmful or safe gambling. Systematic overlapping processes of data gathering and analysis were used to iteratively extend, saturate, test for exception, and verify concepts and themes emerging from the data. The preliminary findings suggested that gambler accounts supported the presence of expert conceptual constructs, and to some degree the role of risk perception in protecting against or increasing vulnerability to harm and disordered gambling. Gambler accounts of causality, meaning, motivation, and strategy were highly idiosyncratic, and often contained content inconsistent with measures of disordered gambling. Disordered gambling appears heavily influenced by relative underestimation of risk and overvaluation of gambling, based on explicit and implicit analysis, and deliberate, innate, contextual, and learned processing evaluations and biases. PMID:24402720

  3. Why Map Issues? On Controversy Analysis as a Digital Method

    PubMed Central

    2015-01-01

    This article takes stock of recent efforts to implement controversy analysis as a digital method in the study of science, technology, and society (STS) and beyond and outlines a distinctive approach to address the problem of digital bias. Digital media technologies exert significant influence on the enactment of controversy in online settings, and this risks undermining the substantive focus of controversy analysis conducted by digital means. To address this problem, I propose a shift in thematic focus from controversy analysis to issue mapping. The article begins by distinguishing between three broad frameworks that currently guide the development of controversy analysis as a digital method, namely, demarcationist, discursive, and empiricist. Each has been adopted in STS, but only the last one offers a digital “move beyond impartiality.” I demonstrate this approach by analyzing issues of Internet governance with the aid of the social media platform Twitter. PMID:26336325

  4. 12 CFR 3.210 - Standardized measurement method for specific risk

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... risk 3.210 Section 3.210 Banks and Banking COMPTROLLER OF THE CURRENCY, DEPARTMENT OF THE TREASURY CAPITAL ADEQUACY STANDARDS Risk-Weighted Assets-Market Risk § 3.210 Standardized measurement method for specific risk (a) General requirement. A national bank or Federal savings association must calculate...

  5. Quantitative Risk Analysis of Obstacle Limitation Standards

    NASA Astrophysics Data System (ADS)

    Sandaradura, Amila Silva

    Obstacle limitation surfaces (OLS) are the main safeguard against objects that can pose a hazard to aircraft operations at and around the airports. The standard dimensions of the most of these surfaces were estimated using the pilot's experience at the time when they were included in to the standard documents. As a result, some of these standards may have been overestimated while others may not provide an adequate level of safety. With airports moving to the Safety Management System (SMS) approach to design and operations safety, proper evaluation of the level of safety provided by OLS at specific sites becomes great importance to airport operators. There is no published evidence, however, for the estimation of the safety level provided by the existing OLS standards. Moreover, the rationale used by the ICAO to establish existing OLS standards is not readily available in the standard documents. Therefore this study attempts to collect actual flight path data using information provided by air traffic control radars and construct a methodology to assess the probability of aircraft deviating from their intended/protected path. The extension of the developed methodology can be used to estimate the OLS dimensions that provide an acceptable safety level for the aircraft operations. This will be helpful to estimate safe and efficient standard dimensions of the OLS and assess the risk level of objects to the aircraft operations around airports. In order to assess the existing standards and show the applications of the methodology, three case studies were conducted using aircraft data collected from Ottawa (CYOW), Calgary (CYYC) and Edmonton (CYEG) International Airports.

  6. Risk analysis. HIV / AIDS country profile: Mozambique.

    PubMed

    1996-12-01

    Mozambique's National STD/AIDS Control Program (NACP) estimates that, at present, about 8% of the population is infected with human immunodeficiency virus (HIV). The epidemic is expected to peak in 1997. By 2001, Mozambique is projected to have 1,650,000 HIV-positive adults 15-49 years of age, of whom 500,000 will have developed acquired immunodeficiency syndrome (AIDS), and 500,000 AIDS orphans. Incidence rates are highest in the country's central region, the transport corridors, and urban centers. The rapid spread of HIV has been facilitated by extreme poverty, the social upheaval and erosion of traditional norms created by years of political conflict and civil war, destruction of the primary health care infrastructure, growth of the commercial sex work trade, and labor migration to and from neighboring countries with high HIV prevalence. Moreover, about 10% of the adult population suffers from sexually transmitted diseases (STDs), including genital ulcers. NACP, created in 1988, is attempting to curb the further spread of HIV through education aimed at changing high-risk behaviors and condom distribution to prevent STD transmission. Theater performances and radio/television programs are used to reach the large illiterate population. The integration of sex education and STD/AIDS information in the curricula of primary and secondary schools and universities has been approved by the Ministry of Education. Several private companies have been persuaded to distribute condoms to their employees. Finally, the confidentiality of HIV patients has been guaranteed. In 1993, the total AIDS budget was US $1.67 million, 50% of which was provided by the European Union. The European Commission seeks to develop a national strategy for managing STDs within the primary health care system. PMID:12320532

  7. Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture

    DOEpatents

    Sanfilippo, Antonio P.; Cowell, Andrew J.; Gregory, Michelle L.; Baddeley, Robert L.; Paulson, Patrick R.; Tratz, Stephen C.; Hohimer, Ryan E.

    2012-03-20

    Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture are described according to some aspects. In one aspect, a hypothesis analysis method includes providing a hypothesis, providing an indicator which at least one of supports and refutes the hypothesis, using the indicator, associating evidence with the hypothesis, weighting the association of the evidence with the hypothesis, and using the weighting, providing information regarding the accuracy of the hypothesis.

  8. Airbreathing hypersonic vehicle design and analysis methods

    NASA Technical Reports Server (NTRS)

    Lockwood, Mary Kae; Petley, Dennis H.; Hunt, James L.; Martin, John G.

    1996-01-01

    The design, analysis, and optimization of airbreathing hypersonic vehicles requires analyses involving many highly coupled disciplines at levels of accuracy exceeding those traditionally considered in a conceptual or preliminary-level design. Discipline analysis methods including propulsion, structures, thermal management, geometry, aerodynamics, performance, synthesis, sizing, closure, and cost are discussed. Also, the on-going integration of these methods into a working environment, known as HOLIST, is described.

  9. Risk Analysis Methodology for Kistler's K-1 Reusable Launch Vehicle

    NASA Astrophysics Data System (ADS)

    Birkeland, Paul W.

    2002-01-01

    Missile risk analysis methodologies were originally developed in the 1940s as the military experimented with intercontinental ballistic missile (ICBM) technology. As the range of these missiles increased, it became apparent that some means of assessing the risk posed to neighboring populations was necessary to gauge the relative safety of a given test. There were many unknowns at the time, and technology was unpredictable at best. Risk analysis itself was in its infancy. Uncertainties in technology and methodology led to an ongoing bias toward conservative assumptions to adequately bound the problem. This methodology ultimately became the Casualty Expectation Analysis that is used to license Expendable Launch Vehicles (ELVs). A different risk analysis approach was adopted by the commercial aviation industry in the 1950s. At the time, commercial aviation technology was more firmly in hand than ICBM technology. Consequently commercial aviation risk analysis focused more closely on the hardware characteristics. Over the years, this approach has enabled the advantages of technological and safety advances in commercial aviation hardware to manifest themselves in greater capabilities and opportunities. The Boeing 777, for example, received approval for trans-oceanic operations "out of the box," where all previous aircraft were required, at the very least, to demonstrate operations over thousands of hours before being granted such approval. This "out of the box" approval is likely to become standard for all subsequent designs. In short, the commercial aircraft approach to risk analysis created a more flexible environment for industry evolution and growth. In contrast, the continued use of the Casualty Expectation Analysis by the launch industry is likely to hinder industry maturation. It likely will cause any safety and reliability gains incorporated into RLV design to be masked by the conservative assumptions made to "bound the problem." Consequently, for the launch

  10. The impact of communicating genetic risks of disease on risk-reducing health behaviour: systematic review with meta-analysis

    PubMed Central

    Hollands, Gareth J; French, David P; Griffin, Simon J; Prevost, A Toby; Sutton, Stephen; King, Sarah

    2016-01-01

    Objective To assess the impact of communicating DNA based disease risk estimates on risk-reducing health behaviours and motivation to engage in such behaviours. Design Systematic review with meta-analysis, using Cochrane methods. Data sources Medline, Embase, PsycINFO, CINAHL, and the Cochrane Central Register of Controlled Trials up to 25 February 2015. Backward and forward citation searches were also conducted. Study selection Randomised and quasi-randomised controlled trials involving adults in which one group received personalised DNA based estimates of disease risk for conditions where risk could be reduced by behaviour change. Eligible studies included a measure of risk-reducing behaviour. Results We examined 10 515 abstracts and included 18 studies that reported on seven behavioural outcomes, including smoking cessation (six studies; n=2663), diet (seven studies; n=1784), and physical activity (six studies; n=1704). Meta-analysis revealed no significant effects of communicating DNA based risk estimates on smoking cessation (odds ratio 0.92, 95% confidence interval 0.63 to 1.35, P=0.67), diet (standardised mean difference 0.12, 95% confidence interval −0.00 to 0.24, P=0.05), or physical activity (standardised mean difference −0.03, 95% confidence interval −0.13 to 0.08, P=0.62). There were also no effects on any other behaviours (alcohol use, medication use, sun protection behaviours, and attendance at screening or behavioural support programmes) or on motivation to change behaviour, and no adverse effects, such as depression and anxiety. Subgroup analyses provided no clear evidence that communication of a risk-conferring genotype affected behaviour more than communication of the absence of such a genotype. However, studies were predominantly at high or unclear risk of bias, and evidence was typically of low quality. Conclusions Expectations that communicating DNA based risk estimates changes behaviour is not supported by existing evidence

  11. Laboratory theory and methods for sediment analysis

    USGS Publications Warehouse

    Guy, Harold P.

    1969-01-01

    The diverse character of fluvial sediments makes the choice of laboratory analysis somewhat arbitrary and the pressing of sediment samples difficult. This report presents some theories and methods used by the Water Resources Division for analysis of fluvial sediments to determine the concentration of suspended-sediment samples and the particle-size distribution of both suspended-sediment and bed-material samples. Other analyses related to these determinations may include particle shape, mineral content, and specific gravity, the organic matter and dissolved solids of samples, and the specific weight of soils. The merits and techniques of both the evaporation and filtration methods for concentration analysis are discussed. Methods used for particle-size analysis of suspended-sediment samples may include the sieve pipet, the VA tube-pipet, or the BW tube-VA tube depending on the equipment available, the concentration and approximate size of sediment in the sample, and the settling medium used. The choice of method for most bed-material samples is usually limited to procedures suitable for sand or to some type of visual analysis for large sizes. Several tested forms are presented to help insure a well-ordered system in the laboratory to handle the samples, to help determine the kind of analysis required for each, to conduct the required processes, and to assist in the required computations. Use of the manual should further 'standardize' methods of fluvial sediment analysis among the many laboratories and thereby help to achieve uniformity and precision of the data.

  12. 2007 Wholesale Power Rate Case Initial Proposal : Risk Analysis Study.

    SciTech Connect

    United States. Bonneville Power Administration.

    2005-11-01

    The Federal Columbia River Power System (FCRPS), operated on behalf of the ratepayers of the PNW by BPA and other Federal agencies, faces many uncertainties during the FY 2007-2009 rate period. Among these uncertainties, the largest revolve around hydro conditions, market prices and river operations for fish recovery. In order to provide a high probability of making its U.S. Treasury payments, BPA performs a Risk Analysis as part of its rate-making process. In this Risk Analysis, BPA identifies key risks, models their relationships, and then analyzes their impacts on net revenues (total revenues less expenses). BPA subsequently evaluates in the ToolKit Model the Treasury Payment Probability (TPP) resulting from the rates, risks, and risk mitigation measures described here and in the Wholesale Power Rate Development Study (WPRDS). If the TPP falls short of BPA's standard, additional risk mitigation revenues, such as PNRR and CRAC revenues are incorporated in the modeling in ToolKit until the TPP standard is met. Increased wholesale market price volatility and six years of drought have significantly changed the profile of risk and uncertainty facing BPA and its stakeholders. These present new challenges for BPA in its effort to keep its power rates as low as possible while fully meeting its obligations to the U.S. Treasury. As a result, the risk BPA faces in not receiving the level of secondary revenues that have been credited to power rates before receiving those funds is greater. In addition to market price volatility, BPA also faces uncertainty around the financial impacts of operations for fish programs in FY 2006 and in the FY 2007-2009 rate period. A new Biological Opinion or possible court-ordered change to river operations in FY 2006 through FY 2009 may reduce BPA's net revenues included Initial Proposal. Finally, the FY 2007-2009 risk analysis includes new operational risks as well as a more comprehensive analysis of non-operating risks. Both the operational

  13. [Establishment of Method for Health Risk Assessment of Pollutants from Fixed Sources].

    PubMed

    Chen, Qiang; Wu, Huan-bo

    2016-05-15

    A health risk assessment method of pollutants from fixed sources was developed by applying AERMOD model in the health risk assessment. The method could directly forecast the health risks of toxic pollutants from source by some exposure pathway. Using the established method, in combination with the data of sources and traditional health risk assessment method as well as the measured data of PAHs in inhalation particle matter (PM₁₀) in Lanzhou, the health risk of polycyclic aromatic hydrocarbons (PAHs) and benzo [a] pyrene (BaP) in PM₁₀ from the three fire power plants and the health risk of PAHs and BaP in PM₁₀ at the receptor point by inhalation exposure in heating and non-heating seasons was calculated, respectively. Then the contribution rates of the health risk caused by the three fire power plants to the health risk at the receptor point were calculated. The results showed that the contribution rates were not associated with sex and age, but were associated with time period and risk types. The contribution rates in the non-heating seasons were greater than those in heating seasons, and the contribution rates of the carcinogenic risk index were greater than those of the cancer risk value. The reliability of the established method was validated by comparing with the traditional method. This method was applicable to health risk assessment of toxic pollutants from all fixed sources and environmental risk assessment of environmental impact assessment. PMID:27506015

  14. Population-standardized genetic risk score: the SNP-based method of choice for inherited risk assessment of prostate cancer.

    PubMed

    Conran, Carly A; Na, Rong; Chen, Haitao; Jiang, Deke; Lin, Xiaoling; Zheng, S Lilly; Brendler, Charles B; Xu, Jianfeng

    2016-01-01

    Several different approaches are available to clinicians for determining prostate cancer (PCa) risk. The clinical validity of various PCa risk assessment methods utilizing single nucleotide polymorphisms (SNPs) has been established; however, these SNP-based methods have not been compared. The objective of this study was to compare the three most commonly used SNP-based methods for PCa risk assessment. Participants were men (n = 1654) enrolled in a prospective study of PCa development. Genotypes of 59 PCa risk-associated SNPs were available in this cohort. Three methods of calculating SNP-based genetic risk scores (GRSs) were used for the evaluation of individual disease risk such as risk allele count (GRS-RAC), weighted risk allele count (GRS-wRAC), and population-standardized genetic risk score (GRS-PS). Mean GRSs were calculated, and performances were compared using area under the receiver operating characteristic curve (AUC) and positive predictive value (PPV). All SNP-based methods were found to be independently associated with PCa (all P < 0.05; hence their clinical validity). The mean GRSs in men with or without PCa using GRS-RAC were 55.15 and 53.46, respectively, using GRS-wRAC were 7.42 and 6.97, respectively, and using GRS-PS were 1.12 and 0.84, respectively (all P < 0.05 for differences between patients with or without PCa). All three SNP-based methods performed similarly in discriminating PCa from non-PCa based on AUC and in predicting PCa risk based on PPV (all P > 0.05 for comparisons between the three methods), and all three SNP-based methods had a significantly higher AUC than family history (all P < 0.05). Results from this study suggest that while the three most commonly used SNP-based methods performed similarly in discriminating PCa from non-PCa at the population level, GRS-PS is the method of choice for risk assessment at the individual level because its value (where 1.0 represents average population risk) can be easily interpreted regardless

  15. Population-standardized genetic risk score: the SNP-based method of choice for inherited risk assessment of prostate cancer

    PubMed Central

    Conran, Carly A; Na, Rong; Chen, Haitao; Jiang, Deke; Lin, Xiaoling; Zheng, S Lilly; Brendler, Charles B; Xu, Jianfeng

    2016-01-01

    Several different approaches are available to clinicians for determining prostate cancer (PCa) risk. The clinical validity of various PCa risk assessment methods utilizing single nucleotide polymorphisms (SNPs) has been established; however, these SNP-based methods have not been compared. The objective of this study was to compare the three most commonly used SNP-based methods for PCa risk assessment. Participants were men (n = 1654) enrolled in a prospective study of PCa development. Genotypes of 59 PCa risk-associated SNPs were available in this cohort. Three methods of calculating SNP-based genetic risk scores (GRSs) were used for the evaluation of individual disease risk such as risk allele count (GRS-RAC), weighted risk allele count (GRS-wRAC), and population-standardized genetic risk score (GRS-PS). Mean GRSs were calculated, and performances were compared using area under the receiver operating characteristic curve (AUC) and positive predictive value (PPV). All SNP-based methods were found to be independently associated with PCa (all P < 0.05; hence their clinical validity). The mean GRSs in men with or without PCa using GRS-RAC were 55.15 and 53.46, respectively, using GRS-wRAC were 7.42 and 6.97, respectively, and using GRS-PS were 1.12 and 0.84, respectively (all P < 0.05 for differences between patients with or without PCa). All three SNP-based methods performed similarly in discriminating PCa from non-PCa based on AUC and in predicting PCa risk based on PPV (all P > 0.05 for comparisons between the three methods), and all three SNP-based methods had a significantly higher AUC than family history (all P < 0.05). Results from this study suggest that while the three most commonly used SNP-based methods performed similarly in discriminating PCa from non-PCa at the population level, GRS-PS is the method of choice for risk assessment at the individual level because its value (where 1.0 represents average population risk) can be easily interpreted regardless

  16. The risk of kidney stones following bariatric surgery: a systematic review and meta-analysis.

    PubMed

    Thongprayoon, Charat; Cheungpasitporn, Wisit; Vijayvargiya, Priya; Anthanont, Pimjai; Erickson, Stephen B

    2016-04-01

    Background With rising prevalence of morbid obesity, the number of bariatric surgeries performed each year has been increasing worldwide. The objective of this meta-analysis was to assess the risk of kidney stones following bariatric surgery. Methods A literature search was performed using MEDLINE, EMBASE, and Cochrane Database of Systematic Reviews from inception through July 2015. Only studies reporting relative risks, odd ratios or hazard ratios (HRs) to compare risk of kidney stones in patients who underwent bariatric surgery versus no surgery were included. Pooled risk ratios (RR) and 95% confidence interval (CI) were calculated using a random-effect, generic inverse variance method. Results Four studies (One randomized controlled trial and three cohort studies) with 11,348 patients were included in analysis to assess the risk of kidney stones following bariatric surgery. The pooled RR of kidney stones in patients undergoing bariatric surgery was 1.22 (95% CI, 0.63-2.35). The type of bariatric surgery subgroup analysis demonstrated an increased risk of kidney stones in patients following Roux-en-Y gastric bypass (RYGB) with the pooled RR of 1.73 (95% CI, 1.30-2.30) and a decreased risk of kidney stones in patients following restrictive procedures including laparoscopic banding or sleeve gastrectomy with the pooled RR of 0.37 (95% CI, 0.16-0.85). Conclusions Our meta-analysis demonstrates an association between RYGB and increased risk of kidney stones. Restrictive bariatric surgery, on the other hand, may decrease kidney stone risk. Future study with long-term follow-up data is needed to confirm this potential benefit of restrictive bariatric surgery. PMID:26803902

  17. RiskChanges Spatial Decision Support system for the analysis of changing multi-hazard risk

    NASA Astrophysics Data System (ADS)

    van Westen, Cees; Zhang, Kaixi; Bakker, Wim; Andrejchenko, Vera; Berlin, Julian; Olyazadeh, Roya; Cristal, Irina

    2015-04-01

    Within the framework of the EU FP7 Marie Curie Project CHANGES and the EU FP7 Copernicus project INCREO a spatial decision support system was developed with the aim to analyse the effect of risk reduction planning alternatives on reducing the risk now and in the future, and support decision makers in selecting the best alternatives. Central to the SDSS are the stakeholders. The envisaged users of the system are organizations involved in planning of risk reduction measures, and that have staff capable of visualizing and analyzing spatial data at a municipal scale. The SDSS should be able to function in different countries with different legal frameworks and with organizations with different mandates. These could be subdivided into Civil protection organization with the mandate to design disaster response plans, Expert organizations with the mandate to design structural risk reduction measures (e.g. dams, dikes, check-dams etc), and planning organizations with the mandate to make land development plans. The SDSS can be used in different ways: analyzing the current level of risk, analyzing the best alternatives for risk reduction, the evaluation of the consequences of possible future scenarios to the risk levels, and the evaluation how different risk reduction alternatives will lead to risk reduction under different future scenarios. The SDSS is developed based on open source software and following open standards, for code as well as for data formats and service interfaces. Code development was based upon open source software as well. The architecture of the system is modular. The various parts of the system are loosely coupled, extensible, using standards for interoperability, flexible and web-based. The Spatial Decision Support System is composed of a number of integrated components. The Risk Assessment component allows to carry out spatial risk analysis, with different degrees of complexity, ranging from simple exposure (overlay of hazard and assets maps) to

  18. Risk assessment methods for cardiac surgery and intervention.

    PubMed

    Thalji, Nassir M; Suri, Rakesh M; Greason, Kevin L; Schaff, Hartzell V

    2014-12-01

    Surgical risk models estimate operative outcomes while controlling for heterogeneity in 'case mix' within and between institutions. In cardiac surgery, risk models are used for patient counselling, surgical decision-making, clinical research, quality assurance and improvement, and financial reimbursement. Importantly, risk models are only as good as the databases from which they are derived; physicians and investigators should, therefore, be aware of shortcomings of clinical and administrative databases used for modelling risk estimates. The most frequently modelled outcome in cardiac surgery is 30-day mortality. However, results of randomized trials to compare conventional surgery versus transcatheter aortic valve implantation (TAVI) indicate attrition of surgical patients at 2-4 months postoperatively, suggesting that 3-month survival or mortality might be an appropriate procedural end point worth modelling. Risk models are increasingly used to identify patients who might be better-suited for TAVI. However, the appropriateness of available statistical models in this application is controversial, particularly given the tendency of risk models to misestimate operative mortality in high-risk patient subsets. Incorporation of new risk factors (such as previous mediastinal radiation, liver failure, and frailty) in future surgical or interventional risk-prediction tools might enhance model performance, and thereby optimize patient selection for TAVI. PMID:25245832

  19. A novel risk-based analysis for the production system under epistemic uncertainty

    NASA Astrophysics Data System (ADS)

    Khalaj, Mehran; Khalaj, Fereshteh; Khalaj, Amineh

    2013-11-01

    Risk analysis of production system, while the actual and appropriate data is not available, will cause wrong system parameters prediction and wrong decision making. In uncertainty condition, there are no appropriate measures for decision making. In epistemic uncertainty, we are confronted by the lack of data. Therefore, in calculating the system risk, we encounter vagueness that we have to use more methods that are efficient in decision making. In this research, using Dempster-Shafer method and risk assessment diagram, the researchers have achieved a better method of calculating tools failure risk. Traditional statistical methods for recognizing and evaluating systems are not always appropriate, especially when enough data is not available. The goal of this research was to present a more modern and applied method in real world organizations. The findings of this research were used in a case study, and an appropriate framework and constraint for tools risk were provided. The research has presented a hopeful concept for the calculation of production systems' risk, and its results show that in uncertainty condition or in case of the lack of knowledge, the selection of an appropriate method will facilitate the decision-making process.

  20. 12 CFR 327.9 - Assessment risk categories and pricing methods.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 12 Banks and Banking 4 2011-01-01 2011-01-01 false Assessment risk categories and pricing methods. 327.9 Section 327.9 Banks and Banking FEDERAL DEPOSIT INSURANCE CORPORATION REGULATIONS AND STATEMENTS OF GENERAL POLICY ASSESSMENTS In General § 327.9 Assessment risk categories and pricing methods. (a) Risk Categories.—Each insured...

  1. Influence of dorsolateral prefrontal cortex and ventral striatum on risk avoidance in addiction: a mediation analysis*

    PubMed Central

    Yamamoto, Dorothy J.; Woo, Choong-Wan; Wager, Tor D.; Regner, Michael F.; Tanabe, Jody

    2015-01-01

    Background Alterations in frontal and striatal function are hypothesized to underlie risky decision-making in drug users, but how these regions interact to affect behavior is incompletely understood. We used mediation analysis to investigate how prefrontal cortex and ventral striatum together influence risk avoidance in abstinent drug users. Method Thirty-seven abstinent substance-dependent individuals (SDI) and 43 controls underwent fMRI while performing a decision-making task involving risk and reward. Analyses of a priori regions-of-interest tested whether activity in dorsolateral prefrontal cortex (DLPFC) and ventral striatum (VST) explained group differences in risk avoidance. Whole-brain analysis was conducted to identify brain regions influencing the negative VST-risk avoidance relationship. Results Right DLPFC (RDLPFC) positively mediated the group-risk avoidance relationship (p < 0.05); RDLPFC activity was higher in SDI and predicted higher risk avoidance across groups, controlling for SDI vs. controls. Conversely, VST activity negatively influenced risk avoidance (p < 0.05); it was higher in SDI, and predicted lower risk avoidance. Whole-brain analysis revealed that, across group, RDLPFC and left temporal-parietal junction positively (p ≤ 0.001) while right thalamus and left middle frontal gyrus negatively (p < 0.005) mediated the VST activity-risk avoidance relationship. Conclusion RDLPFC activity mediated less risky decision-making while VST mediated more risky decision-making across drug users and controls. These results suggest a dual pathway underlying decision-making, which, if imbalanced, may adversely influence choices involving risk. Modeling contributions of multiple brain systems to behavior through mediation analysis could lead to a better understanding of mechanisms of behavior and suggest neuromodulatory treatments for addiction. PMID:25736619

  2. Assessment of ecological risks at former landfill site using TRIAD procedure and multicriteria analysis.

    PubMed

    Sorvari, Jaana; Schultz, Eija; Haimi, Jari

    2013-02-01

    Old industrial landfills are important sources of environmental contamination in Europe, including Finland. In this study, we demonstrated the combination of TRIAD procedure, multicriteria decision analysis (MCDA), and statistical Monte Carlo analysis for assessing the risks to terrestrial biota in a former landfill site contaminated by petroleum hydrocarbons (PHCs) and metals. First, we generated hazard quotients by dividing the concentrations of metals and PHCs in soil by the corresponding risk-based ecological benchmarks. Then we conducted ecotoxicity tests using five plant species, earthworms, and potworms, and determined the abundance and diversity of soil invertebrates from additional samples. We aggregated the results in accordance to the methods used in the TRIAD procedure, conducted rating of the assessment methods based on their performance in terms of specific criteria, and weighted the criteria using two alternative weighting techniques to produce performance scores for each method. We faced problems in using the TRIAD procedure, for example, the results from the animal counts had to be excluded from the calculation of integrated risk estimates (IREs) because our reference soil sample showed the lowest biodiversity and abundance of soil animals. In addition, hormesis hampered the use of the results from the ecotoxicity tests. The final probabilistic IREs imply significant risks at all sampling locations. Although linking MCDA with TRIAD provided a useful means to study and consider the performance of the alternative methods in predicting ecological risks, some uncertainties involved still remained outside the quantitative analysis. PMID:22762796

  3. Hybrid least squares multivariate spectral analysis methods

    DOEpatents

    Haaland, David M.

    2002-01-01

    A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following estimation or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The "hybrid" method herein means a combination of an initial classical least squares analysis calibration step with subsequent analysis by an inverse multivariate analysis method. A "spectral shape" herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The "shape" can be continuous, discontinuous, or even discrete points illustrative of the particular effect.

  4. The Use of Object-Oriented Analysis Methods in Surety Analysis

    SciTech Connect

    Craft, Richard L.; Funkhouser, Donald R.; Wyss, Gregory D.

    1999-05-01

    Object-oriented analysis methods have been used in the computer science arena for a number of years to model the behavior of computer-based systems. This report documents how such methods can be applied to surety analysis. By embodying the causality and behavior of a system in a common object-oriented analysis model, surety analysts can make the assumptions that underlie their models explicit and thus better communicate with system designers. Furthermore, given minor extensions to traditional object-oriented analysis methods, it is possible to automatically derive a wide variety of traditional risk and reliability analysis methods from a single common object model. Automatic model extraction helps ensure consistency among analyses and enables the surety analyst to examine a system from a wider variety of viewpoints in a shorter period of time. Thus it provides a deeper understanding of a system's behaviors and surety requirements. This report documents the underlying philosophy behind the common object model representation, the methods by which such common object models can be constructed, and the rules required to interrogate the common object model for derivation of traditional risk and reliability analysis models. The methodology is demonstrated in an extensive example problem.

  5. Heavy Metal Risk Management: Case Analysis

    PubMed Central

    Kim, Ji Ae; Lee, Seung Ha; Choi, Seung Hyun; Jung, Ki Kyung; Park, Mi Sun; Jeong, Ji Yoon; Hwang, Myung Sil; Yoon, Hae Jung; Choi, Dal Woong

    2012-01-01

    To prepare measures for practical policy utilization and the control of heavy metals, hazard control related institutions by country, present states of control by country, and present states of control by heavy metals were examined. Hazard control cases by heavy metals in various countries were compared and analyzed. In certain countries (e.g., the U.S., the U.K., and Japan), hazardous substances found in foods (e.g., arsenic, lead, cadmium, and mercury) are controlled. In addition, the Joint FAO/WHO Expert Committee on Food Additives (JECFA) recommends calculating the provisional tolerable weekly intake (PTWI) of individual heavy metals instead of the acceptable daily intake (ADI) to compare their pollution levels considering their toxicity accumulated in the human body. In Korea, exposure assessments have been conducted, and in other countries, hazardous substances are controlled by various governing bodies. As such, in Korea and other countries, diverse food heavy metal monitoring and human body exposure assessments are conducted, and reducing measures are prepared accordingly. To reduce the danger of hazardous substances, many countries provide leaflets and guidelines, develop hazardous heavy metal intake recommendations, and take necessary actions. Hazard control case analyses can assist in securing consumer safety by establishing systematic and reliable hazard control methods. PMID:24278603

  6. Developing New Tools and Methods for Risk Assessment

    EPA Science Inventory

    Traditionally, risk assessment for environmental chemicals is based upon epidemiological and/or animal toxicity data. Since the release of the National Academy of Sciences Toxicity in the 21st Century: A Vision and a Strategy (2007) and Science and Decisions: Advancing Risk Asses...

  7. QUANTITATIVE CANCER RISK ASSESSMENT METHODOLOGY USING SHORT-TERM GENETIC BIOASSAYS: THE COMPARATIVE POTENCY METHOD

    EPA Science Inventory

    Quantitative risk assessment is fraught with many uncertainties. The validity of the assumptions underlying the methods employed are often difficult to test or validate. Cancer risk assessment has generally employed either human epidemiological data from relatively high occupatio...

  8. Comparative risk analysis of technological hazards (a review).

    PubMed Central

    Kates, R W; Kasperson, J X

    1983-01-01

    Hazards are threats to people and what they value and risks are measures of hazards. Comparative analyses of the risks and hazards of technology can be dated to Starr's 1969 paper [Starr, C. (1969) Science 165, 1232-1238] but are rooted in recent trends in the evolution of technology, the identification of hazard, the perception of risk, and the activities of society. These trends have spawned an interdisciplinary quasi profession with new terminology, methodology, and literature. A review of 54 English-language monographs and book-length collections, published between 1970 and 1983, identified seven recurring themes: (i) overviews of the field of risk assessment, (ii) efforts to estimate and quantify risk, (iii) discussions of risk acceptability, (iv) perception, (v) analyses of regulation, (vi) case studies of specific technological hazards, and (vii) agenda for research. Within this field, science occupies a unique niche, for many technological hazards transcend the realm of ordinary experience and require expert study. Scientists can make unique contributions to each area of hazard management but their primary contribution is the practice of basic science. Beyond that, science needs to further risk assessment by understanding the more subtle processes of hazard creation and by establishing conventions for estimating risk and for presenting and handling uncertainty. Scientists can enlighten the discussion of tolerable risk by setting risks into comparative contexts, by studying the process of evaluation, and by participating as knowledgeable individuals, but they cannot decide the issue. Science can inform the hazard management process by broadening the range of alternative control actions and modes of implementation and by devising methods to evaluate their effectiveness. PMID:6580625

  9. New risk metrics and mathematical tools for risk analysis: Current and future challenges

    NASA Astrophysics Data System (ADS)

    Skandamis, Panagiotis N.; Andritsos, Nikolaos; Psomas, Antonios; Paramythiotis, Spyridon

    2015-01-01

    The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) the Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total `failure' that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user-friendly softwares

  10. New risk metrics and mathematical tools for risk analysis: Current and future challenges

    SciTech Connect

    Skandamis, Panagiotis N. Andritsos, Nikolaos Psomas, Antonios Paramythiotis, Spyridon

    2015-01-22

    The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) the Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total ‘failure’ that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user

  11. Rocky Flats Plant Live-Fire Range Risk Analysis Report

    SciTech Connect

    Nicolosi, S.L.; Rodriguez, M.A.

    1994-04-01

    The objective of the Live-Fire Range Risk Analysis Report (RAR) is to provide an authorization basis for operation as required by DOE 5480.16. The existing Live-Fire Range does not have a safety analysis-related authorization basis. EG&G Rocky Flats, Inc. has worked with DOE and its representatives to develop a format and content description for development of an RAR for the Live-Fire Range. Development of the RAR is closely aligned with development of the design for a baffle system to control risks from errant projectiles. DOE 5480.16 requires either an RAR or a safety analysis report (SAR) for live-fire ranges. An RAR rather than a SAR was selected in order to gain flexibility to more closely address the safety analysis and conduct of operation needs for a live-fire range in a cost-effective manner.

  12. Accidental Water Pollution Risk Analysis of Mine Tailings Ponds in Guanting Reservoir Watershed, Zhangjiakou City, China.

    PubMed

    Liu, Renzhi; Liu, Jing; Zhang, Zhijiao; Borthwick, Alistair; Zhang, Ke

    2015-12-01

    Over the past half century, a surprising number of major pollution incidents occurred due to tailings dam failures. Most previous studies of such incidents comprised forensic analyses of environmental impacts after a tailings dam failure, with few considering the combined pollution risk before incidents occur at a watershed-scale. We therefore propose Watershed-scale Tailings-pond Pollution Risk Analysis (WTPRA), designed for multiple mine tailings ponds, stemming from previous watershed-scale accidental pollution risk assessments. Transferred and combined risk is embedded using risk rankings of multiple routes of the "source-pathway-target" in the WTPRA. The previous approach is modified using multi-criteria analysis, dam failure models, and instantaneous water quality models, which are modified for application to multiple tailings ponds. The study area covers the basin of Gutanting Reservoir (the largest backup drinking water source for Beijing) in Zhangjiakou City, where many mine tailings ponds are located. The resultant map shows that risk is higher downstream of Gutanting Reservoir and in its two tributary basins (i.e., Qingshui River and Longyang River). Conversely, risk is lower in the midstream and upstream reaches. The analysis also indicates that the most hazardous mine tailings ponds are located in Chongli and Xuanhua, and that Guanting Reservoir is the most vulnerable receptor. Sensitivity and uncertainty analyses are performed to validate the robustness of the WTPRA method. PMID:26633450

  13. Accidental Water Pollution Risk Analysis of Mine Tailings Ponds in Guanting Reservoir Watershed, Zhangjiakou City, China

    PubMed Central

    Liu, Renzhi; Liu, Jing; Zhang, Zhijiao; Borthwick, Alistair; Zhang, Ke

    2015-01-01

    Over the past half century, a surprising number of major pollution incidents occurred due to tailings dam failures. Most previous studies of such incidents comprised forensic analyses of environmental impacts after a tailings dam failure, with few considering the combined pollution risk before incidents occur at a watershed-scale. We therefore propose Watershed-scale Tailings-pond Pollution Risk Analysis (WTPRA), designed for multiple mine tailings ponds, stemming from previous watershed-scale accidental pollution risk assessments. Transferred and combined risk is embedded using risk rankings of multiple routes of the “source-pathway-target” in the WTPRA. The previous approach is modified using multi-criteria analysis, dam failure models, and instantaneous water quality models, which are modified for application to multiple tailings ponds. The study area covers the basin of Gutanting Reservoir (the largest backup drinking water source for Beijing) in Zhangjiakou City, where many mine tailings ponds are located. The resultant map shows that risk is higher downstream of Gutanting Reservoir and in its two tributary basins (i.e., Qingshui River and Longyang River). Conversely, risk is lower in the midstream and upstream reaches. The analysis also indicates that the most hazardous mine tailings ponds are located in Chongli and Xuanhua, and that Guanting Reservoir is the most vulnerable receptor. Sensitivity and uncertainty analyses are performed to validate the robustness of the WTPRA method. PMID:26633450

  14. Radiation-related cancer risks from CT colonography screening: a risk-benefit analysis

    PubMed Central

    de González, Amy Berrington; Kim, Kwang Pyo; Knudsen, Amy B.; Lansdorp-Vogelaar, Iris; Rutter, Carolyn M.; Smith-Bindman, Rebecca; Yee, Judy; Kuntz, Karen M.; van Ballegooijen, Marjolein; Zauber, Ann G.; Berg, Christine D.

    2012-01-01

    Objective The purpose of this study was to estimate the ratio of cancers prevented to induced (benefit-risk ratio) for CT colonography screening every five years from age 50-80. Materials and methods Radiation-related cancer risk was estimated using risk projection models based on the National Research Council's BEIR VII committee's report and screening protocols from the American College of Radiology Imaging Network's National CT Colonography Trial. Uncertainty limits (UL) were estimated using Monte-Carlo simulation methods. Comparative modelling with three colorectal cancer microsimulation models was used to estimate the potential reduction in colorectal cancer cases and deaths. Results The estimated mean effective dose per CT colonography screen was 8mSv for females and 7mSv for males. The estimated number of radiation-related cancers from CT colonography screening every 5 years from age 50-80 was 150 cases/100,000 individuals (95%UL:80-280) for males and females. The estimated number of colorectal cancers prevented by CT colonography every 5 years from age 50-80 ranged across the three microsimulation models from 3580 to 5190/100,000, yielding a benefit-risk ratio that varied from 24:1(95%UL=13:1-45:1) to 35:1(95%UL=19:1-65:1). The benefit-risk ratio for cancer deaths was even higher than the ratio for cancer cases. Inclusion of radiation-related cancer risks from CT scans following-up extracolonic findings did not materially alter the results. Conclusions Concerns have been raised about recommending CT colonography as a routine screening tool because of the potential harms, including the radiation risks. Based on these models the benefits from CT colonography screening every five years from age 50-80 clearly outweigh the radiation risks. PMID:21427330

  15. Towards secure virtual directories : a risk analysis framework.

    SciTech Connect

    Claycomb, William R.

    2010-07-01

    Directory services are used by almost every enterprise computing environment to provide data concerning users, computers, contacts, and other objects. Virtual directories are components that provide directory services in a highly customized manner. Unfortunately, though the use of virtual directory services are widespread, an analysis of risks posed by their unique position and architecture has not been completed. We present a detailed analysis of six attacks to virtual directory services, including steps for detection and prevention. We also describe various categories of attack risks, and discuss what is necessary to launch an attack on virtual directories. Finally, we present a framework to use in analyzing risks to individual enterprise computing virtual directory instances. We show how to apply this framework to an example implementation, and discuss the benefits of doing so.

  16. Method and apparatus for ceramic analysis

    DOEpatents

    Jankowiak, Ryszard J.; Schilling, Chris; Small, Gerald J.; Tomasik, Piotr

    2003-04-01

    The present invention relates to a method and apparatus for ceramic analysis, in particular, a method for analyzing density, density gradients and/or microcracks, including an apparatus with optical instrumentation for analysis of density, density gradients and/or microcracks in ceramics. The method provides analyzing density of a ceramic comprising exciting a component on a surface/subsurface of the ceramic by exposing the material to excitation energy. The method may further include the step of obtaining a measurement of an emitted energy from the component. The method may additionally include comparing the measurement of the emitted energy from the component with a predetermined reference measurement so as to obtain a density for said ceramic.

  17. Risk Factors for the Perpetration of Child Sexual Abuse: A Review and Meta-Analysis

    ERIC Educational Resources Information Center

    Whitaker, Daniel J.; Le, Brenda; Hanson, R. Karl; Baker, Charlene K.; McMahon, Pam M.; Ryan, Gail; Klein, Alisa; Rice, Deborah Donovan

    2008-01-01

    Objectives: Since the late 1980s, there has been a strong theoretical focus on psychological and social influences of perpetration of child sexual abuse. This paper presents the results of a review and meta-analysis of studies examining risk factors for perpetration of child sexual abuse published since 1990. Method: Eighty-nine studies published…

  18. School Health Promotion Policies and Adolescent Risk Behaviors in Israel: A Multilevel Analysis

    ERIC Educational Resources Information Center

    Tesler, Riki; Harel-Fisch, Yossi; Baron-Epel, Orna

    2016-01-01

    Background: Health promotion policies targeting risk-taking behaviors are being implemented across schools in Israel. This study identified the most effective components of these policies influencing cigarette smoking and alcohol consumption among adolescents. Methods: Logistic hierarchical linear model (HLM) analysis of data for 5279 students in…

  19. Applying Uncertainty Analysis to a Risk Assessment for the Pesticide Permethrin

    EPA Science Inventory

    We discuss the application of methods of uncertainty analysis from our previous poster to the problem of a risk assessment for exposure to the food-use pesticide permethrin resulting from residential pesticide crack and crevice application. Exposures are simulated by the SHEDS (S...

  20. Characterization and evaluation of uncertainty in probabilistic risk analysis

    SciTech Connect

    Parry, G.W.; Winter, P.W.

    1981-01-01

    The sources of uncertainty in probabilistic risk analysis are discussed, using the event and fault-tree methodology as an example. The role of statistics in quantifying these uncertainties is investigated. A class of uncertainties is identified which are, at present, unquantifiable using either classical or Bayesian statistics. it is argued that Bayesian statistics is the more appropriate vehicle for the probabilistic analysis of rare events, and a short review is given with some discussion on the representation of ignorance.

  1. A free and open source QGIS plugin for flood risk analysis: FloodRisk

    NASA Astrophysics Data System (ADS)

    Albano, Raffaele; Sole, Aurelia; Mancusi, Leonardo

    2016-04-01

    An analysis of global statistics shows a substantial increase in flood damage over the past few decades. Moreover, it is expected that flood risk will continue to rise due to the combined effect of increasing numbers of people and economic assets in risk-prone areas and the effects of climate change. In order to increase the resilience of European economies and societies, the improvement of risk assessment and management has been pursued in the last years. This results in a wide range of flood analysis models of different complexities with substantial differences in underlying components needed for its implementation, as geographical, hydrological and social differences demand specific approaches in the different countries. At present, it is emerging the need of promote the creation of open, transparent, reliable and extensible tools for a comprehensive, context-specific and applicable flood risk analysis. In this context, the free and open-source Quantum GIS (QGIS) plugin "FloodRisk" is a good starting point to address this objective. The vision of the developers of this free and open source software (FOSS) is to combine the main features of state-of-the-art science, collaboration, transparency and interoperability in an initiative to assess and communicate flood risk worldwide and to assist authorities to facilitate the quality and fairness of flood risk management at multiple scales. Among the scientific community, this type of activity can be labelled as "participatory research", intended as adopting a set of techniques that "are interactive and collaborative" and reproducible, "providing a meaningful research experience that both promotes learning and generates knowledge and research data through a process of guided discovery"' (Albano et al., 2015). Moreover, this FOSS geospatial approach can lowering the financial barriers to understanding risks at national and sub-national levels through a spatio-temporal domain and can provide better and more complete

  2. Standardised survey method for identifying catchment risks to water quality.

    PubMed

    Baker, D L; Ferguson, C M; Chier, P; Warnecke, M; Watkinson, A

    2016-06-01

    This paper describes the development and application of a systematic methodology to identify and quantify risks in drinking water and recreational catchments. The methodology assesses microbial and chemical contaminants from both diffuse and point sources within a catchment using Escherichia coli, protozoan pathogens and chemicals (including fuel and pesticides) as index contaminants. Hazard source information is gathered by a defined sanitary survey process involving use of a software tool which groups hazards into six types: sewage infrastructure, on-site sewage systems, industrial, stormwater, agriculture and recreational sites. The survey estimates the likelihood of the site affecting catchment water quality, and the potential consequences, enabling the calculation of risk for individual sites. These risks are integrated to calculate a cumulative risk for each sub-catchment and the whole catchment. The cumulative risks process accounts for the proportion of potential input sources surveyed and for transfer of contaminants from upstream to downstream sub-catchments. The output risk matrices show the relative risk sources for each of the index contaminants, highlighting those with the greatest impact on water quality at a sub-catchment and catchment level. Verification of the sanitary survey assessments and prioritisation is achieved by comparison with water quality data and microbial source tracking. PMID:27280603

  3. Quantitative risk analysis of oil storage facilities in seismic areas.

    PubMed

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference. PMID:15908107

  4. RASOR Project: Rapid Analysis and Spatialisation of Risk, from Hazard to Risk using EO data

    NASA Astrophysics Data System (ADS)

    Rossi, Lauro; Rudari, Roberto

    2016-04-01

    Over recent decades, there has been a dramatic rise in disasters, and their impact on human populations. Escalation in complexities in our societies is making risks increasingly difficult to understand and changing the ways in which hazards interact with each other. The Rapid Analysis and Spatialisation and Of Risk (RASOR) project developed a multi-hazard risk analysis platform to support the full cycle of disaster management. RASOR provides up-to-date hazard information across floods and geohazards, up-to-date exposure data from known sources and newly-generated EO-based data, and characterised quantitatively their vulnerabilities. RASOR also adapts the newly-developed 12m resolution global TanDEM-X Digital Elevation Model (DEM) to risk management applications, using it as a base layer to develop specific disaster scenarios. RASOR overlays archived and near real-time very high resolution optical and radar satellite data, combined with in situ data for both global and local applications. A scenario-driven query system allows users to project situations into the future and model multi-hazard risk both before and during an event. Applications with regards to different case study sites are presented in order to illustrate the platform potential.

  5. Safety risk analysis of an innovative environmental technology.

    PubMed

    Parnell, G S; Frimpon, M; Barnes, J; Kloeber, J M; Deckro, R E; Jackson, J A

    2001-02-01

    The authors describe a decision and risk analysis performed for the cleanup of a large Department of Energy mixed-waste subsurface disposal area governed by the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA). In a previous study, the authors worked with the site decision makers, state regulators, and U.S. Environmental Protection Agency regional regulators to develop a CERCLA-based multiobjective decision analysis value model and used the model to perform a screening analysis of 28 remedial alternatives. The analysis results identified an innovative technology, in situ vitrification, with high effectiveness versus cost. Since this technology had not been used on this scale before, the major uncertainties were contaminant migration and pressure buildup. Pressure buildup was a safety concern due to the potential risks to worker safety. With the help of environmental technology experts remedial alternative changes were identified to mitigate the concerns about contaminant migration and pressure buildup. The analysis results showed that the probability of an event with a risk to worker safety had been significantly reduced. Based on these results, site decision makers have refocused their test program to examine in situ vitrification and have continued the use of the CERCLA-based decision analysis methodology to analyze remedial alternatives. PMID:11332543

  6. Use of labour induction and risk of cesarean delivery: a systematic review and meta-analysis

    PubMed Central

    Mishanina, Ekaterina; Rogozinska, Ewelina; Thatthi, Tej; Uddin-Khan, Rehan; Khan, Khalid S.; Meads, Catherine

    2014-01-01

    Background: Induction of labour is common, and cesarean delivery is regarded as its major complication. We conducted a systematic review and meta-analysis to investigate whether the risk of cesarean delivery is higher or lower following labour induction compared with expectant management. Methods: We searched 6 electronic databases for relevant articles published through April 2012 to identify randomized controlled trials (RCTs) in which labour induction was compared with placebo or expectant management among women with a viable singleton pregnancy. We assessed risk of bias and obtained data on rates of cesarean delivery. We used regression analysis techniques to explore the effect of patient characteristics, induction methods and study quality on risk of cesarean delivery. Results: We identified 157 eligible RCTs (n = 31 085). Overall, the risk of cesarean delivery was 12% lower with labour induction than with expectant management (pooled relative risk [RR] 0.88, 95% confidence interval [CI] 0.84–0.93; I2 = 0%). The effect was significant in term and post-term gestations but not in preterm gestations. Meta-regression analysis showed that initial cervical score, indication for induction and method of induction did not alter the main result. There was a reduced risk of fetal death (RR 0.50, 95% CI 0.25–0.99; I2 = 0%) and admission to a neonatal intensive care unit (RR 0.86, 95% CI 0.79–0.94), and no impact on maternal death (RR 1.00, 95% CI 0.10–9.57; I2 = 0%) with labour induction. Interpretation: The risk of cesarean delivery was lower among women whose labour was induced than among those managed expectantly in term and post-term gestations. There were benefits for the fetus and no increased risk of maternal death. PMID:24778358

  7. Methods for assessing uncertainty in fundamental assumptions and associated models for cancer risk assessment.

    PubMed

    Small, Mitchell J

    2008-10-01

    The distributional approach for uncertainty analysis in cancer risk assessment is reviewed and extended. The method considers a combination of bioassay study results, targeted experiments, and expert judgment regarding biological mechanisms to predict a probability distribution for uncertain cancer risks. Probabilities are assigned to alternative model components, including the determination of human carcinogenicity, mode of action, the dosimetry measure for exposure, the mathematical form of the dose-response relationship, the experimental data set(s) used to fit the relationship, and the formula used for interspecies extrapolation. Alternative software platforms for implementing the method are considered, including Bayesian belief networks (BBNs) that facilitate assignment of prior probabilities, specification of relationships among model components, and identification of all output nodes on the probability tree. The method is demonstrated using the application of Evans, Sielken, and co-workers for predicting cancer risk from formaldehyde inhalation exposure. Uncertainty distributions are derived for maximum likelihood estimate (MLE) and 95th percentile upper confidence limit (UCL) unit cancer risk estimates, and the effects of resolving selected model uncertainties on these distributions are demonstrated, considering both perfect and partial information for these model components. A method for synthesizing the results of multiple mechanistic studies is introduced, considering the assessed sensitivities and selectivities of the studies for their targeted effects. A highly simplified example is presented illustrating assessment of genotoxicity based on studies of DNA damage response caused by naphthalene and its metabolites. The approach can provide a formal mechanism for synthesizing multiple sources of information using a transparent and replicable weight-of-evidence procedure. PMID:18844862

  8. Transcatheter versus surgical aortic valve replacement in intermediate risk patients: a meta-analysis

    PubMed Central

    Misenheimer, Jacob A.; Jones, Wesley; Bahekar, Amol; Caughey, Melissa; Ramm, Cassandra J.; Caranasos, Thomas G.; Yeung, Michael; Vavalle, John P.

    2016-01-01

    Background Transcatheter aortic valve replacement (TAVR) has been approved in patients with high or prohibited surgical risk for surgery for treatment of severe symptomatic aortic stenosis. Prospective studies examining the benefits of TAVR in intermediate risk patients are ongoing. Other smaller studies including lower risk patients have been conducted, but further meta-analysis of these studies is required to draw more broad comparisons. Methods A Medline search was conducted using standard methodology to search for clinical trials and observational studies including intermediate risk patients. We limited our meta-analysis to studies matching patient populations by propensity scores or randomization and examined clinical outcomes between TAVR and surgical aortic valve replacement (SAVR). Results Analysis of the TAVR and SAVR cohorts revealed no significant differences in the outcomes of 30-day [OR (95% CI): 0.85 (0.57, 1.26)] or 1-year mortality [OR (95% CI): 0.96 (0.75, 1.23)]. A trend towards benefit with TAVR was noted in terms of neurological events and myocardial infarction (MI) without statistical significance. A statistically significant decrease in risk of post-procedural acute renal failure in the TAVR group [OR (95% CI): 0.52 (0.27, 0.99)] was observed, but so was a significantly higher rate of pacemaker implantations for the TAVR group [OR (95% CI): 6.51 (3.23, 13.12)]. Conclusions We conclude that in intermediate risk patients undergoing aortic valve replacement, the risk of mortality, neurological outcomes, and MI do not appear to be significantly different between TAVR and SAVR. However, there appears to be a significant reduction in risk of acute renal failure at the expense of an increased risk of requiring a permanent pacemaker in low and intermediate risk patients undergoing TAVR compared to SAVR. PMID:27280087

  9. Hybrid least squares multivariate spectral analysis methods

    DOEpatents

    Haaland, David M.

    2004-03-23

    A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following prediction or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The hybrid method herein means a combination of an initial calibration step with subsequent analysis by an inverse multivariate analysis method. A spectral shape herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The shape can be continuous, discontinuous, or even discrete points illustrative of the particular effect.

  10. Risk analysis systems for veterinary biologicals: a regulator's tool box.

    PubMed

    Osborne, C G; McElvaine, M D; Ahl, A S; Glosser, J W

    1995-12-01

    Recent advances in biology and technology have significantly improved our ability to produce veterinary biologicals of high purity, efficacy and safety, virtually anywhere in the world. At the same time, increasing trade and comprehensive trade agreements, such as the Uruguay Round of the General Agreement on Tariffs and Trade (GATT: now the World Trade Organisation [WTO]), have put pressure on governments to use scientific principles in the regulation of trade for a wide range of products, including veterinary biologicals. In many cases, however, nations have been reluctant to allow the movement of veterinary biologicals, due to the perceived threat of importing an exotic disease. This paper discusses the history of risk analysis as a decision support tool and provides examples of how this tool may be used in a science-based regulatory system for veterinary biologicals. A wide variety of tools are described, including qualitative, semi-quantitative and quantitative methods, most with a long history of use in engineering and the health and environmental sciences. PMID:8639961

  11. Advanced analysis methods in particle physics

    SciTech Connect

    Bhat, Pushpalatha C.; /Fermilab

    2010-10-01

    Each generation of high energy physics experiments is grander in scale than the previous - more powerful, more complex and more demanding in terms of data handling and analysis. The spectacular performance of the Tevatron and the beginning of operations of the Large Hadron Collider, have placed us at the threshold of a new era in particle physics. The discovery of the Higgs boson or another agent of electroweak symmetry breaking and evidence of new physics may be just around the corner. The greatest challenge in these pursuits is to extract the extremely rare signals, if any, from huge backgrounds arising from known physics processes. The use of advanced analysis techniques is crucial in achieving this goal. In this review, I discuss the concepts of optimal analysis, some important advanced analysis methods and a few examples. The judicious use of these advanced methods should enable new discoveries and produce results with better precision, robustness and clarity.

  12. Spectroscopic chemical analysis methods and apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F. (Inventor); Reid, Ray D. (Inventor)

    2009-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted simultaneously with native fluorescence spectroscopy to provide high levels of sensitivity and specificity in the same instrument.

  13. Spectroscopic chemical analysis methods and apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F. (Inventor); Reid, Ray D. (Inventor)

    2010-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted simultaneously with native fluorescence spectroscopy to provide high levels of sensitivity and specificity in the same instrument.

  14. Smoking, Radiotherapy, Diabetes and Osteoporosis as Risk Factors for Dental Implant Failure: A Meta-Analysis

    PubMed Central

    Chen, Hui; Liu, Nizhou; Xu, Xinchen; Qu, Xinhua; Lu, Eryi

    2013-01-01

    Background There are conflicting reports as to the association between smoking, radiotherapy, diabetes and osteoporosis and the risk of dental implant failure. We undertook a meta-analysis to evaluate the association between smoking, radiotherapy, diabetes and osteoporosis and the risk of dental implant failure. Methods A comprehensive research on MEDLINE and EMBASE, up to January 2013, was conducted to identify potential studies. References of relevant studies were also searched. Screening, data extraction and quality assessment were conducted independently and in duplicate. A random-effects meta-analysis was used to pool estimates of relative risks (RRs) with 95% confidence intervals (CIs). Results A total of 51 studies were identified in this meta-analysis, with more than 40,000 dental implants placed under risk-threatening conditions. The pooled RRs showed a direct association between smoking (n = 33; RR = 1.92; 95% CI, 1.67–2.21) and radiotherapy (n = 16; RR = 2.28; 95% CI, 1.49–3.51) and the risk of dental implant failure, whereas no inverse impact of diabetes (n = 5; RR = 0.90; 95% CI, 0.62–1.32) on the risk of dental implant failure was found. The influence of osteoporosis on the risk of dental implant failure was direct but not significant (n = 4; RR = 1.09; 95% CI, 0.79–1.52). The subgroup analysis indicated no influence of study design, geographical location, length of follow-up, sample size, or mean age of recruited patients. Conclusions Smoking and radiotherapy were associated with an increased risk of dental implant failure. The relationship between diabetes and osteoporosis and the risk of implant failure warrant further study. PMID:23940794

  15. 12 CFR 324.210 - Standardized measurement method for specific risk.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... comprehensive understanding for each securitization position by: (i) Conducting an analysis of the risk... risk. 324.210 Section 324.210 Banks and Banking FEDERAL DEPOSIT INSURANCE CORPORATION REGULATIONS AND STATEMENTS OF GENERAL POLICY CAPITAL ADEQUACY OF FDIC-SUPERVISED INSTITUTIONS Risk-Weighted...

  16. NOA: a novel Network Ontology Analysis method

    PubMed Central

    Wang, Jiguang; Huang, Qiang; Liu, Zhi-Ping; Wang, Yong; Wu, Ling-Yun; Chen, Luonan; Zhang, Xiang-Sun

    2011-01-01

    Gene ontology analysis has become a popular and important tool in bioinformatics study, and current ontology analyses are mainly conducted in individual gene or a gene list. However, recent molecular network analysis reveals that the same list of genes with different interactions may perform different functions. Therefore, it is necessary to consider molecular interactions to correctly and specifically annotate biological networks. Here, we propose a novel Network Ontology Analysis (NOA) method to perform gene ontology enrichment analysis on biological networks. Specifically, NOA first defines link ontology that assigns functions to interactions based on the known annotations of joint genes via optimizing two novel indexes ‘Coverage’ and ‘Diversity’. Then, NOA generates two alternative reference sets to statistically rank the enriched functional terms for a given biological network. We compare NOA with traditional enrichment analysis methods in several biological networks, and find that: (i) NOA can capture the change of functions not only in dynamic transcription regulatory networks but also in rewiring protein interaction networks while the traditional methods cannot and (ii) NOA can find more relevant and specific functions than traditional methods in different types of static networks. Furthermore, a freely accessible web server for NOA has been developed at http://www.aporc.org/noa/. PMID:21543451

  17. tropical cyclone risk analysis: a decisive role of its track

    NASA Astrophysics Data System (ADS)

    Chelsea Nam, C.; Park, Doo-Sun R.; Ho, Chang-Hoi

    2016-04-01

    The tracks of 85 tropical cyclones (TCs) that made landfall to South Korea for the period 1979-2010 are classified into four clusters by using a fuzzy c-means clustering method. The four clusters are characterized by 1) east-short, 2) east-long, 3) west-long, and 4) west-short based on the moving routes around Korean peninsula. We conducted risk comparison analysis for these four clusters regarding their hazards, exposure, and damages. Here, hazard parameters are calculated from two different sources independently, one from the best-track data (BT) and the other from the 60 weather stations over the country (WS). The results show distinct characteristics of the four clusters in terms of the hazard parameters and economic losses (EL), suggesting that there is a clear track-dependency in the overall TC risk. It is appeared that whether there occurred an "effective collision" overweighs the intensity of the TC per se. The EL ranking did not agree with the BT parameters (maximum wind speed, central pressure, or storm radius), but matches to WS parameter (especially, daily accumulated rainfall and TC-influenced period). The west-approaching TCs (i.e. west-long and west-short clusters) generally recorded larger EL than the east-approaching TCs (i.e. east-short and east-long clusters), although the east-long clusters are the strongest in BT point of view. This can be explained through the spatial distribution of the WS parameters and the regional EL maps corresponding to it. West-approaching TCs accompanied heavy rainfall on the southern regions with the helps of the topographic effect on their tracks, and of the extended stay on the Korean Peninsula in their extratropical transition, that were not allowed to the east-approaching TCs. On the other hand, some regions had EL that are not directly proportional to the hazards, and this is partly attributed to spatial disparity in wealth and vulnerability. Correlation analysis also revealed the importance of rainfall; daily

  18. Japanese Encephalitis Risk and Contextual Risk Factors in Southwest China: A Bayesian Hierarchical Spatial and Spatiotemporal Analysis

    PubMed Central

    Zhao, Xing; Cao, Mingqin; Feng, Hai-Huan; Fan, Heng; Chen, Fei; Feng, Zijian; Li, Xiaosong; Zhou, Xiao-Hua

    2014-01-01

    It is valuable to study the spatiotemporal pattern of Japanese encephalitis (JE) and its association with the contextual risk factors in southwest China, which is the most endemic area in China. Using data from 2004 to 2009, we applied GISmapping and spatial autocorrelation analysis to analyze reported incidence data of JE in 438 counties in southwest China, finding that JE cases were not randomly distributed, and a Bayesian hierarchical spatiotemporal model identified the east part of southwest China as a high risk area. Meanwhile, the Bayesian hierarchical spatial model in 2006 demonstrated a statistically significant association between JE and the agricultural and climatic variables, including the proportion of rural population, the pig-to-human ratio, the monthly precipitation and the monthly mean minimum and maximum temperatures. Particular emphasis was placed on the time-lagged effect for climatic factors. The regression method and the Spearman correlation analysis both identified a two-month lag for the precipitation, while the regression method found a one-month lag for temperature. The results show that the high risk area in the east part of southwest China may be connected to the agricultural and climatic factors. The routine surveillance and the allocation of health resources should be given more attention in this area. Moreover, the meteorological variables might be considered as possible predictors of JE in southwest China. PMID:24739769

  19. Chromatographic methods for analysis of triazine herbicides.

    PubMed

    Abbas, Hana Hassan; Elbashir, Abdalla A; Aboul-Enein, Hassan Y

    2015-01-01

    Gas chromatography (GC) and high-performance liquid chromatography (HPLC) coupled to different detectors, and in combination with different sample extraction methods, are most widely used for analysis of triazine herbicides in different environmental samples. Nowadays, many variations and modifications of extraction and sample preparation methods such as solid-phase microextraction (SPME), hollow fiber-liquid phase microextraction (HF-LPME), stir bar sportive extraction (SBSE), headspace-solid phase microextraction (HS-SPME), dispersive liquid-liquid microextraction (DLLME), dispersive liquid-liquid microextraction based on solidification of floating organic droplet (DLLME-SFO), ultrasound-assisted emulsification microextraction (USAEME), and others have been introduced and developed to obtain sensitive and accurate methods for the analysis of these hazardous compounds. In this review, several analytical properties such as linearity, sensitivity, repeatability, and accuracy for each developed method are discussed, and excellent results were obtained for the most of developed methods combined with GC and HPLC techniques for the analysis of triazine herbicides. This review gives an overview of recent publications of the application of GC and HPLC for analysis of triazine herbicides residues in various samples. PMID:25849823

  20. Debris Flow Risk Management Framework and Risk Analysis in Taiwan, A Preliminary Study

    NASA Astrophysics Data System (ADS)

    Tsao, Ting-Chi; Hsu, Wen-Ko; Chiou, Lin-Bin; Cheng, Chin-Tung; Lo, Wen-Chun; Chen, Chen-Yu; Lai, Cheng-Nong; Ju, Jiun-Ping

    2010-05-01

    Taiwan is located on a seismically active mountain belt between the Philippine Sea plate and Eurasian plate. After 1999's Chi-Chi earthquake (Mw=7.6), landslide and debris flow occurred frequently. In Aug. 2009, Typhoon Morakot struck Taiwan and numerous landslides and debris flow events, some with tremendous fatalities, were observed. With limited resources, authorities should establish a disaster management system to cope with slope disaster risks more effectively. Since 2006, Taiwan's authority in charge of debris flow management, the Soil and Water Conservation Bureau (SWCB), completed the basic investigation and data collection of 1,503 potential debris flow creeks around Taiwan. During 2008 and 2009, a debris flow quantitative risk analysis (QRA) framework, based on landslide risk management framework of Australia, was proposed and conducted on 106 creeks of the 30 villages with debris flow hazard history. Information and value of several types of elements at risk (bridge, road, building and crop) were gathered and integrated into a GIS layer, with the vulnerability model of each elements at risk applied. Through studying the historical hazard events of the 30 villages, numerical simulations of debris flow hazards with different magnitudes (5, 10, 25, 50, 100 and 200 years return period) were conducted, the economic losses and fatalities of each scenario were calculated for each creek. When taking annual exceeding probability into account, the annual total risk of each creek was calculated, and the results displayed on a debris flow risk map. The number of fatalities and frequency were calculated, and the F-N curves of 106 creeks were provided. For F-N curves, the individual risk to life per year of 1.0E-04 and slope of 1, which matched with international standards, were considered to be an acceptable risk. Applying the results of the 106 creeks onto the F-N curve, they were divided into 3 categories: Unacceptable, ALARP (As Low As Reasonable Practicable) and

  1. Intercellular Adhesion Molecule-1 (ICAM-1) Polymorphisms and Cancer Risk: A Meta-Analysis

    PubMed Central

    CHENG, Daye; LIANG, Bin

    2015-01-01

    Background: Intercellular adhesion molecule-1 (ICAM-1) Lys469Glu (K469E) polymorphism and Gly 241Arg (G241R) polymorphism might play important roles in cancer development and progression. However, the results of previous studies are inconsistent. The aim of this study was to evaluate the association between ICAM-1 K469E and G241R polymorphisms and the risk of cancer by meta-analysis. Methods: A comprehensive literature search (last search updated in November 2013) was conducted to identify case-control studies that investigated the association between ICAM-1 K469E and G241R polymorphisms and cancer risk. Results: A total of 18 case-control studies for ICAM-1 polymorphisms were included in the meta-analysis, including 4,844 cancer cases and 5,618 healthy controls. For K469E polymorphism, no significant association was found between K469E polymorphism and cancer risk. However, subgroup analysis by ethnicity revealed one genetic comparison (GG vs. AA) presented the relationship with cancer risk in Asian subgroup, and two genetic models (GG+GA vs. AA and GA vs. AA) in European subgroup, respectively. For G241R polymorphism, G241R polymorphism was significantly association with cancer risk in overall analysis. The subgroup analysis by ethnicity showed that G241R polymorphism was significantly associated with cancer risk in European subgroup. Conclusion: ICAM-1 G241R polymorphism might be associated with cancer risk, especially in European populations, but the results doesn’t support ICAM-1 K469E polymorphism as a risk factor for cancer. PMID:26284202

  2. Mudflow Hazards in the Georgian Caucasus - Using Participatory Methods to Investigate Disaster Risk

    NASA Astrophysics Data System (ADS)

    Spanu, Valentina; McCall, Michael; Gaprindashvili, George

    2014-05-01

    /Management (DRR and DRM). Participatory surveys (and participatory monitoring) elicit local people's knowledge about the specifics of the hazard concerning frequency, timing, warning signals, rates of flow, spatial extent, etc. And significantly, only this local knowledge from informants can reveal essential information about different vulnerabilities of people and places, and about any coping or adjustment mechanisms that local people have. The participatory methods employed in Mleta included historical discussions with key informants, village social transects, participatory mapping with children, semi-structured interviews with inhabitants, and VCA (Vulnerability & Capacity Analysis). The geolomorphological map produced on the base of the local geology has been realized with ArcGIS. This allowed the assessment of the areas at risk and the relative maps. We adapted and tested the software programme CyberTracker as a survey tool, a digital device method of field data collection. Google Earth, OpenStreetMap, Virtual Earth and Ilwis have been used for data processing.

  3. Simplified method for nonlinear structural analysis

    NASA Technical Reports Server (NTRS)

    Kaufman, A.

    1983-01-01

    A simplified inelastic analysis computer program was developed for predicting the stress-strain history of a thermomechanically cycled structure from an elastic solution. The program uses an iterative and incremental procedure to estimate the plastic strains from the material stress-strain properties and a simulated plasticity hardening model. The simplified method was exercised on a number of problems involving uniaxial and multiaxial loading, isothermal and nonisothermal conditions, and different materials and plasticity models. Good agreement was found between these analytical results and nonlinear finite element solutions for these problems. The simplified analysis program used less than 1 percent of the CPU time required for a nonlinear finite element analysis.

  4. Probabilistic structural analysis methods development for SSME

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Hopkins, D. A.

    1988-01-01

    The development of probabilistic structural analysis methods is a major part of the SSME Structural Durability Program and consists of three program elements: composite load spectra, probabilistic finite element structural analysis, and probabilistic structural analysis applications. Recent progress includes: (1) the effects of the uncertainties of several factors on the HPFP blade temperature pressure and torque, (2) the evaluation of the cumulative distribution function of structural response variables based on assumed uncertainties on primitive structural variables, and (3) evaluation of the failure probability. Collectively, the results obtained demonstrate that the structural durability of critical SSME components can be probabilistically evaluated.

  5. Design analysis, robust methods, and stress classification

    SciTech Connect

    Bees, W.J.

    1993-01-01

    This special edition publication volume is comprised of papers presented at the 1993 ASME Pressure Vessels and Piping Conference, July 25--29, 1993 in Denver, Colorado. The papers were prepared for presentations in technical sessions developed under the auspices of the PVPD Committees on Computer Technology, Design and Analysis, Operations Applications and Components. The topics included are: Analysis of Pressure Vessels and Components; Expansion Joints; Robust Methods; Stress Classification; and Non-Linear Analysis. Individual papers have been processed separately for inclusion in the appropriate data bases.

  6. Analysis of automated highway system risks and uncertainties. Volume 5

    SciTech Connect

    Sicherman, A.

    1994-10-01

    This volume describes a risk analysis performed to help identify important Automated Highway System (AHS) deployment uncertainties and quantify their effect on costs and benefits for a range of AHS deployment scenarios. The analysis identified a suite of key factors affecting vehicle and roadway costs, capacities and market penetrations for alternative AHS deployment scenarios. A systematic protocol was utilized for obtaining expert judgments of key factor uncertainties in the form of subjective probability percentile assessments. Based on these assessments, probability distributions on vehicle and roadway costs, capacity and market penetration were developed for the different scenarios. The cost/benefit risk methodology and analysis provide insights by showing how uncertainties in key factors translate into uncertainties in summary cost/benefit indices.

  7. A Renormalisation Group Method. IV. Stability Analysis

    NASA Astrophysics Data System (ADS)

    Brydges, David C.; Slade, Gordon

    2015-05-01

    This paper is the fourth in a series devoted to the development of a rigorous renormalisation group method for lattice field theories involving boson fields, fermion fields, or both. The third paper in the series presents a perturbative analysis of a supersymmetric field theory which represents the continuous-time weakly self-avoiding walk on . We now present an analysis of the relevant interaction functional of the supersymmetric field theory, which permits a nonperturbative analysis to be carried out in the critical dimension . The results in this paper include: proof of stability of the interaction, estimates which enable control of Gaussian expectations involving both boson and fermion fields, estimates which bound the errors in the perturbative analysis, and a crucial contraction estimate to handle irrelevant directions in the flow of the renormalisation group. These results are essential for the analysis of the general renormalisation group step in the fifth paper in the series.

  8. Risk Factor Analysis in Low-Temperature Geothermal Play Fairway Analysis for the Appalachian Basin (GPFA-AB)

    DOE Data Explorer

    Teresa E. Jordan

    2015-09-30

    This submission contains information used to compute the risk factors for the GPFA-AB project (DE-EE0006726). The risk factors are natural reservoir quality, thermal resource quality, potential for induced seismicity, and utilization. The methods used to combine the risk factors included taking the product, sum, and minimum of the four risk factors. The files are divided into images, rasters, shapefiles, and supporting information. The image files show what the raster and shapefiles should look like. The raster files contain the input risk factors, calculation of the scaled risk factors, and calculation of the combined risk factors. The shapefiles include definition of the fairways, definition of the US Census Places, the center of the raster cells, and locations of industries. Supporting information contains details of the calculations or processing used in generating the files. An image of the raster will have the same name except *.png as the file ending instead of *.tif. Images with “fairways” or “industries” added to the name are composed of a raster with the relevant shapefile added. The file About_GPFA-AB_Phase1RiskAnalysisTask5DataUpload.pdf contains information the citation, special use considerations, authorship, etc. More details on each file are given in the spreadsheet “list_of_contents.csv” in the folder “SupportingInfo”. Code used to calculate values is available at https://github.com/calvinwhealton/geothermal_pfa under the folder “combining_metrics”.

  9. Risk-based planning analysis for a single levee

    NASA Astrophysics Data System (ADS)

    Hui, Rui; Jachens, Elizabeth; Lund, Jay

    2016-04-01

    Traditional risk-based analysis for levee planning focuses primarily on overtopping failure. Although many levees fail before overtopping, few planning studies explicitly include intermediate geotechnical failures in flood risk analysis. This study develops a risk-based model for two simplified levee failure modes: overtopping failure and overall intermediate geotechnical failure from through-seepage, determined by the levee cross section represented by levee height and crown width. Overtopping failure is based only on water level and levee height, while through-seepage failure depends on many geotechnical factors as well, mathematically represented here as a function of levee crown width using levee fragility curves developed from professional judgment or analysis. These levee planning decisions are optimized to minimize the annual expected total cost, which sums expected (residual) annual flood damage and annualized construction costs. Applicability of this optimization approach to planning new levees or upgrading existing levees is demonstrated preliminarily for a levee on a small river protecting agricultural land, and a major levee on a large river protecting a more valuable urban area. Optimized results show higher likelihood of intermediate geotechnical failure than overtopping failure. The effects of uncertainty in levee fragility curves, economic damage potential, construction costs, and hydrology (changing climate) are explored. Optimal levee crown width is more sensitive to these uncertainties than height, while the derived general principles and guidelines for risk-based optimal levee planning remain the same.

  10. Methods for genetic linkage analysis using trisomies

    SciTech Connect

    Feingold, E.; Lamb, N.E.; Sherman, S.L.

    1995-02-01

    Certain genetic disorders are rare in the general population, but more common in individuals with specific trisomies. Examples of this include leukemia and duodenal atresia in trisomy 21. This paper presents a linkage analysis method for using trisomic individuals to map genes for such traits. It is based on a very general gene-specific dosage model that posits that the trait is caused by specific effects of different alleles at one or a few loci and that duplicate copies of {open_quotes}susceptibility{close_quotes} alleles inherited from the nondisjoining parent give increased likelihood of having the trait. Our mapping method is similar to identity-by-descent-based mapping methods using affected relative pairs and also to methods for mapping recessive traits using inbred individuals by looking for markers with greater than expected homozygosity by descent. In the trisomy case, one would take trisomic individuals and look for markers with greater than expected homozygosity in the chromosomes inherited from the nondisjoining parent. We present statistical methods for performing such a linkage analysis, including a test for linkage to a marker, a method for estimating the distance from the marker to the trait gene, a confidence interval for that distance, and methods for computing power and sample sizes. We also resolve some practical issues involved in implementing the methods, including how to use partially informative markers and how to test candidate genes. 20 refs., 5 figs., 1 tab.

  11. RADTRAN 5: A computer code for transportation risk analysis

    SciTech Connect

    Neuhauser, K. S.; Kanipe, F. L.

    1991-01-01

    RADTRAN 5 is a computer code developed at Sandia National Laboratories (SNL) in Albuquerque, NM, to estimate radiological and nonradiological risks of radioactive materials transportation. RADTRAN 5 is written in ANSI Standard FORTRAN 77 and contains significant advances in the methodology for route-specific analysis first developed by SNL for RADTRAN 4 (Neuhauser and Kanipe, 1992). Like the previous RADTRAN codes, RADTRAN 5 contains two major modules for incident-free and accident risk amlysis, respectively. All commercially important transportation modes may be analyzed with RADTRAN 5: highway by combination truck; highway by light-duty vehicle; rail; barge; ocean-going ship; cargo air; and passenger air.

  12. Adversarial Risk Analysis for Urban Security Resource Allocation.

    PubMed

    Gil, César; Rios Insua, David; Rios, Jesus

    2016-04-01

    Adversarial risk analysis (ARA) provides a framework to deal with risks originating from intentional actions of adversaries. We show how ARA may be used to allocate security resources in the protection of urban spaces. We take into account the spatial structure and consider both proactive and reactive measures, in that we aim at both trying to reduce criminality as well as recovering as best as possible from it, should it happen. We deal with the problem by deploying an ARA model over each spatial unit, coordinating the models through resource constraints, value aggregation, and proximity. We illustrate our approach with an example that uncovers several relevant policy issues. PMID:26927388

  13. Risk Interfaces to Support Integrated Systems Analysis and Development

    NASA Technical Reports Server (NTRS)

    Mindock, Jennifer; Lumpkins, Sarah; Shelhamer, Mark; Anton, Wilma; Havenhill, Maria

    2016-01-01

    Objectives for systems analysis capability: Develop integrated understanding of how a complex human physiological-socio-technical mission system behaves in spaceflight. Why? Support development of integrated solutions that prevent unwanted outcomes (Implementable approaches to minimize mission resources(mass, power, crew time, etc.)); Support development of tools for autonomy (need for exploration) (Assess and maintain resilience -individuals, teams, integrated system). Output of this exercise: -Representation of interfaces based on Human System Risk Board (HSRB) Risk Summary information and simple status based on Human Research Roadmap; Consolidated HSRB information applied to support communication; Point-of-Departure for HRP Element planning; Ability to track and communicate status of collaborations. 4

  14. Determinants of contraceptive method among young women at risk for unintended pregnancy and sexually transmitted infections.

    PubMed

    Raine, Tina; Minnis, Alexandra M; Padian, Nancy S

    2003-07-01

    The objective of this study was to examine the relationship between contraceptive method choice, sexual risk and various demographic and social factors. Data were collected on 378, 15- to 24-year-old women, recruited from health clinics and through community outreach in Northern California. Logistic regression analysis was used to estimate the association of predictors with contraceptive method used at last sex. Asian and Latina women were less likely to use any method. Women who were raised with a religion, or thought they were infertile, were also less likely to use any method. Women with multiple partners were generally less likely to use any method, but were more likely to use barrier methods when they did use one. Few women (7%) were dual method users. Women appear to act in a rational fashion within their own social context and may use no methods at all or use methods that are less effective for pregnancy prevention but offer more protection from sexually transmitted infections. PMID:12878282

  15. Patient-specific meta-analysis for risk assessment using multivariate proportional hazards regression

    PubMed Central

    Crager, Michael R.; Tang, Gong

    2015-01-01

    We propose a method for assessing an individual patient’s risk of a future clinical event using clinical trial or cohort data and Cox proportional hazards regression, combining the information from several studies using meta-analysis techniques. The method combines patient-specific estimates of the log cumulative hazard across studies, weighting by the relative precision of the estimates, using either fixed- or random-effects meta-analysis calculations. Risk assessment can be done for any future patient using a few key summary statistics determined once and for all from each study. Generalizations of the method to logistic regression and linear models are immediate. We evaluate the methods using simulation studies and illustrate their application using real data. PMID:26664111

  16. 49 CFR Appendix D to Part 172 - Rail Risk Analysis Factors

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 2 2011-10-01 2011-10-01 false Rail Risk Analysis Factors D Appendix D to Part... REQUIREMENTS, AND SECURITY PLANS Pt. 172, App. D Appendix D to Part 172—Rail Risk Analysis Factors A. This... safety and security risk analyses required by § 172.820. The risk analysis to be performed may...

  17. 49 CFR Appendix D to Part 172 - Rail Risk Analysis Factors

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 2 2010-10-01 2010-10-01 false Rail Risk Analysis Factors D Appendix D to Part... REQUIREMENTS, AND SECURITY PLANS Pt. 172, App. D Appendix D to Part 172—Rail Risk Analysis Factors A. This... safety and security risk analyses required by § 172.820. The risk analysis to be performed may...

  18. 49 CFR Appendix D to Part 172 - Rail Risk Analysis Factors

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 2 2012-10-01 2012-10-01 false Rail Risk Analysis Factors D Appendix D to Part... REQUIREMENTS, AND SECURITY PLANS Pt. 172, App. D Appendix D to Part 172—Rail Risk Analysis Factors A. This... safety and security risk analyses required by § 172.820. The risk analysis to be performed may...

  19. Hybrid Safety Analysis Using Functional and Risk Decompositions

    SciTech Connect

    COOPER,J. ARLIN; JOHNSON,ALICE J.; WERNER,PAUL W.

    2000-07-15

    Safety analysis of complex systems depends on decomposing the systems into manageable subsystems, from which analysis can be rolled back up to the system level. The authors have found that there is no single best way to decompose; in fact hybrid combinations of decompositions are generally necessary to achieve optimum results. They are currently using two backbone coordinated decompositions--functional and risk, supplemented by other types, such as organizational. An objective is to derive metrics that can be used to efficiently and accurately aggregate information through analysis, to contribute toward assessing system safety, and to contribute information necessary for defensible decisions.

  20. Space flight risk data collection and analysis project: Risk and reliability database

    NASA Astrophysics Data System (ADS)

    1994-05-01

    The focus of the NASA 'Space Flight Risk Data Collection and Analysis' project was to acquire and evaluate space flight data with the express purpose of establishing a database containing measurements of specific risk assessment - reliability - availability - maintainability - supportability (RRAMS) parameters. The developed comprehensive RRAMS database will support the performance of future NASA and aerospace industry risk and reliability studies. One of the primary goals has been to acquire unprocessed information relating to the reliability and availability of launch vehicles and the subsystems and components thereof from the 45th Space Wing (formerly Eastern Space and Missile Command -ESMC) at Patrick Air Force Base. After evaluating and analyzing this information, it was encoded in terms of parameters pertinent to ascertaining reliability and availability statistics, and then assembled into an appropriate database structure.

  1. Space flight risk data collection and analysis project: Risk and reliability database

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The focus of the NASA 'Space Flight Risk Data Collection and Analysis' project was to acquire and evaluate space flight data with the express purpose of establishing a database containing measurements of specific risk assessment - reliability - availability - maintainability - supportability (RRAMS) parameters. The developed comprehensive RRAMS database will support the performance of future NASA and aerospace industry risk and reliability studies. One of the primary goals has been to acquire unprocessed information relating to the reliability and availability of launch vehicles and the subsystems and components thereof from the 45th Space Wing (formerly Eastern Space and Missile Command -ESMC) at Patrick Air Force Base. After evaluating and analyzing this information, it was encoded in terms of parameters pertinent to ascertaining reliability and availability statistics, and then assembled into an appropriate database structure.

  2. Association among Dietary Flavonoids, Flavonoid Subclasses and Ovarian Cancer Risk: A Meta-Analysis

    PubMed Central

    You, Ruxu; Yang, Yu; Liao, Jing; Chen, Dongsheng; Yu, Lixiu

    2016-01-01

    Background Previous studies have indicated that intake of dietary flavonoids or flavonoid subclasses is associated with the ovarian cancer risk, but presented controversial results. Therefore, we conducted a meta-analysis to derive a more precise estimation of these associations. Methods We performed a search in PubMed, Google Scholar and ISI Web of Science from their inception to April 25, 2015 to select studies on the association among dietary flavonoids, flavonoid subclasses and ovarian cancer risk. The information was extracted by two independent authors. We assessed the heterogeneity, sensitivity, publication bias and quality of the articles. A random-effects model was used to calculate the pooled risk estimates. Results Five cohort studies and seven case-control studies were included in the final meta-analysis. We observed that intake of dietary flavonoids can decrease ovarian cancer risk, which was demonstrated by pooled RR (RR = 0.82, 95% CI = 0.68–0.98). In a subgroup analysis by flavonoid subtypes, the ovarian cancer risk was also decreased for isoflavones (RR = 0.67, 95% CI = 0.50–0.92) and flavonols (RR = 0.68, 95% CI = 0.58–0.80). While there was no compelling evidence that consumption of flavones (RR = 0.86, 95% CI = 0.71–1.03) could decrease ovarian cancer risk, which revealed part sources of heterogeneity. The sensitivity analysis indicated stable results, and no publication bias was observed based on the results of Funnel plot analysis and Egger’s test (p = 0.26). Conclusions This meta-analysis suggested that consumption of dietary flavonoids and subtypes (isoflavones, flavonols) has a protective effect against ovarian cancer with a reduced risk of ovarian cancer except for flavones consumption. Nevertheless, further investigations on a larger population covering more flavonoid subclasses are warranted. PMID:26960146

  3. Walking the line: Understanding pedestrian behaviour and risk at rail level crossings with cognitive work analysis.

    PubMed

    Read, Gemma J M; Salmon, Paul M; Lenné, Michael G; Stanton, Neville A

    2016-03-01

    Pedestrian fatalities at rail level crossings (RLXs) are a public safety concern for governments worldwide. There is little literature examining pedestrian behaviour at RLXs and no previous studies have adopted a formative approach to understanding behaviour in this context. In this article, cognitive work analysis is applied to understand the constraints that shape pedestrian behaviour at RLXs in Melbourne, Australia. The five phases of cognitive work analysis were developed using data gathered via document analysis, behavioural observation, walk-throughs and critical decision method interviews. The analysis demonstrates the complex nature of pedestrian decision making at RLXs and the findings are synthesised to provide a model illustrating the influences on pedestrian decision making in this context (i.e. time, effort and social pressures). Further, the CWA outputs are used to inform an analysis of the risks to safety associated with pedestrian behaviour at RLXs and the identification of potential interventions to reduce risk. PMID:26518501

  4. Review of correlation methods for evaluating finite element simulations of impact injury risk.

    PubMed

    Wang, Qian; Gabler, Hampton C

    2008-01-01

    Finite element models have been used to understand human injury responses in various crash configurations. Most of the model validations were limited to qualitative descriptions. Quantitative analysis was needed for the validation of finite element models against experimental results. The purpose of this study is to compare the existing correlation techniques and to determine the best method to use for evaluating finite element simulations of impact injury risk in vehicle crashes. Five correlation methods in the literature were reviewed for systematic comparisons between simulations and tests. A full frontal impact test of a 1997 Geo Metro was simulated. The finite element model of a 1997 Geo Metro was obtained from NCAC finite element model archive. The acceleration and velocity responses of the vehicle seat were extracted from the simulation and compared to the test data. Evaluations of the validation methods were based on the analysis results compared to the suggested criteria. Performance of the different methods showed that the Comprehensive Error Factor method was the best overall correlation method, and therefore was recommended for assessing occupant injury potentials in vehicle accidents. PMID:19141927

  5. Looking beyond borders: integrating best practices in benefit-risk analysis into the field of food and nutrition.

    PubMed

    Tijhuis, M J; Pohjola, M V; Gunnlaugsdóttir, H; Kalogeras, N; Leino, O; Luteijn, J M; Magnússon, S H; Odekerken-Schröder, G; Poto, M; Tuomisto, J T; Ueland, O; White, B C; Holm, F; Verhagen, H

    2012-01-01

    An integrated benefit-risk analysis aims to give guidance in decision situations where benefits do not clearly prevail over risks, and explicit weighing of benefits and risks is thus indicated. The BEPRARIBEAN project aims to advance benefit-risk analysis in the area of food and nutrition by learning from other fields. This paper constitutes the final stage of the project, in which commonalities and differences in benefit-risk analysis are identified between the Food and Nutrition field and other fields, namely Medicines, Food Microbiology, Environmental Health, Economics and Marketing-Finance, and Consumer Perception. From this, ways forward are characterized for benefit-risk analysis in Food and Nutrition. Integrated benefit-risk analysis in Food and Nutrition may advance in the following ways: Increased engagement and communication between assessors, managers, and stakeholders; more pragmatic problem-oriented framing of assessment; accepting some risk; pre- and post-market analysis; explicit communication of the assessment purpose, input and output; more human (dose-response) data and more efficient use of human data; segmenting populations based on physiology; explicit consideration of value judgments in assessment; integration of multiple benefits and risks from multiple domains; explicit recognition of the impact of consumer beliefs, opinions, views, perceptions, and attitudes on behaviour; and segmenting populations based on behaviour; the opportunities proposed here do not provide ultimate solutions; rather, they define a collection of issues to be taken account of in developing methods, tools, practices and policies, as well as refining the regulatory context, for benefit-risk analysis in Food and Nutrition and other fields. Thus, these opportunities will now need to be explored further and incorporated into benefit-risk practice and policy. If accepted, incorporation of these opportunities will also involve a paradigm shift in Food and Nutrition benefit-risk

  6. Prenatal, Perinatal and Neonatal Risk Factors for Intellectual Disability: A Systemic Review and Meta-Analysis

    PubMed Central

    Qu, Yi; Mu, Dezhi

    2016-01-01

    Background The etiology of non-genetic intellectual disability (ID) is not fully known, and we aimed to identify the prenatal, perinatal and neonatal risk factors for ID. Method PubMed and Embase databases were searched for studies that examined the association between pre-, peri- and neonatal factors and ID risk (keywords “intellectual disability” or “mental retardation” or “ID” or “MR” in combination with “prenatal” or “pregnancy” or “obstetric” or “perinatal” or “neonatal”. The last search was updated on September 15, 2015. Summary effect estimates (pooled odds ratios) were calculated for each risk factor using random effects models, with tests for heterogeneity and publication bias. Results Seventeen studies with 55,344 patients and 5,723,749 control individuals were eligible for inclusion in our analysis, and 16 potential risk factors were analyzed. Ten prenatal factors (advanced maternal age, maternal black race, low maternal education, third or more parity, maternal alcohol use, maternal tobacco use, maternal diabetes, maternal hypertension, maternal epilepsy and maternal asthma), one perinatal factor (preterm birth) and two neonatal factors (male sex and low birth weight) were significantly associated with increased risk of ID. Conclusion This systemic review and meta-analysis provides a comprehensive evidence-based assessment of the risk factors for ID. Future studies are encouraged to focus on perinatal and neonatal risk factors and the combined effects of multiple factors. PMID:27110944

  7. Methods for Chemical Analysis of Fresh Waters.

    ERIC Educational Resources Information Center

    Golterman, H. L.

    This manual, one of a series prepared for the guidance of research workers conducting studies as part of the International Biological Programme, contains recommended methods for the analysis of fresh water. The techniques are grouped in the following major sections: Sample Taking and Storage; Conductivity, pH, Oxidation-Reduction Potential,…

  8. Systems and methods for sample analysis

    SciTech Connect

    Cooks, Robert Graham; Li, Guangtao; Li, Xin; Ouyang, Zheng

    2015-01-13

    The invention generally relates to systems and methods for sample analysis. In certain embodiments, the invention provides a system for analyzing a sample that includes a probe including a material connected to a high voltage source, a device for generating a heated gas, and a mass analyzer.

  9. Integrated method for chaotic time series analysis

    DOEpatents

    Hively, L.M.; Ng, E.G.

    1998-09-29

    Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data are disclosed. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated. 8 figs.

  10. Integrated method for chaotic time series analysis

    DOEpatents

    Hively, Lee M.; Ng, Esmond G.

    1998-01-01

    Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated.

  11. Analysis methods for tocopherols and tocotrienols

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Tocopherols and tocotrienols, sometimes called tocochromanols or tocols, are also collectively termed Vitamin E. Vitamins A, D, E, and K, are referred to as fat soluble vitamins. Since the discovery of Vitamin E in 1922, many methods have been developed for the analysis of tocopherols and tocotrie...

  12. Systems and methods for sample analysis

    SciTech Connect

    Cooks, Robert Graham; Li, Guangtao; Li, Xin; Ouyang, Zheng

    2015-10-20

    The invention generally relates to systems and methods for sample analysis. In certain embodiments, the invention provides a system for analyzing a sample that includes a probe including a material connected to a high voltage source, a device for generating a heated gas, and a mass analyzer.

  13. Evaluation of association methods for analysing modifiers of disease risk in carriers of high-risk mutations.

    PubMed

    Barnes, Daniel R; Lee, Andrew; Easton, Douglas F; Antoniou, Antonis C

    2012-04-01

    There is considerable evidence indicating that disease risk in carriers of high-risk mutations (e.g. BRCA1 and BRCA2) varies by other genetic factors. Such mutations tend to be rare in the population and studies of genetic modifiers of risk have focused on sampling mutation carriers through clinical genetics centres. Genetic testing targets affected individuals from high-risk families, making ascertainment of mutation carriers non-random with respect to disease phenotype. Standard analytical methods can lead to biased estimates of associations. Methods proposed to address this problem include a weighted-cohort (WC) and retrospective likelihood (RL) approach. Their performance has not been evaluated systematically. We evaluate these methods by simulation and extend the RL to analysing associations of two diseases simultaneously (competing risks RL-CRRL). The standard cohort approach (Cox regression) yielded the most biased risk ratio (RR) estimates (relative bias-RB: -25% to -17%) and had the lowest power. The WC and RL approaches provided similar RR estimates, were least biased (RB: -2.6% to 2.5%), and had the lowest mean-squared errors. The RL method generally had more power than WC. When analysing associations with two diseases, ignoring a potential association with one disease leads to inflated type I errors for inferences with respect to the second disease and biased RR estimates. The CRRL generally gave unbiased RR estimates for both disease risks and had correct nominal type I errors. These methods are illustrated by analyses of genetic modifiers of breast and ovarian cancer risk for BRCA1 and BRCA2 mutation carriers. PMID:22714938

  14. Assessing the validity of prospective hazard analysis methods: a comparison of two techniques

    PubMed Central

    2014-01-01

    Background Prospective Hazard Analysis techniques such as Healthcare Failure Modes and Effects Analysis (HFMEA) and Structured What If Technique (SWIFT) have the potential to increase safety by identifying risks before an adverse event occurs. Published accounts of their application in healthcare have identified benefits, but the reliability of some methods has been found to be low. The aim of this study was to examine the validity of SWIFT and HFMEA by comparing their outputs in the process of risk assessment, and comparing the results with risks identified by retrospective methods. Methods The setting was a community-based anticoagulation clinic, in which risk assessment activities had been previously performed and were available. A SWIFT and an HFMEA workshop were conducted consecutively on the same day by experienced experts. Participants were a mixture of pharmacists, administrative staff and software developers. Both methods produced lists of risks scored according to the method’s procedure. Participants’ views about the value of the workshops were elicited with a questionnaire. Results SWIFT identified 61 risks and HFMEA identified 72 risks. For both methods less than half the hazards were identified by the other method. There was also little overlap between the results of the workshops and risks identified by prior root cause analysis, staff interviews or clinical governance board discussions. Participants’ feedback indicated that the workshops were viewed as useful. Conclusions Although there was limited overlap, both methods raised important hazards. Scoping the problem area had a considerable influence on the outputs. The opportunity for teams to discuss their work from a risk perspective is valuable, but these methods cannot be relied upon in isolation to provide a comprehensive description. Multiple methods for identifying hazards should be used and data from different sources should be integrated to give a comprehensive view of risk in a system

  15. Exploring Mexican adolescents' perceptions of environmental health risks: a photographic approach to risk analysis.

    PubMed

    Börner, Susanne; Albino, Juan Carlos Torrico; Caraveo, Luz María Nieto; Tejeda, Ana Cristina Cubillas

    2015-05-01

    The objective of this study was to explore Mexican adolescents' perceptions of environmental health risks in contaminated urban areas, and to test the environmental photography technique as a research tool for engaging adolescents in community-based health research. The study was conducted with 74 adolescents from two communities in the city of San Luis Potosi, Mexico. Participants were provided with disposable cameras and asked to take photographs of elements and situations which they believed affected their personal health both at home and outside their homes. They were also asked to describe each photograph in writing. Photographs and written explanations were analyzed by using quantitative and qualitative content analysis. Risk perception plays a crucial role in the development of Risk Communication Programs (RCPs) aimed at the improvement of community health. The photography technique opens up a promising field for environmental health research since it affords a realistic and concise impression of the perceived risks. Adolescents in both communities perceived different environmental health risks as detrimental to their well-being, e.g. waste, air pollution, and lack of hygiene. Yet, some knowledge gaps remain which need to be addressed. PMID:26017963

  16. Multiple predictor smoothing methods for sensitivity analysis.

    SciTech Connect

    Helton, Jon Craig; Storlie, Curtis B.

    2006-08-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (1) locally weighted regression (LOESS), (2) additive models, (3) projection pursuit regression, and (4) recursive partitioning regression. The indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present.

  17. Flood risk assessment in France: comparison of extreme flood estimation methods (EXTRAFLO project, Task 7)

    NASA Astrophysics Data System (ADS)

    Garavaglia, F.; Paquet, E.; Lang, M.; Renard, B.; Arnaud, P.; Aubert, Y.; Carre, J.

    2013-12-01

    In flood risk assessment the methods can be divided in two families: deterministic methods and probabilistic methods. In the French hydrologic community the probabilistic methods are historically preferred to the deterministic ones. Presently a French research project named EXTRAFLO (RiskNat Program of the French National Research Agency, https://extraflo.cemagref.fr) deals with the design values for extreme rainfall and floods. The object of this project is to carry out a comparison of the main methods used in France for estimating extreme values of rainfall and floods, to obtain a better grasp of their respective fields of application. In this framework we present the results of Task 7 of EXTRAFLO project. Focusing on French watersheds, we compare the main extreme flood estimation methods used in French background: (i) standard flood frequency analysis (Gumbel and GEV distribution), (ii) regional flood frequency analysis (regional Gumbel and GEV distribution), (iii) local and regional flood frequency analysis improved by historical information (Naulet et al., 2005), (iv) simplify probabilistic method based on rainfall information (i.e. Gradex method (CFGB, 1994), Agregee method (Margoum, 1992) and Speed method (Cayla, 1995)), (v) flood frequency analysis by continuous simulation approach and based on rainfall information (i.e. Schadex method (Paquet et al., 2013, Garavaglia et al., 2010), Shyreg method (Lavabre et al., 2003)) and (vi) multifractal approach. The main result of this comparative study is that probabilistic methods based on additional information (i.e. regional, historical and rainfall information) provide better estimations than the standard flood frequency analysis. Another interesting result is that, the differences between the various extreme flood quantile estimations of compared methods increase with return period, staying relatively moderate up to 100-years return levels. Results and discussions are here illustrated throughout with the example

  18. Risk assessment and its application to flight safety analysis

    SciTech Connect

    Keese, D.L.; Barton, W.R.

    1989-12-01

    Potentially hazardous test activities have historically been a part of Sandia National Labs mission to design, develop, and test new weapons systems. These test activities include high speed air drops for parachute development, sled tests for component and system level studies, multiple stage rocket experiments, and artillery firings of various projectiles. Due to the nature of Sandia's test programs, the risk associated with these activities can never be totally eliminated. However, a consistent set of policies should be available to provide guidance into the level of risk that is acceptable in these areas. This report presents a general set of guidelines for addressing safety issues related to rocket flight operations at Sandia National Laboratories. Even though the majority of this report deals primarily with rocket flight safety, these same principles could be applied to other hazardous test activities. The basic concepts of risk analysis have a wide range of applications into many of Sandia's current operations. 14 refs., 1 tab.

  19. Risk factors for rape re-victimisation: a retrospective analysis.

    PubMed

    Lurie, S; Boaz, M; Golan, A

    2013-11-01

    Sexual re-victimisation refers to a pattern in which the sexual assault victim has an increased risk of subsequent victimisation relative to an individual who was never victimised. The purpose of our study was to identify risks factors for a second rape, the severest form of sexual re-victimisation. All rape victims treated at the First Regional Israeli Center for Sexual Assault Victims between October 2000 and July 2010 were included in this retrospective analysis. We compared characteristics of 53 rape victims who were victimised twice to those of 1,939 rape victims who were victimised once. We identified several risk factors for a second rape, which can be used in prevention programmes. These are: psychiatric background, history of social services involvement, adulthood, non-virginity and minority ethnicity. PMID:24219731

  20. Integration of Gis-analysis and Atmospheric Modelling For Nuclear Risk and Vulnerability Assessment

    NASA Astrophysics Data System (ADS)

    Rigina, O.; Baklanov, A.; Mahura, A.

    The paper is devoted to the problems of residential radiation risk and territorial vul- nerability with respect to nuclear sites in Europe. The study suggests two approaches, based on an integration of the GIS-analysis and the atmospheric modelling, to calcu- late radiation risk/vulnerability. First, modelling simulations were done for a number of case-studies, based on real data, such as reactor core inventory and estimations from the known accidents, for a number of typical meteorological conditions and different accidental scenarios. Then, using these simulations and the population database as input data, the GIS-analysis reveals administrative units at the highest risk with re- spect to the mean individual and collective doses received by the population. Then, two alternative methods were suggested to assess a probabilistic risk to the population in case of a severe accident on the Kola and Leningrad NPPs (as examples) based on social-geophysical factors: proximity to the accident site, population density and presence of critical groups, and the probabilities of wind trajectories and precipitation. The two latter probabilities were calculated by the atmospheric trajectory models and statistical methods for many years. The GIS analysis was done for the Nordic coun- tries as an example. GIS-based spatial analyses integrated with mathematical mod- elling allow to develop a common methodological approach for complex assessment of regional vulnerability and residential radiation risk, by merging together the sepa- rate aspects: modelling of consequences, probabilistic analysis of atmospheric flows, dose estimation etc. The approach was capable to create risk/vulnerability maps of the Nordic countries and to reveal the most vulnerable provinces with respect to the radiation risk sites.

  1. Quality-by-Design II: Application of Quantitative Risk Analysis to the Formulation of Ciprofloxacin Tablets.

    PubMed

    Claycamp, H Gregg; Kona, Ravikanth; Fahmy, Raafat; Hoag, Stephen W

    2016-04-01

    Qualitative risk assessment methods are often used as the first step to determining design space boundaries; however, quantitative assessments of risk with respect to the design space, i.e., calculating the probability of failure for a given severity, are needed to fully characterize design space boundaries. Quantitative risk assessment methods in design and operational spaces are a significant aid to evaluating proposed design space boundaries. The goal of this paper is to demonstrate a relatively simple strategy for design space definition using a simplified Bayesian Monte Carlo simulation. This paper builds on a previous paper that used failure mode and effects analysis (FMEA) qualitative risk assessment and Plackett-Burman design of experiments to identity the critical quality attributes. The results show that the sequential use of qualitative and quantitative risk assessments can focus the design of experiments on a reduced set of critical material and process parameters that determine a robust design space under conditions of limited laboratory experimentation. This approach provides a strategy by which the degree of risk associated with each known parameter can be calculated and allocates resources in a manner that manages risk to an acceptable level. PMID:26202064

  2. Robotic Mars Sample Return: Risk Assessment and Analysis Report

    NASA Technical Reports Server (NTRS)

    Lalk, Thomas R.; Spence, Cliff A.

    2003-01-01

    A comparison of the risk associated with two alternative scenarios for a robotic Mars sample return mission was conducted. Two alternative mission scenarios were identified, the Jet Propulsion Lab (JPL) reference Mission and a mission proposed by Johnson Space Center (JSC). The JPL mission was characterized by two landers and an orbiter, and a Mars orbit rendezvous to retrieve the samples. The JSC mission (Direct/SEP) involves a solar electric propulsion (SEP) return to earth followed by a rendezvous with the space shuttle in earth orbit. A qualitative risk assessment to identify and characterize the risks, and a risk analysis to quantify the risks were conducted on these missions. Technical descriptions of the competing scenarios were developed in conjunction with NASA engineers and the sequence of events for each candidate mission was developed. Risk distributions associated with individual and combinations of events were consolidated using event tree analysis in conjunction with Monte Carlo techniques to develop probabilities of mission success for each of the various alternatives. The results were the probability of success of various end states for each candidate scenario. These end states ranged from complete success through various levels of partial success to complete failure. Overall probability of success for the Direct/SEP mission was determined to be 66% for the return of at least one sample and 58% for the JPL mission for the return of at least one sample cache. Values were also determined for intermediate events and end states as well as for the probability of violation of planetary protection. Overall mission planetary protection event probabilities of occurrence were determined to be 0.002% and 1.3% for the Direct/SEP and JPL Reference missions respectively.

  3. Dietary intake and urinary level of cadmium and breast cancer risk: A meta-analysis.

    PubMed

    Lin, Jinbo; Zhang, Fang; Lei, Yixiong

    2016-06-01

    Cadmium, a human carcinogenic heavy metal, has been reported to be associated with breast cancer risk; however, the results from the epidemiological studies are not always consistent. The objective of this study was to quantitatively summarize the current evidence for the relationship between cadmium exposure and breast cancer risk using meta-analysis methods. Six studies determining the dietary cadmium intake level and five studies evaluating the urinary cadmium level were identified in a systematic search of MEDLINE and PubMed databases, and the associations between these levels and breast cancer risk were analysed. The pooled estimates under the random-effects model suggested that higher urinary cadmium levels were associated with an increased risk for breast cancer (highest versus lowest quantile, pooled odds ratio [OR]=2.24, 95% confidence interval [95%CI]=1.49-3.35) and a 1μg/g creatinine increase in urinary cadmium led to a 1.02-fold increment of breast cancer (pooled OR=2.02, 95%CI=1.34-3.03); however, pooled estimates for dietary cadmium intake found no significant association between cadmium exposure and breast cancer risk (highest versus lowest quantile, pooled relative risk [RR]=1.01, 95%CI=0.89-1.15). These results suggest that cadmium exposure may lead to an increased risk of breast cancer, and urinary cadmium levels can serve as a reliable biomarker for long-term cadmium exposure and may predict the breast cancer risk. PMID:27085960

  4. GPFA-AB_Phase1RiskAnalysisTask5DataUpload

    DOE Data Explorer

    Teresa E. Jordan

    2015-09-30

    This submission contains information used to compute the risk factors for the GPFA-AB project (DE-EE0006726). The risk factors are natural reservoir quality, thermal resource quality, potential for induced seismicity, and utilization. The methods used to combine the risk factors included taking the product, sum, and minimum of the four risk factors. The files are divided into images, rasters, shapefiles, and supporting information. The image files show what the raster and shapefiles should look like. The raster files contain the input risk factors, calculation of the scaled risk factors, and calculation of the combined risk factors. The shapefiles include definition of the fairways, definition of the US Census Places, the center of the raster cells, and locations of industries. Supporting information contains details of the calculations or processing used in generating the files. An image of the raster will have the same name except *.png as the file ending instead of *.tif. Images with “fairways” or “industries” added to the name are composed of a raster with the relevant shapefile added. The file About_GPFA-AB_Phase1RiskAnalysisTask5DataUpload.pdf contains information the citation, special use considerations, authorship, etc. More details on each file are given in the spreadsheet “list_of_contents.csv” in the folder “SupportingInfo”. Code used to calculate values is available at https://github.com/calvinwhealton/geothermal_pfa under the folder “combining_metrics”.

  5. Metabolic Syndrome Is Associated with Increased Breast Cancer Risk: A Systematic Review with Meta-Analysis

    PubMed Central

    Bhandari, Ruchi; Kelley, George A.; Hartley, Tara A.; Rockett, Ian R. H.

    2014-01-01

    Background. Although individual metabolic risk factors are reported to be associated with breast cancer risk, controversy surrounds risk of breast cancer from metabolic syndrome (MS). We report the first systematic review and meta-analysis of the association between MS and breast cancer risk in all adult females. Methods. Studies were retrieved by searching four electronic reference databases [PubMed, Cumulative Index to Nursing and Allied Health Literature (CINAHL), Web of Science, and ProQuest through June 30, 2012] and cross-referencing retrieved articles. Eligible for inclusion were longitudinal studies reporting associations between MS and breast cancer risk among females aged 18 years and older. Relative risks and 95% confidence intervals were calculated for each study and pooled using random-effects models. Publication bias was assessed quantitatively (Trim and Fill) and qualitatively (funnel plots). Heterogeneity was examined using Q and I2 statistics. Results. Representing nine independent cohorts and 97,277 adult females, eight studies met the inclusion criteria. A modest, positive association was observed between MS and breast cancer risk (RR: 1.47, 95% CI, 1.15–1.87; z = 3.13; p = 0.002; Q = 26.28, p = 0.001; I2 = 69.55%). No publication bias was observed. Conclusions. MS is associated with increased breast cancer risk in adult women. PMID:25653879

  6. Aspirin use and risk of breast cancer: systematic review and meta-analysis of observational studies.

    PubMed

    Zhong, Shanliang; Chen, Lin; Zhang, Xiaohui; Yu, Dandan; Tang, Jinhai; Zhao, Jianhua

    2015-11-01

    Previous studies concerning the association between aspirin use and breast cancer risk yielded inconsistent results. We aimed to investigate the association by meta-analysis. PubMed and EMBASE were searched for relevant studies. We calculated the summary relative risks (RR) and 95% confidence intervals (CI) using random-effects models. Seventeen cohort studies and 15 case-control studies were included. The overall result showed that aspirin use decreased risk of breast cancer (RR, 0.90; 95% CI, 0.85-0.95). However, there was evidence of publication bias and heterogeneity and the association disappeared after correction using the trim-and-fill method. When stratified by study design, a significant benefit for aspirin users was only found in population-based and hospital-based case-control studies but not in cohort or nest case-control studies. Further subgroup analyses showed that aspirin use could decrease risk of in situ breast tumors or hormone receptor-positive tumors and reduce risk of breast cancer in postmenopausal women. Aspirin use may not affect overall risk of breast cancer, but decrease risk of in situ breast tumors or hormone receptor-positive tumors and reduce risk of breast cancer in postmenopausal women. Considering between-study significant heterogeneity and publication bias, confirmation in future studies is also essential. PMID:26315555

  7. Deciding which chemical mixtures risk assessment methods work best for what mixtures

    SciTech Connect

    Teuschler, Linda K.

    2007-09-01

    The most commonly used chemical mixtures risk assessment methods involve simple notions of additivity and toxicological similarity. Newer methods are emerging in response to the complexities of chemical mixture exposures and effects. Factors based on both science and policy drive decisions regarding whether to conduct a chemical mixtures risk assessment and, if so, which methods to employ. Scientific considerations are based on positive evidence of joint toxic action, elevated human exposure conditions or the potential for significant impacts on human health. Policy issues include legislative drivers that may mandate action even though adequate toxicity data on a specific mixture may not be available and risk assessment goals that impact the choice of risk assessment method to obtain the amount of health protection desired. This paper discusses three important concepts used to choose among available approaches for conducting a chemical mixtures risk assessment: (1) additive joint toxic action of mixture components; (2) toxicological interactions of mixture components; and (3) chemical composition of complex mixtures. It is proposed that scientific support for basic assumptions used in chemical mixtures risk assessment should be developed by expert panels, risk assessment methods experts, and laboratory toxicologists. This is imperative to further develop and refine quantitative methods and provide guidance on their appropriate applications. Risk assessors need scientific support for chemical mixtures risk assessment methods in the form of toxicological data on joint toxic action for high priority mixtures, statistical methods for analyzing dose-response for mixtures, and toxicological and statistical criteria for determining sufficient similarity of complex mixtures.

  8. Method for Assessing the Integrated Risk of Soil Pollution in Industrial and Mining Gathering Areas.

    PubMed

    Guan, Yang; Shao, Chaofeng; Gu, Qingbao; Ju, Meiting; Zhang, Qian

    2015-11-01

    Industrial and mining activities are recognized as major sources of soil pollution. This study proposes an index system for evaluating the inherent risk level of polluting factories and introduces an integrated risk assessment method based on human health risk. As a case study, the health risk, polluting factories and integrated risks were analyzed in a typical industrial and mining gathering area in China, namely, Binhai New Area. The spatial distribution of the risk level was determined using a Geographic Information System. The results confirmed the following: (1) Human health risk in the study area is moderate to extreme, with heavy metals posing the greatest threat; (2) Polluting factories pose a moderate to extreme inherent risk in the study area. Such factories are concentrated in industrial and urban areas, but are irregularly distributed and also occupy agricultural land, showing a lack of proper planning and management; (3) The integrated risks of soil are moderate to high in the study area. PMID:26580644

  9. Method for Assessing the Integrated Risk of Soil Pollution in Industrial and Mining Gathering Areas

    PubMed Central

    Guan, Yang; Shao, Chaofeng; Gu, Qingbao; Ju, Meiting; Zhang, Qian

    2015-01-01

    Industrial and mining activities are recognized as major sources of soil pollution. This study proposes an index system for evaluating the inherent risk level of polluting factories and introduces an integrated risk assessment method based on human health risk. As a case study, the health risk, polluting factories and integrated risks were analyzed in a typical industrial and mining gathering area in China, namely, Binhai New Area. The spatial distribution of the risk level was determined using a Geographic Information System. The results confirmed the following: (1) Human health risk in the study area is moderate to extreme, with heavy metals posing the greatest threat; (2) Polluting factories pose a moderate to extreme inherent risk in the study area. Such factories are concentrated in industrial and urban areas, but are irregularly distributed and also occupy agricultural land, showing a lack of proper planning and management; (3) The integrated risks of soil are moderate to high in the study area. PMID:26580644

  10. Standard methods for land-use planning to determine the effects on societal risk.

    PubMed

    Laheij, G M; Post, J G; Ale, B J

    2000-01-01

    In the Netherlands, the individual risk and the societal risk are used in efforts to reduce the number of people exposed to the effects of an accident. In principle, the societal risk for each new land-use plan should be recalculated. Since this is proving increasingly cumbersome for planning agencies, several methods have been developed for SEVESO establishments and establishments for which in the Netherlands a generic zoning policy is used to determine the effects of new land-use plans on the societal risk. The methods give the uniform population density from a certain distance around the establishment at which the indicative limit for the societal risk is not exceeded. Correction factors are determined for non-uniform population distributions around the establishment, non-continuous residence times and deviating societal risk limits. Using these methods allows decision-making without the necessity of repeating quantified risk analyses for each alternative proposal. PMID:10677665

  11. Risk assessment as an evolved threat detection and analysis process.

    PubMed

    Blanchard, D Caroline; Griebel, Guy; Pobbe, Roger; Blanchard, Robert J

    2011-03-01

    Risk assessment is a pattern of activities involved in detection and analysis of threat stimuli and the situations in which the threat is encountered. It is a core process in the choice of specific defenses, such as flight, freezing, defensive threat and defensive attack, that counter the threat and minimize the danger it poses. This highly adaptive process takes into account important characteristics, such as type and location (including distance from the subject) of the threat, as well as those (e.g. presence of an escape route or hiding place) of the situation, combining them to predict which specific defense is optimal with that particular combination of threat and situation. Risk assessment is particularly associated with ambiguity either of the threat stimulus or of the outcome of available defensive behaviors. It is also crucial in determining that threat is no longer present, permitting a return to normal, nondefensive behavior. Although risk assessment has been described in detail in rodents, it is also a feature of human defensive behavior, particularly in association with ambiguity. Rumination may be a specifically human form of risk assessment, more often expressed by women, and highly associated with anxiety. Risk assessment behaviors respond to drugs effective against generalized anxiety disorder; however, flight, a dominant specific defense in many common situations, shows a pharmacological response profile closer to that of panic disorder. Risk assessment and flight also appear to show some consistent differences in terms of brain regional activation patterns, suggesting a potential biological differentiation of anxiety and fear/panic systems. An especially intriguing possibility is that mirror neurons may respond to some of the same types of situational differences that are analyzed during risk assessment, suggesting an additional functional role for these neurons. PMID:21056591

  12. Biological risk factors for suicidal behaviors: a meta-analysis.

    PubMed

    Chang, B P; Franklin, J C; Ribeiro, J D; Fox, K R; Bentley, K H; Kleiman, E M; Nock, M K

    2016-01-01

    Prior studies have proposed a wide range of potential biological risk factors for future suicidal behaviors. Although strong evidence exists for biological correlates of suicidal behaviors, it remains unclear if these correlates are also risk factors for suicidal behaviors. We performed a meta-analysis to integrate the existing literature on biological risk factors for suicidal behaviors and to determine their statistical significance. We conducted a systematic search of PubMed, PsycInfo and Google Scholar for studies that used a biological factor to predict either suicide attempt or death by suicide. Inclusion criteria included studies with at least one longitudinal analysis using a biological factor to predict either of these outcomes in any population through 2015. From an initial screen of 2541 studies we identified 94 cases. Random effects models were used for both meta-analyses and meta-regression. The combined effect of biological factors produced statistically significant but relatively weak prediction of suicide attempts (weighted mean odds ratio (wOR)=1.41; CI: 1.09-1.81) and suicide death (wOR=1.28; CI: 1.13-1.45). After accounting for publication bias, prediction was nonsignificant for both suicide attempts and suicide death. Only two factors remained significant after accounting for publication bias-cytokines (wOR=2.87; CI: 1.40-5.93) and low levels of fish oil nutrients (wOR=1.09; CI: 1.01-1.19). Our meta-analysis revealed that currently known biological factors are weak predictors of future suicidal behaviors. This conclusion should be interpreted within the context of the limitations of the existing literature, including long follow-up intervals and a lack of tests of interactions with other risk factors. Future studies addressing these limitations may more effectively test for potential biological risk factors. PMID:27622931

  13. MORTALITY RISK VALUATION AND STATED PREFERENCE METHODS: AN EXPLORATORY STUDY

    EPA Science Inventory

    The purposes of this project are: (1) to improve understanding of cognitive processes involved in the valuation of mortality risk reductions that occur in an environmental pollution context, and (2) to translate this understanding into survey language appropriate for future stat...

  14. Engine non-containment: UK risk assessment methods

    NASA Technical Reports Server (NTRS)

    Wallin, J. C.

    1977-01-01

    More realistic guideline data must be developed for use in aircraft design in order to comply with recent changes in British civil airworthiness requirements. Unrealistically pessimistic results were obtained when the methodology developed during the Concorde SST certification program was extended to assess catastrophic risks resulting from uncontained engine rotors.

  15. Plasma prolactin and breast cancer risk: a meta- analysis

    PubMed Central

    Wang, Minghao; Wu, Xiujuan; Chai, Fan; Zhang, Yi; Jiang, Jun

    2016-01-01

    Breast cancer is the most common cancer among women, and its incidence is on a constant rise. Previous studies suggest that higher levels of plasma prolactin are associated with escalated risk of breast cancer, however, these results are contradictory and inconclusive. PubMed and Medline were used to search and identify published observational studies that assessed the relationship between plasma prolactin levels and the risk of breast cancer. The pooled relative risks (RRs) with 95% confidence intervals (CIs) were calculated using a fixed-effects or random-effects model. A total of 7 studies were included in our analysis. For the highest versus lowest levels of plasma prolactin, the pooled RR (95% CI) of breast cancer were 1.16 (1.04, 1.29). In subgroup analyses, we found a positive association between plasma prolactin levels and the risk of breast cancer among the patients who were postmenopausal, ER+/PR+ or in situ and invasive carcinoma. However, this positive association was not detected in the premenopausal and ER-/PR- patients. In conclusion, the present study provides evidence supporting a significantly positive association between plasma prolactin levels and the risk of breast cancer. PMID:27184120

  16. Plasma prolactin and breast cancer risk: a meta- analysis.

    PubMed

    Wang, Minghao; Wu, Xiujuan; Chai, Fan; Zhang, Yi; Jiang, Jun

    2016-01-01

    Breast cancer is the most common cancer among women, and its incidence is on a constant rise. Previous studies suggest that higher levels of plasma prolactin are associated with escalated risk of breast cancer, however, these results are contradictory and inconclusive. PubMed and Medline were used to search and identify published observational studies that assessed the relationship between plasma prolactin levels and the risk of breast cancer. The pooled relative risks (RRs) with 95% confidence intervals (CIs) were calculated using a fixed-effects or random-effects model. A total of 7 studies were included in our analysis. For the highest versus lowest levels of plasma prolactin, the pooled RR (95% CI) of breast cancer were 1.16 (1.04, 1.29). In subgroup analyses, we found a positive association between plasma prolactin levels and the risk of breast cancer among the patients who were postmenopausal, ER(+)/PR(+) or in situ and invasive carcinoma. However, this positive association was not detected in the premenopausal and ER(-)/PR(-) patients. In conclusion, the present study provides evidence supporting a significantly positive association between plasma prolactin levels and the risk of breast cancer. PMID:27184120

  17. An analysis method for control reconfigurability of linear systems

    NASA Astrophysics Data System (ADS)

    Wang, Dayi; Duan, Wenjie; Liu, Chengrui

    2016-01-01

    The reconfigurability of control systems is further researched based on the function-objective model (FOM). The establishment of the FOM has been published in the authors' former paper, solving the problem whether the system is reconfigurable without losing the desired control objective. Based on the FOM, the importance factor, the risk factor and the k th reconfigurability factor are proposed to evaluate the fault risks of all components and the system reconfigurability with k faults. These factors show which components should be improved and which faults cannot be tolerated. The analysis results are very useful for enhancing the fault-tolerance performances of the control systems by improving system designs. A satellite model is utilized to illustrate the proposed method.

  18. [Radiobiological analysis of cancerogenic risk values in radioepidemiological investigations].

    PubMed

    Rozhdestvenskiĭ, L M

    2008-01-01

    The aim of the present article consisted in critical analysis of the epidemiological approach to radiocancerogenic risk estimation in region of low level radiation (LLR). The estimation is making by means of mathematician models that ignore a principal difference in biological action of LLR and high level radiation (HLR). The main formal characteristic of LLR action is the presence of a plateau in beginning of a dose-effect curve of radiogenic risk. It may be argued by the following positions: repeating the plateau-phenomenon on various radiobiological effects, in different tests and bioobjects, first; a paradoxical trend of reciprocal ERR/Sv increasing regarding dose decreasing in region of plateau, second, and third, the increasing of the curvature in dose-effect curve beginning. The presence of a plateau is associated with the presence of a real radiogenic risk threshold. Besides, the analysis of processes influencing significantly the dynamics of initial radiation injury of biologically important macromolecules showed the preference in region of LLR those, decreasing/eliminating genome damages. There is follows from mentioned above a necessity to evaluate radiogenic risks in LLR region separately from HLR region. PMID:18825986

  19. A Utility/Cost Analysis of Breast Cancer Risk Prediction Algorithms

    PubMed Central

    Abbey, Craig K.; Wu, Yirong; Burnside, Elizabeth S.; Wunderlich, Adam; Samuelson, Frank W.; Boone, John M.

    2016-01-01

    Breast cancer risk prediction algorithms are used to identify subpopulations that are at increased risk for developing breast cancer. They can be based on many different sources of data such as demographics, relatives with cancer, gene expression, and various phenotypic features such as breast density. Women who are identified as high risk may undergo a more extensive (and expensive) screening process that includes MRI or ultrasound imaging in addition to the standard full-field digital mammography (FFDM) exam. Given that there are many ways that risk prediction may be accomplished, it is of interest to evaluate them in terms of expected cost, which includes the costs of diagnostic outcomes. In this work we perform an expected-cost analysis of risk prediction algorithms that is based on a published model that includes the costs associated with diagnostic outcomes (true-positive, false-positive, etc.). We assume the existence of a standard screening method and an enhanced screening method with higher scan cost, higher sensitivity, and lower specificity. We then assess expected cost of using a risk prediction algorithm to determine who gets the enhanced screening method under the strong assumption that risk and diagnostic performance are independent. We find that if risk prediction leads to a high enough positive predictive value, it will be cost-effective regardless of the size of the subpopulation. Furthermore, in terms of the hit-rate and false-alarm rate of the of the risk-prediction algorithm, iso-cost contours are lines with slope determined by properties of the available diagnostic systems for screening. PMID:27335532

  20. A utility/cost analysis of breast cancer risk prediction algorithms

    NASA Astrophysics Data System (ADS)

    Abbey, Craig K.; Wu, Yirong; Burnside, Elizabeth S.; Wunderlich, Adam; Samuelson, Frank W.; Boone, John M.

    2016-03-01

    Breast cancer risk prediction algorithms are used to identify subpopulations that are at increased risk for developing breast cancer. They can be based on many different sources of data such as demographics, relatives with cancer, gene expression, and various phenotypic features such as breast density. Women who are identified as high risk may undergo a more extensive (and expensive) screening process that includes MRI or ultrasound imaging in addition to the standard full-field digital mammography (FFDM) exam. Given that there are many ways that risk prediction may be accomplished, it is of interest to evaluate them in terms of expected cost, which includes the costs of diagnostic outcomes. In this work we perform an expected-cost analysis of risk prediction algorithms that is based on a published model that includes the costs associated with diagnostic outcomes (true-positive, false-positive, etc.). We assume the existence of a standard screening method and an enhanced screening method with higher scan cost, higher sensitivity, and lower specificity. We then assess expected cost of using a risk prediction algorithm to determine who gets the enhanced screening method under the strong assumption that risk and diagnostic performance are independent. We find that if risk prediction leads to a high enough positive predictive value, it will be cost-effective regardless of the size of the subpopulation. Furthermore, in terms of the hit-rate and false-alarm rate of the of the risk prediction algorithm, iso-cost contours are lines with slope determined by properties of the available diagnostic systems for screening.