Science.gov

Sample records for risk analysis method

  1. Bayes Method Plant Aging Risk Analysis

    Energy Science and Technology Software Center (ESTSC)

    1992-03-13

    DORIAN is an integrated package for performing Bayesian aging analysis of reliability data; e.g. for identifying trends in component failure rates and/or outage durations as a function of time. The user must specify several alternatives hypothesized aging models (i.e. possible trends) along with prior probabilities indicating the subjective probability that each trend is actually the correct one. DORIAN then uses component failure and/or repair data over time to update these prior probabilities and develop amore » posterior probability for each aging model, representing the probability that each model is the correct one in light of the observed data rather than a priori. Mean, median, and 5th and 95th percentile trends are also compiled from the posterior probabilities.« less

  2. Comprehensive safeguards evaluation methods and societal risk analysis

    SciTech Connect

    Richardson, J.M.

    1982-03-01

    Essential capabilities of an integrated evaluation methodology for analyzing safeguards systems are discussed. Such a methodology must be conceptually meaningful, technically defensible, discriminating and consistent. A decompostion of safeguards systems by function is mentioned as a possible starting point for methodology development. The application of a societal risk equation to safeguards systems analysis is addressed. Conceptual problems with this approach are discussed. Technical difficulties in applying this equation to safeguards systems are illustrated through the use of confidence intervals, information content, hypothesis testing and ranking and selection procedures.

  3. Synthesis of Enterprise and Value-Based Methods for Multiattribute Risk Analysis

    SciTech Connect

    C. Robert Kenley; John W. Collins; John M. Beck; Harold J. Heydt; Chad B. Garcia

    2001-10-01

    This paper describes a method for performing multiattribute decision analysis to prioritize ap-proaches to handling risks during the development and operation of complex socio-technical systems. The method combines risk categorization based on enterprise views, risk prioritization of the categories based on the Analytic Hierarchy Process (AHP), and more standard probability-consequence ratings schemes. We also apply value-based testing me-thods used in software development to prioritize risk-handling approaches. We describe a tool that synthesizes the methods and performs a multiattribute analysis of the technical and pro-grammatic risks on the Next Generation Nuclear Plant (NGNP) enterprise.

  4. Development of partial failure analysis method in probability risk assessments

    SciTech Connect

    Ni, T.; Modarres, M.

    1996-12-01

    This paper presents a new approach to evaluate the partial failure effect on current Probability Risk Assessments (PRAs). An integrated methodology of the thermal-hydraulic analysis and fuzzy logic simulation using the Dynamic Master Logic Diagram (DMLD) was developed. The thermal-hydraulic analysis used in this approach is to identify partial operation effect of any PRA system function in a plant model. The DMLD is used to simulate the system performance of the partial failure effect and inspect all minimal cut sets of system functions. This methodology can be applied in the context of a full scope PRA to reduce core damage frequency. An example of this application of the approach is presented. The partial failure data used in the example is from a survey study of partial failure effects from the Nuclear Plant Reliability Data System (NPRDS).

  5. Simplified plant analysis risk (SPAR) human reliability analysis (HRA) methodology: Comparisons with other HRA methods

    SciTech Connect

    J. C. Byers; D. I. Gertman; S. G. Hill; H. S. Blackman; C. D. Gentillon; B. P. Hallbert; L. N. Haney

    2000-07-31

    The 1994 Accident Sequence Precursor (ASP) human reliability analysis (HRA) methodology was developed for the U.S. Nuclear Regulatory Commission (USNRC) in 1994 by the Idaho National Engineering and Environmental Laboratory (INEEL). It was decided to revise that methodology for use by the Simplified Plant Analysis Risk (SPAR) program. The 1994 ASP HRA methodology was compared, by a team of analysts, on a point-by-point basis to a variety of other HRA methods and sources. This paper briefly discusses how the comparisons were made and how the 1994 ASP HRA methodology was revised to incorporate desirable aspects of other methods. The revised methodology was renamed the SPAR HRA methodology.

  6. Simplified Plant Analysis Risk (SPAR) Human Reliability Analysis (HRA) Methodology: Comparisons with other HRA Methods

    SciTech Connect

    Byers, James Clifford; Gertman, David Ira; Hill, Susan Gardiner; Blackman, Harold Stabler; Gentillon, Cynthia Ann; Hallbert, Bruce Perry; Haney, Lon Nolan

    2000-08-01

    The 1994 Accident Sequence Precursor (ASP) human reliability analysis (HRA) methodology was developed for the U.S. Nuclear Regulatory Commission (USNRC) in 1994 by the Idaho National Engineering and Environmental Laboratory (INEEL). It was decided to revise that methodology for use by the Simplified Plant Analysis Risk (SPAR) program. The 1994 ASP HRA methodology was compared, by a team of analysts, on a point-by-point basis to a variety of other HRA methods and sources. This paper briefly discusses how the comparisons were made and how the 1994 ASP HRA methodology was revised to incorporate desirable aspects of other methods. The revised methodology was renamed the SPAR HRA methodology.

  7. Application of the Maximum Entropy Method to Risk Analysis of Mergers and Acquisitions

    NASA Astrophysics Data System (ADS)

    Xie, Jigang; Song, Wenyun

    The maximum entropy (ME) method can be used to analyze the risk of mergers and acquisitions when only pre-acquisition information is available. A practical example of the risk analysis of China listed firms’ mergers and acquisitions is provided to testify the feasibility and practicality of the method.

  8. Finding and fixing systems weaknesses: probabilistic methods and applications of engineering risk analysis.

    PubMed

    Paté-Cornell, Elisabeth

    2002-04-01

    Methods of engineering risk analysis are based on a functional analysis of systems and on the probabilities (generally Bayesian) of the events and random variables that affect their performances. These methods allow identification of a system's failure modes, computation of its probability of failure or performance deterioration per time unit or operation, and of the contribution of each component to the probabilities and consequences of failures. The model has been extended to include the human decisions and actions that affect components' performances, and the management factors that affect behaviors and can thus be root causes of system failures. By computing the risk with and without proposed measures, one can then set priorities among different risk management options under resource constraints. In this article, I present briefly the engineering risk analysis method, then several illustrations of risk computations that can be used to identify a system's weaknesses and the most cost-effective way to fix them. The first example concerns the heat shield of the space shuttle orbiter and shows the relative risk contribution of the tiles in different areas of the orbiter's surface. The second application is to patient risk in anesthesia and demonstrates how the engineering risk analysis method can be used in the medical domain to rank the benefits of risk mitigation measures, in that case, mostly organizational. The third application is a model of seismic risk analysis and mitigation, with application to the San Francisco Bay area for the assessment of the costs and benefits of different seismic provisions of building codes. In all three cases, some aspects of the results were not intuitively obvious. The probabilistic risk analysis (PRA) method allowed identifying system weaknesses and the most cost-effective way to fix them. PMID:12022679

  9. Review of Research Trends and Methods in Nano Environmental, Health, and Safety Risk Analysis.

    PubMed

    Erbis, Serkan; Ok, Zeynep; Isaacs, Jacqueline A; Benneyan, James C; Kamarthi, Sagar

    2016-08-01

    Despite the many touted benefits of nanomaterials, concerns remain about their possible environmental, health, and safety (EHS) risks in terms of their toxicity, long-term accumulation effects, or dose-response relationships. The published studies on EHS risks of nanomaterials have increased significantly over the past decade and half, with most focused on nanotoxicology. Researchers are still learning about health consequences of nanomaterials and how to make environmentally responsible decisions regarding their production. This article characterizes the scientific literature on nano-EHS risk analysis to map the state-of-the-art developments in this field and chart guidance for the future directions. First, an analysis of keyword co-occurrence networks is investigated for nano-EHS literature published in the past decade to identify the intellectual turning points and research trends in nanorisk analysis studies. The exposure groups targeted in emerging nano-EHS studies are also assessed. System engineering methods for risk, safety, uncertainty, and system reliability analysis are reviewed, followed by detailed descriptions where applications of these methods are utilized to analyze nanomaterial EHS risks. Finally, the trends, methods, future directions, and opportunities of system engineering methods in nano-EHS research are discussed. The analysis of nano-EHS literature presented in this article provides important insights on risk assessment and risk management tools associated with nanotechnology, nanomanufacturing, and nano-enabled products. PMID:26882074

  10. RAMPART (TM): Risk Assessment Method-Property Analysis and Ranking Tool v.4.0

    SciTech Connect

    2007-09-30

    RAMPART{trademark}, Risk Assessment Method-property Analysis and Ranking Tool, is a new type of computer software package for the assessment of risk to buildings. RAMPART{trademark} has been developed by Sandia National Laboratories (SNL) for the U.S. General Services Administration (GSA). RAMPART {trademark} has been designed and developed to be a risk-based decision support tool that requires no risk analysis expertise on the part of the user. The RAMPART{trademark} user interface elicits information from the user about the building. The RAMPART{trademark} expert system is a set of rules that embodies GSA corporate knowledge and SNL's risk assessment experience. The RAMPART{trademark} database contains both data entered by the user during a building analysis session and large sets of natural hazard and crime data. RAMPART{trademark} algorithms use these data to assess the risk associated with a given building in the face of certain hazards. Risks arising from five natural hazards (earthquake, hurricane, winter storm, tornado and flood); crime (inside and outside the building); fire and terrorism are calculated. These hazards may cause losses of various kinds. RAMPART{trademark} considers death, injury, loss of mission, loss of property, loss of contents, loss of building use, and first-responder loss. The results of each analysis are presented graphically on the screen and in a written report.

  11. RAMPART (TM): Risk Assessment Method-Property Analysis and Ranking Tool v.4.0

    Energy Science and Technology Software Center (ESTSC)

    2007-09-30

    RAMPART{trademark}, Risk Assessment Method-property Analysis and Ranking Tool, is a new type of computer software package for the assessment of risk to buildings. RAMPART{trademark} has been developed by Sandia National Laboratories (SNL) for the U.S. General Services Administration (GSA). RAMPART {trademark} has been designed and developed to be a risk-based decision support tool that requires no risk analysis expertise on the part of the user. The RAMPART{trademark} user interface elicits information from the user aboutmore » the building. The RAMPART{trademark} expert system is a set of rules that embodies GSA corporate knowledge and SNL's risk assessment experience. The RAMPART{trademark} database contains both data entered by the user during a building analysis session and large sets of natural hazard and crime data. RAMPART{trademark} algorithms use these data to assess the risk associated with a given building in the face of certain hazards. Risks arising from five natural hazards (earthquake, hurricane, winter storm, tornado and flood); crime (inside and outside the building); fire and terrorism are calculated. These hazards may cause losses of various kinds. RAMPART{trademark} considers death, injury, loss of mission, loss of property, loss of contents, loss of building use, and first-responder loss. The results of each analysis are presented graphically on the screen and in a written report.« less

  12. Multi-factor Constrained Information Analysis Method for Landslide Hazard Risk

    NASA Astrophysics Data System (ADS)

    Tao, Kunwang; Wang, Liang; Qian, Xinlin

    2015-04-01

    Landslide hazard cause enormous damage to human life, property, and the environment. The most effective way to mitigate the effects of landslides is to evaluate the risk of the landslides, and take measures to avoid the loss in advance. Various factors should be considered for the landslides' risk assessment, so the assessment has great complexity and uncertainty. According to landslides, the multiple factors constrained method for landslides risk assessment was proposed at the same time, which with three step to carried it out, first using the GIS technology to divide the analysis grid as the base analysis unit, second, according to the available information, slope, lithology, faults, landuse, etc. as the multiple evaluation factors, finally, based on years of landslides in the observed data, the risk assessment analysis of landslides were realized with of multiple factors constrained assessment model, which the weight value of every factor was determined by the information model. The region of Gongliu was selected as the experimental area which located in Xinjiang Ili River basin and the altitude of 600 to 4000 meters, with the special terrain characteristics of long at the east to west, narrow at north to south. The unique topography characters is the abundant rainfall which causing frequent landslides. By selecting 500m * 500m as an analysis grid which covering the whole study area, based on the multiple factors constrained method for the landslides risk assessment, a comprehensive assessment of the landslides risk in this region were computed out, and the landslide hazard classification map was realized at the end. From the experimental results of the statistical perspective, the proportion of landslide hazard point is 94.04% at a little high risk and high risk areas. And the proportion of 4.64% at the low risk zone, 1.32% at the most low risk zone. The results showed a high probability of landslides at high level of the assess region, which showed that

  13. An improved method for risk evaluation in failure modes and effects analysis of CNC lathe

    NASA Astrophysics Data System (ADS)

    Rachieru, N.; Belu, N.; Anghel, D. C.

    2015-11-01

    Failure mode and effects analysis (FMEA) is one of the most popular reliability analysis tools for identifying, assessing and eliminating potential failure modes in a wide range of industries. In general, failure modes in FMEA are evaluated and ranked through the risk priority number (RPN), which is obtained by the multiplication of crisp values of the risk factors, such as the occurrence (O), severity (S), and detection (D) of each failure mode. However, the crisp RPN method has been criticized to have several deficiencies. In this paper, linguistic variables, expressed in Gaussian, trapezoidal or triangular fuzzy numbers, are used to assess the ratings and weights for the risk factors S, O and D. A new risk assessment system based on the fuzzy set theory and fuzzy rule base theory is to be applied to assess and rank risks associated to failure modes that could appear in the functioning of Turn 55 Lathe CNC. Two case studies have been shown to demonstrate the methodology thus developed. It is illustrated a parallel between the results obtained by the traditional method and fuzzy logic for determining the RPNs. The results show that the proposed approach can reduce duplicated RPN numbers and get a more accurate, reasonable risk assessment. As a result, the stability of product and process can be assured.

  14. Genotype relative risks: methods for design and analysis of candidate-gene association studies.

    PubMed Central

    Schaid, D J; Sommer, S S

    1993-01-01

    Design and analysis methods are presented for studying the association of a candidate gene with a disease by using parental data in place of nonrelated controls. This alternative design eliminates spurious differences in allele frequencies between cases and nonrelated controls resulting from different ethnic origins and population stratification for these two groups. We present analysis methods which are based on two genetic relative risks: (1) the relative risk of disease for homozygotes with two copies of the candidate gene versus homozygotes without the candidate gene and (2) the relative risk for heterozygotes with one copy of the candidate gene versus homozygotes without the candidate gene. In addition to estimating the magnitude of these relative risks, likelihood methods allow specific hypotheses to be tested, namely, a test for overall association of the candidate gene with disease, as well as specific genetic hypotheses, such as dominant or recessive inheritance. Two likelihood methods are presented: (1) a likelihood method appropriate when Hardy-Weinberg equilibrium holds and (2) a likelihood method in which we condition on parental genotype data when Hardy-Weinberg equilibrium does not hold. The results for the relative efficiency of these two methods suggest that the conditional approach may at times be preferable, even when equilibrium holds. Sample-size and power calculations are presented for a multitiered design. The purpose of tier 1 is to detect the presence of an abnormal sequence for a postulated candidate gene among a small group of cases. The purpose of tier 2 is to test for association of the abnormal variant with disease, such as by the likelihood methods presented. The purpose of tier 3 is to confirm positive results from tier 2. Results indicate that required sample sizes are smaller when expression of disease is recessive, rather than dominant, and that, for recessive disease and large relative risks, necessary sample sizes may be

  15. Genotype relative risks: Methods for design and analysis of candidate-gene association studies

    SciTech Connect

    Shaid, D.J.; Sommer, S.S. )

    1993-11-01

    Design and analysis methods are presented for studying the association of a candidate gene with a disease by using parental data in place of nonrelated controls. This alternating design eliminates spurious differences in allele frequencies between cases and nonrelated controls resulting from different ethnic origins and population stratification for these two groups. The authors present analysis methods which are based on two genetic relative risks: (1) the relative risk of disease for homozygotes with two copies of the candidate gene versus homozygotes without the candidate gene and (2) the relative risk for heterozygotes with one copy of the candidate gene versus homozygotes without the candidate gene. In addition to estimating the magnitude of these relative risks, likelihood methods allow specific hypotheses to be tested, namely, a test for overall association of the candidate gene with disease, as well as specific genetic hypotheses, such as dominant or recessive inheritance. Two likelihood methods are presented: (1) a likelihood method appropriate when Hardy-Weinberg equilibrium holds and (2) a likelihood method in which the authors condition on parental genotype data when Hardy-Weinberg equilibrium does not hold. The results for the relative efficiency of these two methods suggest that the conditional approach may at times be preferable, even when equilibrium holds. Sample-size and power calculations are presented for a multitiered design. Tier 1 detects the presence of an abnormal sequence for a postulated candidate gene among a small group of cases. Tier 2 tests for association of the abnormal variant with disease, such as by the likelihood methods presented. Tier 3 confirms positive results from tier 2. Results indicate that required sample sizes are smaller when expression of disease is recessive, rather than dominant, and that, for recessive disease and large relative risks, necessary sample sizes may be feasible. 19 refs., 2 figs., 2 tabs.

  16. Method of assessing a lipid-related health risk based on ion mobility analysis of lipoproteins

    DOEpatents

    Benner, W. Henry; Krauss, Ronald M.; Blanche, Patricia J.

    2010-12-14

    A medical diagnostic method and instrumentation system for analyzing noncovalently bonded agglomerated biological particles is described. The method and system comprises: a method of preparation for the biological particles; an electrospray generator; an alpha particle radiation source; a differential mobility analyzer; a particle counter; and data acquisition and analysis means. The medical device is useful for the assessment of human diseases, such as cardiac disease risk and hyperlipidemia, by rapid quantitative analysis of lipoprotein fraction densities. Initially, purification procedures are described to reduce an initial blood sample to an analytical input to the instrument. The measured sizes from the analytical sample are correlated with densities, resulting in a spectrum of lipoprotein densities. The lipoprotein density distribution can then be used to characterize cardiac and other lipid-related health risks.

  17. Risk-based analysis methods applied to nuclear power plant technical specifications

    SciTech Connect

    Wagner, D.P.; Minton, L.A.; Gaertner, J.P.

    1989-03-01

    A computer-aided methodology and practical applications of risk-based evaluation of technical specifications are described. The methodology, developed for use by the utility industry, is a part of the overall process of improving nuclear power plant technical specifications. The SOCRATES computer program uses the results of a probabilistic risk assessment or a system-level risk analysis to calculate changes in risk due to changes in the surveillance test interval and/or the allowed outage time stated in the technical specification. The computer program can accommodate various testing strategies (such as staggered or simultaneous testing) to allow modeling of component testing as it is carried out at the plant. The methods and computer program are an integral part of a larger decision process aimed at determining benefits from technical specification changes. These benefits can include cost savings to the utilities by reducing forced shutdowns and decreasing labor requirements for test and maintenance activities, with no adverse impacts on risk. The methodology and the SOCRATES computer program have been used extensively toe valuate several actual technical specifications in case studies demonstrating the methods. Summaries of these applications demonstrate the types of results achieved and the usefulness of the risk-based evaluation in improving the technical specifications.

  18. An integrated optimization method for river water quality management and risk analysis in a rural system.

    PubMed

    Liu, J; Li, Y P; Huang, G H; Zeng, X T; Nie, S

    2016-01-01

    In this study, an interval-stochastic-based risk analysis (RSRA) method is developed for supporting river water quality management in a rural system under uncertainty (i.e., uncertainties exist in a number of system components as well as their interrelationships). The RSRA method is effective in risk management and policy analysis, particularly when the inputs (such as allowable pollutant discharge and pollutant discharge rate) are expressed as probability distributions and interval values. Moreover, decision-makers' attitudes towards system risk can be reflected using a restricted resource measure by controlling the variability of the recourse cost. The RSRA method is then applied to a real case of water quality management in the Heshui River Basin (a rural area of China), where chemical oxygen demand (COD), total nitrogen (TN), total phosphorus (TP), and soil loss are selected as major indicators to identify the water pollution control strategies. Results reveal that uncertainties and risk attitudes have significant effects on both pollutant discharge and system benefit. A high risk measure level can lead to a reduced system benefit; however, this reduction also corresponds to raised system reliability. Results also disclose that (a) agriculture is the dominant contributor to soil loss, TN, and TP loads, and abatement actions should be mainly carried out for paddy and dry farms; (b) livestock husbandry is the main COD discharger, and abatement measures should be mainly conducted for poultry farm; (c) fishery accounts for a high percentage of TN, TP, and COD discharges but a has low percentage of overall net benefit, and it may be beneficial to cease fishery activities in the basin. The findings can facilitate the local authority in identifying desired pollution control strategies with the tradeoff between socioeconomic development and environmental sustainability. PMID:26310705

  19. Quantitative environmental risk analysis

    SciTech Connect

    Klovning, J.; Nilsen, E.F.

    1995-12-31

    According to regulations relating to implementation and rise of risk analysis in the petroleum activities issued by the Norwegian Petroleum Directorate, it is mandatory for an operator on the Norwegian Continental Shelf to establish acceptance criteria for environmental risk in the activities and carry out environmental risk analysis. This paper presents a {open_quotes}new{close_quotes} method for environmental risk analysis developed by the company. The objective has been to assist the company to meet rules and regulations and to assess and describe the environmental risk in a systematic manner. In the environmental risk analysis the most sensitive biological resource in the affected area is used to assess the environmental damage. The analytical method is based on the methodology for quantitative risk analysis related to loss of life. In addition it incorporates the effect of seasonal fluctuations in the environmental risk evaluations. The paper is describing the function of the main analytical sequences exemplified through an analysis of environmental risk related to exploration drilling in an environmental sensitive area on the Norwegian Continental Shelf.

  20. Uncertainty analysis in regulatory programs: Application factors versus probabilistic methods in ecological risk assessments of chemicals

    SciTech Connect

    Moore, D.R.J.; Elliot, B.

    1995-12-31

    In assessments of toxic chemicals, sources of uncertainty may be dealt with by two basic approaches: application factors and probabilistic methods. In regulatory programs, the most common approach is to calculate a quotient by dividing the predicted environmental concentration (PEC) by the predicted no effects concentration (PNEC). PNECs are usually derived from laboratory bioassays, thus requiring the use of application factors to account for uncertainty introduced by the extrapolation from the laboratory to the field, and from measurement to assessment endpoints. Using this approach, often with worst-case assumptions about exposure and species sensitivities, the hope is that chemicals with a quotient of less than one will have a very low probability of causing adverse ecological effects. This approach has received widespread criticism recently, particularly because it tends to be overly conservative and does not adequately estimate the magnitude and probability of causing adverse effects. On the plus side, application factors are simple to use, accepted worldwide, and may be used with limited effects data in a quotient calculation. The alternative approach is to use probabilistic methods such as Monte Carlo simulation, Baye`s theorem or other techniques to estimate risk. Such methods often have rigorous statistical assumptions and may have large data requirements. Stating an effect in probabilistic terms, however, forces the identification of sources of uncertainty and quantification of their impact on risk estimation. In this presentation the authors discuss the advantages and disadvantages of using application factors and probabilistic methods in dealing with uncertainty in ecological risk assessments of chemicals. Based on this analysis, recommendations are presented to assist in choosing the appropriate approach for different types of regulatory programs dealing with toxic chemicals.

  1. Animal Disease Import Risk Analysis--a Review of Current Methods and Practice.

    PubMed

    Peeler, E J; Reese, R A; Thrush, M A

    2015-10-01

    The application of risk analysis to the spread of disease with international trade in animals and their products, that is, import risk analysis (IRA), has been largely driven by the Sanitary and Phytosanitary (SPS) agreement of the World Trade Organization (WTO). The degree to which the IRA standard established by the World Organization for Animal Health (OIE), and associated guidance, meets the needs of the SPS agreement is discussed. The use of scenario trees is the core modelling approach used to represent the steps necessary for the hazard to occur. There is scope to elaborate scenario trees for commodity IRA so that the quantity of hazard at each step is assessed, which is crucial to the likelihood of establishment. The dependence between exposure and establishment suggests that they should fall within the same subcomponent. IRA undertaken for trade reasons must include an assessment of consequences to meet SPS criteria, but guidance is sparse. The integration of epidemiological and economic modelling may open a path for better methods. Matrices have been used in qualitative IRA to combine estimates of entry and exposure, and consequences with likelihood, but this approach has flaws and better methods are needed. OIE IRA standards and guidance indicate that the volume of trade should be taken into account, but offer no detail. Some published qualitative IRAs have assumed current levels and patterns of trade without specifying the volume of trade, which constrains the use of IRA to determine mitigation measures (to reduce risk to an acceptable level) and whether the principle of equivalence, fundamental to the SPS agreement, has been observed. It is questionable whether qualitative IRA can meet all the criteria set out in the SPS agreement. Nevertheless, scope exists to elaborate the current standards and guidance, so they better serve the principle of science-based decision-making. PMID:24237667

  2. Stochastic Drought Risk Analysis and Projection Methods For Thermoelectric Power Systems

    NASA Astrophysics Data System (ADS)

    Bekera, Behailu Belamo

    Combined effects of socio-economic, environmental, technological and political factors impact fresh cooling water availability, which is among the most important elements of thermoelectric power plant site selection and evaluation criteria. With increased variability and changes in hydrologic statistical stationarity, one concern is the increased occurrence of extreme drought events that may be attributable to climatic changes. As hydrological systems are altered, operators of thermoelectric power plants need to ensure a reliable supply of water for cooling and generation requirements. The effects of climate change are expected to influence hydrological systems at multiple scales, possibly leading to reduced efficiency of thermoelectric power plants. This study models and analyzes drought characteristics from a thermoelectric systems operational and regulation perspective. A systematic approach to characterize a stream environment in relation to extreme drought occurrence, duration and deficit-volume is proposed and demonstrated. More specifically, the objective of this research is to propose a stochastic water supply risk analysis and projection methods from thermoelectric power systems operation and management perspectives. The study defines thermoelectric drought as a shortage of cooling water due to stressed supply or beyond operable water temperature limits for an extended period of time requiring power plants to reduce production or completely shut down. It presents a thermoelectric drought risk characterization framework that considers heat content and water quantity facets of adequate water availability for uninterrupted operation of such plants and safety of its surroundings. In addition, it outlines mechanisms to identify rate of occurrences of the said droughts and stochastically quantify subsequent potential losses to the sector. This mechanism is enabled through a model based on compound Nonhomogeneous Poisson Process. This study also demonstrates how

  3. Risk analysis methods and techniques for veterinary biologicals used in Australia.

    PubMed

    Owusu, J

    1995-12-01

    Advances in modern science and technology and the globalisation of the veterinary manufacturing industry, coupled with the relaxation of trade restrictions by the General Agreement on Tariffs and Trade treaty on sanitary and phytosanitary measures, call for an international approach to standards of acceptable risk and risk analysis methodology. In Australia, different elements of risk analysis are undertaken by different agencies. The agencies employ screening risk assessment, which uses simple worst-case scenarios and conservative data to set priorities and identify issues of limited risk. The approach is multi-factorial, assessing risk to public health, animals and the environment. The major components of the analysis process are risk assessment, risk management, and procedures for communicating and monitoring risk. The author advocates the possible use of quantitative risk assessment, based on acceptable international standards, in making international trade decisions. In the absence of acceptable international standards, it is proposed that countries adopt mutual recognition of comparable standards and specifications employed by national agencies. PMID:8639948

  4. Handbook of methods for risk-based analysis of technical specifications

    SciTech Connect

    Samanta, P.K.; Kim, I.S.; Mankamo, T.; Vesely, W.E.

    1996-09-01

    Technical Specifications (TS) requirements for nuclear power plants define the Limiting Conditions for Operations (LCOs) and Surveillance Requirements (SRs) to assure safety during operation. In general, these requirements are based on deterministic analyses and engineering judgments. Improvements in these requirements are facilitated by the availability of plant-specific Probabilistic Risk Assessments (PRAs). The US Nuclear Regulatory Commission (USNRC) Office of Research sponsored research to develop systematic, risk-based methods to improve various aspects of TS requirements. A handbook of methods summarizing such risk-based approaches has been completed in 1994. It is expected that this handbook will provide valuable input to NRC`s present work in developing guidance for using PRA in risk-informed regulation. The handbook addresses reliability and risk-based methods for evaluating allowed outage times (AOTs), action statements requiring shutdown where shutdown risk may be substantial, surveillance test intervals (STIs), managing plant configurations, and scheduling maintenance.

  5. [Working tasks with upper limbs repetitive movements: analysis of different methods for risk assessment].

    PubMed

    Occhipinti, E

    2008-01-01

    A review of different methods for the risk assessment of upper limbs repetitive movements is carried out mainly referring to a recent ISO standard (ISO 11228-3). This standard establishes ergonomic recommendations for tasks involving manual handling of low loads at high frequency (repetitive work). It is a "voluntary" standard and provides information for all professionals involved in occupational prevention as well as in job and product design. It refers to a four-step approach, involving both risk assessment and risk reduction (hazard identification, risk estimation, risk evaluation and risk reduction). General reference is made to a general model reported in a Consensus Document published by the IEA Technical Committee "Musculoskeletal Disorders", with the endorsement of ICOH. Apart from risk identification, the standard addresses and suggests several methods for a simple risk estimation (i.e. Plibel, Osha Checklist, Upper Limb Expert Tool, Qec, Checklist Ocra). If the risk estimation results in the 'yellow' or 'red' zone, or if the job is composed by two or more repetitive tasks, a more detailed risk assessment is recommended. For a detailed risk assessment, the OCRA method is suggested as "preferred"; however, other methods (STRAIN INDEX; HAL-TLV-ACGIH) can also be used. The applicative limits of the methods mentioned above, considering the purposes of the standard, are shortly discussed together with their recent scientific updates and applicative perspectives. The standard, with the suggested risk assessment procedures and methods, represents a useful tool for all OSH operators involved in the application of European and National legislation regarding the prevention of UL WMSDs. PMID:19288787

  6. Risk analysis of a biomass combustion process using MOSAR and FMEA methods.

    PubMed

    Thivel, P-X; Bultel, Y; Delpech, F

    2008-02-28

    Thermal and chemical conversion processes that convert in energy the sewage sludge, pasty waste and other pre-processed waste are increasingly common, for economic and ecological reasons. Fluidized bed combustion is currently one of the most promising methods of energy conversion, since it burns biomass very efficiently, and produces only very small quantities of sulphur and nitrogen oxides. The hazards associated with biomass combustion processes are fire, explosion and poisoning from the combustion gases (CO, etc.). The risk analysis presented in this paper uses the MADS-MOSAR methodology, applied to a semi-industrial pilot scheme comprising a fluidization column, a conventional cyclone, two natural gas burners and a continuous supply of biomass. The methodology uses a generic approach, with an initial macroscopic stage where hazard sources are identified, scenarios for undesired events are recognized and ranked using a grid of SeverityxProbability and safety barriers suggested. A microscopic stage then analyzes in detail the major risks identified during the first stage. This analysis may use various different tools, such as HAZOP, FMEA, etc.: our analysis is based on FMEA. Using MOSAR, we were able to identify five subsystems: the reactor (fluidized bed and centrifuge), the fuel and biomass supply lines, the operator and the environment. When we drew up scenarios based on these subsystems, we found that malfunction of the gas supply burners was a common trigger in many scenarios. Our subsequent microscopic analysis, therefore, focused on the burners, looking at the ways they failed, and at the effects and criticality of those failures (FMEA). We were, thus, able to identify a number of critical factors such as the incoming gas lines and the ignition electrode. PMID:17624666

  7. Risk Analysis

    NASA Technical Reports Server (NTRS)

    Morring, Frank, Jr.

    2004-01-01

    A National Academies panel says the Hubble Space Telescope is too valuable ;or gamblingon a long-shot robotic mission to extend its service life, and urges Directly contradicting Administrator Sean O'Keefe, who killed a planned fifth shuttle servicing mission to the telescope on grounds it was too dangerous for a human crew in the post-Challenger environment, the expert committee found that upgrades to shuttle safety actually should make it less hazardous to fly to the telescope than it was before Columbia was lost. Risks of a telescope-servicing mission are only marginally greater than the planned missions to the International Space Station (ISS) O'Keefe has authorized, the panel found. After comparing those risks to the dangers inherent in trying to develop a complex space robot in the 39 months remaining in the Hubble s estimated service life, the panel opted for the human mission to save one of the major achievements of the American space program, in the words of Louis J. Lanzerotti, its chairman.

  8. Is risk analysis scientific?

    PubMed

    Hansson, Sven Ove; Aven, Terje

    2014-07-01

    This article discusses to what extent risk analysis is scientific in view of a set of commonly used definitions and criteria. We consider scientific knowledge to be characterized by its subject matter, its success in developing the best available knowledge in its fields of study, and the epistemic norms and values that guide scientific investigations. We proceed to assess the field of risk analysis according to these criteria. For this purpose, we use a model for risk analysis in which science is used as a base for decision making on risks, which covers the five elements evidence, knowledge base, broad risk evaluation, managerial review and judgment, and the decision; and that relates these elements to the domains experts and decisionmakers, and to the domains fact-based or value-based. We conclude that risk analysis is a scientific field of study, when understood as consisting primarily of (i) knowledge about risk-related phenomena, processes, events, etc., and (ii) concepts, theories, frameworks, approaches, principles, methods and models to understand, assess, characterize, communicate, and manage risk, in general and for specific applications (the instrumental part). PMID:24919396

  9. Does simplifying transport and exposure yield reliable results? An analysis of four risk assessment methods.

    PubMed

    Zhang, Q; Crittenden, J C; Mihelcic, J R

    2001-03-15

    Four approaches for predicting the risk of chemicals to humans and fish under different scenarios were compared to investigate whether it is appropriate to simplify risk evaluations in situations where an individual is making environmentally conscious manufacturing decisions or interpreting toxics release inventory (TRI) data: (1) the relative risk method, that compares only a chemical's relative toxicity; (2) the toxicity persistence method, that considers a chemical's relative toxicity and persistence; (3) the partitioning, persistence toxicity method, that considers a chemical's equilibrium partitioning to air, land, water, and sediment, persistence in each medium, and its relative toxicity; and (4) the detailed chemical fate and toxicity method, that considers the chemical's relative toxicity, and realistic attenuation mechanisms such as advection, mass transfer and reaction in air, land, water, and sediment. In all four methods, the magnitude of the risk was estimated by comparing the risk of the chemical's release to that of a reference chemical. Three comparative scenarios were selected to evaluate the four approaches for making pollution prevention decisions: (1) evaluation of nine dry cleaning solvents, (2) evaluation of four reaction pathways to produce glycerine, and (3) comparison of risks for the chemical manufacturing and petroleum industry. In all three situations, it was concluded that ignoring or simplifying exposure calculations is not appropriate, except in cases where either the toxicity was very great or when comparing chemicals with similar fate. When the toxicity is low to moderate and comparable for chemicals, the chemicals' fate influences the results; therefore, we recommend using a detailed chemical fate and toxicity method because the fate of chemicals in the environment is assessed with consideration of more realistic attenuation mechanisms than the other three methods. In addition, our study shows that evaluating the risk associated

  10. FOOD RISK ANALYSIS

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Food risk analysis is a holistic approach to food safety because it considers all aspects of the problem. Risk assessment modeling is the foundation of food risk analysis. Proper design and simulation of the risk assessment model is important to properly predict and control risk. Because of knowl...

  11. Indoor Soiling Method and Outdoor Statistical Risk Analysis of Photovoltaic Power Plants

    NASA Astrophysics Data System (ADS)

    Rajasekar, Vidyashree

    This is a two-part thesis. Part 1 presents an approach for working towards the development of a standardized artificial soiling method for laminated photovoltaic (PV) cells or mini-modules. Construction of an artificial chamber to maintain controlled environmental conditions and components/chemicals used in artificial soil formulation is briefly explained. Both poly-Si mini-modules and a single cell mono-Si coupons were soiled and characterization tests such as I-V, reflectance and quantum efficiency (QE) were carried out on both soiled, and cleaned coupons. From the results obtained, poly-Si mini-modules proved to be a good measure of soil uniformity, as any non-uniformity present would not result in a smooth curve during I-V measurements. The challenges faced while executing reflectance and QE characterization tests on poly-Si due to smaller size cells was eliminated on the mono-Si coupons with large cells to obtain highly repeatable measurements. This study indicates that the reflectance measurements between 600-700 nm wavelengths can be used as a direct measure of soil density on the modules. Part 2 determines the most dominant failure modes of field aged PV modules using experimental data obtained in the field and statistical analysis, FMECA (Failure Mode, Effect, and Criticality Analysis). The failure and degradation modes of about 744 poly-Si glass/polymer frameless modules fielded for 18 years under the cold-dry climate of New York was evaluated. Defect chart, degradation rates (both string and module levels) and safety map were generated using the field measured data. A statistical reliability tool, FMECA that uses Risk Priority Number (RPN) is used to determine the dominant failure or degradation modes in the strings and modules by means of ranking and prioritizing the modes. This study on PV power plants considers all the failure and degradation modes from both safety and performance perspectives. The indoor and outdoor soiling studies were jointly

  12. A method for risk analysis across governance systems: a Great Barrier Reef case study

    NASA Astrophysics Data System (ADS)

    Dale, Allan; Vella, Karen; Pressey, Robert L.; Brodie, Jon; Yorkston, Hugh; Potts, Ruth

    2013-03-01

    Healthy governance systems are key to delivering sound environmental management outcomes from global to local scales. There are, however, surprisingly few risk assessment methods that can pinpoint those domains and sub-domains within governance systems that are most likely to influence good environmental outcomes at any particular scale, or those if absent or dysfunctional, most likely to prevent effective environmental management. This paper proposes a new risk assessment method for analysing governance systems. This method is then tested through its preliminary application to a significant real-world context: governance as it relates to the health of Australia’s Great Barrier Reef (GBR). The GBR exists at a supra-regional scale along most of the north eastern coast of Australia. Brodie et al (2012 Mar. Pollut. Bull. 65 81-100) have recently reviewed the state and trend of the health of the GBR, finding that overall trends remain of significant concern. At the same time, official international concern over the governance of the reef has recently been signalled globally by the International Union for the Conservation of Nature (IUCN). These environmental and political contexts make the GBR an ideal candidate for use in testing and reviewing the application of improved tools for governance risk assessment.

  13. Method for improved prediction of bone fracture risk using bone mineral density in structural analysis

    NASA Technical Reports Server (NTRS)

    Cann, Christopher E. (Inventor); Faulkner, Kenneth G. (Inventor)

    1992-01-01

    A non-invasive in-vivo method of analyzing a bone for fracture risk includes obtaining data from the bone such as by computed tomography or projection imaging which data represents a measure of bone material characteristics such as bone mineral density. The distribution of the bone material characteristics is used to generate a finite element method (FEM) mesh from which load capability of the bone can be determined. In determining load capability, the bone is mathematically compressed, and stress, strain force, force/area versus bone material characteristics are determined.

  14. Integrated seismic risk analysis using simple weighting method: the case of residential Eskişehir, Turkey

    NASA Astrophysics Data System (ADS)

    Pekkan, E.; Tun, M.; Guney, Y.; Mutlu, S.

    2015-06-01

    A large part of the residential areas in Turkey are at risk from earthquakes. The main factors that threaten residential areas during an earthquake are poor quality building stock and soil problems. Liquefaction, loss of bearing capacity, amplification, slope failure, and landslide hazards must be taken into account for residential areas that are close to fault zones and covered with younger sediments. Analyzing these hazards separately and then combining the analyses would ensure a more realistic risk evaluation according to population density than analyzing several risks based on a single parameter. In this study, an integrated seismic risk analysis of central Eskişehir was performed based on two earthquake related parameters, liquefaction and amplification. The analysis used a simple weighting method. Other earthquake-related problems such as loss of bearing capacity, landslides, and slope failures are not significant for Eskişehir because of the geological and the topographical conditions of the region. According to the integrated seismic risk analysis of the Eskişehir residential area, the populated area is found to be generally at medium to high risk during a potential earthquake.

  15. Recruitment Methods and Show Rates to a Prostate Cancer Early Detection Program for High-Risk Men: A Comprehensive Analysis

    PubMed Central

    Giri, Veda N.; Coups, Elliot J.; Ruth, Karen; Goplerud, Julia; Raysor, Susan; Kim, Taylor Y.; Bagden, Loretta; Mastalski, Kathleen; Zakrzewski, Debra; Leimkuhler, Suzanne; Watkins-Bruner, Deborah

    2009-01-01

    Purpose Men with a family history (FH) of prostate cancer (PCA) and African American (AA) men are at higher risk for PCA. Recruitment and retention of these high-risk men into early detection programs has been challenging. We report a comprehensive analysis on recruitment methods, show rates, and participant factors from the Prostate Cancer Risk Assessment Program (PRAP), which is a prospective, longitudinal PCA screening study. Materials and Methods Men 35–69 years are eligible if they have a FH of PCA, are AA, or have a BRCA1/2 mutation. Recruitment methods were analyzed with respect to participant demographics and show to the first PRAP appointment using standard statistical methods Results Out of 707 men recruited, 64.9% showed to the initial PRAP appointment. More individuals were recruited via radio than from referral or other methods (χ2 = 298.13, p < .0001). Men recruited via radio were more likely to be AA (p<0.001), less educated (p=0.003), not married or partnered (p=0.007), and have no FH of PCA (p<0.001). Men recruited via referrals had higher incomes (p=0.007). Men recruited via referral were more likely to attend their initial PRAP visit than those recruited by radio or other methods (χ2 = 27.08, p < .0001). Conclusions This comprehensive analysis finds that radio leads to higher recruitment of AA men with lower socioeconomic status. However, these are the high-risk men that have lower show rates for PCA screening. Targeted motivational measures need to be studied to improve show rates for PCA risk assessment for these high-risk men. PMID:19758657

  16. Qualitative methods for assessing risk

    SciTech Connect

    Mahn, J.A.; Hannaman, G.W.; Kryska, P.

    1995-04-01

    The Department of Energy`s (DOE) non-nuclear facilities generally require only a qualitative accident analysis to assess facility risks in accordance with DOE Order 5481.1B, Safety Analysis and Review System. Achieving a meaningful qualitative assessment of risk necessarily requires the use of suitable non-numerical assessment criteria. Typically, the methods and criteria for assigning facility-specific accident scenarios to the qualitative severity and likelihood classification system in the DOE order requires significant judgment in many applications. Systematic methods for more consistently assigning the total accident scenario frequency and associated consequences are required to substantiate and enhance future risk ranking between various activities at Sandia National Laboratories (SNL). SNL`s Risk Management and National Environmental Policy Act (NEPA) Department has developed an improved methodology for performing qualitative risk assessments in accordance wi the DOE order requirements. Products of this effort are an improved set of qualitative description that permit (1) definition of the severity for both technical and programmatic consequences that may result from a variety of accident scenarios, and (2) qualitative representation of the likelihood of occurrence. These sets of descriptions are intended to facilitate proper application of DOE criteria for assessing facility risks.

  17. An Improved Breast Epithelial Sampling Method for Molecular Profiling and Biomarker Analysis in Women at Risk for Breast Cancer

    PubMed Central

    Danforth, David N; Warner, Andrew C; Wangsa, Darawalee; Ried, Thomas; Duelli, Dominik; Filie, Armando C; Prindiville, Sheila A

    2015-01-01

    BACKGROUND There is a strong need to define the molecular changes in normal at-risk breast epithelium to identify biomarkers and new targets for breast cancer prevention and to develop a molecular signature for risk assessment. Improved methods of breast epithelial sampling are needed to promote whole-genome molecular profiling, increase ductal epithelial cell yield, and reduce sample cell heterogeneity. METHODS We developed an improved method of breast ductal sampling with ductal lavage through a 22-gauge catheter and collection of ductal samples with a microaspirator. Women at normal risk or increased risk for breast cancer were studied. Ductal epithelial samples were analyzed for cytopathologic changes, cellular yield, epithelial cell purity, quality and quantity of DNA and RNA, and use in multiple downstream molecular applications. RESULTS We studied 50 subjects, including 40 subjects at normal risk for breast cancer and 37 subjects with non-nipple aspirate fluid-yielding ducts. This method provided multiple 1.0 mL samples of high ductal epithelial cell content (median ≥8 samples per subject of ≥5,000 cells per sample) with 80%–100% epithelial cell purity. Extraction of a single intact ductal sample (fluid and cells) or the separate frozen cellular component provided DNA and RNA for multiple downstream studies, including quantitative reverse transcription- polymerase chain reaction (PCR) for microRNA, quantitative PCR for the human telomerase reverse transcriptase gene, whole-genome DNA amplification, and array comparative genomic hybridization analysis. CONCLUSION An improved breast epithelial sampling method has been developed, which should significantly expand the acquisition and biomarker analysis of breast ductal epithelium in women at risk for breast cancer. PMID:26078587

  18. Methods Development for a Spatially Explicit Population-Level Risk Assessment, Uncertainty Analysis, and Comparison with Risk Quotient Approaches

    EPA Science Inventory

    The standard framework of Ecological Risk Assessment (ERA) uses organism-level assessment endpoints to qualitatively determine the risk to populations. While organism-level toxicity data provide the pathway by which a species may be affected by a chemical stressor, they neither i...

  19. A survey of probabilistic methods used in reliability, risk and uncertainty analysis: Analytical techniques 1

    SciTech Connect

    Robinson, D.G.

    1998-06-01

    This report provides an introduction to the various probabilistic methods developed roughly between 1956--1985 for performing reliability or probabilistic uncertainty analysis on complex systems. This exposition does not include the traditional reliability methods (e.g. parallel-series systems, etc.) that might be found in the many reliability texts and reference materials (e.g. and 1977). Rather, the report centers on the relatively new, and certainly less well known across the engineering community, analytical techniques. Discussion of the analytical methods has been broken into two reports. This particular report is limited to those methods developed between 1956--1985. While a bit dated, methods described in the later portions of this report still dominate the literature and provide a necessary technical foundation for more current research. A second report (Analytical Techniques 2) addresses methods developed since 1985. The flow of this report roughly follows the historical development of the various methods so each new technique builds on the discussion of strengths and weaknesses of previous techniques. To facilitate the understanding of the various methods discussed, a simple 2-dimensional problem is used throughout the report. The problem is used for discussion purposes only; conclusions regarding the applicability and efficiency of particular methods are based on secondary analyses and a number of years of experience by the author. This document should be considered a living document in the sense that as new methods or variations of existing methods are developed, the document and references will be updated to reflect the current state of the literature as much as possible. For those scientists and engineers already familiar with these methods, the discussion will at times become rather obvious. However, the goal of this effort is to provide a common basis for future discussions and, as such, will hopefully be useful to those more intimate with

  20. Assessing the Risk of Secondary Transfer Via Fingerprint Brush Contamination Using Enhanced Sensitivity DNA Analysis Methods.

    PubMed

    Bolivar, Paula-Andrea; Tracey, Martin; McCord, Bruce

    2016-01-01

    Experiments were performed to determine the extent of cross-contamination of DNA resulting from secondary transfer due to fingerprint brushes used on multiple items of evidence. Analysis of both standard and low copy number (LCN) STR was performed. Two different procedures were used to enhance sensitivity, post-PCR cleanup and increased cycle number. Under standard STR typing procedures, some additional alleles were produced that were not present in the controls or blanks; however, there was insufficient data to include the contaminant donor as a contributor. Inclusion of the contaminant donor did occur for one sample using post-PCR cleanup. Detection of the contaminant donor occurred for every replicate of the 31 cycle amplifications; however, using LCN interpretation recommendations for consensus profiles, only one sample would include the contaminant donor. Our results indicate that detection of secondary transfer of DNA can occur through fingerprint brush contamination and is enhanced using LCN-DNA methods. PMID:26300550

  1. Risk analysis methodology survey

    NASA Technical Reports Server (NTRS)

    Batson, Robert G.

    1987-01-01

    NASA regulations require that formal risk analysis be performed on a program at each of several milestones as it moves toward full-scale development. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from simple to complex network-based simulation were surveyed. A Program Risk Analysis Handbook was prepared in order to provide both analyst and manager with a guide for selection of the most appropriate technique.

  2. North energy system risk analysis features

    NASA Astrophysics Data System (ADS)

    Prokhorov, V. A.; Prokhorov, D. V.

    2015-12-01

    Risk indicator analysis for a decentralized energy system of the North was carried out. Based on analysis of damages caused by accidents at energy systems, their structure is selected, and a North energy system risk determination method was proposed.

  3. Methods for Multitemporal Analysis of Satellite Data Aimed at Environmental Risk Monitoring

    NASA Astrophysics Data System (ADS)

    Caprioli, M.; Scognamiglio, A.

    2012-08-01

    In the last years the topic of Environmental monitoring has raised a particular importance, also according to minor short-term stability and predictability of climatic events. Facing this situation, often in terms of emergency, involves high and unpredictable costs for public Agencies. Prevention of damages caused by natural disasters does not regard only weather forecasts, but requires constant attention and practice of monitoring and control of human activity on territory. Practically, the problem is not knowing if and when an event will affect a determined area, but recognizing the possible damages if this event happened, by adopting the adequate measures to reduce them to a minimum, and requiring the necessary tools for a timely intervention. On the other hand, the surveying technologies should be the most possible accurate and updatable in order to guarantee high standards, involving the analysis of a great amount of data. The management of such data requires the integration and calculation systems with specialized software and fast and reliable connection and communication networks. To solve such requirements, current satellite technology, with recurrent data acquisition for the timely generation of cartographic products updated and coherent to the territorial investigation, offers the possibility to fill the temporal gap between the need of urgent information and official reference information. Among evolved image processing techniques, Change detection analysis is useful to facilitate individuation of environmental temporal variations, contributing to reduce the users intervention by means of the processes automation and improving in a progressive way the qualitative and quantitative accuracy of results. The research investigate automatic methods on land cover transformations by means of "Change detection" techniques executable on satellite data that are heterogeneous for spatial and spectral resolution with homogenization and registration in an unique

  4. Risk/Stress Analysis.

    ERIC Educational Resources Information Center

    Schwerdtfeger, Don; Howell, Richard E.

    1986-01-01

    Identifies stress as a definite health hazard and risk factor involved in a variety of health situations. Proposes that stress identification efforts be considered in environmental analysis so that a more complete approach to risk assessment and management and health hazard prevention can occur. (ML)

  5. Multidimensional Risk Analysis: MRISK

    NASA Technical Reports Server (NTRS)

    McCollum, Raymond; Brown, Douglas; O'Shea, Sarah Beth; Reith, William; Rabulan, Jennifer; Melrose, Graeme

    2015-01-01

    Multidimensional Risk (MRISK) calculates the combined multidimensional score using Mahalanobis distance. MRISK accounts for covariance between consequence dimensions, which de-conflicts the interdependencies of consequence dimensions, providing a clearer depiction of risks. Additionally, in the event the dimensions are not correlated, Mahalanobis distance reduces to Euclidean distance normalized by the variance and, therefore, represents the most flexible and optimal method to combine dimensions. MRISK is currently being used in NASA's Environmentally Responsible Aviation (ERA) project o assess risk and prioritize scarce resources.

  6. Applying Data Envelopment Analysis to Preventive Medicine: A Novel Method for Constructing a Personalized Risk Model of Obesity

    PubMed Central

    Narimatsu, Hiroto; Nakata, Yoshinori; Nakamura, Sho; Sato, Hidenori; Sho, Ri; Otani, Katsumi; Kawasaki, Ryo; Kubota, Isao; Ueno, Yoshiyuki; Kato, Takeo; Yamashita, Hidetoshi; Fukao, Akira; Kayama, Takamasa

    2015-01-01

    Data envelopment analysis (DEA) is a method of operations research that has not yet been applied in the field of obesity research. However, DEA might be used to evaluate individuals’ susceptibility to obesity, which could help establish effective risk models for the onset of obesity. Therefore, we conducted this study to evaluate the feasibility of applying DEA to predict obesity, by calculating efficiency scores and evaluating the usefulness of risk models. In this study, we evaluated data from the Takahata study, which was a population-based cohort study (with a follow-up study) of Japanese people who are >40 years old. For our analysis, we used the input-oriented Charnes-Cooper-Rhodes model of DEA, and defined the decision-making units (DMUs) as individual subjects. The inputs were defined as (1) exercise (measured as calories expended) and (2) the inverse of food intake (measured as calories ingested). The output was defined as the inverse of body mass index (BMI). Using the β coefficients for the participants’ single nucleotide polymorphisms, we then calculated their genetic predisposition score (GPS). Both efficiency scores and GPS were available for 1,620 participants from the baseline survey, and for 708 participants from the follow-up survey. To compare the strengths of the associations, we used models of multiple linear regressions. To evaluate the effects of genetic factors and efficiency score on body mass index (BMI), we used multiple linear regression analysis, with BMI as the dependent variable, GPS and efficiency scores as the explanatory variables, and several demographic controls, including age and sex. Our results indicated that all factors were statistically significant (p < 0.05), with an adjusted R2 value of 0.66. Therefore, it is possible to use DEA to predict environmentally driven obesity, and thus to establish a well-fitted model for risk of obesity. PMID:25973987

  7. Application of a risk analysis method to different technologies for producing a monoclonal antibody employed in hepatitis B vaccine manufacturing.

    PubMed

    Milá, Lorely; Valdés, Rodolfo; Tamayo, Andrés; Padilla, Sigifredo; Ferro, Williams

    2012-03-01

    CB.Hep-1 monoclonal antibody (mAb) is used for a recombinant Hepatitis B vaccine manufacturing, which is included in a worldwide vaccination program against Hepatitis B disease. The use of this mAb as immunoligand has been addressed into one of the most efficient steps of active pharmaceutical ingredient purification process. Regarding this, Quality Risk Management (QRM) provides an excellent framework for the risk management use in pharmaceutical manufacturing and quality decision-making applications. Consequently, this study sought applying a prospective risk analysis methodology Failure Mode Effects Analysis (FMEA) as QRM tool for analyzing different CB.Hep-1 mAb manufacturing technologies. As main conclusions FMEA was successfully used to assess risks associated with potential problems in CB.Hep-1 mAb manufacturing processes. The severity and occurrence of risks analysis evidenced that the percentage of very high severe risks ranged 31.0-38.7% of all risks and the huge majority of risks have a very low occurrence level (61.9-83.3%) in all assessed technologies. Finally, additive Risk Priority Number, was descending ordered as follow: transgenic plants (2636), ascites (2577), transgenic animals (2046) and hollow fiber bioreactors (1654), which also corroborated that in vitro technology, should be the technology of choice for CB.Hep-1 mAb manufacturing in terms of risks and mAb molecule quality. PMID:22285820

  8. Analysis of the LaSalle Unit 2 nuclear power plant: Risk Methods Integration and Evaluation Program (RMIEP). Volume 8, Seismic analysis

    SciTech Connect

    Wells, J.E.; Lappa, D.A.; Bernreuter, D.L.; Chen, J.C.; Chuang, T.Y.; Johnson, J.J.; Campbell, R.D.; Hashimoto, P.S.; Maslenikov, O.R.; Tiong, L.W.; Ravindra, M.K.; Kincaid, R.H.; Sues, R.H.; Putcha, C.S.

    1993-11-01

    This report describes the methodology used and the results obtained from the application of a simplified seismic risk methodology to the LaSalle County Nuclear Generating Station Unit 2. This study is part of the Level I analysis being performed by the Risk Methods Integration and Evaluation Program (RMIEP). Using the RMIEP developed event and fault trees, the analysis resulted in a seismically induced core damage frequency point estimate of 6.OE-7/yr. This result, combined with the component importance analysis, indicated that system failures were dominated by random events. The dominant components included diesel generator failures (failure to swing, failure to start, failure to run after started), and condensate storage tank.

  9. Program risk analysis handbook

    NASA Technical Reports Server (NTRS)

    Batson, R. G.

    1987-01-01

    NASA regulations specify that formal risk analysis be performed on a program at each of several milestones. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from extremely simple to complex network-based simulation, are described in this handbook in order to provide both analyst and manager with a guide for selection of the most appropriate technique. All program risk assessment techniques are shown to be based on elicitation and encoding of subjective probability estimates from the various area experts on a program. Techniques to encode the five most common distribution types are given. Then, a total of twelve distinct approaches to risk assessment are given. Steps involved, good and bad points, time involved, and degree of computer support needed are listed. Why risk analysis should be used by all NASA program managers is discussed. Tools available at NASA-MSFC are identified, along with commercially available software. Bibliography (150 entries) and a program risk analysis check-list are provided.

  10. Recasting risk analysis methods in terms of object-oriented modeling techniques

    SciTech Connect

    Wyss, G.D.; Craft, R.L.; Vandewart, R.L.; Funkhouser, D.R.

    1998-08-01

    For more than two decades, risk analysts have relied on powerful logic-based models to perform their analyses. However, the applicability of these models has been limited because they can be complex and expensive to develop. Analysts must frequently start from scratch when analyzing a new (but similar) system because the understanding of how the system works exists only in the mind of the analyst and is only incompletely instantiated in the actual logic model. This paper introduces the notion of using explicit object-oriented system models, such as those embodied in computer-aided software engineering (CASE) tools, to document the analyst`s understanding of the system and appropriately capture how the system works. It also shows that from these models, standard assessment products, such as fault trees and event trees, can be automatically derived.

  11. The influence of the free space environment on the superlight-weight thermal protection system: conception, methods, and risk analysis

    NASA Astrophysics Data System (ADS)

    Yatsenko, Vitaliy; Falchenko, Iurii; Fedorchuk, Viktor; Petrushynets, Lidiia

    2016-07-01

    This report focuses on the results of the EU project "Superlight-weight thermal protection system for space application (LIGHT-TPS)". The bottom line is an analysis of influence of the free space environment on the superlight-weight thermal protection system (TPS). This report focuses on new methods that based on the following models: synergetic, physical, and computational. This report concentrates on four approaches. The first concerns the synergetic approach. The synergetic approach to the solution of problems of self-controlled synthesis of structures and creation of self-organizing technologies is considered in connection with the super-problem of creation of materials with new functional properties. Synergetics methods and mathematical design are considered according to actual problems of material science. The second approach describes how the optimization methods can be used to determine material microstructures with optimized or targeted properties. This technique enables one to find unexpected microstructures with exotic behavior (e.g., negative thermal expansion coefficients). The third approach concerns the dynamic probabilistic risk analysis of TPS l elements with complex characterizations for damages using a physical model of TPS system and a predictable level of ionizing radiation and space weather. Focusing is given mainly on the TPS model, mathematical models for dynamic probabilistic risk assessment and software for the modeling and prediction of the influence of the free space environment. The probabilistic risk assessment method for TPS is presented considering some deterministic and stochastic factors. The last approach concerns results of experimental research of the temperature distribution on the surface of the honeycomb sandwich panel size 150 x 150 x 20 mm at the diffusion welding in vacuum are considered. An equipment, which provides alignment of temperature fields in a product for the formation of equal strength of welded joints is

  12. Designing a Software for Flood Risk Assessment Based on Multi Criteria Desicion Analysis and Information Diffusion Methods

    NASA Astrophysics Data System (ADS)

    Musaoglu, N.; Saral, A.; Seker, D. Z.

    2012-12-01

    Flooding is one of the major natural disasters not only in Turkey but also in all over the world and it causes serious damage and harm. It is estimated that of the total economic loss caused by all kinds of disasters, 40% was due to floods. In July 1995, the Ayamama Creek in Istanbul was flooded, the insurance sector received around 1,200 claims notices during that period, insurance companies had to pay a total of $40 million for claims. In 2009, the same creek was flooded again and killed 31 people over two days and insurance firms paid for damages around cost €150 million for claims. To solve these kinds of problems modern tools such as GIS and Remote Sensing should be utilized. In this study, a software was designed for the flood risk analysis with Analytic Hierarchy Process (AHP) and Information Diffusion( InfoDif) methods.In the developed sofware, five evaluation criterias were taken into account, which were slope, aspect, elevation, geology and land use which were extracted from the satellite sensor data. The Digital Elevation Model (DEM) of the Ayamama River Basin was acquired from the SPOT 5 satellite image with 2.5 meter spatial resolution. Slope and aspect values of the study basin were extracted from this DEM. The land use of the Ayamama Creek was obtained by performing object-oriented nearest neighbor classification method by image segmentation on SPOT 5 image dated 2010. All produced data were used as an input for the part of Multi Criteria Desicion Analysis (MCDA) method of this software. Criterias and their each sub criteras were weighted and flood vulnerability was determined with MCDA-AHP. Also, daily flood data was collected from Florya Meteorological Station, between 1975 to 2009 years and the daily flood peak discharge was calculated with the method of Soil Conservation Service-Curve Number (SCS-CN) and were used as an input in the software for the part of InfoDif.Obtained results were verified using ground truth data and it has been clearly

  13. Risk assessment of water pollution sources based on an integrated k-means clustering and set pair analysis method in the region of Shiyan, China.

    PubMed

    Li, Chunhui; Sun, Lian; Jia, Junxiang; Cai, Yanpeng; Wang, Xuan

    2016-07-01

    Source water areas are facing many potential water pollution risks. Risk assessment is an effective method to evaluate such risks. In this paper an integrated model based on k-means clustering analysis and set pair analysis was established aiming at evaluating the risks associated with water pollution in source water areas, in which the weights of indicators were determined through the entropy weight method. Then the proposed model was applied to assess water pollution risks in the region of Shiyan in which China's key source water area Danjiangkou Reservoir for the water source of the middle route of South-to-North Water Diversion Project is located. The results showed that eleven sources with relative high risk value were identified. At the regional scale, Shiyan City and Danjiangkou City would have a high risk value in term of the industrial discharge. Comparatively, Danjiangkou City and Yunxian County would have a high risk value in terms of agricultural pollution. Overall, the risk values of north regions close to the main stream and reservoir of the region of Shiyan were higher than that in the south. The results of risk level indicated that five sources were in lower risk level (i.e., level II), two in moderate risk level (i.e., level III), one in higher risk level (i.e., level IV) and three in highest risk level (i.e., level V). Also risks of industrial discharge are higher than that of the agricultural sector. It is thus essential to manage the pillar industry of the region of Shiyan and certain agricultural companies in the vicinity of the reservoir to reduce water pollution risks of source water areas. PMID:27016678

  14. Targeted assets risk analysis.

    PubMed

    Bouwsema, Barry

    2013-01-01

    Risk assessments utilising the consolidated risk assessment process as described by Public Safety Canada and the Centre for Security Science utilise the five threat categories of natural, human accidental, technological, human intentional and chemical, biological, radiological, nuclear or explosive (CBRNE). The categories of human intentional and CBRNE indicate intended actions against specific targets. It is therefore necessary to be able to identify which pieces of critical infrastructure represent the likely targets of individuals with malicious intent. Using the consolidated risk assessment process and the target capabilities list, coupled with the CARVER methodology and a security vulnerability analysis, it is possible to identify these targeted assets and their weaknesses. This process can help emergency managers to identify where resources should be allocated and funding spent. Targeted Assets Risk Analysis (TARA) presents a new opportunity to improve how risk is measured, monitored, managed and minimised through the four phases of emergency management, namely, prevention, preparation, response and recovery. To reduce risk throughout Canada, Defence Research and Development Canada is interested in researching the potential benefits of a comprehensive approach to risk assessment and management. The TARA provides a framework against which potential human intentional threats can be measured and quantified, thereby improving safety for all Canadians. PMID:23615063

  15. DWPF risk analysis summary

    SciTech Connect

    Shedrow, C.B.

    1990-10-01

    This document contains selected risk analysis data from Chapter 9 (Safety Analysis) of the Defense Waste Processing Facility Safety Analysis Report DWPF SAR and draft Addendum 1 to the Waste Tank Farms SAR. Although these data may be revised prior to finalization of the draft SAR and the draft addendum, they are presently the best available information and were therefore used in preparing the risk analysis portion of the DWPF Environmental Analysis (DWPF EA). This information has been extracted from those draft documents an approved under separate cover so that it can be used as reference material for the DWPF EA when it is placed in the public reading rooms. 9 refs., 4 tabs.

  16. A Comparison of Rule-based Analysis with Regression Methods in Understanding the Risk Factors for Study Withdrawal in a Pediatric Study

    PubMed Central

    Haghighi, Mona; Johnson, Suzanne Bennett; Qian, Xiaoning; Lynch, Kristian F.; Vehik, Kendra; Huang, Shuai; Rewers, Marian; Barriga, Katherine; Baxter, Judith; Eisenbarth, George; Frank, Nicole; Gesualdo, Patricia; Hoffman, Michelle; Norris, Jill; Ide, Lisa; Robinson, Jessie; Waugh, Kathleen; She, Jin-Xiong; Schatz, Desmond; Hopkins, Diane; Steed, Leigh; Choate, Angela; Silvis, Katherine; Shankar, Meena; Huang, Yi-Hua; Yang, Ping; Wang, Hong-Jie; Leggett, Jessica; English, Kim; McIndoe, Richard; Dequesada, Angela; Haller, Michael; Anderson, Stephen W.; Ziegler, Anette G.; Boerschmann, Heike; Bonifacio, Ezio; Bunk, Melanie; Försch, Johannes; Henneberger, Lydia; Hummel, Michael; Hummel, Sandra; Joslowski, Gesa; Kersting, Mathilde; Knopff, Annette; Kocher, Nadja; Koletzko, Sibylle; Krause, Stephanie; Lauber, Claudia; Mollenhauer, Ulrike; Peplow, Claudia; Pflüger, Maren; Pöhlmann, Daniela; Ramminger, Claudia; Rash-Sur, Sargol; Roth, Roswith; Schenkel, Julia; Thümer, Leonore; Voit, Katja; Winkler, Christiane; Zwilling, Marina; Simell, Olli G.; Nanto-Salonen, Kirsti; Ilonen, Jorma; Knip, Mikael; Veijola, Riitta; Simell, Tuula; Hyöty, Heikki; Virtanen, Suvi M.; Kronberg-Kippilä, Carina; Torma, Maija; Simell, Barbara; Ruohonen, Eeva; Romo, Minna; Mantymaki, Elina; Schroderus, Heidi; Nyblom, Mia; Stenius, Aino; Lernmark, Åke; Agardh, Daniel; Almgren, Peter; Andersson, Eva; Andrén-Aronsson, Carin; Ask, Maria; Karlsson, Ulla-Marie; Cilio, Corrado; Bremer, Jenny; Ericson-Hallström, Emilie; Gard, Thomas; Gerardsson, Joanna; Gustavsson, Ulrika; Hansson, Gertie; Hansen, Monica; Hyberg, Susanne; Håkansson, Rasmus; Ivarsson, Sten; Johansen, Fredrik; Larsson, Helena; Lernmark, Barbro; Markan, Maria; Massadakis, Theodosia; Melin, Jessica; Månsson-Martinez, Maria; Nilsson, Anita; Nilsson, Emma; Rahmati, Kobra; Rang, Sara; Järvirova, Monica Sedig; Sibthorpe, Sara; Sjöberg, Birgitta; Törn, Carina; Wallin, Anne; Wimar, Åsa; Hagopian, William A.; Yan, Xiang; Killian, Michael; Crouch, Claire Cowen; Hay, Kristen M.; Ayres, Stephen; Adams, Carissa; Bratrude, Brandi; Fowler, Greer; Franco, Czarina; Hammar, Carla; Heaney, Diana; Marcus, Patrick; Meyer, Arlene; Mulenga, Denise; Scott, Elizabeth; Skidmore, Jennifer; Small, Erin; Stabbert, Joshua; Stepitova, Viktoria; Becker, Dorothy; Franciscus, Margaret; Dalmagro-Elias Smith, MaryEllen; Daftary, Ashi; Krischer, Jeffrey P.; Abbondondolo, Michael; Ballard, Lori; Brown, Rasheedah; Cuthbertson, David; Eberhard, Christopher; Gowda, Veena; Lee, Hye-Seung; Liu, Shu; Malloy, Jamie; McCarthy, Cristina; McLeod, Wendy; Smith, Laura; Smith, Stephen; Smith, Susan; Uusitalo, Ulla; Yang, Jimin; Akolkar, Beena; Briese, Thomas; Erlich, Henry; Oberste, Steve

    2016-01-01

    Regression models are extensively used in many epidemiological studies to understand the linkage between specific outcomes of interest and their risk factors. However, regression models in general examine the average effects of the risk factors and ignore subgroups with different risk profiles. As a result, interventions are often geared towards the average member of the population, without consideration of the special health needs of different subgroups within the population. This paper demonstrates the value of using rule-based analysis methods that can identify subgroups with heterogeneous risk profiles in a population without imposing assumptions on the subgroups or method. The rules define the risk pattern of subsets of individuals by not only considering the interactions between the risk factors but also their ranges. We compared the rule-based analysis results with the results from a logistic regression model in The Environmental Determinants of Diabetes in the Young (TEDDY) study. Both methods detected a similar suite of risk factors, but the rule-based analysis was superior at detecting multiple interactions between the risk factors that characterize the subgroups. A further investigation of the particular characteristics of each subgroup may detect the special health needs of the subgroup and lead to tailored interventions. PMID:27561809

  17. A Comparison of Rule-based Analysis with Regression Methods in Understanding the Risk Factors for Study Withdrawal in a Pediatric Study.

    PubMed

    Haghighi, Mona; Johnson, Suzanne Bennett; Qian, Xiaoning; Lynch, Kristian F; Vehik, Kendra; Huang, Shuai

    2016-01-01

    Regression models are extensively used in many epidemiological studies to understand the linkage between specific outcomes of interest and their risk factors. However, regression models in general examine the average effects of the risk factors and ignore subgroups with different risk profiles. As a result, interventions are often geared towards the average member of the population, without consideration of the special health needs of different subgroups within the population. This paper demonstrates the value of using rule-based analysis methods that can identify subgroups with heterogeneous risk profiles in a population without imposing assumptions on the subgroups or method. The rules define the risk pattern of subsets of individuals by not only considering the interactions between the risk factors but also their ranges. We compared the rule-based analysis results with the results from a logistic regression model in The Environmental Determinants of Diabetes in the Young (TEDDY) study. Both methods detected a similar suite of risk factors, but the rule-based analysis was superior at detecting multiple interactions between the risk factors that characterize the subgroups. A further investigation of the particular characteristics of each subgroup may detect the special health needs of the subgroup and lead to tailored interventions. PMID:27561809

  18. Initial Decision and Risk Analysis

    SciTech Connect

    Engel, David W.

    2012-02-29

    Decision and Risk Analysis capabilities will be developed for industry consideration and possible adoption within Year 1. These tools will provide a methodology for merging qualitative ranking of technology maturity and acknowledged risk contributors with quantitative metrics that drive investment decision processes. Methods and tools will be initially introduced as applications to the A650.1 case study, but modular spreadsheets and analysis routines will be offered to industry collaborators as soon as possible to stimulate user feedback and co-development opportunities.

  19. Bivariate hydrologic risk analysis based on a coupled entropy-copula method for the Xiangxi River in the Three Gorges Reservoir area, China

    NASA Astrophysics Data System (ADS)

    Fan, Y. R.; Huang, W. W.; Huang, G. H.; Huang, K.; Li, Y. P.; Kong, X. M.

    2016-07-01

    In this study, a bivariate hydrologic risk framework is proposed based on a coupled entropy-copula method. In the proposed risk analysis framework, bivariate flood frequency would be analyzed for different flood variable pairs (i.e., flood peak-volume, flood peak-duration, flood volume-duration). The marginal distributions of flood peak, volume, and duration are quantified through both parametric (i.e., gamma, general extreme value (GEV), and lognormal distributions) and nonparametric (i.e., entropy) approaches. The joint probabilities of flood peak-volume, peak-duration, and volume-duration are established through copulas. The bivariate hydrologic risk is then derived based on the joint return period to reflect the interactive effects of flood variables on the final hydrologic risk values. The proposed method is applied to the risk analysis for the Xiangxi River in the Three Gorges Reservoir area, China. The results indicate the entropy method performs best in quantifying the distribution of flood duration. Bivariate hydrologic risk would then be generated to characterize the impacts of flood volume and duration on the occurrence of a flood. The results suggest that the bivariate risk for flood peak-volume would not decrease significantly for the flood volume less than 1000 m3/s. Moreover, a flood in the Xiangxi River may last at least 5 days without significant decrease of the bivariate risk for flood peak-duration.

  20. The report of Task Group 100 of the AAPM: Application of risk analysis methods to radiation therapy quality management.

    PubMed

    Huq, M Saiful; Fraass, Benedick A; Dunscombe, Peter B; Gibbons, John P; Ibbott, Geoffrey S; Mundt, Arno J; Mutic, Sasa; Palta, Jatinder R; Rath, Frank; Thomadsen, Bruce R; Williamson, Jeffrey F; Yorke, Ellen D

    2016-07-01

    The increasing complexity of modern radiation therapy planning and delivery challenges traditional prescriptive quality management (QM) methods, such as many of those included in guidelines published by organizations such as the AAPM, ASTRO, ACR, ESTRO, and IAEA. These prescriptive guidelines have traditionally focused on monitoring all aspects of the functional performance of radiotherapy (RT) equipment by comparing parameters against tolerances set at strict but achievable values. Many errors that occur in radiation oncology are not due to failures in devices and software; rather they are failures in workflow and process. A systematic understanding of the likelihood and clinical impact of possible failures throughout a course of radiotherapy is needed to direct limit QM resources efficiently to produce maximum safety and quality of patient care. Task Group 100 of the AAPM has taken a broad view of these issues and has developed a framework for designing QM activities, based on estimates of the probability of identified failures and their clinical outcome through the RT planning and delivery process. The Task Group has chosen a specific radiotherapy process required for "intensity modulated radiation therapy (IMRT)" as a case study. The goal of this work is to apply modern risk-based analysis techniques to this complex RT process in order to demonstrate to the RT community that such techniques may help identify more effective and efficient ways to enhance the safety and quality of our treatment processes. The task group generated by consensus an example quality management program strategy for the IMRT process performed at the institution of one of the authors. This report describes the methodology and nomenclature developed, presents the process maps, FMEAs, fault trees, and QM programs developed, and makes suggestions on how this information could be used in the clinic. The development and implementation of risk-assessment techniques will make radiation therapy

  1. The report of Task Group 100 of the AAPM: Application of risk analysis methods to radiation therapy quality management

    PubMed Central

    Huq, M. Saiful; Fraass, Benedick A.; Dunscombe, Peter B.; Gibbons, John P.; Mundt, Arno J.; Mutic, Sasa; Palta, Jatinder R.; Rath, Frank; Thomadsen, Bruce R.; Williamson, Jeffrey F.; Yorke, Ellen D.

    2016-01-01

    The increasing complexity of modern radiation therapy planning and delivery challenges traditional prescriptive quality management (QM) methods, such as many of those included in guidelines published by organizations such as the AAPM, ASTRO, ACR, ESTRO, and IAEA. These prescriptive guidelines have traditionally focused on monitoring all aspects of the functional performance of radiotherapy (RT) equipment by comparing parameters against tolerances set at strict but achievable values. Many errors that occur in radiation oncology are not due to failures in devices and software; rather they are failures in workflow and process. A systematic understanding of the likelihood and clinical impact of possible failures throughout a course of radiotherapy is needed to direct limit QM resources efficiently to produce maximum safety and quality of patient care. Task Group 100 of the AAPM has taken a broad view of these issues and has developed a framework for designing QM activities, based on estimates of the probability of identified failures and their clinical outcome through the RT planning and delivery process. The Task Group has chosen a specific radiotherapy process required for “intensity modulated radiation therapy (IMRT)” as a case study. The goal of this work is to apply modern risk-based analysis techniques to this complex RT process in order to demonstrate to the RT community that such techniques may help identify more effective and efficient ways to enhance the safety and quality of our treatment processes. The task group generated by consensus an example quality management program strategy for the IMRT process performed at the institution of one of the authors. This report describes the methodology and nomenclature developed, presents the process maps, FMEAs, fault trees, and QM programs developed, and makes suggestions on how this information could be used in the clinic. The development and implementation of risk-assessment techniques will make radiation

  2. Analysis of the LaSalle Unit 2 Nuclear Power Plant, Risk Methods Integration and Evaluation Program (RMIEP)

    SciTech Connect

    Ferrell, W.L. ); Payne, A.C. Jr.; Daniel, S.L. )

    1992-10-01

    This report is a description of the internal flood analysis performed on the LaSalle County Nuclear Generating Station, Unit 2. A more detailed integration with the internal events analysis than in prior flood risk assessments was accomplished. The same system fault trees used for the internal events analysis were also used for the flood analysis, which included modeling of components down to the contact pair level. Subsidiary equations were created to map the effects of pipe failures. All component locations were traced and mapped into the fault trees. The effects of floods were then mapped directly onto the internal plant model and their relative importance was evaluated. A detailed screening analysis was performed which showed that most plant areas had a negligible contribution to the flood-induced core damage frequency. This was influenced strongly by the fact that the LaSalle plant was designed with a high level of concern about the effects of external events such as fire and flood and significant separation was maintained between systems in the original design. Detailed analysis of the remaining flood scenarios identified only two that contributed significantly to risk. The flood analysis resulted in a total (mean) core damage frequency of 3.23E-6 per year.

  3. Study Of The Risks Arising From Natural Disasters And Hazards On Urban And Intercity Motorways By Using Failure Mode Effect Analysis (FMEA) Methods

    NASA Astrophysics Data System (ADS)

    DELİCE, Yavuz

    2015-04-01

    Highways, Located in the city and intercity locations are generally prone to many kind of natural disaster risks. Natural hazards and disasters that may occur firstly from highway project making to construction and operation stages and later during the implementation of highway maintenance and repair stages have to be taken into consideration. And assessment of risks that may occur against adverse situations is very important in terms of project design, construction, operation maintenance and repair costs. Making hazard and natural disaster risk analysis is largely depending on the definition of the likelihood of the probable hazards on the highways. However, assets at risk , and the impacts of the events must be examined and to be rated in their own. With the realization of these activities, intended improvements against natural hazards and disasters will be made with the utilization of Failure Mode Effects Analysis (FMEA) method and their effects will be analyzed with further works. FMEA, is a useful method to identify the failure mode and effects depending on the type of failure rate effects priorities and finding the most optimum economic and effective solution. Although relevant measures being taken for the identified risks by this analysis method , it may also provide some information for some public institutions about the nature of these risks when required. Thus, the necessary measures will have been taken in advance in the city and intercity highways. Many hazards and natural disasters are taken into account in risk assessments. The most important of these dangers can be listed as follows; • Natural disasters 1. Meteorological based natural disasters (floods, severe storms, tropical storms, winter storms, avalanches, etc.). 2. Geological based natural disasters (earthquakes, tsunamis, landslides, subsidence, sinkholes, etc) • Human originated disasters 1. Transport accidents (traffic accidents), originating from the road surface defects (icing

  4. Qualitative methods for assessing risk

    SciTech Connect

    Mahn, J.A.; Hannaman, G.W.; Kryska, P.

    1995-03-01

    The purpose of this document is to describe a qualitative risk assessment process that supplements the requirements of DOE/AL 5481.1B. Although facility managers have a choice of assessing risk either quantitatively or qualitatively, trade offs are involved in making the most appropriate choice for a given application. The results that can be obtained from a quantitative risk assessment are significantly more robust than those results derived from a qualitative approach. However, the advantages derived from quantitative risk assessment are achieved at a greater expenditure of money, time and convenience. This document provides the elements of a framework for performing a much less costly qualitative risk assessment, while retaining the best attributes of quantitative methods. The approach discussed herein will; (1) provide facility managers with the tools to prepare consistent, site wide assessments, and (2) aid the reviewers who may be tasked to evaluate the assessments. Added cost/benefit measures of the qualitative methodology include the identification of mechanisms for optimally allocating resources for minimizing risk in an expeditious, and fiscally responsible manner.

  5. [Comparative analysis of two different methods for risk assessment of groundwater pollution: a case study in Beijing plain].

    PubMed

    Wang, Hong-na; He, Jiang-tao; Ma, Wen-jie; Xu, Zhen

    2015-01-01

    Groundwater contamination risk assessment has important meaning to groundwater contamination prevention planning and groundwater exploitation potentiality. Recently, UN assessment system and WP assessment system have become the focuses of international research. In both systems, the assessment framework and indices were drawn from five aspects: intrinsic vulnerability, aquifer storage, groundwater quality, groundwater resource protection zone and contamination load. But, the five factors were built up in different ways. In order to expound the difference between the UN and WP assessment systems, and explain the main reasons, the UN and WP assessment systems were applied to Beijing Plain, China. The maps constructed from the UN and WP risk assessment systems were compared. The results showed that both kinds of groundwater contamination risk assessment maps were in accordance with the actual conditions and were similar in spatial distribution trends. However, there was quite significant different in the coverage area at the same level. It also revealed that during the system construction process, the structural hierarchy, relevant overlaying principles and classification method might have effects on the groundwater contamination risk assessment map. UN assessment system and WP assessment system were both suitable for groundwater contamination risk assessment of the plain, however, their emphasis was different. PMID:25898663

  6. Comparison of nonlinear methods symbolic dynamics, detrended fluctuation, and Poincaré plot analysis in risk stratification in patients with dilated cardiomyopathy

    NASA Astrophysics Data System (ADS)

    Voss, Andreas; Schroeder, Rico; Truebner, Sandra; Goernig, Matthias; Figulla, Hans Reiner; Schirdewan, Alexander

    2007-03-01

    Dilated cardiomyopathy (DCM) has an incidence of about 20/100 000 new cases per annum and accounts for nearly 10 000 deaths per year in the United States. Approximately 36% of patients with dilated cardiomyopathy (DCM) suffer from cardiac death within five years after diagnosis. Currently applied methods for an early risk prediction in DCM patients are rather insufficient. The objective of this study was to investigate the suitability of short-term nonlinear methods symbolic dynamics (STSD), detrended fluctuation (DFA), and Poincaré plot analysis (PPA) for risk stratification in these patients. From 91 DCM patients and 30 healthy subjects (REF), heart rate and blood pressure variability (HRV, BPV), STSD, DFA, and PPA were analyzed. Measures from BPV analysis, DFA, and PPA revealed highly significant differences (p<0.0011) discriminating REF and DCM. For risk stratification in DCM patients, four parameters from BPV analysis, STSD, and PPA revealed significant differences between low and high risk (maximum sensitivity: 90%, specificity: 90%). These results suggest that STSD and PPA are useful nonlinear methods for enhanced risk stratification in DCM patients.

  7. [METHODS AND TECHNOLOGIES OF HEALTH RISK ANALYSIS IN THE SYSTEM OF THE STATE MANAGEMENT UNDER ASSURANCE OF THE SANITATION AND EPIDEMIOLOGICAL WELFARE OF POPULATION].

    PubMed

    Zaĭtseva, N V; Popova, A Iu; Maĭ, I V; Shur, P Z

    2015-01-01

    The methodology of the analysis of health risk at the present stage of development of Russian society is in-demand at all levels of government management. In conjunction with the methods of mathematical modeling, spatial-temporal analysis and economic tools the risk assessment in the analysis of the situation makes it possible to determine the level of safety of the population, workers and consumers, to select prior resources, and threat factors as a point for exertion efforts. At the planning stage risk assessment is a basis for the establishment of most effective measures for the minimization of hazard and dangers. At the realization stage the methodology allows to estimate the efficiency of measures; at the control and supervision phase it permits to select out priorities for the concentration of efforts on the objects of maximal health risk for population. Risk assessments, including the elements of evolutionary modeling, are incorporated in the system of state hygienic regulation, the formation of evidence base of harm to health, the organization of control and supervisory activities. This allows you to harmonize the domestic legal framework with ternational legal requirements and ultimately enhances the credibility of the Russian data on the safety of the environment products and services. There is seemed to be actual the further assignment of enforcement of methodology of health risk analysis in the field of assurance of sanitary and epidemiological well-being and health of employers; he development of informational and analytical base in the part of the establishment of models of dependencies "exposure-response" for different types and levels of exposure and risk contingents; the accuracy enhancement of estimations of exposure; improvement of the economic aspects of health risk analysis and forecasting of measures aimed at mitigation of the losses associated with the negative impact of manifold factors on the health of citizens. PMID:26155657

  8. Risk Analysis Virtual ENvironment

    Energy Science and Technology Software Center (ESTSC)

    2014-02-10

    RAVEN has 3 major functionalities: 1. Provides a Graphical User Interface for the pre- and post-processing of the RELAP-7 input and output. 2. Provides the capability to model nuclear power plants control logic for the RELAP-7 code and dynamic control of the accident scenario evolution. This capability is based on a software structure that realizes a direct connection between the RELAP-7 solver engine (MOOSE) and a python environment where the variables describing the plant statusmore » are accessible in a scripting environment. RAVEN support the generation of the probabilistic scenario control by supplying a wide range of probability and cumulative distribution functions and their inverse functions. 3. Provides a general environment to perform probability risk analysis for RELAP-7, RELAP-5 and any generic MOOSE based applications. The probabilistic analysis is performed by sampling the input space of the coupled code parameters and it is enhanced by using modern artificial intelligence algorithms that accelerate the identification of the areas of major risk (in the input parameter space). This environment also provides a graphical visualization capability to analyze the outcomes. Among other approaches, the classical Monte Carlo and Latin Hypercube sampling algorithms are available. For the acceleration of the convergence of the sampling methodologies, Support Vector Machines, Bayesian regression, and collocation stochastic polynomials chaos are implemented. The same methodologies here described could be used to solve optimization and uncertainties propagation problems using the RAVEN framework.« less

  9. Risk Analysis Virtual ENvironment

    SciTech Connect

    2014-02-10

    RAVEN has 3 major functionalities: 1. Provides a Graphical User Interface for the pre- and post-processing of the RELAP-7 input and output. 2. Provides the capability to model nuclear power plants control logic for the RELAP-7 code and dynamic control of the accident scenario evolution. This capability is based on a software structure that realizes a direct connection between the RELAP-7 solver engine (MOOSE) and a python environment where the variables describing the plant status are accessible in a scripting environment. RAVEN support the generation of the probabilistic scenario control by supplying a wide range of probability and cumulative distribution functions and their inverse functions. 3. Provides a general environment to perform probability risk analysis for RELAP-7, RELAP-5 and any generic MOOSE based applications. The probabilistic analysis is performed by sampling the input space of the coupled code parameters and it is enhanced by using modern artificial intelligence algorithms that accelerate the identification of the areas of major risk (in the input parameter space). This environment also provides a graphical visualization capability to analyze the outcomes. Among other approaches, the classical Monte Carlo and Latin Hypercube sampling algorithms are available. For the acceleration of the convergence of the sampling methodologies, Support Vector Machines, Bayesian regression, and collocation stochastic polynomials chaos are implemented. The same methodologies here described could be used to solve optimization and uncertainties propagation problems using the RAVEN framework.

  10. Integrated seismic risk analysis using simple weighting method: the case of residential Eskişehir, Turkey

    NASA Astrophysics Data System (ADS)

    Pekkan, E.; Tun, M.; Guney, Y.

    2014-11-01

    A large part of the residential areas in Turkey are at risk for earthquakes. The main factors that threaten residential areas during an earthquake are poor quality building stock and soil problems. Liquefaction, loss of bearing capacity, amplification, slope failure, and landslide risks must be taken into account for residential areas that are close to the fault zones and covered with younger sediments. If these risks were separately analyzed and these analyses were combined, this would be more realistic than analyzing several hazard maps based a single parameter. In this study, an integrated seismic hazard map of central Eskişehir was created based on two earthquake related parameters, liquefaction, and amplification, by using a simple weighting method. Other earthquake-related problems such as loss of bearing capacity, landslides, and slope failures are not significant for Eskişehir because of the geologic and the topographic conditions of the region. According to the integrated seismic hazard map of the Eskişehir residential area, the area is found to be generally at medium-high risk during a potential earthquake.

  11. Augmenting the Deliberative Method for Ranking Risks.

    PubMed

    Susel, Irving; Lasley, Trace; Montezemolo, Mark; Piper, Joel

    2016-01-01

    The Department of Homeland Security (DHS) characterized and prioritized the physical cross-border threats and hazards to the nation stemming from terrorism, market-driven illicit flows of people and goods (illegal immigration, narcotics, funds, counterfeits, and weaponry), and other nonmarket concerns (movement of diseases, pests, and invasive species). These threats and hazards pose a wide diversity of consequences with very different combinations of magnitudes and likelihoods, making it very challenging to prioritize them. This article presents the approach that was used at DHS to arrive at a consensus regarding the threats and hazards that stand out from the rest based on the overall risk they pose. Due to time constraints for the decision analysis, it was not feasible to apply multiattribute methodologies like multiattribute utility theory or the analytic hierarchy process. Using a holistic approach was considered, such as the deliberative method for ranking risks first published in this journal. However, an ordinal ranking alone does not indicate relative or absolute magnitude differences among the risks. Therefore, the use of the deliberative method for ranking risks is not sufficient for deciding whether there is a material difference between the top-ranked and bottom-ranked risks, let alone deciding what the stand-out risks are. To address this limitation of ordinal rankings, the deliberative method for ranking risks was augmented by adding an additional step to transform the ordinal ranking into a ratio scale ranking. This additional step enabled the selection of stand-out risks to help prioritize further analysis. PMID:26224206

  12. A ubiquitous method for street scale spatial data collection and analysis in challenging urban environments: mapping health risks using spatial video in Haiti

    PubMed Central

    2013-01-01

    Background Fine-scale and longitudinal geospatial analysis of health risks in challenging urban areas is often limited by the lack of other spatial layers even if case data are available. Underlying population counts, residential context, and associated causative factors such as standing water or trash locations are often missing unless collected through logistically difficult, and often expensive, surveys. The lack of spatial context also hinders the interpretation of results and designing intervention strategies structured around analytical insights. This paper offers a ubiquitous spatial data collection approach using a spatial video that can be used to improve analysis and involve participatory collaborations. A case study will be used to illustrate this approach with three health risks mapped at the street scale for a coastal community in Haiti. Methods Spatial video was used to collect street and building scale information, including standing water, trash accumulation, presence of dogs, cohort specific population characteristics, and other cultural phenomena. These data were digitized into Google Earth and then coded and analyzed in a GIS using kernel density and spatial filtering approaches. The concentrations of these risks around area schools which are sometimes sources of diarrheal disease infection because of the high concentration of children and variable sanitary practices will show the utility of the method. In addition schools offer potential locations for cholera education interventions. Results Previously unavailable fine scale health risk data vary in concentration across the town, with some schools being proximate to greater concentrations of the mapped risks. The spatial video is also used to validate coded data and location specific risks within these “hotspots”. Conclusions Spatial video is a tool that can be used in any environment to improve local area health analysis and intervention. The process is rapid and can be repeated in study

  13. Multiple Interacting Risk Factors: On Methods for Allocating Risk Factor Interactions.

    PubMed

    Price, Bertram; MacNicoll, Michael

    2015-05-01

    A persistent problem in health risk analysis where it is known that a disease may occur as a consequence of multiple risk factors with interactions is allocating the total risk of the disease among the individual risk factors. This problem, referred to here as risk apportionment, arises in various venues, including: (i) public health management, (ii) government programs for compensating injured individuals, and (iii) litigation. Two methods have been described in the risk analysis and epidemiology literature for allocating total risk among individual risk factors. One method uses weights to allocate interactions among the individual risk factors. The other method is based on risk accounting axioms and finding an optimal and unique allocation that satisfies the axioms using a procedure borrowed from game theory. Where relative risk or attributable risk is the risk measure, we find that the game-theory-determined allocation is the same as the allocation where risk factor interactions are apportioned to individual risk factors using equal weights. Therefore, the apportionment problem becomes one of selecting a meaningful set of weights for allocating interactions among the individual risk factors. Equal weights and weights proportional to the risks of the individual risk factors are discussed. PMID:25644783

  14. Object Oriented Risk Analysis Workshop

    NASA Astrophysics Data System (ADS)

    Pons, M. Güell I.; Jaboyedoff, M.

    2009-04-01

    In the framework of the RISET Project (Interfaculty Network of Support to Education and Technology) an educational tool for introducing risk analysis has been developed. This workshop enables to carry a group of students (role-play game) through a step-by-step process of risk identification and quantification. The aim is to assess risk in a characteristic alpine village regarding natural hazards (rockfall, snow avalanche, flooding…) and is oriented to affected objects such as buildings, infrastructures... The workshop contains the following steps: 1.- Planning of the study and definition of stakeholders 2.- Hazard identification 3.- Risk analysis 4.- Risk assessment 5.- Proposition of mitigation measures 6- Risk management and cost-benefit analysis. During the process, information related to past events and useful concepts are provided in order to bring up discussion and decision making. The Risk Matrix and other graphical tools allow having a visual representation of the risk level and help to prioritize counter measures. At the end of the workshop, there is the possibility to compare the results between different groups and print out a summarizing report. This approach provides a rapid and comprehensible risk evaluation. The workshop is accessible from the internet and will be used for educational purposes at bachelor and master level as well as for external persons dealing with risk analysis.

  15. Analysis of the LaSalle Unit 2 Nuclear Power Plant, Risk Methods Integration and Evaluation Program (RMIEP). Volume 10, Internal flood analysis

    SciTech Connect

    Ferrell, W.L.; Payne, A.C. Jr.; Daniel, S.L.

    1992-10-01

    This report is a description of the internal flood analysis performed on the LaSalle County Nuclear Generating Station, Unit 2. A more detailed integration with the internal events analysis than in prior flood risk assessments was accomplished. The same system fault trees used for the internal events analysis were also used for the flood analysis, which included modeling of components down to the contact pair level. Subsidiary equations were created to map the effects of pipe failures. All component locations were traced and mapped into the fault trees. The effects of floods were then mapped directly onto the internal plant model and their relative importance was evaluated. A detailed screening analysis was performed which showed that most plant areas had a negligible contribution to the flood-induced core damage frequency. This was influenced strongly by the fact that the LaSalle plant was designed with a high level of concern about the effects of external events such as fire and flood and significant separation was maintained between systems in the original design. Detailed analysis of the remaining flood scenarios identified only two that contributed significantly to risk. The flood analysis resulted in a total (mean) core damage frequency of 3.23E-6 per year.

  16. Risk analysis and management

    NASA Technical Reports Server (NTRS)

    Smith, H. E.

    1990-01-01

    Present software development accomplishments are indicative of the emerging interest in and increasing efforts to provide risk assessment backbone tools in the manned spacecraft engineering community. There are indications that similar efforts are underway in the chemical processes industry and are probably being planned for other high risk ground base environments. It appears that complex flight systems intended for extended manned planetary exploration will drive this technology.

  17. Budget Risk & Prioritization Analysis Tool

    Energy Science and Technology Software Center (ESTSC)

    2010-12-31

    BRPAtool performs the following: •Assists managers in making solid decisions on what scope/activities to reduce and/or eliminate, to meet constrained budgets, based on multiple risk factors •Enables analysis of different budget scenarios •Can analyze risks and cost for each activity based on technical, quantifiable risk criteria and management-determined risks •Real-time analysis •Enables managers to determine the multipliers and where funding is best applied •Promotes solid budget defense

  18. Bivariate random effects models for meta-analysis of comparative studies with binary outcomes: methods for the absolute risk difference and relative risk.

    PubMed

    Chu, Haitao; Nie, Lei; Chen, Yong; Huang, Yi; Sun, Wei

    2012-12-01

    Multivariate meta-analysis is increasingly utilised in biomedical research to combine data of multiple comparative clinical studies for evaluating drug efficacy and safety profile. When the probability of the event of interest is rare, or when the individual study sample sizes are small, a substantial proportion of studies may not have any event of interest. Conventional meta-analysis methods either exclude such studies or include them through ad hoc continuality correction by adding an arbitrary positive value to each cell of the corresponding 2 × 2 tables, which may result in less accurate conclusions. Furthermore, different continuity corrections may result in inconsistent conclusions. In this article, we discuss a bivariate Beta-binomial model derived from Sarmanov family of bivariate distributions and a bivariate generalised linear mixed effects model for binary clustered data to make valid inferences. These bivariate random effects models use all available data without ad hoc continuity corrections, and accounts for the potential correlation between treatment (or exposure) and control groups within studies naturally. We then utilise the bivariate random effects models to reanalyse two recent meta-analysis data sets. PMID:21177306

  19. Reinterpretation of the results of a pooled analysis of dietary carotenoid intake and breast cancer risk by using the interval collapsing method

    PubMed Central

    2016-01-01

    OBJECTIVES: A pooled analysis of 18 prospective cohort studies reported in 2012 for evaluating carotenoid intakes and breast cancer risk defined by estrogen receptor (ER) and progesterone receptor (PR) statuses by using the “highest versus lowest intake” method (HLM). By applying the interval collapsing method (ICM) to maximize the use of the estimated information, we reevaluated the results of the previous analysis in order to reinterpret the inferences made. METHODS: In order to estimate the summary effect size (sES) and its 95% confidence interval (CI), meta-analyses with the random-effects model were conducted for adjusted relative risks and their 95% CI from the second to the fifth interval according to five kinds of carotenoids and ER/PR status. RESULTS: The following new findings were identified: α-Carotene and β-cryptoxanthin have protective effects on overall breast cancer. All five kinds of carotenoids showed protective effects on ER− breast cancer. β-Carotene level increased the risk of ER+ or ER+/PR+ breast cancer. α-Carotene, β-carotene, lutein/zeaxanthin, and lycopene showed a protective effect on ER−/PR+ or ER−/PR− breast cancer. CONCLUSIONS: The new facts support the hypothesis that carotenoids that show anticancer effects with anti-oxygen function might reduce the risk of ER− breast cancer. Based on the new facts, the modification of the effects of α-carotene, β-carotene, and β-cryptoxanthin should be evaluated according to PR and ER statuses. PMID:27283141

  20. Safety analysis, risk assessment, and risk acceptance criteria

    SciTech Connect

    Jamali, K.; Stack, D.W.; Sullivan, L.H.; Sanzo, D.L.

    1997-08-01

    This paper discusses a number of topics that relate safety analysis as documented in the Department of Energy (DOE) safety analysis reports (SARs), probabilistic risk assessments (PRA) as characterized primarily in the context of the techniques that have assumed some level of formality in commercial nuclear power plant applications, and risk acceptance criteria as an outgrowth of PRA applications. DOE SARs of interest are those that are prepared for DOE facilities under DOE Order 5480.23 and the implementing guidance in DOE STD-3009-94. It must be noted that the primary area of application for DOE STD-3009 is existing DOE facilities and that certain modifications of the STD-3009 approach are necessary in SARs for new facilities. Moreover, it is the hazard analysis (HA) and accident analysis (AA) portions of these SARs that are relevant to the present discussions. Although PRAs can be qualitative in nature, PRA as used in this paper refers more generally to all quantitative risk assessments and their underlying methods. HA as used in this paper refers more generally to all qualitative risk assessments and their underlying methods that have been in use in hazardous facilities other than nuclear power plants. This discussion includes both quantitative and qualitative risk assessment methods. PRA has been used, improved, developed, and refined since the Reactor Safety Study (WASH-1400) was published in 1975 by the Nuclear Regulatory Commission (NRC). Much debate has ensued since WASH-1400 on exactly what the role of PRA should be in plant design, reactor licensing, `ensuring` plant and process safety, and a large number of other decisions that must be made for potentially hazardous activities. Of particular interest in this area is whether the risks quantified using PRA should be compared with numerical risk acceptance criteria (RACs) to determine whether a facility is `safe.` Use of RACs requires quantitative estimates of consequence frequency and magnitude.

  1. [The cascade scheme as a methodical platform for analysis of health risks in space flight and partially and fully analog conditions].

    PubMed

    Ushakov, I B; Poliakov, A V; Usov, V M

    2011-01-01

    Space anthropoecology, a subsection of human ecology, studies various aspects of physiological, psychological, social and professional adaptation to the extreme environment of space flight and human life and work in partially- and fully analogous conditions on Earth. Both SF and simulated extreme conditions are known for high human safety standards and a substantial analytic base that secures on-line analysis of torrent of information. Management evaluation and response to germing undesired developments aimed to curb their impact on the functioning of the crew-vehicle-environment system and human health involve the complete wealth of knowledge about risks to human health and performance. Spacecrew safety issues are tackled by experts of many specialties which emphasizes the importance of integral methodical approaches to risk estimation and mitigation, setting up barriers to adverse trends in human physiology and psychology in challenging conditions, and minimization of delayed effects on professional longevity and disorders in behavioral reactions. PMID:21970036

  2. Probabilistic Exposure Analysis for Chemical Risk Characterization

    PubMed Central

    Bogen, Kenneth T.; Cullen, Alison C.; Frey, H. Christopher; Price, Paul S.

    2009-01-01

    This paper summarizes the state of the science of probabilistic exposure assessment (PEA) as applied to chemical risk characterization. Current probabilistic risk analysis methods applied to PEA are reviewed. PEA within the context of risk-based decision making is discussed, including probabilistic treatment of related uncertainty, interindividual heterogeneity, and other sources of variability. Key examples of recent experience gained in assessing human exposures to chemicals in the environment, and other applications to chemical risk characterization and assessment, are presented. It is concluded that, although improvements continue to be made, existing methods suffice for effective application of PEA to support quantitative analyses of the risk of chemically induced toxicity that play an increasing role in key decision-making objectives involving health protection, triage, civil justice, and criminal justice. Different types of information required to apply PEA to these different decision contexts are identified, and specific PEA methods are highlighted that are best suited to exposure assessment in these separate contexts. PMID:19223660

  3. Failure risk assessment by analysis and testing

    NASA Technical Reports Server (NTRS)

    Moore, N.; Ebbeler, D.; Creager, M.

    1992-01-01

    The sources of information on which to base an evaluation of reliability or failure risk of an aerospace flight system are (1) experience from tests and flights and (2) engineering analysis. It is rarely feasible to establish high reliability at high confidence by testing aerospace systems or components. Moreover, failure prediction by conventional, deterministic methods of engineering analysis can become arbitrary and subject to serious misinterpretation when uncertain or approximate information is used to establish analysis parameter values and to calibrate the accuracy of engineering models. The limitations of testing to evaluate failure risk are discussed, and a statistical approach which incorporates both engineering analysis and testing is presented.

  4. At-Risk Youngsters: Methods That Work.

    ERIC Educational Resources Information Center

    Obiakor, Festus E.

    This paper examines problems faced by youngsters at risk of failure in school, and discusses methods for helping them succeed in educational programs. At-risk youngsters confront many problems in school and in mainstream society, and are frequently misidentified, misdiagnosed, and improperly instructed. Problems faced by at-risk youngsters…

  5. Post hoc Analysis for Detecting Individual Rare Variant Risk Associations Using Probit Regression Bayesian Variable Selection Methods in Case-Control Sequencing Studies.

    PubMed

    Larson, Nicholas B; McDonnell, Shannon; Albright, Lisa Cannon; Teerlink, Craig; Stanford, Janet; Ostrander, Elaine A; Isaacs, William B; Xu, Jianfeng; Cooney, Kathleen A; Lange, Ethan; Schleutker, Johanna; Carpten, John D; Powell, Isaac; Bailey-Wilson, Joan; Cussenot, Olivier; Cancel-Tassin, Geraldine; Giles, Graham; MacInnis, Robert; Maier, Christiane; Whittemore, Alice S; Hsieh, Chih-Lin; Wiklund, Fredrik; Catolona, William J; Foulkes, William; Mandal, Diptasri; Eeles, Rosalind; Kote-Jarai, Zsofia; Ackerman, Michael J; Olson, Timothy M; Klein, Christopher J; Thibodeau, Stephen N; Schaid, Daniel J

    2016-09-01

    Rare variants (RVs) have been shown to be significant contributors to complex disease risk. By definition, these variants have very low minor allele frequencies and traditional single-marker methods for statistical analysis are underpowered for typical sequencing study sample sizes. Multimarker burden-type approaches attempt to identify aggregation of RVs across case-control status by analyzing relatively small partitions of the genome, such as genes. However, it is generally the case that the aggregative measure would be a mixture of causal and neutral variants, and these omnibus tests do not directly provide any indication of which RVs may be driving a given association. Recently, Bayesian variable selection approaches have been proposed to identify RV associations from a large set of RVs under consideration. Although these approaches have been shown to be powerful at detecting associations at the RV level, there are often computational limitations on the total quantity of RVs under consideration and compromises are necessary for large-scale application. Here, we propose a computationally efficient alternative formulation of this method using a probit regression approach specifically capable of simultaneously analyzing hundreds to thousands of RVs. We evaluate our approach to detect causal variation on simulated data and examine sensitivity and specificity in instances of high RV dimensionality as well as apply it to pathway-level RV analysis results from a prostate cancer (PC) risk case-control sequencing study. Finally, we discuss potential extensions and future directions of this work. PMID:27312771

  6. Multiattribute risk analysis in nuclear emergency management.

    PubMed

    Hämäläinen, R P; Lindstedt, M R; Sinkko, K

    2000-08-01

    Radiation protection authorities have seen a potential for applying multiattribute risk analysis in nuclear emergency management and planning to deal with conflicting objectives, different parties involved, and uncertainties. This type of approach is expected to help in the following areas: to ensure that all relevant attributes are considered in decision making; to enhance communication between the concerned parties, including the public; and to provide a method for explicitly including risk analysis in the process. A multiattribute utility theory analysis was used to select a strategy for protecting the population after a simulated nuclear accident. The value-focused approach and the use of a neutral facilitator were identified as being useful. PMID:11051070

  7. Selecting Needs Analysis Methods.

    ERIC Educational Resources Information Center

    Newstrom, John W.; Lilyquist, John M.

    1979-01-01

    Presents a contingency model for decision making with regard to needs analysis methods. Focus is on 12 methods with brief discussion of their defining characteristics and some operational guidelines for their use. (JOW)

  8. GROWING NEED FOR RISK ANALYSIS

    EPA Science Inventory

    Risk analysis has been increasingly receiving attention in making environmental decisions. or example, in its May 18, 1993 Combustion Strategy announcement, EPA required that any issuance of a new hazardous waste combustion permit be preceded by the performance of a complete (dir...

  9. Intelligent adversary risk analysis: a bioterrorism risk management model.

    PubMed

    Parnell, Gregory S; Smith, Christopher M; Moxley, Frederick I

    2010-01-01

    The tragic events of 9/11 and the concerns about the potential for a terrorist or hostile state attack with weapons of mass destruction have led to an increased emphasis on risk analysis for homeland security. Uncertain hazards (natural and engineering) have been successfully analyzed using probabilistic risk analysis (PRA). Unlike uncertain hazards, terrorists and hostile states are intelligent adversaries who can observe our vulnerabilities and dynamically adapt their plans and actions to achieve their objectives. This article compares uncertain hazard risk analysis with intelligent adversary risk analysis, describes the intelligent adversary risk analysis challenges, and presents a probabilistic defender-attacker-defender model to evaluate the baseline risk and the potential risk reduction provided by defender investments. The model includes defender decisions prior to an attack; attacker decisions during the attack; defender actions after an attack; and the uncertainties of attack implementation, detection, and consequences. The risk management model is demonstrated with an illustrative bioterrorism problem with notional data. PMID:20002893

  10. Risk analysis of spent fuel transportation

    SciTech Connect

    Not Available

    1986-01-01

    This book discusses the kinds of judgments that must go into a technical analysis of risk and moves on to the sociopolitical aspects of risk analysis where the same set of facts can be honestly but differently interpreted. Also outlines options available in risk management and reviews courts' involvement with risk analysis.

  11. Evaluating the Risks of Clinical Research: Direct Comparative Analysis

    PubMed Central

    Abdoler, Emily; Roberson-Nay, Roxann; Pine, Daniel S.; Wendler, David

    2014-01-01

    Abstract Objectives: Many guidelines and regulations allow children and adolescents to be enrolled in research without the prospect of clinical benefit when it poses minimal risk. However, few systematic methods exist to determine when research risks are minimal. This situation has led to significant variation in minimal risk judgments, raising concern that some children are not being adequately protected. To address this concern, we describe a new method for implementing the widely endorsed “risks of daily life” standard for minimal risk. This standard defines research risks as minimal when they do not exceed the risks posed by daily life activities or routine examinations. Methods: This study employed a conceptual and normative analysis, and use of an illustrative example. Results: Different risks are composed of the same basic elements: Type, likelihood, and magnitude of harm. Hence, one can compare the risks of research and the risks of daily life by comparing the respective basic elements with each other. We use this insight to develop a systematic method, direct comparative analysis, for implementing the “risks of daily life” standard for minimal risk. The method offers a way of evaluating research procedures that pose the same types of risk as daily life activities, such as the risk of experiencing anxiety, stress, or other psychological harm. We thus illustrate how direct comparative analysis can be applied in practice by using it to evaluate whether the anxiety induced by a respiratory CO2 challenge poses minimal or greater than minimal risks in children and adolescents. Conclusions: Direct comparative analysis is a systematic method for applying the “risks of daily life” standard for minimal risk to research procedures that pose the same types of risk as daily life activities. It thereby offers a method to protect children and adolescents in research, while ensuring that important studies are not blocked because of unwarranted concerns about

  12. Failure analysis of pinch-torsion tests as a thermal runaway risk evaluation method of Li-Ion Cells

    SciTech Connect

    Xia, Yuzhi; Li, Dr. Tianlei; Ren, Prof. Fei; Gao, Yanfei; Wang, Hsin

    2014-01-01

    Recently a pinch-torsion test is developed for safety testing of Li-ion batteries (Ren et al., J. Power Source, 2013). It has been demonstrated that this test can generate small internal short-circuit spots in the separator in a controllable and repeatable manner. In the current research, the failure mechanism is examined by numerical simulations and comparisons to experimental observations. Finite element models are developed to evaluate the deformation of the separators under both pure pinch and pinch-torsion loading conditions. It is discovered that the addition of the torsion component significantly increased the maximum principal strain, which is believed to induce the internal short circuit. In addition, the applied load in the pinch-torsion test is significantly less than in the pure pinch test, thus dramatically improving the applicability of this method to ultra-thick batteries which otherwise require heavy load in excess of machine capability. It is further found that the separator failure is achieved in the early stage of torsion (within a few degree of rotation). Effect of coefficient of friction on the maximum principal strain is also examined.

  13. Recursive Partitioning Method on Competing Risk Outcomes

    PubMed Central

    Xu, Wei; Che, Jiahua; Kong, Qin

    2016-01-01

    In some cancer clinical studies, researchers have interests to explore the risk factors associated with competing risk outcomes such as recurrence-free survival. We develop a novel recursive partitioning framework on competing risk data for both prognostic and predictive model constructions. We define specific splitting rules, pruning algorithm, and final tree selection algorithm for the competing risk tree models. This methodology is quite flexible that it can corporate both semiparametric method using Cox proportional hazards model and parametric competing risk model. Both prognostic and predictive tree models are developed to adjust for potential confounding factors. Extensive simulations show that our methods have well-controlled type I error and robust power performance. Finally, we apply both Cox proportional hazards model and flexible parametric model for prognostic tree development on a retrospective clinical study on oropharyngeal cancer patients. PMID:27486300

  14. Command Process Modeling & Risk Analysis

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  15. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 2 2014-07-01 2014-07-01 false Risk analysis. 75.115...) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive... analysis or VA's Office of Inspector General conducts an independent risk analysis of the data breach....

  16. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 2 2010-07-01 2010-07-01 false Risk analysis. 75.115...) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive... analysis or VA's Office of Inspector General conducts an independent risk analysis of the data breach....

  17. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 2 2011-07-01 2011-07-01 false Risk analysis. 75.115...) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive... analysis or VA's Office of Inspector General conducts an independent risk analysis of the data breach....

  18. Draft Waste Management Programmatic Environmental Impact Statement for managing treatment, storage, and disposal of radioactive and hazardous waste. Volume 3, Appendix A: Public response to revised NOI, Appendix B: Environmental restoration, Appendix C, Environmental impact analysis methods, Appendix D, Risk

    SciTech Connect

    1995-08-01

    Volume three contains appendices for the following: Public comments do DOE`s proposed revisions to the scope of the waste management programmatic environmental impact statement; Environmental restoration sensitivity analysis; Environmental impacts analysis methods; and Waste management facility human health risk estimates.

  19. Comparison Of Intake Gate Closure Methods At Lower Granite, Little Goose, Lower Monumental, And Mcnary Dams Using Risk-Based Analysis

    SciTech Connect

    Gore, Bryan F.; Blackburn, Tyrone R.; Heasler, Patrick G.; Mara, Neil L.; Phan, Hahn K.; Bardy, David M.; Hollenbeck, Robert E.

    2001-01-19

    The objective of this report is to compare the benefits and costs of modifications proposed for intake gate closure systems at four hydroelectric stations on the Lower Snake and Upper Columbia Rivers in the Walla Walla District that are unable to meet the COE 10-minute closure rule due to the installation of fish screens. The primary benefit of the proposed modifications is to reduce the risk of damage to the station and environs when emergency intake gate closure is required. Consequently, this report presents the results and methodology of an extensive risk analysis performed to assess the reliability of powerhouse systems and the costs and timing of potential damages resulting from events requiring emergency intake gate closure. As part of this analysis, the level of protection provided by the nitrogen emergency closure system was also evaluated. The nitrogen system was the basis for the original recommendation to partially disable the intake gate systems. The risk analysis quantifies this protection level.

  20. Comparison of Intake Gate Closure Methods At Lower Granite, Little Goose, Lower Monumental, And McNary Dams Using Risk-Based Analysis

    SciTech Connect

    Gore, Bryan F; Blackburn, Tye R; Heasler, Patrick G; Mara, Neil L

    2001-01-19

    The objective of this report is to compare the benefits and costs of modifications proposed for intake gate closure systems at four hydroelectric stations on the Lower Snake and Upper Columbia Rivers in the Walla Walla District that are unable to meet the COE 10-minute closure rule due to the installation of fish screens. The primary benefit of the proposed modifications is to reduce the risk of damage to the station and environs when emergency intake gate closure is required. Consequently, this report presents the results and methodology of an extensive risk analysis performed to assess the reliability of powerhouse systems and the costs and timing of potential damages resulting from events requiring emergency intake gate closure. As part of this analysis, the level of protection provided by the nitrogen emergency closure system was also evaluated. The nitrogen system was the basis for the original recommendation to partially disable the intake gate systems. The risk analysis quantifies this protection level.

  1. Association analysis of the dopamine D{sub 2} receptor gene in Tourette`s syndrome using the haplotype relative risk method

    SciTech Connect

    Noethen, M.M.; Cichon, S.; Propping, P.

    1994-09-15

    Comings et al. have recently reported a highly significant association between Tourette`s syndrome (TS) and a restriction fragment length polymorphism (RFLP) of the dopamine D{sub 2} receptor gene (DRD2) locus. The A1 allele of the DRD2 Taq I RFLP was present in 45% of the Tourette patients compared with 25% of controls. We tried to replicate this finding by using the haplotype relative risk (HRR) method for association analysis. This method overcomes a major problem of conventional case-control studies, where undetected ethnic differences between patients and controls may result in a false-positive finding, by using parental alleles not inherited by the proband as control alleles. Sixty-one nuclear families encompassing an affected child and parents were typed for the DRD2 Taq I polymorphism. No significant differences in DRD2 A1 allele frequency were observed between TS probands, sub-populations of probands classified according to tic severity, or parental control alleles. Our data do not support the hypothesis that the DRD2 locus may act as a modifying gene in the expression of the disorder in TS probands. 40 refs., 1 tab.

  2. Russian risk assessment methods and approaches

    SciTech Connect

    Dvorack, M.A.; Carlson, D.D.; Smith, R.E.

    1996-07-01

    One of the benefits resulting from the collapse of the Soviet Union is the increased dialogue currently taking place between American and Russian nuclear weapons scientists in various technical arenas. One of these arenas currently being investigated involves collaborative studies which illustrate how risk assessment is perceived and utilized in the Former Soviet Union (FSU). The collaborative studies indicate that, while similarities exist with respect to some methodologies, the assumptions and approaches in performing risk assessments were, and still are, somewhat different in the FSU as opposed to that in the US. The purpose of this paper is to highlight the present knowledge of risk assessment methodologies and philosophies within the two largest nuclear weapons laboratories of the Former Soviet Union, Arzamas-16 and Chelyabinsk-70. Furthermore, This paper will address the relative progress of new risk assessment methodologies, such as Fuzzy Logic, within the framework of current risk assessment methods at these two institutes.

  3. Contribution of European research to risk analysis.

    PubMed

    Boenke, A

    2001-12-01

    The European Commission's, Quality of Life Research Programme, Key Action 1-Health, Food & Nutrition is mission-oriented and aims, amongst other things, at providing a healthy, safe and high-quality food supply leading to reinforced consumer confidence in the safety, of European food. Its objectives also include the enhancing of the competitiveness of the European food supply. Key Action 1 is currently supporting a number of different types of European collaborative projects in the area of risk analysis. The objectives of these projects range from the development and validation of prevention strategies including the reduction of consumers risks; development and validation of new modelling approaches, harmonization of risk assessment principles methodologies and terminology; standardization of methods and systems used for the safety evaluation of transgenic food; providing of tools for the evaluation of human viral contamination of shellfish and quality control; new methodologies for assessing the potential of unintended effects of genetically modified (genetically modified) foods; development of a risk assessment model for Cryptosporidium parvum related to the food and water industries, to the development of a communication platform for genetically modified organism, producers, retailers, regulatory authorities and consumer groups to improve safety assessment procedures, risk management strategies and risk communication; development and validation of new methods for safety testing of transgenic food; evaluation of the safety and efficacy of iron supplementation in pregnant women, evaluation of the potential cancer-preventing activity of pro- and pre-biotic ('synbiotic') combinations in human volunteers. An overview of these projects is presented here. PMID:11761126

  4. [Methods of risk assessment and their validation].

    PubMed

    Baracco, Alessandro

    2014-01-01

    The review of the literature data shows several methods for the the risks assessment of biomnechanical overload of the musculoskeletal system in activities with repetitive strain of the upper limbs and manual material handling. The application of these methods should allow the quantification ofriskfor the working population, the identification of the preventive measures to reduce the risk and their effectiveness and thle design of a specific health surveillance scheme. In this paper we analyze the factors which must be taken into account in Occupational Medicine to implement a process of validation of these methods. In conclusion we believe it will necessary in the future the availability of new methods able to analyze and reduce the risk already in the design phase of the production process. PMID:25558718

  5. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2007-01-01

    A formal method is described to quantify structural reliability and risk in the presence of a multitude of uncertainties. The method is based on the materials behavior level where primitive variables with their respective scatters are used to describe that behavior. Computational simulation is then used to propagate those uncertainties to the structural scale where reliability and risk are usually specified. A sample case is described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from this method demonstrate that the method is mature and that it can be used for future strategic projections and planning to assure better, cheaper, faster products for competitive advantages in world markets. The results also indicate that the methods are suitable for predicting remaining life in aging or deteriorating structures.

  6. Tsunamis: Global Exposure and Local Risk Analysis

    NASA Astrophysics Data System (ADS)

    Harbitz, C. B.; Løvholt, F.; Glimsdal, S.; Horspool, N.; Griffin, J.; Davies, G.; Frauenfelder, R.

    2014-12-01

    The 2004 Indian Ocean tsunami led to a better understanding of the likelihood of tsunami occurrence and potential tsunami inundation, and the Hyogo Framework for Action (HFA) was one direct result of this event. The United Nations International Strategy for Disaster Risk Reduction (UN-ISDR) adopted HFA in January 2005 in order to reduce disaster risk. As an instrument to compare the risk due to different natural hazards, an integrated worldwide study was implemented and published in several Global Assessment Reports (GAR) by UN-ISDR. The results of the global earthquake induced tsunami hazard and exposure analysis for a return period of 500 years are presented. Both deterministic and probabilistic methods (PTHA) are used. The resulting hazard levels for both methods are compared quantitatively for selected areas. The comparison demonstrates that the analysis is rather rough, which is expected for a study aiming at average trends on a country level across the globe. It is shown that populous Asian countries account for the largest absolute number of people living in tsunami prone areas, more than 50% of the total exposed people live in Japan. Smaller nations like Macao and the Maldives are among the most exposed by population count. Exposed nuclear power plants are limited to Japan, China, India, Taiwan, and USA. On the contrary, a local tsunami vulnerability and risk analysis applies information on population, building types, infrastructure, inundation, flow depth for a certain tsunami scenario with a corresponding return period combined with empirical data on tsunami damages and mortality. Results and validation of a GIS tsunami vulnerability and risk assessment model are presented. The GIS model is adapted for optimal use of data available for each study. Finally, the importance of including landslide sources in the tsunami analysis is also discussed.

  7. Methods for external event screening quantification: Risk Methods Integration and Evaluation Program (RMIEP) methods development

    SciTech Connect

    Ravindra, M.K.; Banon, H.

    1992-07-01

    In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as a three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC`s PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described.

  8. Methods for external event screening quantification: Risk Methods Integration and Evaluation Program (RMIEP) methods development

    SciTech Connect

    Ravindra, M.K.; Banon, H. )

    1992-07-01

    In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as a three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC's PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described.

  9. Flood hazard energy in urban areas: a new integrated method for flood risk analysis in synthesizing interactions with urban boundary layer

    NASA Astrophysics Data System (ADS)

    Park, S. Y.; Schmidt, A.

    2015-12-01

    Since urban physical characteristics (such as morphology and land-use/land-cover) are different from those of nature, altered interactions between the surface and atmosphere (especially urban boundary layer, UBL) or surface and subsurface can affect the hydrologic behavior and hence the flood hazards. In this research we focus on three main aspects of the urban surface/atmosphere interactions that affect flood hazard: urban heat island (UHI) effect, increased surface roughness, and accumulated aerosols. These factors, along with the uncertainties in quantifying these components make risk analysis intractable. In order to perform a risk analysis, the impact of these components needs to be mapped to a variable that can be mathematically described in a risk-analysis framework. We propose defining hazard energy as a surrogate for the combined effect of these three components. Perturbations that can change the hazard energy come from diverse sources in the urban areas and these somewhat disconnected things can be combined by the energy concept to characterize the impacts of urban areas in risk assessment. This approach synthesizes across hydrological and hydraulic processes in UBL, land surface, subsurface, and sewer network with scrutinizing energy exchange across places. We can extend our understanding about not only the influence of cities on local climate in rural areas or larger scales but also the interaction of cities and nature affecting each other.

  10. Low-thrust mission risk analysis.

    NASA Technical Reports Server (NTRS)

    Yen, C. L.; Smith, D. B.

    1973-01-01

    A computerized multi-stage failure process simulation procedure is used to evaluate the risk in a solar electric space mission. The procedure uses currently available thrust-subsystem reliability data and performs approximate simulations of the thrust subsystem burn operation, the system failure processes, and the retargetting operations. The application of the method is used to assess the risks in carrying out a 1980 rendezvous mission to Comet Encke. Analysis of the results and evaluation of the effects of various risk factors on the mission show that system component failure rates is the limiting factor in attaining a high mission reliability. But it is also shown that a well-designed trajectory and system operation mode can be used effectively to partially compensate for unreliable thruster performance.

  11. Method and apparatus for assessing cardiovascular risk

    NASA Technical Reports Server (NTRS)

    Albrecht, Paul (Inventor); Bigger, J. Thomas (Inventor); Cohen, Richard J. (Inventor)

    1998-01-01

    The method for assessing risk of an adverse clinical event includes detecting a physiologic signal in the subject and determining from the physiologic signal a sequence of intervals corresponding to time intervals between heart beats. The long-time structure of fluctuations in the intervals over a time period of more than fifteen minutes is analyzed to assess risk of an adverse clinical event. In a preferred embodiment, the physiologic signal is an electrocardiogram and the time period is at least fifteen minutes. A preferred method for analyzing the long-time structure variability in the intervals includes computing the power spectrum and fitting the power spectrum to a power law dependence on frequency over a selected frequency range such as 10.sup.-4 to 10.sup.-2 Hz. Characteristics of the long-time structure fluctuations in the intervals is used to assess risk of an adverse clinical event.

  12. A classification scheme for risk assessment methods.

    SciTech Connect

    Stamp, Jason Edwin; Campbell, Philip LaRoche

    2004-08-01

    This report presents a classification scheme for risk assessment methods. This scheme, like all classification schemes, provides meaning by imposing a structure that identifies relationships. Our scheme is based on two orthogonal aspects--level of detail, and approach. The resulting structure is shown in Table 1 and is explained in the body of the report. Each cell in the Table represent a different arrangement of strengths and weaknesses. Those arrangements shift gradually as one moves through the table, each cell optimal for a particular situation. The intention of this report is to enable informed use of the methods so that a method chosen is optimal for a situation given. This report imposes structure on the set of risk assessment methods in order to reveal their relationships and thus optimize their usage.We present a two-dimensional structure in the form of a matrix, using three abstraction levels for the rows and three approaches for the columns. For each of the nine cells in the matrix we identify the method type by name and example. The matrix helps the user understand: (1) what to expect from a given method, (2) how it relates to other methods, and (3) how best to use it. Each cell in the matrix represent a different arrangement of strengths and weaknesses. Those arrangements shift gradually as one moves through the table, each cell optimal for a particular situation. The intention of this report is to enable informed use of the methods so that a method chosen is optimal for a situation given. The matrix, with type names in the cells, is introduced in Table 2 on page 13 below. Unless otherwise stated we use the word 'method' in this report to refer to a 'risk assessment method', though often times we use the full phrase. The use of the terms 'risk assessment' and 'risk management' are close enough that we do not attempt to distinguish them in this report. The remainder of this report is organized as follows. In Section 2 we provide context for this report

  13. Comprehensive rotorcraft analysis methods

    NASA Technical Reports Server (NTRS)

    Stephens, Wendell B.; Austin, Edward E.

    1988-01-01

    The development and application of comprehensive rotorcraft analysis methods in the field of rotorcraft technology are described. These large scale analyses and the resulting computer programs are intended to treat the complex aeromechanical phenomena that describe the behavior of rotorcraft. They may be used to predict rotor aerodynamics, acoustic, performance, stability and control, handling qualities, loads and vibrations, structures, dynamics, and aeroelastic stability characteristics for a variety of applications including research, preliminary and detail design, and evaluation and treatment of field problems. The principal comprehensive methods developed or under development in recent years and generally available to the rotorcraft community because of US Army Aviation Research and Technology Activity (ARTA) sponsorship of all or part of the software systems are the Rotorcraft Flight Simulation (C81), Dynamic System Coupler (DYSCO), Coupled Rotor/Airframe Vibration Analysis Program (SIMVIB), Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics (CAMRAD), General Rotorcraft Aeromechanical Stability Program (GRASP), and Second Generation Comprehensive Helicopter Analysis System (2GCHAS).

  14. The JPL Cost Risk Analysis Approach that Incorporates Engineering Realism

    NASA Technical Reports Server (NTRS)

    Harmon, Corey C.; Warfield, Keith R.; Rosenberg, Leigh S.

    2006-01-01

    This paper discusses the JPL Cost Engineering Group (CEG) cost risk analysis approach that accounts for all three types of cost risk. It will also describe the evaluation of historical cost data upon which this method is based. This investigation is essential in developing a method that is rooted in engineering realism and produces credible, dependable results to aid decision makers.

  15. Economic Risk Analysis: Using Analytical and Monte Carlo Techniques.

    ERIC Educational Resources Information Center

    O'Donnell, Brendan R.; Hickner, Michael A.; Barna, Bruce A.

    2002-01-01

    Describes the development and instructional use of a Microsoft Excel spreadsheet template that facilitates analytical and Monte Carlo risk analysis of investment decisions. Discusses a variety of risk assessment methods followed by applications of the analytical and Monte Carlo methods. Uses a case study to illustrate use of the spreadsheet tool…

  16. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2008-01-01

    A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multi-factor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.

  17. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multifactor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.

  18. New method for assessing risks of email

    NASA Astrophysics Data System (ADS)

    Raja, Seyyed H.; Afrooz, Farzad

    2013-03-01

    E-mail technology, has become one of the requirements of human lives for correspondence between individuals. Given this, the important point is that the messages, server and client of e-mail and correspondences that exchanged between different people have acceptable security, to make people sure to use of this technology. In the information age, many of financial and non financial transactions are done electronically, data exchange takes place via the internet and theft and manipulation of data can make exorbitant cost in terms of integrity, financial, political, economic and culture. E-mail correspondence in there is same and it is very important. With review took place, a method that will focus on email system for risks assessment is not provided. We are examining ways of assessing for other systems and their strengths and weaknesses, then we use Mr Convery method for assessing email risks which it is for assessing network risks. At the end of paper we have offered special table for email risk assessment.

  19. Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581

    NASA Astrophysics Data System (ADS)

    Prayogo, Galang Sandy; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin

    2016-04-01

    Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presented a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.

  20. Spatio-temporal earthquake risk assessment for the Lisbon Metropolitan Area - A contribution to improving standard methods of population exposure and vulnerability analysis

    NASA Astrophysics Data System (ADS)

    Freire, Sérgio; Aubrecht, Christoph

    2010-05-01

    The recent 7.0 M earthquake that caused severe damage and destruction in parts of Haiti struck close to 5 PM (local time), at a moment when many people were not in their residences, instead being in their workplaces, schools, or churches. Community vulnerability assessment to seismic hazard relying solely on the location and density of resident-based census population, as is commonly the case, would grossly misrepresent the real situation. In particular in the context of global (climate) change, risk analysis is a research field increasingly gaining in importance whereas risk is usually defined as a function of hazard probability and vulnerability. Assessment and mapping of human vulnerability has however generally been lagging behind hazard analysis efforts. Central to the concept of vulnerability is the issue of human exposure. Analysis of exposure is often spatially tied to administrative units or reference objects such as buildings, spanning scales from the regional level to local studies for small areas. Due to human activities and mobility, the spatial distribution of population is time-dependent, especially in metropolitan areas. Accurately estimating population exposure is a key component of catastrophe loss modeling, one element of effective risk analysis and emergency management. Therefore, accounting for the spatio-temporal dynamics of human vulnerability correlates with recent recommendations to improve vulnerability analyses. Earthquakes are the prototype for a major disaster, being low-probability, rapid-onset, high-consequence events. Lisbon, Portugal, is subject to a high risk of earthquake, which can strike at any day and time, as confirmed by modern history (e.g. December 2009). The recently-approved Special Emergency and Civil Protection Plan (PEERS) is based on a Seismic Intensity map, and only contemplates resident population from the census as proxy for human exposure. In the present work we map and analyze the spatio-temporal distribution of

  1. Barriers to uptake among high-risk individuals declining participation in lung cancer screening: a mixed methods analysis of the UK Lung Cancer Screening (UKLS) trial

    PubMed Central

    Ali, Noor; Lifford, Kate J; Carter, Ben; McRonald, Fiona; Yadegarfar, Ghasem; Baldwin, David R; Weller, David; Hansell, David M; Duffy, Stephen W; Field, John K; Brain, Kate

    2015-01-01

    Objective The current study aimed to identify the barriers to participation among high-risk individuals in the UK Lung Cancer Screening (UKLS) pilot trial. Setting The UKLS pilot trial is a randomised controlled trial of low-dose CT (LDCT) screening that has recruited high-risk people using a population approach in the Cambridge and Liverpool areas. Participants High-risk individuals aged 50–75 years were invited to participate in UKLS. Individuals were excluded if a LDCT scan was performed within the last year, if they were unable to provide consent, or if LDCT screening was unable to be carried out due to coexisting comorbidities. Outcome measures Statistical associations between individual characteristics and UKLS uptake were examined using multivariable regression modelling. In those who completed a non-participation questionnaire (NPQ), thematic analysis of free-text data was undertaken to identify reasons for not taking part, with subsequent exploratory linkage of key themes to risk factors for non-uptake. Results Comparative data were available from 4061 high-risk individuals who consented to participate in the trial and 2756 who declined participation. Of those declining participation, 748 (27.1%) completed a NPQ. Factors associated with non-uptake included: female gender (OR=0.64, p<0.001), older age (OR=0.73, p<0.001), current smoking (OR=0.70, p<0.001), lower socioeconomic group (OR=0.56, p<0.001) and higher affective risk perception (OR=0.52, p<0.001). Among non-participants who provided a reason, two main themes emerged reflecting practical and emotional barriers. Smokers were more likely to report emotional barriers to participation. Conclusions A profile of risk factors for non-participation in lung screening has emerged, with underlying reasons largely relating to practical and emotional barriers. Strategies for engaging high-risk, hard-to-reach groups are critical for the equitable uptake of a potential future lung cancer screening programme

  2. Landsafe: Landing Site Risk Analysis Software Framework

    NASA Astrophysics Data System (ADS)

    Schmidt, R.; Bostelmann, J.; Cornet, Y.; Heipke, C.; Philippe, C.; Poncelet, N.; de Rosa, D.; Vandeloise, Y.

    2012-08-01

    The European Space Agency (ESA) is planning a Lunar Lander mission in the 2018 timeframe that will demonstrate precise soft landing at the polar regions of the Moon. To ensure a safe and successful landing a careful risk analysis has to be carried out. This is comprised of identifying favorable target areas and evaluating the surface conditions in these areas. Features like craters, boulders, steep slopes, rough surfaces and shadow areas have to be identified in order to assess the risk associated to a landing site in terms of a successful touchdown and subsequent surface operation of the lander. In addition, global illumination conditions at the landing site have to be simulated and analyzed. The Landing Site Risk Analysis software framework (LandSAfe) is a system for the analysis, selection and certification of safe landing sites on the lunar surface. LandSAfe generates several data products including high resolution digital terrain models (DTMs), hazard maps, illumination maps, temperature maps and surface reflectance maps which assist the user in evaluating potential landing site candidates. This paper presents the LandSAfe system and describes the methods and products of the different modules. For one candidate landing site on the rim of Shackleton crater at the south pole of the Moon a high resolution DTM is showcased.

  3. Applying the partitioned multiobjective risk method (PMRM) to portfolio selection.

    PubMed

    Reyes Santos, Joost; Haimes, Yacov Y

    2004-06-01

    The analysis of risk-return tradeoffs and their practical applications to portfolio analysis paved the way for Modern Portfolio Theory (MPT), which won Harry Markowitz a 1992 Nobel Prize in Economics. A typical approach in measuring a portfolio's expected return is based on the historical returns of the assets included in a portfolio. On the other hand, portfolio risk is usually measured using volatility, which is derived from the historical variance-covariance relationships among the portfolio assets. This article focuses on assessing portfolio risk, with emphasis on extreme risks. To date, volatility is a major measure of risk owing to its simplicity and validity for relatively small asset price fluctuations. Volatility is a justified measure for stable market performance, but it is weak in addressing portfolio risk under aberrant market fluctuations. Extreme market crashes such as that on October 19, 1987 ("Black Monday") and catastrophic events such as the terrorist attack of September 11, 2001 that led to a four-day suspension of trading on the New York Stock Exchange (NYSE) are a few examples where measuring risk via volatility can lead to inaccurate predictions. Thus, there is a need for a more robust metric of risk. By invoking the principles of the extreme-risk-analysis method through the partitioned multiobjective risk method (PMRM), this article contributes to the modeling of extreme risks in portfolio performance. A measure of an extreme portfolio risk, denoted by f(4), is defined as the conditional expectation for a lower-tail region of the distribution of the possible portfolio returns. This article presents a multiobjective problem formulation consisting of optimizing expected return and f(4), whose solution is determined using Evolver-a software that implements a genetic algorithm. Under business-as-usual market scenarios, the results of the proposed PMRM portfolio selection model are found to be compatible with those of the volatility-based model

  4. Novel methods to evaluate fracture risk models

    PubMed Central

    Donaldson, M.G.; Cawthon, P. M.; Schousboe, J.T.; Ensrud, K.E.; Lui, L.Y.; Cauley, J.A.; Hillier, T.A.; Taylor, B.C.; Hochberg, M.C.; Bauer, D.C.; Cummings, S.R.

    2013-01-01

    Fracture prediction models help identify individuals at high risk who may benefit from treatment. Area Under the Curve (AUC) is used to compare prediction models. However, the AUC has limitations and may miss important differences between models. Novel reclassification methods quantify how accurately models classify patients who benefit from treatment and the proportion of patients above/below treatment thresholds. We applied two reclassification methods, using the NOF treatment thresholds, to compare two risk models: femoral neck BMD and age (“simple model”) and FRAX (”FRAX model”). The Pepe method classifies based on case/non-case status and examines the proportion of each above and below thresholds. The Cook method examines fracture rates above and below thresholds. We applied these to the Study of Osteoporotic Fractures. There were 6036 (1037 fractures) and 6232 (389 fractures) participants with complete data for major osteoporotic and hip fracture respectively. Both models for major osteoporotic fracture (0.68 vs. 0.69) and hip fracture (0.75 vs. 0.76) had similar AUCs. In contrast, using reclassification methods, each model classified a substantial number of women differently. Using the Pepe method, the FRAX model (vs. simple model), missed treating 70 (7%) cases of major osteoporotic fracture but avoided treating 285 (6%) non-cases. For hip fracture, the FRAX model missed treating 31 (8%) cases but avoided treating 1026 (18%) non-cases. The Cook method (both models, both fracture outcomes) had similar fracture rates above/below the treatment thresholds. Compared with the AUC, new methods provide more detailed information about how models classify patients. PMID:21351143

  5. RISK ANALYSIS: CASE HISTORY OF PUCCINIA JACEAE ON YELLOW STARTHISTLE

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Risk analysis has five components: Risk awareness, Risk perception, Risk assessment, Risk management, and Risk communication. Using the case with the foreign plant pathogen, Puccinia jaceae, under evaluation for biological control of yellow starthistle (Centaurea solstitialis, YST), approaches and...

  6. Generalized Multicoincidence Analysis Methods

    SciTech Connect

    Warren, Glen A.; Smith, Leon E.; Aalseth, Craig E.; Ellis, J. E.; Valsan, Andrei B.; Mengesha, Wondwosen

    2005-10-01

    The ability to conduct automated trace radionuclide analysis at or near the sample collection point would provide a valuable tool for emergency response, nuclear forensics and environmental monitoring. Pacific Northwest National Laboratory is developing systems for this purpose based on dual gamma-ray spectrometers, e.g. NaI(TI) or HPGe, combined with thin organic scintillator sensors to detect light charged particles. Translating the coincident signatures recorded by these systems, which include , and , into the concentration of detectable radionuclides in the sample requires generalized multicoincidence analysis tools. The development and validation of the Coincidence Lookup Library, which currently contains the probabilities of single and coincidence signatures from more than 420 isotopes, is described. Also discussed is a method to calculate the probability of observing a coincidence signature which incorporates true coincidence summing effects. These effects are particularly important for high-geometric-efficiency detection systems. Finally, a process for validating the integrated analysis software package is demonstrated using GEANT 4 simulations of the prototype detector systems.

  7. Generalized Multicoincidence Analysis Methods

    SciTech Connect

    Warren, Glen A.; Smith, Leon E.; Aalseth, Craig E.; Ellis, J. E.; Valsan, Andrei B.; Mengesha, Wondwosen

    2006-02-01

    The ability to conduct automated trace radionuclide analysis at or near the sample collection point would provide a valuable tool for emergency response, environmental monitoring, and verification of treaties and agreements. Pacific Northwest National Laboratory is developing systems for this purpose based on dual gamma-ray spectrometers, e.g. NaI(TI) or HPGe, combined with thin organic scintillator sensors to detect light charged particles. Translating the coincident signatures recorded by these systems, which include beta-gamma, gamma-gamma and beta-gamma-gamma, into the concentration of detectable radionuclides in the sample requires generalized multicoincidence analysis tools. The development and validation of the Coincidence Lookup Library, which currently contains the probabilities of single and coincidence signatures from more than 420 isotopes, is described. Also discussed is a method to calculate the probability of observing a coincidence signature which incorporates true coincidence summing effects. These effects are particularly important for high-geometric-efficiency detection systems. Finally, a process for verifying the integrated analysis software package is demonstrated using GEANT 4 simulations of the prototype detector systems.

  8. [Risk sharing methods in middle income countries].

    PubMed

    Inotai, András; Kaló, Zoltán

    2012-01-01

    The pricing strategy of innovative medicines is based on the therapeutic value in the largest pharmaceutical markets. The cost-effectiveness of new medicines with value based ex-factory price is justifiable. Due to the international price referencing and parallel trade the ex-factory price corridor of new medicines has been narrowed in recent years. Middle income countries have less negotiation power to change the narrow drug pricing corridor, although their fair intention is to buy pharmaceuticals at lower price from their scarce public resources compared to higher income countries. Therefore the reimbursement of new medicines at prices of Western-European countries may not be justifiable in Central-Eastern European countries. Confidential pricing agreements (i.e. confidential price discounts, claw-back or rebate) in lower income countries of the European Union can alleviate this problem, as prices of new medicines can be adjusted to local purchasing power without influencing the published ex-factory price and so the accessibility of patients to these drugs in other countries. In order to control the drug budget payers tend to apply financial risk sharing agreements for new medicines in more and more countries to shift the consequences of potential overspending to pharmaceutical manufacturers. The major paradox of financial risk-sharing schemes is that increased mortality, poor persistence of patients, reduced access to healthcare providers, and no treatment reduce pharmaceutical spending. Consequently, payers have started to apply outcome based risk sharing agreements for new medicines recently to improve the quality of health care provision. Our paper aims to review and assess the published financial and outcome based risk sharing methods. Introduction of outcome based risk-sharing schemes can be a major advancement in the drug reimbursement strategy of payers in middle income countries. These schemes can help to reduce the medical uncertainty in coverage

  9. Latent Model Analysis of Substance Use and HIV Risk Behaviors among High-Risk Minority Adults

    ERIC Educational Resources Information Center

    Wang, Min Qi; Matthew, Resa F.; Chiu, Yu-Wen; Yan, Fang; Bellamy, Nikki D.

    2007-01-01

    Objectives: This study evaluated substance use and HIV risk profile using a latent model analysis based on ecological theory, inclusive of a risk and protective factor framework, in sexually active minority adults (N=1,056) who participated in a federally funded substance abuse and HIV prevention health initiative from 2002 to 2006. Methods: Data…

  10. A comparison of radiological risk assessment methods for environmental restoration

    SciTech Connect

    Dunning, D.E. Jr.; Peterson, J.M.

    1993-09-01

    Evaluation of risks to human health from exposure to ionizing radiation at radioactively contaminated sites is an integral part of the decision-making process for determining the need for remediation and selecting remedial actions that may be required. At sites regulated under the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA), a target risk range of 10{sup {minus}4} to 10{sup {minus}6} incremental cancer incidence over a lifetime is specified by the US Environmental Protection Agency (EPA) as generally acceptable, based on the reasonable maximum exposure to any individual under current and future land use scenarios. Two primary methods currently being used in conducting radiological risk assessments at CERCLA sites are compared in this analysis. Under the first method, the radiation dose equivalent (i.e., Sv or rem) to the receptors of interest over the appropriate period of exposure is estimated and multiplied by a risk factor (cancer risk/Sv). Alternatively, incremental cancer risk can be estimated by combining the EPA`s cancer slope factors (previously termed potency factors) for radionuclides with estimates of radionuclide intake by ingestion and inhalation, as well as radionuclide concentrations in soil that contribute to external dose. The comparison of the two methods has demonstrated that resulting estimates of lifetime incremental cancer risk under these different methods may differ significantly, even when all other exposure assumptions are held constant, with the magnitude of the discrepancy depending upon the dominant radionuclides and exposure pathways for the site. The basis for these discrepancies, the advantages and disadvantages of each method, and the significance of the discrepant results for environmental restoration decisions are presented.

  11. Risk Analysis for Resource Planning Optimization

    NASA Technical Reports Server (NTRS)

    Chueng, Kar-Ming

    2008-01-01

    The main purpose of this paper is to introduce a risk management approach that allows planners to quantify the risk and efficiency tradeoff in the presence of uncertainties, and to make forward-looking choices in the development and execution of the plan. Demonstrate a planning and risk analysis framework that tightly integrates mathematical optimization, empirical simulation, and theoretical analysis techniques to solve complex problems.

  12. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 2 2013-07-01 2013-07-01 false Risk analysis. 75.115 Section 75.115 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS AFFAIRS (CONTINUED) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive personal information that is processed...

  13. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 2 2012-07-01 2012-07-01 false Risk analysis. 75.115 Section 75.115 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS AFFAIRS (CONTINUED) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive personal information that is processed...

  14. Risk Analysis and Uncertainty: Implications for Counselling

    ERIC Educational Resources Information Center

    Hassenzahl, David

    2004-01-01

    Over the past two decades, the risk analysis community has made substantial advances in understanding and describing uncertainty. Uncertainty is ubiquitous, complex, both quantitative and qualitative in nature, and often irreducible. Uncertainty thus creates a challenge when using risk analysis to evaluate the rationality of group and individual…

  15. Cumulative Risk Assessment Toolbox: Methods and Approaches for the Practitioner

    PubMed Central

    MacDonell, Margaret M.; Haroun, Lynne A.; Teuschler, Linda K.; Rice, Glenn E.; Hertzberg, Richard C.; Butler, James P.; Chang, Young-Soo; Clark, Shanna L.; Johns, Alan P.; Perry, Camarie S.; Garcia, Shannon S.; Jacobi, John H.; Scofield, Marcienne A.

    2013-01-01

    The historical approach to assessing health risks of environmental chemicals has been to evaluate them one at a time. In fact, we are exposed every day to a wide variety of chemicals and are increasingly aware of potential health implications. Although considerable progress has been made in the science underlying risk assessments for real-world exposures, implementation has lagged because many practitioners are unaware of methods and tools available to support these analyses. To address this issue, the US Environmental Protection Agency developed a toolbox of cumulative risk resources for contaminated sites, as part of a resource document that was published in 2007. This paper highlights information for nearly 80 resources from the toolbox and provides selected updates, with practical notes for cumulative risk applications. Resources are organized according to the main elements of the assessment process: (1) planning, scoping, and problem formulation; (2) environmental fate and transport; (3) exposure analysis extending to human factors; (4) toxicity analysis; and (5) risk and uncertainty characterization, including presentation of results. In addition to providing online access, plans for the toolbox include addressing nonchemical stressors and applications beyond contaminated sites and further strengthening resource accessibility to support evolving analyses for cumulative risk and sustainable communities. PMID:23762048

  16. Carbon Fiber Risk Analysis. [conference

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The scope and status of the effort to assess the risks associated with the accidental release of carbon/graphite fibers from civil aircraft is presented. Vulnerability of electrical and electronic equipment to carbon fibers, dispersal of carbon fibers, effectiveness of filtering systems, impact of fiber induced failures, and risk methodology are among the topics covered.

  17. RISK ASSESSMENT FOR BENEFITS ANALYSIS

    EPA Science Inventory

    Among the important types of information considered in decision making at the U.S. Environmental Protection Agency (EPA) are the outputs of risk assessments and benefit-cost analyses. Risk assessments present estimates of the adverse consequences of exposure to environmental poll...

  18. Nuclear risk analysis of the Ulysses mission

    SciTech Connect

    Bartram, B.W.; Vaughan, F.R. ); Englehart, D.R.W. )

    1991-01-01

    The use of a radioisotope thermoelectric generator fueled with plutonium-238 dioxide on the Space Shuttle-launched Ulysses mission implies some level of risk due to potential accidents. This paper describes the method used to quantify risks in the Ulysses mission Final Safety Analysis Report prepared for the U.S. Department of Energy. The starting point for the analysis described herein is following input of source term probability distributions from the General Electric Company. A Monte Carlo technique is used to develop probability distributions of radiological consequences for a range of accident scenarios thoughout the mission. Factors affecting radiological consequences are identified, the probability distribution of the effect of each factor determined, and the functional relationship among all the factors established. The probability distributions of all the factor effects are then combined using a Monte Carlo technique. The results of the analysis are presented in terms of complementary cumulative distribution functions (CCDF) by mission sub-phase, phase, and the overall mission. The CCDFs show the total probability that consequences (calculated health effects) would be equal to or greater than a given value.

  19. Carbon Fiber Risk Analysis: Conclusions

    NASA Technical Reports Server (NTRS)

    Huston, R. J.

    1979-01-01

    It was concluded that preliminary estimates indicate that the public risk due to accidental release of carbon fiber from air transport aircraft is small. It was also concluded that further work is required to increase confidence in these estimates.

  20. A GIS-based method for flood risk assessment

    NASA Astrophysics Data System (ADS)

    Kalogeropoulos, Kleomenis; Stathopoulos, Nikos; Psarogiannis, Athanasios; Penteris, Dimitris; Tsiakos, Chrisovalantis; Karagiannopoulou, Aikaterini; Krikigianni, Eleni; Karymbalis, Efthimios; Chalkias, Christos

    2016-04-01

    Floods are physical global hazards with negative environmental and socio-economic impacts on local and regional scale. The technological evolution during the last decades, especially in the field of geoinformatics, has offered new advantages in hydrological modelling. This study seeks to use this technology in order to quantify flood risk assessment. The study area which was used is an ungauged catchment and by using mostly GIS hydrological and geomorphological analysis together with a GIS-based distributed Unit Hydrograph model, a series of outcomes have risen. More specifically, this paper examined the behaviour of the Kladeos basin (Peloponnese, Greece) using real rainfall data, as well hypothetical storms. The hydrological analysis held using a Digital Elevation Model of 5x5m pixel size, while the quantitative drainage basin characteristics were calculated and were studied in terms of stream order and its contribution to the flood. Unit Hydrographs are, as it known, useful when there is lack of data and in this work, based on time-area method, a sequences of flood risk assessments have been made using the GIS technology. Essentially, the proposed methodology estimates parameters such as discharge, flow velocity equations etc. in order to quantify flood risk assessment. Keywords Flood Risk Assessment Quantification; GIS; hydrological analysis; geomorphological analysis.

  1. Bridging the two cultures of risk analysis

    SciTech Connect

    Jasanoff, S. )

    1993-04-01

    During the past 15 years, risk analysis has come of age as an interdisciplinary field of remarkable breadth, nurturing connections among fields as diverse as mathematics, biostatistics, toxicology, and engineering on one hand, and law, psychology, sociology, and economics on the other hand. In this editorial, the author addresses the question: What has the presence of social scientists in the network meant to the substantive development of the field of risk analysis The answers offered here discuss the substantial progress in bridging the two cultures of risk analysis. Emphasis is made of the continual need for monitoring risk analysis. Topics include: the micro-worlds of risk assessment; constraining assumptions; and exchange programs. 14 refs.

  2. Initial Risk Analysis and Decision Making Framework

    SciTech Connect

    Engel, David W.

    2012-02-01

    Commercialization of new carbon capture simulation initiative (CCSI) technology will include two key elements of risk management, namely, technical risk (will process and plant performance be effective, safe, and reliable) and enterprise risk (can project losses and costs be controlled within the constraints of market demand to maintain profitability and investor confidence). Both of these elements of risk are incorporated into the risk analysis subtask of Task 7. Thus far, this subtask has developed a prototype demonstration tool that quantifies risk based on the expected profitability of expenditures when retrofitting carbon capture technology on a stylized 650 MW pulverized coal electric power generator. The prototype is based on the selection of specific technical and financial factors believed to be important determinants of the expected profitability of carbon capture, subject to uncertainty. The uncertainty surrounding the technical performance and financial variables selected thus far is propagated in a model that calculates the expected profitability of investments in carbon capture and measures risk in terms of variability in expected net returns from these investments. Given the preliminary nature of the results of this prototype, additional work is required to expand the scope of the model to include additional risk factors, additional information on extant and proposed risk factors, the results of a qualitative risk factor elicitation process, and feedback from utilities and other interested parties involved in the carbon capture project. Additional information on proposed distributions of these risk factors will be integrated into a commercial implementation framework for the purpose of a comparative technology investment analysis.

  3. Risk analysis of computer system designs

    NASA Technical Reports Server (NTRS)

    Vallone, A.

    1981-01-01

    Adverse events during implementation can affect final capabilities, schedule and cost of a computer system even though the system was accurately designed and evaluated. Risk analysis enables the manager to forecast the impact of those events and to timely ask for design revisions or contingency plans before making any decision. This paper presents a structured procedure for an effective risk analysis. The procedure identifies the required activities, separates subjective assessments from objective evaluations, and defines a risk measure to determine the analysis results. The procedure is consistent with the system design evaluation and enables a meaningful comparison among alternative designs.

  4. How important are venue-based HIV risks among male clients of female sex workers? A mixed methods analysis of the risk environment in nightlife venues in Tijuana, Mexico.

    PubMed

    Goldenberg, Shira M; Strathdee, Steffanie A; Gallardo, Manuel; Nguyen, Lucie; Lozada, Remedios; Semple, Shirley J; Patterson, Thomas L

    2011-05-01

    In 2008, 400 males ≥18 years old who paid or traded for sex with a female sex worker (FSW) in Tijuana, Mexico, in the past 4 months completed surveys and HIV/STI testing; 30 also completed qualitative interviews. To analyze environmental sources of HIV vulnerability among male clients of FSWs in Tijuana, we used mixed methods to investigate correlates of clients who met FSWs in nightlife venues and clients' perspectives on venue-based HIV risk. Logistic regression identified micro-level correlates of meeting FSWs in nightlife venues, which were triangulated with clients' narratives regarding macro-level influences. In a multivariate model, offering increased pay for unprotected sex and binge drinking were micro-level factors that were independently associated with meeting FSWs in nightlife venues versus other places. In qualitative interviews, clients characterized nightlife venues as high risk due to the following macro-level features: social norms dictating heavy alcohol consumption; economic exploitation by establishment owners; and poor enforcement of sex work regulations in nightlife venues. Structural interventions in nightlife venues are needed to address venue-based risks. PMID:21396875

  5. Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William

    2009-01-01

    This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).

  6. CUMULATIVE RISK ANALYSIS FOR ORGANOPHOSPHORUS PESTICIDES

    EPA Science Inventory

    Cumulative Risk Analysis for Organophosphorus Pesticides
    R. Woodrow Setzer, Jr. NHEERL MD-74, USEPA, RTP, NC 27711

    The US EPA has recently completed a risk assessment of the effects of exposure to 33 organophosphorous pesticides (OPs) through the diet, water, and resi...

  7. DETERMINING SIGNIFICANT ENDPOINTS FOR ECOLOGICAL RISK ANALYSIS

    EPA Science Inventory

    Risk analyses, both human health and ecological, will be important factors in determining which DOE sites should be cleaned up and in deciding if acceptable performance standards have been met. Risk analysis procedures for humans use the individual as the 'unit' of observation, a...

  8. Risk analysis approach. [of carbon fiber release

    NASA Technical Reports Server (NTRS)

    Huston, R. J.

    1979-01-01

    The assessment of the carbon fiber hazard is outlined. Program objectives, requirements of the risk analysis, and elements associated with the physical phenomena of the accidental release are described.

  9. Advanced reliability method for fatigue analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Wirsching, P. H.

    1984-01-01

    When design factors are considered as random variables and the failure condition cannot be expressed by a closed form algebraic inequality, computations of risk (or probability of failure) may become extremely difficult or very inefficient. This study suggests using a simple and easily constructed second degree polynomial to approximate the complicated limit state in the neighborhood of the design point; a computer analysis relates the design variables at selected points. Then a fast probability integration technique (i.e., the Rackwitz-Fiessler algorithm) can be used to estimate risk. The capability of the proposed method is demonstrated in an example of a low cycle fatigue problem for which a computer analysis is required to perform local strain analysis to relate the design variables. A comparison of the performance of this method is made with a far more costly Monte Carlo solution. Agreement of the proposed method with Monte Carlo is considered to be good.

  10. Risk analysis in bioequivalence and biowaiver decisions.

    PubMed

    Kubbinga, Marlies; Langguth, Peter; Barends, Dirk

    2013-07-01

    This article evaluates the current biowaiver guidance documents published by the FDA, EU and WHO from a risk based perspective. The authors introduce the use of a Failure Mode and Effect Analysis (FMEA) risk calculation tool to show that current regulatory documents implicitly limit the risk for bioinequivalence after granting a biowaiver by reduction of the incidence, improving the detection and limiting the severity of any unforeseen bioinequivalent product. In addition, the authors use the risk calculation to expose yet unexplored options for future extension of comparative in vitro tools for biowaivers. PMID:23280474

  11. RISK ANALYSIS, ANALYSIS OF VARIANCE: GETTING MORE FROM OUR DATA

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Analysis of variance (ANOVA) and regression are common statistical techniques used to analyze agronomic experimental data and determine significant differences among yields due to treatments or other experimental factors. Risk analysis provides an alternate and complimentary examination of the same...

  12. ANALYSIS OF LAMB MORTALITY USING COMPETING RISKS

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A competing risks model was used to describe lamb mortality up to four weeks of age in a composite sheep flock with 8,642 lamb records. Discrete survival methods were applied using sire and animal models. The results indicated that substantial variation exists in the risk of lambs dying from diffe...

  13. Fuzzy risk analysis for nuclear safeguards

    SciTech Connect

    Zardecki, A.

    1993-05-01

    Analysis of a safeguards system, based on the notion of fuzzy sets and linguistic variables, concerns such as complexity and inherent imprecision in estimating the possibility of loss or compromise. The automated risk analysis allows the risk to be determined for an entire system based on estimates for lowest level components and the component proportion. In addition, for each component (asset) the most effective combination of protection mechanisms against a given set of threats is determined. A distinction between bar and featured risk is made.

  14. Fuzzy risk analysis for nuclear safeguards

    SciTech Connect

    Zardecki, A.

    1993-01-01

    Analysis of a safeguards system, based on the notion of fuzzy sets and linguistic variables, concerns such as complexity and inherent imprecision in estimating the possibility of loss or compromise. The automated risk analysis allows the risk to be determined for an entire system based on estimates for lowest level components and the component proportion. In addition, for each component (asset) the most effective combination of protection mechanisms against a given set of threats is determined. A distinction between bar and featured risk is made.

  15. Risk based requirements for long term stewardship: A proof-of-principle analysis of an analytic method tested on selected Hanford locations

    SciTech Connect

    Jarvis, T.T.; Andrews, W.B.; Buck, J.W.

    1998-03-01

    Since 1989, the Department of Energy`s (DOE) Environmental Management (EM) Program has managed the environmental legacy of US nuclear weapons production, research and testing at 137 facilities in 31 states and one US territory. The EM program has conducted several studies on the public risks posed by contaminated sites at these facilities. In Risks and the Risk Debate [DOE, 1995a], the Department analyzed the risks at sites before, during, and after remediation work by the EM program. The results indicated that aside from a few urgent risks, most hazards present little inherent risk because physical and active site management controls limit both the releases of site contaminants, and public access to these hazards. Without these controls, these sites would pose greater risks to the public. Past risk reports, however, provided little information about post-cleanup risk, primarily because of uncertainty about future site uses and site characteristics at the end of planned cleanup activities. This is of concern because in many cases current cleanup technologies, and remedies, will last a shorter period of time than the waste itself and the resulting contamination will remain hazardous.

  16. Risk Based Requirements for Long Term Stewardship: A Proof-of-Principle Analysis of an Analytic Method Tested on Selected Hanford Locations

    SciTech Connect

    GM Gelston; JW Buck; LR Huesties; MS Peffers; TB Miley; TT Jarvis; WB Andrews

    1998-12-03

    Since 1989, the Department of Energy's (DOE) Environmental Management (EM) Program has managed the environmental legacy of US nuclear weapons production research and testing at 137 facilities in 31 states and one US territory. The EM program has conducted several studies on the public risks posed by contaminated sites at these facilities. In Risks and the Risk Debate DOE, 1995a, the Department analyzed the risks at sites before, during, and after remediation work by the EM program. The results indicated that aside from a few urgent risks, most hazards present little inherent risk because physical and active site management controls limit both the releases of site contaminants, and public access to these hazards. Without these controls, these sites would pose greater risks to the public. Past risk reports, however, provided little irdiormation about post- cleanup risk, primarily because of uncertainty about fiture site uses and site characteristics at the end of planned cleanup activities. This is of concern because in many cases current cleanup technologies, and remedies, will last a shorter period of time than the waste itself and the resulting contamination will remain hazardous.

  17. Occupational safety and HIV risk among female sex workers in China: A mixed-methods analysis of sex-work harms and mommies

    PubMed Central

    Yi, Huso; Zheng, Tiantian; Wan, Yanhai; Mantell, Joanne E.; Park, Minah; Csete, Joanne

    2013-01-01

    Female sex workers (FSWs) in China are exposed to multiple work-related harms that increase HIV vulnerability. Using mixed-methods, we explored the social-ecological aspects of sexual risk among 348 FSWs in Beijing. Sex-work harms were assessed by property stolen, being underpaid or not paid at all, verbal and sexual abuse, forced drinking; and forced sex more than once. The majority (90%) reported at least one type of harm, 38% received harm protection from ‘mommies’ (i.e., managers) and 32% reported unprotected sex with clients. In multivariate models, unprotected sex was significantly associated with longer involvement in sex work, greater exposure to harms, and no protection from mommies. Mommies’ protection moderated the effect of sex-work harms on unprotected sex with clients. Our ethnography indicated that mommies played a core role in sex-work networks. Such networks provide a basis for social capital; they are not only profitable economically, but also protect FSWs from sex-work harms. Effective HIV prevention interventions for FSWs in China must address the occupational safety and health of FSWs by facilitating social capital and protection agency (e.g., mommies) in the sex-work industry. PMID:22375698

  18. Communication Network Analysis Methods.

    ERIC Educational Resources Information Center

    Farace, Richard V.; Mabee, Timothy

    This paper reviews a variety of analytic procedures that can be applied to network data, discussing the assumptions and usefulness of each procedure when applied to the complexity of human communication. Special attention is paid to the network properties measured or implied by each procedure. Factor analysis and multidimensional scaling are among…

  19. Complementing Gender Analysis Methods.

    PubMed

    Kumar, Anant

    2016-01-01

    The existing gender analysis frameworks start with a premise that men and women are equal and should be treated equally. These frameworks give emphasis on equal distribution of resources between men and women and believe that this will bring equality which is not always true. Despite equal distribution of resources, women tend to suffer and experience discrimination in many areas of their lives such as the power to control resources within social relationships, and the need for emotional security and reproductive rights within interpersonal relationships. These frameworks believe that patriarchy as an institution plays an important role in women's oppression, exploitation, and it is a barrier in their empowerment and rights. Thus, some think that by ensuring equal distribution of resources and empowering women economically, institutions like patriarchy can be challenged. These frameworks are based on proposed equality principle which puts men and women in competing roles. Thus, the real equality will never be achieved. Contrary to the existing gender analysis frameworks, the Complementing Gender Analysis framework proposed by the author provides a new approach toward gender analysis which not only recognizes the role of economic empowerment and equal distribution of resources but suggests to incorporate the concept and role of social capital, equity, and doing gender in gender analysis which is based on perceived equity principle, putting men and women in complementing roles that may lead to equality. In this article the author reviews the mainstream gender theories in development from the viewpoint of the complementary roles of gender. This alternative view is argued based on existing literature and an anecdote of observations made by the author. While criticizing the equality theory, the author offers equity theory in resolving the gender conflict by using the concept of social and psychological capital. PMID:25941756

  20. Starlink corn: a risk analysis.

    PubMed Central

    Bucchini, Luca; Goldman, Lynn R

    2002-01-01

    Modern biotechnology has dramatically increased our ability to alter the agronomic traits of plants. Among the novel traits that biotechnology has made available, an important group includes Bacillus thuringiensis-derived insect resistance. This technology has been applied to potatoes, cotton, and corn. Benefits of Bt crops, and biotechnology generally, can be realized only if risks are assessed and managed properly. The case of Starlink corn, a plant modified with a gene that encodes the Bt protein Cry9c, was a severe test of U.S. regulatory agencies. The U.S. Environmental Protection Agency had restricted its use to animal feed due to concern about the potential for allergenicity. However, Starlink corn was later found throughout the human food supply, resulting in food recalls by the Food and Drug Administration and significant disruption of the food supply. Here we examine the regulatory history of Starlink, the assessment framework employed by the U.S. government, assumptions and information gaps, and the key elements of government efforts to manage the product. We explore the impacts on regulations, science, and society and conclude that only significant advances in our understanding of food allergies and improvements in monitoring and enforcement will avoid similar events in the future. Specifically, we need to develop a stronger fundamental basis for predicting allergic sensitization and reactions if novel proteins are to be introduced in this fashion. Mechanisms are needed to assure that worker and community aeroallergen risks are considered. Requirements are needed for the development of valid assays so that enforcement and post market surveillance activities can be conducted. PMID:11781159

  1. Fuzzy Comprehensive Evaluation Method Applied in the Real Estate Investment Risks Research

    NASA Astrophysics Data System (ADS)

    ML(Zhang Minli), Zhang; Wp(Yang Wenpo), Yang

    Real estate investment is a high-risk and high returned of economic activity, the key of real estate analysis is the identification of their types of investment risk and the risk of different types of effective prevention. But, as the financial crisis sweeping the world, the real estate industry also faces enormous risks, how effective and correct evaluation of real estate investment risks becomes the multitudinous scholar concern[1]. In this paper, real estate investment risks were summarized and analyzed, and comparative analysis method is discussed and finally presented fuzzy comprehensive evaluation method, not only in theory has the advantages of science, in the application also has the reliability, for real estate investment risk assessment provides an effective means for investors in real estate investing guidance on risk factors and forecasts.

  2. Risk Assessment and Integration Team (RAIT) Portfolio Risk Analysis Strategy

    NASA Technical Reports Server (NTRS)

    Edwards, Michelle

    2010-01-01

    Impact at management level: Qualitative assessment of risk criticality in conjunction with risk consequence, likelihood, and severity enable development of an "investment policy" towards managing a portfolio of risks. Impact at research level: Quantitative risk assessments enable researchers to develop risk mitigation strategies with meaningful risk reduction results. Quantitative assessment approach provides useful risk mitigation information.

  3. Preliminary Technical Risk Analysis for the Geothermal Technologies Program

    SciTech Connect

    2009-01-18

    This report explains the goals, methods, and results of a probabilistic analysis of technical risk for a portfolio of R&D projects in the DOE Geothermal Technologies Program (The Program). The analysis is a task by Princeton Energy Resources International, LLC, in support of the National Renewable Energy Laboratory on behalf of the Program. The main challenge in the analysis lies in translating R&D results to a quantitative reflection of technical risk for a key Program metric: levelized cost of energy (LCOE).

  4. A risk analysis model for radioactive wastes.

    PubMed

    Külahcı, Fatih

    2011-07-15

    Hazardous wastes affect natural environmental systems to a significant extend, and therefore, it is necessary to control their harm through risk analysis. Herein, an effective risk methodology is proposed by considering their uncertain behaviors on stochastic, statistical and probabilistic bases. The basic element is attachment of a convenient probability distribution function (pdf) to a given waste quality measurement sequence. In this paper, (40)K contaminant measurements are adapted for risk assessment application after derivation of necessary fundamental formulations. The spatial contaminant distribution of (40)K is presented in the forms of maps and three-dimensional surfaces. PMID:21571428

  5. Risk-based methods applicable to ranking conceptual designs

    SciTech Connect

    Breeding, R.J.; Ortiz, K.; Ringland, J.T.; Lim, J.J.

    1993-11-01

    In Ginichi Taguchi`s latest book on quality engineering, an emphasis is placed on robust design processes in which quality engineering techniques are brought ``upstream,`` that is, they are utilized as early as possible, preferably in the conceptual design stage. This approach was used in a study of possible future safety system designs for weapons. As an experiment, a method was developed for using probabilistic risk analysis (PRA) techniques to rank conceptual designs for performance against a safety metric for ultimate incorporation into a Pugh matrix evaluation. This represents a high-level UW application of PRA methods to weapons. As with most conceptual designs, details of the implementation were not yet developed; many of the components had never been built, let alone tested. Therefore, our application of risk assessment methods was forced to be at such a high level that the entire evaluation could be performed on a spreadsheet. Nonetheless, the method produced numerical estimates of safety in a manner that was consistent, reproducible, and scrutable. The results enabled us to rank designs to identify areas where returns on research efforts would be the greatest. The numerical estimates were calibrated against what is achievable by current weapon safety systems. The use of expert judgement is inescapable, but these judgements are explicit and the method is easily implemented on an spreadsheet computer program.

  6. State of the art in benefit-risk analysis: medicines.

    PubMed

    Luteijn, J M; White, B C; Gunnlaugsdóttir, H; Holm, F; Kalogeras, N; Leino, O; Magnússon, S H; Odekerken, G; Pohjola, M V; Tijhuis, M J; Tuomisto, J T; Ueland, Ø; McCarron, P A; Verhagen, H

    2012-01-01

    Benefit-risk assessment in medicine has been a valuable tool in the regulation of medicines since the 1960s. Benefit-risk assessment takes place in multiple stages during a medicine's life-cycle and can be conducted in a variety of ways, using methods ranging from qualitative to quantitative. Each benefit-risk assessment method is subject to its own specific strengths and limitations. Despite its widespread and long-time use, benefit-risk assessment in medicine is subject to debate and suffers from a number of limitations and is currently still under development. This state of the art review paper will discuss the various aspects and approaches to benefit-risk assessment in medicine in a chronological pathway. The review will discuss all types of benefit-risk assessment a medicinal product will undergo during its lifecycle, from Phase I clinical trials to post-marketing surveillance and health technology assessment for inclusion in public formularies. The benefit-risk profile of a drug is dynamic and differs for different indications and patient groups. In the end of this review we conclude benefit-risk analysis in medicine is a developed practice that is subject to continuous improvement and modernisation. Improvement not only in methodology, but also in cooperation between organizations can improve benefit-risk assessment. PMID:21683115

  7. Seismic vulnerability assessments in risk analysis

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Larionov, Valery; Bonnin, Jean; Ugarov, Alexander

    2013-04-01

    The assessment of seismic vulnerability is a critical issue within natural and technological risk analysis. In general, there are three common types of methods used for development of vulnerability functions of different elements at risk: empirical, analytical and expert estimations. The paper addresses the empirical methods for seismic vulnerability estimation for residential buildings and industrial facilities. The results of engineering analysis of past earthquake consequences, as well as the statistical data on buildings behavior during strong earthquakes presented in the different seismic intensity scales, are used to verify the regional parameters of mathematical models in order to simulate physical and economic vulnerability for different building types classified according to seismic scale MMSK-86. Verified procedure has been used to estimate the physical and economic vulnerability of buildings and constructions against earthquakes for the Northern Caucasus Federal region of the Russian Federation and Krasnodar area, which are characterized by rather high level of seismic activity and high population density. In order to estimate expected damage states to buildings and constructions in the case of the earthquakes according to the OSR-97B (return period T=1,000 years) within big cities and towns, they were divided into unit sites and their coordinates were presented as dots located in the centers of unit sites. Then the indexes obtained for each unit site were summed up. The maps of physical vulnerability zoning for Northern Caucasus Federal region of the Russian Federation and Krasnodar area includes two elements: percent of different damage states for settlements with number of inhabitants less than 1,000 and vulnerability for cities and towns with number of inhabitants more than 1,000. The hypsometric scale is used to represent both elements on the maps. Taking into account the size of oil pipe line systems located in the highly active seismic zones in

  8. Traditional Methods for Mineral Analysis

    NASA Astrophysics Data System (ADS)

    Ward, Robert E.; Carpenter, Charles E.

    This chapter describes traditional methods for analysis of minerals involving titrimetric and colorimetric procedures, and the use of ion selective electrodes. Other traditional methods of mineral analysis include gravimetric titration (i.e., insoluble forms of minerals are precipitated, rinse, dried, and weighed) and redox reactions (i.e., mineral is part of an oxidation-reduction reaction, and product is quantitated). However, these latter two methods will not be covered because they currently are used little in the food industry. The traditional methods that will be described have maintained widespread usage in the food industry despite the development of more modern instrumentation such as atomic absorption spectroscopy and inductively coupled plasma-atomic emission spectroscopy (Chap. 24). Traditional methods generally require chemicals and equipment that are routinely available in an analytical laboratory and are within the experience of most laboratory technicians. Additionally, traditional methods often form the basis for rapid analysis kits (e.g., Quantab®; for salt determination) that are increasingly in demand. Procedures for analysis of minerals of major nutritional or food processing concern are used for illustrative purposes. For additional examples of traditional methods refer to references (1-6). Slight modifications of these traditional methods are often needed for specific foodstuffs to minimize interferences or to be in the range of analytical performance. For analytical requirements for specific foods see the Official Methods of Analysis of AOAC International (5) and related official methods (6).

  9. Risk analysis for critical asset protection.

    PubMed

    McGill, William L; Ayyub, Bilal M; Kaminskiy, Mark

    2007-10-01

    This article proposes a quantitative risk assessment and management framework that supports strategic asset-level resource allocation decision making for critical infrastructure and key resource protection. The proposed framework consists of five phases: scenario identification, consequence and criticality assessment, security vulnerability assessment, threat likelihood assessment, and benefit-cost analysis. Key innovations in this methodology include its initial focus on fundamental asset characteristics to generate an exhaustive set of plausible threat scenarios based on a target susceptibility matrix (which we refer to as asset-driven analysis) and an approach to threat likelihood assessment that captures adversary tendencies to shift their preferences in response to security investments based on the expected utilities of alternative attack profiles assessed from the adversary perspective. A notional example is provided to demonstrate an application of the proposed framework. Extensions of this model to support strategic portfolio-level analysis and tactical risk analysis are suggested. PMID:18076495

  10. Probabilistic methods for rotordynamics analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.

    1991-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.

  11. Risk determination method for accidental water basin contamination based on risk source coupling with sensitive targets.

    PubMed

    Li, Zongfeng; Zeng, Bo; Zhou, Tinggang; Li, Guowei; Zhu, Xiaobo

    2016-01-01

    Accidental water basin pollution seriously threatens human health and ecological security, but rapid, effective methods for evaluating this threat are lacking. This paper aims to develop a risk evaluation method for basin accidents by coupling the risk source with sensitive targets to evaluate the zone accident risk levels of basins and prevent the accidental environmental pollution of water. This method incorporates the interplay between risk sources and sensitive targets by evaluating the zone risk levels of water environments from different sources, effectiveness of the risk source control mechanisms, vulnerability of sensitive targets and spatial and temporal relationships between these sources and targets. Using the Three Gorges Reservoir region as an example, a risk system for water basin pollution incidents consisting of a risk indicator quantification system, a risk zoning method and a verification method for the zoning results is developed and implemented. The results were verified in a field investigation, which showed that the risk zoning model provides rapid, effective and reliable zoning results. This research method could serve as a theoretical reference and technological support for evaluating water basin accident risks. Furthermore, the results are useful for evaluating and protecting the aquatic environments in the Three Gorges Reservoir region. PMID:26207430

  12. NATO PILOT STUDY ON ADVANCED CANCER RISK ASSESSMENT METHODS

    EPA Science Inventory

    NCEA scientists are participating in a study of advanced cancer risk assessment methods, conducted under the auspices of NATO's Committee on the Challenges of Modern Society. The product will be a book of case studies that illustrate advanced cancer risk assessment methods, avail...

  13. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    SciTech Connect

    Lowry, Peter P.; Wagner, Katie A.

    2015-04-01

    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish a lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.

  14. A Method for Dynamic Risk Assessment and Management of Rockbursts in Drill and Blast Tunnels

    NASA Astrophysics Data System (ADS)

    Liu, Guo-Feng; Feng, Xia-Ting; Feng, Guang-Liang; Chen, Bing-Rui; Chen, Dong-Fang; Duan, Shu-Qian

    2016-08-01

    Focusing on the problems caused by rockburst hazards in deep tunnels, such as casualties, damage to construction equipment and facilities, construction schedule delays, and project cost increase, this research attempts to present a methodology for dynamic risk assessment and management of rockbursts in D&B tunnels. The basic idea of dynamic risk assessment and management of rockbursts is determined, and methods associated with each step in the rockburst risk assessment and management process are given, respectively. Among them, the main parts include a microseismic method for early warning the occurrence probability of rockburst risk, an estimation method that aims to assess potential consequences of rockburst risk, an evaluation method that utilizes a new quantitative index considering both occurrence probability and consequences for determining the level of rockburst risk, and the dynamic updating. Specifically, this research briefly describes the referenced microseismic method of warning rockburst, but focuses on the analysis of consequences and associated risk assessment and management of rockburst. Using the proposed method of risk assessment and management of rockburst, the occurrence probability, potential consequences, and the level of rockburst risk can be obtained in real-time during tunnel excavation, which contributes to the dynamic optimisation of risk mitigation measures and their application. The applicability of the proposed method has been verified by those cases from the Jinping II deep headrace and water drainage tunnels at depths of 1900-2525 m (with a length of 11.6 km in total for D&B tunnels).

  15. 14 CFR Appendix C to Part 420 - Risk Analysis

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Risk Analysis C Appendix C to Part 420... TRANSPORTATION LICENSING LICENSE TO OPERATE A LAUNCH SITE Pt. 420, App. C Appendix C to Part 420—Risk Analysis (a... risk is minimal. (2) An applicant shall perform a risk analysis when a populated area is located...

  16. 14 CFR Appendix C to Part 420 - Risk Analysis

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Risk Analysis C Appendix C to Part 420... TRANSPORTATION LICENSING LICENSE TO OPERATE A LAUNCH SITE Pt. 420, App. C Appendix C to Part 420—Risk Analysis (a... risk is minimal. (2) An applicant shall perform a risk analysis when a populated area is located...

  17. 14 CFR Appendix C to Part 420 - Risk Analysis

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Risk Analysis C Appendix C to Part 420... TRANSPORTATION LICENSING LICENSE TO OPERATE A LAUNCH SITE Pt. 420, App. C Appendix C to Part 420—Risk Analysis (a... risk is minimal. (2) An applicant shall perform a risk analysis when a populated area is located...

  18. 14 CFR Appendix C to Part 420 - Risk Analysis

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Risk Analysis C Appendix C to Part 420... TRANSPORTATION LICENSING LICENSE TO OPERATE A LAUNCH SITE Pt. 420, App. C Appendix C to Part 420—Risk Analysis (a... risk is minimal. (2) An applicant shall perform a risk analysis when a populated area is located...

  19. 14 CFR Appendix C to Part 420 - Risk Analysis

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Risk Analysis C Appendix C to Part 420... TRANSPORTATION LICENSING LICENSE TO OPERATE A LAUNCH SITE Pt. 420, App. C Appendix C to Part 420—Risk Analysis (a... risk is minimal. (2) An applicant shall perform a risk analysis when a populated area is located...

  20. Risk analysis for confined space entries: Critical analysis of four tools applied to three risk scenarios.

    PubMed

    Burlet-Vienney, Damien; Chinniah, Yuvin; Bahloul, Ali; Roberge, Brigitte

    2016-06-01

    Investigation reports of fatal confined space accidents nearly always point to a problem of identifying or underestimating risks. This paper compares 4 different risk analysis tools developed for confined spaces by applying them to 3 hazardous scenarios. The tools were namely 1. a checklist without risk estimation (Tool A), 2. a checklist with a risk scale (Tool B), 3. a risk calculation without a formal hazard identification stage (Tool C), and 4. a questionnaire followed by a risk matrix (Tool D). Each tool's structure and practical application were studied. Tools A and B gave crude results comparable to those of more analytic tools in less time. Their main limitations were lack of contextual information for the identified hazards and greater dependency on the user's expertise and ability to tackle hazards of different nature. Tools C and D utilized more systematic approaches than tools A and B by supporting risk reduction based on the description of the risk factors. Tool D is distinctive because of 1. its comprehensive structure with respect to the steps suggested in risk management, 2. its dynamic approach to hazard identification, and 3. its use of data resulting from the risk analysis. PMID:26864350

  1. Method for Analyzing District Level IAI Data Bases to Identify Learning Opportunity Risks.

    ERIC Educational Resources Information Center

    Milazzo, Patricia; And Others

    A learning opportunity risk is defined as an absence of instruction or insufficient attention to proficiency at an early grade of instruction in a subject matter which will generate serious learning problems in later grades. A method for identifying such risks has been derived from analysis of district-level Instructional Accomplishment…

  2. 78 FR 25440 - Request for Information and Citations on Methods for Cumulative Risk Assessment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-01

    ...The United States Environmental Protection Agency (EPA) is requesting information and citations on approaches and methods for the planning, analysis, assessment, and characterization of cumulative risks to human populations and the environment. The EPA is developing guidelines for the assessment of cumulative risk as defined and characterized in the EPA 2003 publication Framework for......

  3. Key Attributes of the SAPHIRE Risk and Reliability Analysis Software for Risk-Informed Probabilistic Applications

    SciTech Connect

    Curtis Smith; James Knudsen; Kellie Kvarfordt; Ted Wood

    2008-08-01

    The Idaho National Laboratory is a primary developer of probabilistic risk and reliability analysis (PRRA) tools, dating back over 35 years. Evolving from mainframe-based software, the current state-of-the-practice has lead to the creation of the SAPHIRE software. Currently, agencies such as the Nuclear Regulatory Commission, the National Aeronautics and Aerospace Agency, the Department of Energy, and the Department of Defense use version 7 of the SAPHIRE software for many of their risk-informed activities. In order to better understand and appreciate the power of software as part of risk-informed applications, we need to recall that our current analysis methods and solution methods have built upon pioneering work done 30 to 40 years ago. We contrast this work with the current capabilities in the SAPHIRE analysis package. As part of this discussion, we provide information for both the typical features and special analysis capabilities which are available. We also present the application and results typically found with state-of-the-practice PRRA models. By providing both a high-level and detailed look at the SAPHIRE software, we give a snapshot in time for the current use of software tools in a risk-informed decision arena.

  4. Impact of NDE reliability developments on risk-informed methods

    SciTech Connect

    Walker, S.M.; Ammirato, F.V.

    1996-12-01

    Risk informed inspection procedures are being developed to more effectively and economically manage degradation in plant piping systems. A key element of this process is applying nondestructive examination (NDE) procedures capable of detecting specific damage mechanisms that may be operative in particular locations. Thus, the needs of risk informed analysis are closely coupled with a firm understanding of the capability of NDE.

  5. Reliability/Risk Methods and Design Tools for Application in Space Programs

    NASA Technical Reports Server (NTRS)

    Townsend, John S.; Smart, Christian

    1999-01-01

    Since 1984 NASA has funded several major programs to develop Reliability/Risk Methods and tools for engineers to apply in the design and assessment of aerospace hardware. Two probabilistic software tools that show great promise for practical application are the finite element code NESSUS and the system risk analysis code QRAS. This paper examines NASA's past, present, and future directions in reliability and risk engineering applications, Both the NESSUS and QRAS software tools are detailed.

  6. Comparing risks from low-level radioactive waste disposal on land and in the ocean: A review of agreements/statutes, scenarios, processing/packaging/disposal technologies, models, and decision analysis methods

    SciTech Connect

    Moskowitz, P.D.; Kalb, P.D.; Morris, S.C.; Rowe, M.D. ); Marietta, M. ); Anspaugh, L; McKone, T. )

    1989-10-01

    This report gives background information on: The history of LLW disposal in the US; agreements, statutes and regulations for the disposal of LLW; disposal scenarios and alternative treatment options for LLW; methods and models which could be used to assess and compare risks associated with land and ocean options for LLW disposal; technical and methodological issues associated with comparing risks of different options; and, roles of decision making approaches in comparing risks across media. 63 refs., 23 figs., 33 tabs.

  7. Integrated Reliability and Risk Analysis System (IRRAS)

    SciTech Connect

    Russell, K D; McKay, M K; Sattison, M.B. Skinner, N.L.; Wood, S T; Rasmuson, D M

    1992-01-01

    The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the user the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification. Version 1.0 of the IRRAS program was released in February of 1987. Since that time, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 4.0 and is the subject of this Reference Manual. Version 4.0 of IRRAS provides the same capabilities as Version 1.0 and adds a relational data base facility for managing the data, improved functionality, and improved algorithm performance.

  8. Risk Analysis for Environmental Health Triage

    SciTech Connect

    Bogen, K T

    2005-11-18

    The Homeland Security Act mandates development of a national, risk-based system to support planning for, response to and recovery from emergency situations involving large-scale toxic exposures. To prepare for and manage consequences effectively, planners and responders need not only to identify zones of potentially elevated individual risk, but also to predict expected casualties. Emergency response support systems now define ''consequences'' by mapping areas in which toxic chemical concentrations do or may exceed Acute Exposure Guideline Levels (AEGLs) or similar guidelines. However, because AEGLs do not estimate expected risks, current unqualified claims that such maps support consequence management are misleading. Intentionally protective, AEGLs incorporate various safety/uncertainty factors depending on scope and quality of chemical-specific toxicity data. Some of these factors are irrelevant, and others need to be modified, whenever resource constraints or exposure-scenario complexities require responders to make critical trade-off (triage) decisions in order to minimize expected casualties. AEGL-exceedance zones cannot consistently be aggregated, compared, or used to calculate expected casualties, and so may seriously misguide emergency response triage decisions. Methods and tools well established and readily available to support environmental health protection are not yet developed for chemically related environmental health triage. Effective triage decisions involving chemical risks require a new assessment approach that focuses on best estimates of likely casualties, rather than on upper plausible bounds of individual risk. If risk-based consequence management is to become a reality, federal agencies tasked with supporting emergency response must actively coordinate to foster new methods that can support effective environmental health triage.

  9. Risk and value analysis of SETI

    NASA Technical Reports Server (NTRS)

    Billingham, J.

    1990-01-01

    This paper attempts to apply a traditional risk and value analysis to the Search for Extraterrestrial Intelligence--SETI. In view of the difficulties of assessing the probability of success, a comparison is made between SETI and a previous search for extraterrestrial life, the biological component of Project Viking. Our application of simple Utility Theory, given some reasonable assumptions, suggests that SETI is at least as worthwhile as the biological experiment on Viking.

  10. Risk and value analysis of SETI.

    PubMed

    Billingham, J

    1990-01-01

    This paper attempts to apply a traditional risk and value analysis to the Search for Extraterrestrial Intelligence--SETI. In view of the difficulties of assessing the probability of success, a comparison is made between SETI and a previous search for extraterrestrial life, the biological component of Project Viking. Our application of simple Utility Theory, given some reasonable assumptions, suggests that SETI is at least as worthwhile as the biological experiment on Viking. PMID:11538075

  11. 14 CFR 417.225 - Debris risk analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Debris risk analysis. 417.225 Section 417... OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.225 Debris risk analysis. A flight safety analysis must demonstrate that the risk to the public potentially exposed to inert...

  12. 14 CFR 417.225 - Debris risk analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Debris risk analysis. 417.225 Section 417... OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.225 Debris risk analysis. A flight safety analysis must demonstrate that the risk to the public potentially exposed to inert...

  13. 14 CFR 417.225 - Debris risk analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Debris risk analysis. 417.225 Section 417... OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.225 Debris risk analysis. A flight safety analysis must demonstrate that the risk to the public potentially exposed to inert...

  14. 14 CFR 417.225 - Debris risk analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Debris risk analysis. 417.225 Section 417... OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.225 Debris risk analysis. A flight safety analysis must demonstrate that the risk to the public potentially exposed to inert...

  15. 14 CFR 417.225 - Debris risk analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Debris risk analysis. 417.225 Section 417... OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.225 Debris risk analysis. A flight safety analysis must demonstrate that the risk to the public potentially exposed to inert...

  16. Fire Risk Analysis for Armenian NPP Confinement

    SciTech Connect

    Poghosyan, Shahen; Malkhasyan, Albert; Bznuni, Surik; Amirjanyan, Armen

    2006-07-01

    Major fire occurred at Armenian NPP (ANPP) in October 1982 showed that fire-induced initiating events (IE) can have dominant contribution in overall risk of core damage. Probabilistic Safety Assessment study for fire-induced initiating events for ANPP was initiated in 2002. Analysis was performed for compartments fires in which could result in failure of components which are necessary for reactor cold shutdown. Analysis shows that main risk from fire at ANPP is conditioned by fire in cable tunnels 61-64. Meanwhile fire in confinement compartments don't have significant contribution to overall risk of core damage. The exception is so called 'confinement valves compartment' (room no.A-013/2) fire (more than 7.5% of CDF) in which fire could result in the loss of coolant accident with unavailability of primary makeup system, which directly leads to core damage. Detailed analysis of this problem that is common for typical WWER-440/230 reactors with no hermetic MCPs and recommendations for solution are presented in this paper. (authors)

  17. Preliminary Technical Risk Analysis for the Geothermal Technologies Program

    SciTech Connect

    McVeigh, J.; Cohen, J.; Vorum, M.; Porro, G.; Nix, G.

    2007-03-01

    This report explains the goals, methods, and results of a probabilistic analysis of technical risk for a portfolio of R&D projects in the DOE Geothermal Technologies Program ('the Program'). The analysis is a task by Princeton Energy Resources International, LLC (PERI), in support of the National Renewable Energy Laboratory (NREL) on behalf of the Program. The main challenge in the analysis lies in translating R&D results to a quantitative reflection of technical risk for a key Program metric: levelized cost of energy (LCOE). This requires both computational development (i.e., creating a spreadsheet-based analysis tool) and a synthesis of judgments by a panel of researchers and experts of the expected results of the Program's R&D.

  18. Analysis of exogenous components of mortality risks.

    PubMed

    Blinkin, V L

    1998-04-01

    A new technique for deriving exogenous components of mortality risks from national vital statistics has been developed. Each observed death rate Dij (where i corresponds to calendar time (year or interval of years) and j denotes the number of corresponding age group) was represented as Dij = Aj + BiCj, and unknown quantities Aj, Bi, and Cj were estimated by a special procedure using the least-squares principle. The coefficients of variation do not exceed 10%. It is shown that the term Aj can be interpreted as the endogenous and the second term BiCj as the exogenous components of the death rate. The aggregate of endogenous components Aj can be described by a regression function, corresponding to the Gompertz-Makeham law, A(tau) = gamma + beta x e alpha tau, where gamma, beta, and alpha are constants, tau is age, A(tau) [symbol: see text] tau = tau j identical to A(tau j) identical to Aj and tau j is the value of age tau in jth age group. The coefficients of variation for such a representation does not exceed 4%. An analysis of exogenous risk levels in the Moscow and Russian populations during 1980-1995 shows that since 1992 all components of exogenous risk in the Moscow population had been increasing up to 1994. The greatest contribution to the total level of exogenous risk was lethal diseases, and their death rate was 387 deaths per 100,000 persons in 1994, i.e., 61.9% of all deaths. The dynamics of exogenous mortality risk change during 1990-1994 in the Moscow population and in the Russian population without Moscow had been identical: the risk had been increasing and its value in the Russian population had been higher than that in the Moscow population. PMID:9637078

  19. Predicted 10-year risk of cardiovascular disease is influenced by the risk equation adopted: a cross-sectional analysis

    PubMed Central

    Gray, Benjamin J; Bracken, Richard M; Turner, Daniel; Morgan, Kerry; Mellalieu, Stephen D; Thomas, Michael; Williams, Sally P; Williams, Meurig; Rice, Sam; Stephens, Jeffrey W

    2014-01-01

    Background Validated risk equations are currently recommended to assess individuals to determine those at ‘high risk’ of cardiovascular disease (CVD). However, there is no longer a risk ‘equation of choice’. Aim This study examined the differences between four commonly-used CVD risk equations. Design and setting Cross-sectional analysis of individuals who participated in a workplace-based risk assessment in Carmarthenshire, south Wales. Method Analysis of 790 individuals (474 females, 316 males) with no prior diagnosis of CVD or diabetes. Ten-year CVD risk was predicted by entering the relevant variables into the QRISK2, Framingham Lipids, Framingham BMI, and JBS2 risk equations. Results The Framingham BMI and JBS2 risk equations predicted a higher absolute risk than the QRISK2 and Framingham Lipids equations, and CVD risk increased concomitantly with age irrespective of which risk equation was adopted. Only a small proportion of females (0–2.1%) were predicted to be at high risk of developing CVD using any of the risk algorithms. The proportion of males predicted at high risk ranged from 5.4% (QRISK2) to 20.3% (JBS2). After age stratification, few differences between isolated risk factors were observed in males, although a greater proportion of males aged ≥50 years were predicted to be at ‘high risk’ independent of risk equation used. Conclusions Different risk equations can influence the predicted 10-year CVD risk of individuals. More males were predicted at ‘high risk’ using the JBS2 or Framingham BMI equations. Consideration should also be given to the number of isolated risk factors, especially in younger adults when evaluating CVD risk. PMID:25267049

  20. Risk analysis by FMEA as an element of analytical validation.

    PubMed

    van Leeuwen, J F; Nauta, M J; de Kaste, D; Odekerken-Rombouts, Y M C F; Oldenhof, M T; Vredenbregt, M J; Barends, D M

    2009-12-01

    We subjected a Near-Infrared (NIR) analytical procedure used for screening drugs on authenticity to a Failure Mode and Effects Analysis (FMEA), including technical risks as well as risks related to human failure. An FMEA team broke down the NIR analytical method into process steps and identified possible failure modes for each step. Each failure mode was ranked on estimated frequency of occurrence (O), probability that the failure would remain undetected later in the process (D) and severity (S), each on a scale of 1-10. Human errors turned out to be the most common cause of failure modes. Failure risks were calculated by Risk Priority Numbers (RPNs)=O x D x S. Failure modes with the highest RPN scores were subjected to corrective actions and the FMEA was repeated, showing reductions in RPN scores and resulting in improvement indices up to 5.0. We recommend risk analysis as an addition to the usual analytical validation, as the FMEA enabled us to detect previously unidentified risks. PMID:19640668

  1. Incorporating Classification Uncertainty in Competing- risks Nest- failure Analysis

    EPA Science Inventory

    Nesting birds risk nest failure due to many causes. Though partitioning risk of failure among causes has long been of interest to ornithologists, formal methods for estimating competing risk have been lacking.

  2. A meta-analysis on depression and subsequent cancer risk

    PubMed Central

    2007-01-01

    Background The authors tested the hypothesis that depression is a possible factor influencing the course of cancer by reviewing prospective epidemiological studies and calculating summary relative risks. Methods Studies were identified by computerized searches of Medline, Embase and PsycINFO. as well as manual searches of reference lists of selected publications. Inclusion criteria were cohort design, population-based sample, structured measurement of depression and outcome of cancer known for depressed and non-depressed subjects Results Thirteen eligible studies were identified. Based on eight studies with complete crude data on overall cancer, our summary relative risk (95% confidence interval) was 1.19 (1.06–1.32). After adjustment for confounders we pooled a summary relative risk of 1.12 (0.99–1.26). No significant association was found between depression and subsequent breast cancer risk, based on seven heterogeneous studies, with or without adjustment for possible confounders. Subgroup analysis of studies with a follow-up of ten years or more, however, resulted in a statistically significant summary relative risk of 2.50 (1.06–5.91). No significant associations were found for lung, colon or prostate cancer. Conclusion This review suggests a tendency towards a small and marginally significant association between depression and subsequent overall cancer risk and towards a stronger increase of breast cancer risk emerging many years after a previous depression. PMID:18053168

  3. Conceptual issues with risk analysis in Switzerland

    NASA Astrophysics Data System (ADS)

    Nicolet, Pierrick; Jaboyedoff, Michel; Lévy, Sébastien

    2015-04-01

    Risk analysis is a tricky procedure, where one can easily make mistakes. Indeed, although risk equations are rather general, transferring a methodology to another context or hazard type can often lead to inaccuracies or even significant errors. To illustrate this, common mistakes made with the Swiss methodology are presented, together with possible solutions. This includes the following: Risk analysis for moving objects only takes the process dimension into account (e.g. the length of a road section potentially affected by a landslide), but not the object dimension (e.g. the cars length). This is a fair simplification as long as the object dimension is considerably smaller than the process dimension. However, when the object is large compared to the process (e.g. rockfalls on a train), the results will be wrong. This problem can be illustrated by considering two blocs. According to this methodology a 1 m diameter bloc will be twice more susceptible to reach a train than a 50 cm bloc. This is obviously not correct. When it comes to rockfalls risk analysis on roads or railway found in the literature, the bloc dimension is usually neglected, in favour of the object dimension, which is a fair assumption in this context. However, it is possible to include both dimensions by using the sum of the lengths instead of one of them. Risk analysis is usually performed using 3 different scenarios, for 3 different ranges of return periods, namely 1-30, 30-100 and 100-300 years. In order to be conservative, the operator commonly considers the magnitude of the worst event that happens with a return period included between the class bounds, which means that the operator evaluates the magnitude reached or overpassed with a return period of 30, 100 and 300 years respectively. Then, since the magnitude corresponds to the upper bounds of the classes, risk is calculated using the frequency corresponding to these return periods and not to the middle of the class (and also subtracting the

  4. Sensitivity Analysis of Launch Vehicle Debris Risk Model

    NASA Technical Reports Server (NTRS)

    Gee, Ken; Lawrence, Scott L.

    2010-01-01

    As part of an analysis of the loss of crew risk associated with an ascent abort system for a manned launch vehicle, a model was developed to predict the impact risk of the debris resulting from an explosion of the launch vehicle on the crew module. The model consisted of a debris catalog describing the number, size and imparted velocity of each piece of debris, a method to compute the trajectories of the debris and a method to calculate the impact risk given the abort trajectory of the crew module. The model provided a point estimate of the strike probability as a function of the debris catalog, the time of abort and the delay time between the abort and destruction of the launch vehicle. A study was conducted to determine the sensitivity of the strike probability to the various model input parameters and to develop a response surface model for use in the sensitivity analysis of the overall ascent abort risk model. The results of the sensitivity analysis and the response surface model are presented in this paper.

  5. Method of multivariate spectral analysis

    DOEpatents

    Keenan, Michael R.; Kotula, Paul G.

    2004-01-06

    A method of determining the properties of a sample from measured spectral data collected from the sample by performing a multivariate spectral analysis. The method can include: generating a two-dimensional matrix A containing measured spectral data; providing a weighted spectral data matrix D by performing a weighting operation on matrix A; factoring D into the product of two matrices, C and S.sup.T, by performing a constrained alternating least-squares analysis of D=CS.sup.T, where C is a concentration intensity matrix and S is a spectral shapes matrix; unweighting C and S by applying the inverse of the weighting used previously; and determining the properties of the sample by inspecting C and S. This method can be used to analyze X-ray spectral data generated by operating a Scanning Electron Microscope (SEM) with an attached Energy Dispersive Spectrometer (EDS).

  6. Compounding conservatisms: EPA's health risk assessment methods

    SciTech Connect

    Stackelberg, K. von; Burmaster, D.E. )

    1993-03-01

    Superfund conjures up images of hazardous waste sites, which EPA is spending billions of dollars to remediate. One of the law's most worrisome effects is that it drains enormous economic resources without returning commensurate benefits. In a Sept. 1, 1991, front page article in The New York Times, experts argued that most health dangers at Superfund sites could be eliminated for a fraction of the billions that will be spent cleaning up the 1,200 high-priority sites across the country. Even EPA has suggested that the Superfund program may receive disproportionate resources, compared with other public health programs, such as radon in houses, the diminishing ozone layer and occupational diseases. Public opinion polls over the last decade consistently have mirrored the public's vast fear of hazardous waste sites, a fear as great as that held for nuclear power plants. Fear notwithstanding, the high cost of chosen remedies at given sites may have less to do with public health goals than with the method EPA uses to translate them into acceptable contaminant concentrations in soil, groundwater and other environmental media.

  7. Method of photon spectral analysis

    DOEpatents

    Gehrke, Robert J.; Putnam, Marie H.; Killian, E. Wayne; Helmer, Richard G.; Kynaston, Ronnie L.; Goodwin, Scott G.; Johnson, Larry O.

    1993-01-01

    A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and .gamma.-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2000 keV), as well as high-energy .gamma. rays (>1 MeV). A 8192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The .gamma.-ray portion of each spectrum is analyzed by a standard Ge .gamma.-ray analysis program. This method can be applied to any analysis involving x- and .gamma.-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the .gamma.-ray analysis and accommodated during the x-ray analysis.

  8. Method of photon spectral analysis

    DOEpatents

    Gehrke, R.J.; Putnam, M.H.; Killian, E.W.; Helmer, R.G.; Kynaston, R.L.; Goodwin, S.G.; Johnson, L.O.

    1993-04-27

    A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and [gamma]-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2,000 keV), as well as high-energy [gamma] rays (>1 MeV). A 8,192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The [gamma]-ray portion of each spectrum is analyzed by a standard Ge [gamma]-ray analysis program. This method can be applied to any analysis involving x- and [gamma]-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the [gamma]-ray analysis and accommodated during the x-ray analysis.

  9. Modeling Opponents in Adversarial Risk Analysis.

    PubMed

    Rios Insua, David; Banks, David; Rios, Jesus

    2016-04-01

    Adversarial risk analysis has been introduced as a framework to deal with risks derived from intentional actions of adversaries. The analysis supports one of the decisionmakers, who must forecast the actions of the other agents. Typically, this forecast must take account of random consequences resulting from the set of selected actions. The solution requires one to model the behavior of the opponents, which entails strategic thinking. The supported agent may face different kinds of opponents, who may use different rationality paradigms, for example, the opponent may behave randomly, or seek a Nash equilibrium, or perform level-k thinking, or use mirroring, or employ prospect theory, among many other possibilities. We describe the appropriate analysis for these situations, and also show how to model the uncertainty about the rationality paradigm used by the opponent through a Bayesian model averaging approach, enabling a fully decision-theoretic solution. We also show how as we observe an opponent's decision behavior, this approach allows learning about the validity of each of the rationality models used to predict his decision by computing the models' (posterior) probabilities, which can be understood as a measure of their validity. We focus on simultaneous decision making by two agents. PMID:26133501

  10. Defining Human Failure Events for Petroleum Risk Analysis

    SciTech Connect

    Ronald L. Boring; Knut Øien

    2014-06-01

    In this paper, an identification and description of barriers and human failure events (HFEs) for human reliability analysis (HRA) is performed. The barriers, called target systems, are identified from risk significant accident scenarios represented as defined situations of hazard and accident (DSHAs). This report serves as the foundation for further work to develop petroleum HFEs compatible with the SPAR-H method and intended for reuse in future HRAs.

  11. Risk Analysis Related to Quality Management Principles

    NASA Astrophysics Data System (ADS)

    Vykydal, David; Halfarová, Petra; Nenadál, Jaroslav; Plura, Jiří; Hekelová, Edita

    2012-12-01

    Efficient and effective implementation of quality management principles asks for a responsible approach from top managers' perspectives. A study of the current state of affairs in Czech organizations discovers a lot of shortcomings in this field that can be changed to vary managerial risks. The article identifies and analyses some of them and gives short guidance for appropriate treatment. Text of the article reflects the authors' experience as well as knowledge obtained from the systematic analysis of industrial companies' environments.

  12. FORTRAN computer program for seismic risk analysis

    USGS Publications Warehouse

    McGuire, Robin K.

    1976-01-01

    A program for seismic risk analysis is described which combines generality of application, efficiency and accuracy of operation, and the advantage of small storage requirements. The theoretical basis for the program is first reviewed, and the computational algorithms used to apply this theory are described. The information required for running the program is listed. Published attenuation functions describing the variation with earthquake magnitude and distance of expected values for various ground motion parameters are summarized for reference by the program user. Finally, suggestions for use of the program are made, an example problem is described (along with example problem input and output) and the program is listed.

  13. New challenges on uncertainty propagation assessment of flood risk analysis

    NASA Astrophysics Data System (ADS)

    Martins, Luciano; Aroca-Jiménez, Estefanía; Bodoque, José M.; Díez-Herrero, Andrés

    2016-04-01

    Natural hazards, such as floods, cause considerable damage to the human life, material and functional assets every year and around the World. Risk assessment procedures has associated a set of uncertainties, mainly of two types: natural, derived from stochastic character inherent in the flood process dynamics; and epistemic, that are associated with lack of knowledge or the bad procedures employed in the study of these processes. There are abundant scientific and technical literature on uncertainties estimation in each step of flood risk analysis (e.g. rainfall estimates, hydraulic modelling variables); but very few experience on the propagation of the uncertainties along the flood risk assessment. Therefore, epistemic uncertainties are the main goal of this work, in particular,understand the extension of the propagation of uncertainties throughout the process, starting with inundability studies until risk analysis, and how far does vary a proper analysis of the risk of flooding. These methodologies, such as Polynomial Chaos Theory (PCT), Method of Moments or Monte Carlo, are used to evaluate different sources of error, such as data records (precipitation gauges, flow gauges...), hydrologic and hydraulic modelling (inundation estimation), socio-demographic data (damage estimation) to evaluate the uncertainties propagation (UP) considered in design flood risk estimation both, in numerical and cartographic expression. In order to consider the total uncertainty and understand what factors are contributed most to the final uncertainty, we used the method of Polynomial Chaos Theory (PCT). It represents an interesting way to handle to inclusion of uncertainty in the modelling and simulation process. PCT allows for the development of a probabilistic model of the system in a deterministic setting. This is done by using random variables and polynomials to handle the effects of uncertainty. Method application results have a better robustness than traditional analysis

  14. Forecasting method of nationak-level forest fire risk rating

    NASA Astrophysics Data System (ADS)

    Qin, Xian-lin; Zhang, Zi-hui; Li, Zeng-yuan; Yi, Hao-ruo

    2008-11-01

    The risk level of forest fire not only depends on weather, topography, human activities, socio-economic conditions, but is also closely related to the types, growth, moisture content, and quantity of forest fuel on the ground. How to timely acquire information about the growth and moisture content of forest fuel and climate for the whole country is critical to national-level forest fire risk forecasting. The development and application of remote sensing (RS), geographic information system (GIS), databases, internet, and other modern information technologies has provided important technical means for macro-regional forest fire risk forecasting. In this paper, quantified forecasting of national-level forest fire risk was studied using Fuel State Index (FSI) and Background Composite Index (BCI). The FSI was estimated using Moderate Resolution Imaging Spectroradiaometer (MODIS) data. National meteorological data and other basic data on distribution of fuel types and forest fire risk rating were standardized in ArcGIS platform to calculate BCI. The FSI and the BCI were used to calculate the Forest Fire Danger Index (FFDI), which is regarded as a quantitative indicator for national forest fire risk forecasting and forest fire risk rating, shifting from qualitative description to quantitative estimation. The major forest fires occurred in recent years were taken as examples to validate the above method, and results indicated that the method can be used for quantitative forecasting of national-level forest fire risks.

  15. Topographic Avalanche Risk: DEM Sensitivity Analysis

    NASA Astrophysics Data System (ADS)

    Nazarkulova, Ainura; Strobl, Josef

    2015-04-01

    GIS-based models are frequently used to assess the risk and trigger probabilities of (snow) avalanche releases, based on parameters and geomorphometric derivatives like elevation, exposure, slope, proximity to ridges and local relief energy. Numerous models, and model-based specific applications and project results have been published based on a variety of approaches and parametrizations as well as calibrations. Digital Elevation Models (DEM) come with many different resolution (scale) and quality (accuracy) properties, some of these resulting from sensor characteristics and DEM generation algorithms, others from different DEM processing workflows and analysis strategies. This paper explores the impact of using different types and characteristics of DEMs for avalanche risk modeling approaches, and aims at establishing a framework for assessing the uncertainty of results. The research question is derived from simply demonstrating the differences in release risk areas and intensities by applying identical models to DEMs with different properties, and then extending this into a broader sensitivity analysis. For the quantification and calibration of uncertainty parameters different metrics are established, based on simple value ranges, probabilities, as well as fuzzy expressions and fractal metrics. As a specific approach the work on DEM resolution-dependent 'slope spectra' is being considered and linked with the specific application of geomorphometry-base risk assessment. For the purpose of this study focusing on DEM characteristics, factors like land cover, meteorological recordings and snowpack structure and transformation are kept constant, i.e. not considered explicitly. Key aims of the research presented here are the development of a multi-resolution and multi-scale framework supporting the consistent combination of large area basic risk assessment with local mitigation-oriented studies, and the transferability of the latter into areas without availability of

  16. Risk analysis for renewable energy projects due to constraints arising

    NASA Astrophysics Data System (ADS)

    Prostean, G.; Vasar, C.; Prostean, O.; Vartosu, A.

    2016-02-01

    Starting from the target of the European Union (EU) to use renewable energy in the area that aims a binding target of 20% renewable energy in final energy consumption by 2020, this article illustrates the identification of risks for implementation of wind energy projects in Romania, which could lead to complex technical implications, social and administrative. In specific projects analyzed in this paper were identified critical bottlenecks in the future wind power supply chain and reasonable time periods that may arise. Renewable energy technologies have to face a number of constraints that delayed scaling-up their production process, their transport process, the equipment reliability, etc. so implementing these types of projects requiring complex specialized team, the coordination of which also involve specific risks. The research team applied an analytical risk approach to identify major risks encountered within a wind farm project developed in Romania in isolated regions with different particularities, configured for different geographical areas (hill and mountain locations in Romania). Identification of major risks was based on the conceptual model set up for the entire project implementation process. Throughout this conceptual model there were identified specific constraints of such process. Integration risks were examined by an empirical study based on the method HAZOP (Hazard and Operability). The discussion describes the analysis of our results implementation context of renewable energy projects in Romania and creates a framework for assessing energy supply to any entity from renewable sources.

  17. A comprehensive risk analysis of coastal zones in China

    NASA Astrophysics Data System (ADS)

    Wang, Guanghui; Liu, Yijun; Wang, Hongbing; Wang, Xueying

    2014-03-01

    Although coastal zones occupy an important position in the world development, they face high risks and vulnerability to natural disasters because of their special locations and their high population density. In order to estimate their capability for crisis-response, various models have been established. However, those studies mainly focused on natural factors or conditions, which could not reflect the social vulnerability and regional disparities of coastal zones. Drawing lessons from the experiences of the United Nations Environment Programme (UNEP), this paper presents a comprehensive assessment strategy based on the mechanism of Risk Matrix Approach (RMA), which includes two aspects that are further composed of five second-class indicators. The first aspect, the probability phase, consists of indicators of economic conditions, social development, and living standards, while the second one, the severity phase, is comprised of geographic exposure and natural disasters. After weighing all of the above indicators by applying the Analytic Hierarchy Process (AHP) and Delphi Method, the paper uses the comprehensive assessment strategy to analyze the risk indices of 50 coastal cities in China. The analytical results are presented in ESRI ArcGis10.1, which generates six different risk maps covering the aspects of economy, society, life, environment, disasters, and an overall assessment of the five areas. Furthermore, the study also investigates the spatial pattern of these risk maps, with detailed discussion and analysis of different risks in coastal cities.

  18. System Analysis and Risk Assessment System.

    Energy Science and Technology Software Center (ESTSC)

    2000-11-20

    Version 00 SARA4.16 is a program that allows the user to review the results of a Probabilistic Risk Assessment (PRA) and to perform limited sensitivity analysis on these results. This tool is intended to be used by a less technical oriented user and does not require the level of understanding of PRA concepts required by a full PRA analysis tool. With this program a user can review the information generated by a PRA analyst andmore » compare the results to those generated by making limited modifications to the data in the PRA. Also included in this program is the ability to graphically display the information stored in the database. This information includes event trees, fault trees, P&IDs and uncertainty distributions. SARA 4.16 is incorporated in the SAPHIRE 5.0 code package.« less

  19. [Risk analysis in radiation therapy: state of the art].

    PubMed

    Mazeron, R; Aguini, N; Deutsch, É

    2013-01-01

    Five radiotherapy accidents, from which two serial, occurred in France from 2003 to 2007, led the authorities to establish a roadmap for securing radiotherapy. By analogy with industrial processes, a technical decision form the French Nuclear Safety Authority in 2008 requires radiotherapy professionals to conduct analyzes of risks to patients. The process of risk analysis had been tested in three pilot centers, before the occurrence of accidents, with the creation of cells feedback. The regulation now requires all radiotherapy services to have similar structures to collect precursor events, incidents and accidents, to perform analyzes following rigorous methods and to initiate corrective actions. At the same time, it is also required to conduct analyzes a priori, less intuitive, and usually require the help of a quality engineer, with the aim of reducing risk. The progressive implementation of these devices is part of an overall policy to improve the quality of radiotherapy. Since 2007, no radiotherapy accident was reported. PMID:23787020

  20. Project 6: Cumulative Risk Assessment (CRA) Methods and Applications

    EPA Science Inventory

    Project 6: CRA Methods and Applications addresses the need to move beyond traditional risk assessment practices by developing CRA methods to integrate and evaluate impacts of chemical and nonchemical stressors on the environment and human health. Project 6 has three specific obje...

  1. Best self visualization method with high-risk youth.

    PubMed

    Schussel, Lorne; Miller, Lisa

    2013-08-01

    The healing process of the Best Self Visualization Method (BSM) is described within the framework of meditation, neuroscience, and psychodynamic theory. Cases are drawn from the treatment of high-risk youth, who have histories of poverty, survival of sexual and physical abuse, and/or current risk for perpetrating abuse. Clinical use of BSM is demonstrated in two case illustrations, one of group psychotherapy and another of individual therapy. PMID:23775428

  2. Bleeding after endoscopic submucosal dissection: Risk factors and preventive methods

    PubMed Central

    Kataoka, Yosuke; Tsuji, Yosuke; Sakaguchi, Yoshiki; Minatsuki, Chihiro; Asada-Hirayama, Itsuko; Niimi, Keiko; Ono, Satoshi; Kodashima, Shinya; Yamamichi, Nobutake; Fujishiro, Mitsuhiro; Koike, Kazuhiko

    2016-01-01

    Endoscopic submucosal dissection (ESD) has become widely accepted as a standard method of treatment for superficial gastrointestinal neoplasms because it enables en block resection even for large lesions or fibrotic lesions with minimal invasiveness, and decreases the local recurrence rate. Moreover, specimens resected in an en block fashion enable accurate histological assessment. Taking these factors into consideration, ESD seems to be more advantageous than conventional endoscopic mucosal resection (EMR), but the associated risks of perioperative adverse events are higher than in EMR. Bleeding after ESD is the most frequent among these adverse events. Although post-ESD bleeding can be controlled by endoscopic hemostasis in most cases, it may lead to serious conditions including hemorrhagic shock. Even with preventive methods including administration of acid secretion inhibitors and preventive hemostasis, post-ESD bleeding cannot be completely prevented. In addition high-risk cases for post-ESD bleeding, which include cases with the use of antithrombotic agents or which require large resection, are increasing. Although there have been many reports about associated risk factors and methods of preventing post-ESD bleeding, many issues remain unsolved. Therefore, in this review, we have overviewed risk factors and methods of preventing post-ESD bleeding from previous studies. Endoscopists should have sufficient knowledge of these risk factors and preventive methods when performing ESD. PMID:27468187

  3. Bleeding after endoscopic submucosal dissection: Risk factors and preventive methods.

    PubMed

    Kataoka, Yosuke; Tsuji, Yosuke; Sakaguchi, Yoshiki; Minatsuki, Chihiro; Asada-Hirayama, Itsuko; Niimi, Keiko; Ono, Satoshi; Kodashima, Shinya; Yamamichi, Nobutake; Fujishiro, Mitsuhiro; Koike, Kazuhiko

    2016-07-14

    Endoscopic submucosal dissection (ESD) has become widely accepted as a standard method of treatment for superficial gastrointestinal neoplasms because it enables en block resection even for large lesions or fibrotic lesions with minimal invasiveness, and decreases the local recurrence rate. Moreover, specimens resected in an en block fashion enable accurate histological assessment. Taking these factors into consideration, ESD seems to be more advantageous than conventional endoscopic mucosal resection (EMR), but the associated risks of perioperative adverse events are higher than in EMR. Bleeding after ESD is the most frequent among these adverse events. Although post-ESD bleeding can be controlled by endoscopic hemostasis in most cases, it may lead to serious conditions including hemorrhagic shock. Even with preventive methods including administration of acid secretion inhibitors and preventive hemostasis, post-ESD bleeding cannot be completely prevented. In addition high-risk cases for post-ESD bleeding, which include cases with the use of antithrombotic agents or which require large resection, are increasing. Although there have been many reports about associated risk factors and methods of preventing post-ESD bleeding, many issues remain unsolved. Therefore, in this review, we have overviewed risk factors and methods of preventing post-ESD bleeding from previous studies. Endoscopists should have sufficient knowledge of these risk factors and preventive methods when performing ESD. PMID:27468187

  4. 49 CFR 260.17 - Credit risk premium analysis.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 4 2012-10-01 2012-10-01 false Credit risk premium analysis. 260.17 Section 260... Financial Assistance § 260.17 Credit risk premium analysis. (a) When Federal appropriations are not available to cover the total subsidy cost, the Administrator will determine the Credit Risk...

  5. 49 CFR 260.17 - Credit risk premium analysis.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 4 2014-10-01 2014-10-01 false Credit risk premium analysis. 260.17 Section 260... Financial Assistance § 260.17 Credit risk premium analysis. (a) When Federal appropriations are not available to cover the total subsidy cost, the Administrator will determine the Credit Risk...

  6. 49 CFR 260.17 - Credit risk premium analysis.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 4 2013-10-01 2013-10-01 false Credit risk premium analysis. 260.17 Section 260... Financial Assistance § 260.17 Credit risk premium analysis. (a) When Federal appropriations are not available to cover the total subsidy cost, the Administrator will determine the Credit Risk...

  7. 49 CFR 260.17 - Credit risk premium analysis.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 4 2011-10-01 2011-10-01 false Credit risk premium analysis. 260.17 Section 260... Financial Assistance § 260.17 Credit risk premium analysis. (a) When Federal appropriations are not available to cover the total subsidy cost, the Administrator will determine the Credit Risk...

  8. Voltammetric analysis apparatus and method

    DOEpatents

    Almon, A.C.

    1993-06-08

    An apparatus and method is described for electrochemical analysis of elements in solution. An auxiliary electrode, a reference electrode, and five working electrodes are positioned in a container containing a sample solution. The working electrodes are spaced apart evenly from each other and the auxiliary electrode to minimize any inter-electrode interference that may occur during analysis. An electric potential is applied between the auxiliary electrode and each of the working electrodes. Simultaneous measurements taken of the current flow through each of the working electrodes for each given potential in a potential range are used for identifying chemical elements present in the sample solution and their respective concentrations. Multiple working electrodes enable a more positive identification to be made by providing unique data characteristic of chemical elements present in the sample solution.

  9. Voltametric analysis apparatus and method

    DOEpatents

    Almon, Amy C.

    1993-01-01

    An apparatus and method for electrochemical analysis of elements in solution. An auxiliary electrode 14, a reference electrode 18, and five working electrodes 20, 22, 26, 28, and 30 are positioned in a container 12 containing a sample solution 34. The working electrodes are spaced apart evenly from each other and auxiliary electrode 14 to minimize any inter-electrode interference that may occur during analysis. An electric potential is applied between auxiliary electrode 14 and each of the working electrodes 20, 22, 26, 28, and 30. Simultaneous measurements taken of the current flow through each of the working electrodes for each given potential in a potential range are used for identifying chemical elements present in sample solution 34 and their respective concentrations. Multiple working electrodes enable a more positive identification to be made by providing unique data characteristic of chemical elements present in the sample solution.

  10. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments

    PubMed Central

    Conners, Erin E.; West, Brooke S.; Roth, Alexis M.; Meckel-Parker, Kristen G.; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D.; Brouwer, Kimberly C.

    2016-01-01

    Increasingly, ‘place’, including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions. PMID:27191846

  11. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments.

    PubMed

    Conners, Erin E; West, Brooke S; Roth, Alexis M; Meckel-Parker, Kristen G; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D; Brouwer, Kimberly C

    2016-01-01

    Increasingly, 'place', including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions. PMID:27191846

  12. Code System to Calculate Integrated Reliability and Risk Analysis.

    Energy Science and Technology Software Center (ESTSC)

    2002-02-18

    Version 04 IRRAS Version 4.16, the latest in a series (2.0, 2.5, 4.0, 4.15), is a program developed for the purpose of performing those functions necessary to create and analyze a complete Probabilistic Risk Assessment (PRA). This program includes functions to allow the user to create event trees and fault trees, to define accident sequences and basic event failure data, to solve system and accident sequence fault trees, to quantify cut sets, and to performmore » uncertainty analysis on the results. Also included in this program are features to allow the analyst to generate reports and displays that can be used to document the results of an analysis. Since this software is a very detailed technical tool, the user of this program should be familiar with PRA concepts and the methods used to perform these analyses. IRRAS Version 4.16 is the latest in the stand-alone IRRAS series (2.0, 2.5, 4.0, 4.15). Be sure to review the PSR-405/ SAPHIRE 7.06 package which was released in January 2000 and includes three programs: the Integrated Reliability and Risk Analysis System (IRRAS), the System Analysis and Risk Assessment (SARA) system, the Models And Results Database (MAR-D) system, and the Fault tree, Event tree and P&ID (FEP) editors.« less

  13. Correlation method of electrocardiogram analysis

    NASA Astrophysics Data System (ADS)

    Strinadko, Marina M.; Timochko, Katerina B.

    2002-02-01

    The electrocardiograph method is the informational source for functional heart state characteristics. The electrocardiogram parameters are the integrated map of many component characteristics of the heart system and depend on disturbance requirements of each device. In the research work the attempt of making the skeleton diagram of perturbation of the heart system is made by the characteristic description of its basic components and connections between them through transition functions, which are written down by the differential equations of the first and second order with the purpose to build-up and analyze electrocardiogram. Noting the vector character of perturbation and the various position of heart in each organism, we offer own coordinate system connected with heart. The comparative analysis of electrocardiogram was conducted with the usage of correlation method.

  14. OVERVIEW: MYCOTOXIN RISK ANALYSIS IN THE USA

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The objective of the risk assessment process is to provide the basis for decisions that will allow successful management of the risk. Poor risk management decisions that result from inadequate risk assessment can be enormousely costly, both economically and in terms of human or animal health. Accco...

  15. Method and system for dynamic probabilistic risk assessment

    NASA Technical Reports Server (NTRS)

    Dugan, Joanne Bechta (Inventor); Xu, Hong (Inventor)

    2013-01-01

    The DEFT methodology, system and computer readable medium extends the applicability of the PRA (Probabilistic Risk Assessment) methodology to computer-based systems, by allowing DFT (Dynamic Fault Tree) nodes as pivot nodes in the Event Tree (ET) model. DEFT includes a mathematical model and solution algorithm, supports all common PRA analysis functions and cutsets. Additional capabilities enabled by the DFT include modularization, phased mission analysis, sequence dependencies, and imperfect coverage.

  16. Working session 5: Operational aspects and risk analysis

    SciTech Connect

    Cizelj, L.; Donoghue, J.

    1997-02-01

    A general observation is that both operational aspects and risk analysis cannot be adequately discussed without information presented in other sessions. Some overlap of conclusions and recommendations is therefore to be expected. Further, it was assumed that recommendations concerning improvements in some related topics were generated by other sessions and are not repeated here. These include: (1) Knowledge on degradation mechanisms (initiation, progression, and failure). (2) Modeling of degradation (initiation, progression, and failure). (3) Capabilities of NDE methods. (4) Preventive maintenance and repair. One should note here, however, that all of these directly affect both operational and risk aspects of affected plants. A list of conclusions and recommendations is based on available presentations and discussions addressing risk and operational experience. The authors aimed at reaching as broad a consensus as possible. It should be noted here that there is no strict delineation between operational and safety aspects of degradation of steam generator tubes. This is caused by different risk perceptions in different countries/societies. The conclusions and recommendations were divided into four broad groups: human reliability; leakage monitoring; risk impact; and consequence assessment.

  17. [Irrationality and risk--a problem analysis].

    PubMed

    Bergler, R

    1995-04-01

    The way one experiences risks and one's risk behaviour is not centrally the result of rational considerations and decisions. Contradictions and incompatibilities can be proved empirically: (1) General paranoiac hysterical fear of risks, (2) complete repression of health risks, (3) unsureness and inability in dealing with risks, (4) prejudice against risks, (5) admiring risk behaviour, (6) seeking dangerous ways to test one's own limits, (7) readiness for aggression by not subjectively being able to control the risk, (8) naively assessing a risk positively thus intensifying pleasurable sensation, (9) increased preparedness to take risks in groups, (10) increased preparedness to take risk in anonymity, (11) accepting risks as a creative challenge for achievement motivation, (12) loss of innovation when avoiding risks. The awareness and assessment of risk factors is dependent on (1) personality factors (achievement-motivated persons experience risk as a creative challenge; risk behaviour being a variation in stimulus and exploration of one's environment), (2) the social back-ground (booster function), (3) the culture-related assessment of being able to control risks and (4) age (youthful risk behaviour being a way of coping with developmental exercises, e.g. purposely breaking parent's rules, provocatively demonstrating adult behaviour, a means to solve frustrating performance failures, advantages through social acceptance). The way one copes with risk factors is based on subjective estimation of risk; this assessment is made anew for each area of behaviour according to the specific situation. Making a decision and acting upon it is the result of simultaneously assessing the possible psychological benefit-cost factors of taking risks. The extent, to which the threat of risk factors is felt, depends on the importance one attaches to certain needs and benefits of quality of life and, in addition to this, what the subjective probabilities are that one is likely to be

  18. Corticosteroids and Pediatric Septic Shock Outcomes: A Risk Stratified Analysis

    PubMed Central

    Atkinson, Sarah J.; Cvijanovich, Natalie Z.; Thomas, Neal J.; Allen, Geoffrey L.; Anas, Nick; Bigham, Michael T.; Hall, Mark; Freishtat, Robert J.; Sen, Anita; Meyer, Keith; Checchia, Paul A.; Shanley, Thomas P.; Nowak, Jeffrey; Quasney, Michael; Weiss, Scott L.; Banschbach, Sharon; Beckman, Eileen; Howard, Kelli; Frank, Erin; Harmon, Kelli; Lahni, Patrick; Lindsell, Christopher J.; Wong, Hector R.

    2014-01-01

    Background The potential benefits of corticosteroids for septic shock may depend on initial mortality risk. Objective We determined associations between corticosteroids and outcomes in children with septic shock who were stratified by initial mortality risk. Methods We conducted a retrospective analysis of an ongoing, multi-center pediatric septic shock clinical and biological database. Using a validated biomarker-based stratification tool (PERSEVERE), 496 subjects were stratified into three initial mortality risk strata (low, intermediate, and high). Subjects receiving corticosteroids during the initial 7 days of admission (n = 252) were compared to subjects who did not receive corticosteroids (n = 244). Logistic regression was used to model the effects of corticosteroids on 28-day mortality and complicated course, defined as death within 28 days or persistence of two or more organ failures at 7 days. Results Subjects who received corticosteroids had greater organ failure burden, higher illness severity, higher mortality, and a greater requirement for vasoactive medications, compared to subjects who did not receive corticosteroids. PERSEVERE-based mortality risk did not differ between the two groups. For the entire cohort, corticosteroids were associated with increased risk of mortality (OR 2.3, 95% CI 1.3–4.0, p = 0.004) and a complicated course (OR 1.7, 95% CI 1.1–2.5, p = 0.012). Within each PERSEVERE-based stratum, corticosteroid administration was not associated with improved outcomes. Similarly, corticosteroid administration was not associated with improved outcomes among patients with no comorbidities, nor in groups of patients stratified by PRISM. Conclusions Risk stratified analysis failed to demonstrate any benefit from corticosteroids in this pediatric septic shock cohort. PMID:25386653

  19. Pressure Systems Stored-Energy Threshold Risk Analysis

    SciTech Connect

    Paulsen, Samuel S.

    2009-08-25

    Federal Regulation 10 CFR 851, which became effective February 2007, brought to light potential weaknesses regarding the Pressure Safety Program at the Pacific Northwest National Laboratory (PNNL). The definition of a pressure system in 10 CFR 851 does not contain a limit based upon pressure or any other criteria. Therefore, the need for a method to determine an appropriate risk-based hazard level for pressure safety was identified. The Laboratory has historically used a stored energy of 1000 lbf-ft to define a pressure hazard; however, an analytical basis for this value had not been documented. This document establishes the technical basis by evaluating the use of stored energy as an appropriate criterion to establish a pressure hazard, exploring a suitable risk threshold for pressure hazards, and reviewing the methods used to determine stored energy. The literature review and technical analysis concludes the use of stored energy as a method for determining a potential risk, the 1000 lbf-ft threshold, and the methods used by PNNL to calculate stored energy are all appropriate. Recommendations for further program improvements are also discussed

  20. Global Human Settlement Analysis for Disaster Risk Reduction

    NASA Astrophysics Data System (ADS)

    Pesaresi, M.; Ehrlich, D.; Ferri, S.; Florczyk, A.; Freire, S.; Haag, F.; Halkia, M.; Julea, A. M.; Kemper, T.; Soille, P.

    2015-04-01

    The Global Human Settlement Layer (GHSL) is supported by the European Commission, Joint Research Center (JRC) in the frame of his institutional research activities. Scope of GHSL is developing, testing and applying the technologies and analysis methods integrated in the JRC Global Human Settlement analysis platform for applications in support to global disaster risk reduction initiatives (DRR) and regional analysis in the frame of the European Cohesion policy. GHSL analysis platform uses geo-spatial data, primarily remotely sensed and population. GHSL also cooperates with the Group on Earth Observation on SB-04-Global Urban Observation and Information, and various international partners andWorld Bank and United Nations agencies. Some preliminary results integrating global human settlement information extracted from Landsat data records of the last 40 years and population data are presented.

  1. Putting problem formulation at the forefront of GMO risk analysis.

    PubMed

    Tepfer, Mark; Racovita, Monica; Craig, Wendy

    2013-01-01

    When applying risk assessment and the broader process of risk analysis to decisions regarding the dissemination of genetically modified organisms (GMOs), the process has a tendency to become remarkably complex. Further, as greater numbers of countries consider authorising the large-scale dissemination of GMOs, and as GMOs with more complex traits reach late stages of development, there has been increasing concern about the burden posed by the complexity of risk analysis. We present here an improved approach for GMO risk analysis that gives a central role to problem formulation. Further, the risk analysis strategy has been clarified and simplified in order to make rigorously scientific risk assessment and risk analysis more broadly accessible to diverse stakeholder groups. PMID:23160540

  2. Movement recognition technology as a method of assessing spontaneous general movements in high risk infants.

    PubMed

    Marcroft, Claire; Khan, Aftab; Embleton, Nicholas D; Trenell, Michael; Plötz, Thomas

    2014-01-01

    Preterm birth is associated with increased risks of neurological and motor impairments such as cerebral palsy. The risks are highest in those born at the lowest gestations. Early identification of those most at risk is challenging meaning that a critical window of opportunity to improve outcomes through therapy-based interventions may be missed. Clinically, the assessment of spontaneous general movements is an important tool, which can be used for the prediction of movement impairments in high risk infants. Movement recognition aims to capture and analyze relevant limb movements through computerized approaches focusing on continuous, objective, and quantitative assessment. Different methods of recording and analyzing infant movements have recently been explored in high risk infants. These range from camera-based solutions to body-worn miniaturized movement sensors used to record continuous time-series data that represent the dynamics of limb movements. Various machine learning methods have been developed and applied to the analysis of the recorded movement data. This analysis has focused on the detection and classification of atypical spontaneous general movements. This article aims to identify recent translational studies using movement recognition technology as a method of assessing movement in high risk infants. The application of this technology within pediatric practice represents a growing area of inter-disciplinary collaboration, which may lead to a greater understanding of the development of the nervous system in infants at high risk of motor impairment. PMID:25620954

  3. Movement Recognition Technology as a Method of Assessing Spontaneous General Movements in High Risk Infants

    PubMed Central

    Marcroft, Claire; Khan, Aftab; Embleton, Nicholas D.; Trenell, Michael; Plötz, Thomas

    2015-01-01

    Preterm birth is associated with increased risks of neurological and motor impairments such as cerebral palsy. The risks are highest in those born at the lowest gestations. Early identification of those most at risk is challenging meaning that a critical window of opportunity to improve outcomes through therapy-based interventions may be missed. Clinically, the assessment of spontaneous general movements is an important tool, which can be used for the prediction of movement impairments in high risk infants. Movement recognition aims to capture and analyze relevant limb movements through computerized approaches focusing on continuous, objective, and quantitative assessment. Different methods of recording and analyzing infant movements have recently been explored in high risk infants. These range from camera-based solutions to body-worn miniaturized movement sensors used to record continuous time-series data that represent the dynamics of limb movements. Various machine learning methods have been developed and applied to the analysis of the recorded movement data. This analysis has focused on the detection and classification of atypical spontaneous general movements. This article aims to identify recent translational studies using movement recognition technology as a method of assessing movement in high risk infants. The application of this technology within pediatric practice represents a growing area of inter-disciplinary collaboration, which may lead to a greater understanding of the development of the nervous system in infants at high risk of motor impairment. PMID:25620954

  4. HANFORD SAFETY ANALYSIS & RISK ASSESSMENT HANDBOOK (SARAH)

    SciTech Connect

    EVANS, C B

    2004-12-21

    The purpose of the Hanford Safety Analysis and Risk Assessment Handbook (SARAH) is to support the development of safety basis documentation for Hazard Category 2 and 3 (HC-2 and 3) U.S. Department of Energy (DOE) nuclear facilities to meet the requirements of 10 CFR 830, ''Nuclear Safety Management''. Subpart B, ''Safety Basis Requirements.'' Consistent with DOE-STD-3009-94, Change Notice 2, ''Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'' (STD-3009), and DOE-STD-3011-2002, ''Guidance for Preparation of Basis for Interim Operation (BIO) Documents'' (STD-3011), the Hanford SARAH describes methodology for performing a safety analysis leading to development of a Documented Safety Analysis (DSA) and derivation of Technical Safety Requirements (TSR), and provides the information necessary to ensure a consistently rigorous approach that meets DOE expectations. The DSA and TSR documents, together with the DOE-issued Safety Evaluation Report (SER), are the basic components of facility safety basis documentation. For HC-2 or 3 nuclear facilities in long-term surveillance and maintenance (S&M), for decommissioning activities, where source term has been eliminated to the point that only low-level, residual fixed contamination is present, or for environmental remediation activities outside of a facility structure, DOE-STD-1120-98, ''Integration of Environment, Safety, and Health into Facility Disposition Activities'' (STD-1120), may serve as the basis for the DSA. HC-2 and 3 environmental remediation sites also are subject to the hazard analysis methodologies of this standard.

  5. Cost Risk Analysis Based on Perception of the Engineering Process

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.; Wood, Darrell A.; Moore, Arlene A.; Bogart, Edward H.

    1986-01-01

    In most cost estimating applications at the NASA Langley Research Center (LaRC), it is desirable to present predicted cost as a range of possible costs rather than a single predicted cost. A cost risk analysis generates a range of cost for a project and assigns a probability level to each cost value in the range. Constructing a cost risk curve requires a good estimate of the expected cost of a project. It must also include a good estimate of expected variance of the cost. Many cost risk analyses are based upon an expert's knowledge of the cost of similar projects in the past. In a common scenario, a manager or engineer, asked to estimate the cost of a project in his area of expertise, will gather historical cost data from a similar completed project. The cost of the completed project is adjusted using the perceived technical and economic differences between the two projects. This allows errors from at least three sources. The historical cost data may be in error by some unknown amount. The managers' evaluation of the new project and its similarity to the old project may be in error. The factors used to adjust the cost of the old project may not correctly reflect the differences. Some risk analyses are based on untested hypotheses about the form of the statistical distribution that underlies the distribution of possible cost. The usual problem is not just to come up with an estimate of the cost of a project, but to predict the range of values into which the cost may fall and with what level of confidence the prediction is made. Risk analysis techniques that assume the shape of the underlying cost distribution and derive the risk curve from a single estimate plus and minus some amount usually fail to take into account the actual magnitude of the uncertainty in cost due to technical factors in the project itself. This paper addresses a cost risk method that is based on parametric estimates of the technical factors involved in the project being costed. The engineering

  6. Methods for Cancer Epigenome Analysis

    PubMed Central

    Nagarajan, Raman P.; Fouse, Shaun D.; Bell, Robert J.A.; Costello, Joseph F.

    2014-01-01

    Accurate detection of epimutations in tumor cells is crucial for understanding the molecular pathogenesis of cancer. Alterations in DNA methylation in cancer are functionally important and clinically relevant, but even this well-studied area is continually re-evaluated in light of unanticipated results, including a strong connection between aberrant DNA methylation in adult tumors and polycomb group profiles in embryonic stem cells, cancer-associated genetic mutations in epigenetic regulators such as DNMT3A and TET family genes, and the discovery of abundant 5-hydroxymethylcytosine, a product of TET proteins acting on 5-methylcytosine, in human tissues. The abundance and distribution of covalent histone modifications in primary cancer tissues relative to normal cells is a largely uncharted area, although there is good evidence for a mechanistic role of cancer-specific alterations in epigenetic marks in tumor etiology, drug response and tumor progression. Meanwhile, the discovery of new epigenetic marks continues, and there are many useful methods for epigenome analysis applicable to primary tumor samples, in addition to cancer cell lines. For DNA methylation and hydroxymethylation, next-generation sequencing allows increasingly inexpensive and quantitative whole-genome profiling. Similarly, the refinement and maturation of chromatin immunoprecipitation with next-generation sequencing (ChIP-seq) has made possible genome-wide mapping of histone modifications, open chromatin and transcription factor binding sites. Computational tools have been developed apace with these epigenome methods to better enable the accuracy and interpretation of the data from the profiling methods. PMID:22956508

  7. Structural reliability analysis and seismic risk assessment

    SciTech Connect

    Hwang, H.; Reich, M.; Shinozuka, M.

    1984-01-01

    This paper presents a reliability analysis method for safety evaluation of nuclear structures. By utilizing this method, it is possible to estimate the limit state probability in the lifetime of structures and to generate analytically the fragility curves for PRA studies. The earthquake ground acceleration, in this approach, is represented by a segment of stationary Gaussian process with a zero mean and a Kanai-Tajimi Spectrum. All possible seismic hazard at a site represented by a hazard curve is also taken into consideration. Furthermore, the limit state of a structure is analytically defined and the corresponding limit state surface is then established. Finally, the fragility curve is generated and the limit state probability is evaluated. In this paper, using a realistic reinforced concrete containment as an example, results of the reliability analysis of the containment subjected to dead load, live load and ground earthquake acceleration are presented and a fragility curve for PRA studies is also constructed.

  8. A Proposal of Operational Risk Management Method Using FMEA for Drug Manufacturing Computerized System

    NASA Astrophysics Data System (ADS)

    Takahashi, Masakazu; Nanba, Reiji; Fukue, Yoshinori

    This paper proposes operational Risk Management (RM) method using Failure Mode and Effects Analysis (FMEA) for drug manufacturing computerlized system (DMCS). The quality of drug must not be influenced by failures and operational mistakes of DMCS. To avoid such situation, DMCS has to be conducted enough risk assessment and taken precautions. We propose operational RM method using FMEA for DMCS. To propose the method, we gathered and compared the FMEA results of DMCS, and develop a list that contains failure modes, failures and countermeasures. To apply this list, we can conduct RM in design phase, find failures, and conduct countermeasures efficiently. Additionally, we can find some failures that have not been found yet.

  9. INTERIM REPORT IMPROVED METHODS FOR INCORPORATING RISK IN DECISION MAKING

    SciTech Connect

    Clausen, M. J.; Fraley, D. W.; Denning, R. S.

    1980-08-01

    This paper reports observations and preliminary investigations in the first phase of a research program covering methodologies for making safety-related decisions. The objective has been to gain insight into NRC perceptions of the value of formal decision methods, their possible applications, and how risk is, or may be, incorporated in decision making. The perception of formal decision making techniques, held by various decision makers, and what may be done to improve them, were explored through interviews with NRC staff. An initial survey of decision making methods, an assessment of the applicability of formal methods vis-a-vis the available information, and a review of methods of incorporating risk and uncertainty have also been conducted.

  10. 31 CFR 223.11 - Limitation of risk: Protective methods.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false Limitation of risk: Protective methods. 223.11 Section 223.11 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) FISCAL SERVICE, DEPARTMENT OF THE TREASURY FINANCIAL MANAGEMENT SERVICE SURETY COMPANIES DOING BUSINESS WITH THE UNITED STATES §...