Science.gov

Sample records for risk analysis method

  1. Common Methods for Security Risk Analysis

    DTIC Science & Technology

    2007-11-02

    Workshops was particularly influential among Canadian tool-designers in the late 1980’s. These models generally favour a software tool solution simply...tools that have too small a market to justify extensive software development. Also, most of the risk management standards that came out at this...companies developing specialized risk analysis tools, such as the Vulcanizer project of DOMUS Software Inc. The latter incorporated fuzzy logic to

  2. The development of a 3D risk analysis method.

    PubMed

    I, Yet-Pole; Cheng, Te-Lung

    2008-05-01

    Much attention has been paid to the quantitative risk analysis (QRA) research in recent years due to more and more severe disasters that have happened in the process industries. Owing to its calculation complexity, very few software, such as SAFETI, can really make the risk presentation meet the practice requirements. However, the traditional risk presentation method, like the individual risk contour in SAFETI, is mainly based on the consequence analysis results of dispersion modeling, which usually assumes that the vapor cloud disperses over a constant ground roughness on a flat terrain with no obstructions and concentration fluctuations, which is quite different from the real situations of a chemical process plant. All these models usually over-predict the hazardous regions in order to maintain their conservativeness, which also increases the uncertainty of the simulation results. On the other hand, a more rigorous model such as the computational fluid dynamics (CFD) model can resolve the previous limitations; however, it cannot resolve the complexity of risk calculations. In this research, a conceptual three-dimensional (3D) risk calculation method was proposed via the combination of results of a series of CFD simulations with some post-processing procedures to obtain the 3D individual risk iso-surfaces. It is believed that such technique will not only be limited to risk analysis at ground level, but also be extended into aerial, submarine, or space risk analyses in the near future.

  3. An advanced method for flood risk analysis in river deltas, applied to societal flood fatality risks in the Netherlands

    NASA Astrophysics Data System (ADS)

    de Bruijn, K. M.; Diermanse, F. L. M.; Beckers, J. V. L.

    2014-02-01

    This paper discusses the new method developed to analyse flood risks in river deltas. Risk analysis of river deltas is complex, because both storm surges and river discharges may cause flooding and since the effect of upstream breaches on downstream water levels and flood risks must be taken into account. A Monte Carlo based flood risk analysis framework for policy making was developed, which considers both storm surges and river flood waves and includes hydrodynamic interaction effects on flood risks. It was applied to analyse societal flood fatality risks (the probability of events with more than N fatalities) in the Rhine-Meuse delta.

  4. An advanced method for flood risk analysis in river deltas, applied to societal flood fatality risk in the Netherlands

    NASA Astrophysics Data System (ADS)

    de Bruijn, K. M.; Diermanse, F. L. M.; Beckers, J. V. L.

    2014-10-01

    This paper discusses a new method for flood risk assessment in river deltas. Flood risk analysis of river deltas is complex, because both storm surges and river discharges may cause flooding and the effect of upstream breaches on downstream water levels and flood risk must be taken into account. This paper presents a Monte Carlo-based flood risk analysis framework for policy making, which considers both storm surges and river flood waves and includes effects from hydrodynamic interaction on flood risk. It was applied to analyse societal flood fatality risk in the Rhine-Meuse delta.

  5. [Discussion on the building of post market risk analysis method in hemodialysis device].

    PubMed

    Xu, Honglei; Peng, Xiaolong; Tian, Xiaojun; Wang, Peilian

    2014-09-01

    This paper discussed the building of post market risk analysis method in hemodialysis device from the point of government supervision. By proposing practical research methods for post market risk identification and estimation on hemodialysis device, providing technical guidance for government to put risk management of hemodialysis device into effect, and offering reference for enterprises to carry out post market risk evaluation on their products as well.

  6. Simplified plant analysis risk (SPAR) human reliability analysis (HRA) methodology: Comparisons with other HRA methods

    SciTech Connect

    J. C. Byers; D. I. Gertman; S. G. Hill; H. S. Blackman; C. D. Gentillon; B. P. Hallbert; L. N. Haney

    2000-07-31

    The 1994 Accident Sequence Precursor (ASP) human reliability analysis (HRA) methodology was developed for the U.S. Nuclear Regulatory Commission (USNRC) in 1994 by the Idaho National Engineering and Environmental Laboratory (INEEL). It was decided to revise that methodology for use by the Simplified Plant Analysis Risk (SPAR) program. The 1994 ASP HRA methodology was compared, by a team of analysts, on a point-by-point basis to a variety of other HRA methods and sources. This paper briefly discusses how the comparisons were made and how the 1994 ASP HRA methodology was revised to incorporate desirable aspects of other methods. The revised methodology was renamed the SPAR HRA methodology.

  7. Simplified Plant Analysis Risk (SPAR) Human Reliability Analysis (HRA) Methodology: Comparisons with other HRA Methods

    SciTech Connect

    Byers, James Clifford; Gertman, David Ira; Hill, Susan Gardiner; Blackman, Harold Stabler; Gentillon, Cynthia Ann; Hallbert, Bruce Perry; Haney, Lon Nolan

    2000-08-01

    The 1994 Accident Sequence Precursor (ASP) human reliability analysis (HRA) methodology was developed for the U.S. Nuclear Regulatory Commission (USNRC) in 1994 by the Idaho National Engineering and Environmental Laboratory (INEEL). It was decided to revise that methodology for use by the Simplified Plant Analysis Risk (SPAR) program. The 1994 ASP HRA methodology was compared, by a team of analysts, on a point-by-point basis to a variety of other HRA methods and sources. This paper briefly discusses how the comparisons were made and how the 1994 ASP HRA methodology was revised to incorporate desirable aspects of other methods. The revised methodology was renamed the SPAR HRA methodology.

  8. Review of Research Trends and Methods in Nano Environmental, Health, and Safety Risk Analysis.

    PubMed

    Erbis, Serkan; Ok, Zeynep; Isaacs, Jacqueline A; Benneyan, James C; Kamarthi, Sagar

    2016-08-01

    Despite the many touted benefits of nanomaterials, concerns remain about their possible environmental, health, and safety (EHS) risks in terms of their toxicity, long-term accumulation effects, or dose-response relationships. The published studies on EHS risks of nanomaterials have increased significantly over the past decade and half, with most focused on nanotoxicology. Researchers are still learning about health consequences of nanomaterials and how to make environmentally responsible decisions regarding their production. This article characterizes the scientific literature on nano-EHS risk analysis to map the state-of-the-art developments in this field and chart guidance for the future directions. First, an analysis of keyword co-occurrence networks is investigated for nano-EHS literature published in the past decade to identify the intellectual turning points and research trends in nanorisk analysis studies. The exposure groups targeted in emerging nano-EHS studies are also assessed. System engineering methods for risk, safety, uncertainty, and system reliability analysis are reviewed, followed by detailed descriptions where applications of these methods are utilized to analyze nanomaterial EHS risks. Finally, the trends, methods, future directions, and opportunities of system engineering methods in nano-EHS research are discussed. The analysis of nano-EHS literature presented in this article provides important insights on risk assessment and risk management tools associated with nanotechnology, nanomanufacturing, and nano-enabled products.

  9. RAMPART (TM): Risk Assessment Method-Property Analysis and Ranking Tool v.4.0

    SciTech Connect

    Carson, Susan D.; Hunter, Regina L.; Link, Madison D.; Browitt, Robert D.

    2007-09-30

    RAMPART{trademark}, Risk Assessment Method-property Analysis and Ranking Tool, is a new type of computer software package for the assessment of risk to buildings. RAMPART{trademark} has been developed by Sandia National Laboratories (SNL) for the U.S. General Services Administration (GSA). RAMPART {trademark} has been designed and developed to be a risk-based decision support tool that requires no risk analysis expertise on the part of the user. The RAMPART{trademark} user interface elicits information from the user about the building. The RAMPART{trademark} expert system is a set of rules that embodies GSA corporate knowledge and SNL's risk assessment experience. The RAMPART{trademark} database contains both data entered by the user during a building analysis session and large sets of natural hazard and crime data. RAMPART{trademark} algorithms use these data to assess the risk associated with a given building in the face of certain hazards. Risks arising from five natural hazards (earthquake, hurricane, winter storm, tornado and flood); crime (inside and outside the building); fire and terrorism are calculated. These hazards may cause losses of various kinds. RAMPART{trademark} considers death, injury, loss of mission, loss of property, loss of contents, loss of building use, and first-responder loss. The results of each analysis are presented graphically on the screen and in a written report.

  10. An improved method for risk evaluation in failure modes and effects analysis of CNC lathe

    NASA Astrophysics Data System (ADS)

    Rachieru, N.; Belu, N.; Anghel, D. C.

    2015-11-01

    Failure mode and effects analysis (FMEA) is one of the most popular reliability analysis tools for identifying, assessing and eliminating potential failure modes in a wide range of industries. In general, failure modes in FMEA are evaluated and ranked through the risk priority number (RPN), which is obtained by the multiplication of crisp values of the risk factors, such as the occurrence (O), severity (S), and detection (D) of each failure mode. However, the crisp RPN method has been criticized to have several deficiencies. In this paper, linguistic variables, expressed in Gaussian, trapezoidal or triangular fuzzy numbers, are used to assess the ratings and weights for the risk factors S, O and D. A new risk assessment system based on the fuzzy set theory and fuzzy rule base theory is to be applied to assess and rank risks associated to failure modes that could appear in the functioning of Turn 55 Lathe CNC. Two case studies have been shown to demonstrate the methodology thus developed. It is illustrated a parallel between the results obtained by the traditional method and fuzzy logic for determining the RPNs. The results show that the proposed approach can reduce duplicated RPN numbers and get a more accurate, reasonable risk assessment. As a result, the stability of product and process can be assured.

  11. Genotype relative risks: Methods for design and analysis of candidate-gene association studies

    SciTech Connect

    Shaid, D.J.; Sommer, S.S. )

    1993-11-01

    Design and analysis methods are presented for studying the association of a candidate gene with a disease by using parental data in place of nonrelated controls. This alternating design eliminates spurious differences in allele frequencies between cases and nonrelated controls resulting from different ethnic origins and population stratification for these two groups. The authors present analysis methods which are based on two genetic relative risks: (1) the relative risk of disease for homozygotes with two copies of the candidate gene versus homozygotes without the candidate gene and (2) the relative risk for heterozygotes with one copy of the candidate gene versus homozygotes without the candidate gene. In addition to estimating the magnitude of these relative risks, likelihood methods allow specific hypotheses to be tested, namely, a test for overall association of the candidate gene with disease, as well as specific genetic hypotheses, such as dominant or recessive inheritance. Two likelihood methods are presented: (1) a likelihood method appropriate when Hardy-Weinberg equilibrium holds and (2) a likelihood method in which the authors condition on parental genotype data when Hardy-Weinberg equilibrium does not hold. The results for the relative efficiency of these two methods suggest that the conditional approach may at times be preferable, even when equilibrium holds. Sample-size and power calculations are presented for a multitiered design. Tier 1 detects the presence of an abnormal sequence for a postulated candidate gene among a small group of cases. Tier 2 tests for association of the abnormal variant with disease, such as by the likelihood methods presented. Tier 3 confirms positive results from tier 2. Results indicate that required sample sizes are smaller when expression of disease is recessive, rather than dominant, and that, for recessive disease and large relative risks, necessary sample sizes may be feasible. 19 refs., 2 figs., 2 tabs.

  12. Method of assessing a lipid-related health risk based on ion mobility analysis of lipoproteins

    DOEpatents

    Benner, W. Henry; Krauss, Ronald M.; Blanche, Patricia J.

    2010-12-14

    A medical diagnostic method and instrumentation system for analyzing noncovalently bonded agglomerated biological particles is described. The method and system comprises: a method of preparation for the biological particles; an electrospray generator; an alpha particle radiation source; a differential mobility analyzer; a particle counter; and data acquisition and analysis means. The medical device is useful for the assessment of human diseases, such as cardiac disease risk and hyperlipidemia, by rapid quantitative analysis of lipoprotein fraction densities. Initially, purification procedures are described to reduce an initial blood sample to an analytical input to the instrument. The measured sizes from the analytical sample are correlated with densities, resulting in a spectrum of lipoprotein densities. The lipoprotein density distribution can then be used to characterize cardiac and other lipid-related health risks.

  13. An integrated optimization method for river water quality management and risk analysis in a rural system.

    PubMed

    Liu, J; Li, Y P; Huang, G H; Zeng, X T; Nie, S

    2016-01-01

    In this study, an interval-stochastic-based risk analysis (RSRA) method is developed for supporting river water quality management in a rural system under uncertainty (i.e., uncertainties exist in a number of system components as well as their interrelationships). The RSRA method is effective in risk management and policy analysis, particularly when the inputs (such as allowable pollutant discharge and pollutant discharge rate) are expressed as probability distributions and interval values. Moreover, decision-makers' attitudes towards system risk can be reflected using a restricted resource measure by controlling the variability of the recourse cost. The RSRA method is then applied to a real case of water quality management in the Heshui River Basin (a rural area of China), where chemical oxygen demand (COD), total nitrogen (TN), total phosphorus (TP), and soil loss are selected as major indicators to identify the water pollution control strategies. Results reveal that uncertainties and risk attitudes have significant effects on both pollutant discharge and system benefit. A high risk measure level can lead to a reduced system benefit; however, this reduction also corresponds to raised system reliability. Results also disclose that (a) agriculture is the dominant contributor to soil loss, TN, and TP loads, and abatement actions should be mainly carried out for paddy and dry farms; (b) livestock husbandry is the main COD discharger, and abatement measures should be mainly conducted for poultry farm; (c) fishery accounts for a high percentage of TN, TP, and COD discharges but a has low percentage of overall net benefit, and it may be beneficial to cease fishery activities in the basin. The findings can facilitate the local authority in identifying desired pollution control strategies with the tradeoff between socioeconomic development and environmental sustainability.

  14. Quantitative environmental risk analysis

    SciTech Connect

    Klovning, J.; Nilsen, E.F.

    1995-12-31

    According to regulations relating to implementation and rise of risk analysis in the petroleum activities issued by the Norwegian Petroleum Directorate, it is mandatory for an operator on the Norwegian Continental Shelf to establish acceptance criteria for environmental risk in the activities and carry out environmental risk analysis. This paper presents a {open_quotes}new{close_quotes} method for environmental risk analysis developed by the company. The objective has been to assist the company to meet rules and regulations and to assess and describe the environmental risk in a systematic manner. In the environmental risk analysis the most sensitive biological resource in the affected area is used to assess the environmental damage. The analytical method is based on the methodology for quantitative risk analysis related to loss of life. In addition it incorporates the effect of seasonal fluctuations in the environmental risk evaluations. The paper is describing the function of the main analytical sequences exemplified through an analysis of environmental risk related to exploration drilling in an environmental sensitive area on the Norwegian Continental Shelf.

  15. Uncertainty analysis in regulatory programs: Application factors versus probabilistic methods in ecological risk assessments of chemicals

    SciTech Connect

    Moore, D.R.J.; Elliot, B.

    1995-12-31

    In assessments of toxic chemicals, sources of uncertainty may be dealt with by two basic approaches: application factors and probabilistic methods. In regulatory programs, the most common approach is to calculate a quotient by dividing the predicted environmental concentration (PEC) by the predicted no effects concentration (PNEC). PNECs are usually derived from laboratory bioassays, thus requiring the use of application factors to account for uncertainty introduced by the extrapolation from the laboratory to the field, and from measurement to assessment endpoints. Using this approach, often with worst-case assumptions about exposure and species sensitivities, the hope is that chemicals with a quotient of less than one will have a very low probability of causing adverse ecological effects. This approach has received widespread criticism recently, particularly because it tends to be overly conservative and does not adequately estimate the magnitude and probability of causing adverse effects. On the plus side, application factors are simple to use, accepted worldwide, and may be used with limited effects data in a quotient calculation. The alternative approach is to use probabilistic methods such as Monte Carlo simulation, Baye`s theorem or other techniques to estimate risk. Such methods often have rigorous statistical assumptions and may have large data requirements. Stating an effect in probabilistic terms, however, forces the identification of sources of uncertainty and quantification of their impact on risk estimation. In this presentation the authors discuss the advantages and disadvantages of using application factors and probabilistic methods in dealing with uncertainty in ecological risk assessments of chemicals. Based on this analysis, recommendations are presented to assist in choosing the appropriate approach for different types of regulatory programs dealing with toxic chemicals.

  16. Animal Disease Import Risk Analysis--a Review of Current Methods and Practice.

    PubMed

    Peeler, E J; Reese, R A; Thrush, M A

    2015-10-01

    The application of risk analysis to the spread of disease with international trade in animals and their products, that is, import risk analysis (IRA), has been largely driven by the Sanitary and Phytosanitary (SPS) agreement of the World Trade Organization (WTO). The degree to which the IRA standard established by the World Organization for Animal Health (OIE), and associated guidance, meets the needs of the SPS agreement is discussed. The use of scenario trees is the core modelling approach used to represent the steps necessary for the hazard to occur. There is scope to elaborate scenario trees for commodity IRA so that the quantity of hazard at each step is assessed, which is crucial to the likelihood of establishment. The dependence between exposure and establishment suggests that they should fall within the same subcomponent. IRA undertaken for trade reasons must include an assessment of consequences to meet SPS criteria, but guidance is sparse. The integration of epidemiological and economic modelling may open a path for better methods. Matrices have been used in qualitative IRA to combine estimates of entry and exposure, and consequences with likelihood, but this approach has flaws and better methods are needed. OIE IRA standards and guidance indicate that the volume of trade should be taken into account, but offer no detail. Some published qualitative IRAs have assumed current levels and patterns of trade without specifying the volume of trade, which constrains the use of IRA to determine mitigation measures (to reduce risk to an acceptable level) and whether the principle of equivalence, fundamental to the SPS agreement, has been observed. It is questionable whether qualitative IRA can meet all the criteria set out in the SPS agreement. Nevertheless, scope exists to elaborate the current standards and guidance, so they better serve the principle of science-based decision-making.

  17. Stochastic Drought Risk Analysis and Projection Methods For Thermoelectric Power Systems

    NASA Astrophysics Data System (ADS)

    Bekera, Behailu Belamo

    Combined effects of socio-economic, environmental, technological and political factors impact fresh cooling water availability, which is among the most important elements of thermoelectric power plant site selection and evaluation criteria. With increased variability and changes in hydrologic statistical stationarity, one concern is the increased occurrence of extreme drought events that may be attributable to climatic changes. As hydrological systems are altered, operators of thermoelectric power plants need to ensure a reliable supply of water for cooling and generation requirements. The effects of climate change are expected to influence hydrological systems at multiple scales, possibly leading to reduced efficiency of thermoelectric power plants. This study models and analyzes drought characteristics from a thermoelectric systems operational and regulation perspective. A systematic approach to characterize a stream environment in relation to extreme drought occurrence, duration and deficit-volume is proposed and demonstrated. More specifically, the objective of this research is to propose a stochastic water supply risk analysis and projection methods from thermoelectric power systems operation and management perspectives. The study defines thermoelectric drought as a shortage of cooling water due to stressed supply or beyond operable water temperature limits for an extended period of time requiring power plants to reduce production or completely shut down. It presents a thermoelectric drought risk characterization framework that considers heat content and water quantity facets of adequate water availability for uninterrupted operation of such plants and safety of its surroundings. In addition, it outlines mechanisms to identify rate of occurrences of the said droughts and stochastically quantify subsequent potential losses to the sector. This mechanism is enabled through a model based on compound Nonhomogeneous Poisson Process. This study also demonstrates how

  18. Risk analysis before launch

    NASA Astrophysics Data System (ADS)

    Behlert, Rene

    1988-08-01

    A quality methodology is proposed based on risk analysis and observation of technical facts. The procedures for the quantization of a risk are described and examples are given. A closed loop quality analysis is described. Overall mission safety goals are described. The concept of maintenance is developed to evolutionary maintenance. It is shown that a large number of data must be processed to apply the proposed methods. The use of computer data processing is required.

  19. Methods of analysis for chemicals that disrupt cellular signaling pathways: risk assessment for potential endocrine disruptors.

    PubMed

    Umezawa, Yoshio; Ozawa, Takeaki; Sato, Moritoshi; Inadera, Hidekuni; Kaneko, Shuichi; Kunimoto, Manabu; Hashimoto, Shin-ichi

    2005-01-01

    Here we present a basic concept and several examples of methods of analysis for chemicals that disrupt cellular signaling pathways, in view of risk assessment for potential endocrine disrupting chemicals (EDCs). The key cellular signaling pathways include 1) ER/coactivator interaction, 2) AR translocation into the nucleus, 3) ER/NO/sGC/cGMP, 4) ER/Akt, 5) ER/Src, 6)ER/Src/Grb2, and 7) ER/Ca2+/CaM/CaMK pathways. These were visualized in relevant live cells using newly developed fluorescent and bioluminescent probes. Changes in cellular signals were thereby observed in nongenomic pathways of steroid hormones upon treatment of the target cells with steroid hormones and related chemicals. This method of analysis appears to be a rational approach to high-throughput prescreening (HTPS) of biohazardous chemicals, EDCs, in particular. Also described was the screening of gene expression by serial analysis of gene expression and gene chips upon applying EDCs to breast cancer cells, mouse livers, and human neuroblastoma NB-1 cells.

  20. Risk analysis of a biomass combustion process using MOSAR and FMEA methods.

    PubMed

    Thivel, P-X; Bultel, Y; Delpech, F

    2008-02-28

    Thermal and chemical conversion processes that convert in energy the sewage sludge, pasty waste and other pre-processed waste are increasingly common, for economic and ecological reasons. Fluidized bed combustion is currently one of the most promising methods of energy conversion, since it burns biomass very efficiently, and produces only very small quantities of sulphur and nitrogen oxides. The hazards associated with biomass combustion processes are fire, explosion and poisoning from the combustion gases (CO, etc.). The risk analysis presented in this paper uses the MADS-MOSAR methodology, applied to a semi-industrial pilot scheme comprising a fluidization column, a conventional cyclone, two natural gas burners and a continuous supply of biomass. The methodology uses a generic approach, with an initial macroscopic stage where hazard sources are identified, scenarios for undesired events are recognized and ranked using a grid of SeverityxProbability and safety barriers suggested. A microscopic stage then analyzes in detail the major risks identified during the first stage. This analysis may use various different tools, such as HAZOP, FMEA, etc.: our analysis is based on FMEA. Using MOSAR, we were able to identify five subsystems: the reactor (fluidized bed and centrifuge), the fuel and biomass supply lines, the operator and the environment. When we drew up scenarios based on these subsystems, we found that malfunction of the gas supply burners was a common trigger in many scenarios. Our subsequent microscopic analysis, therefore, focused on the burners, looking at the ways they failed, and at the effects and criticality of those failures (FMEA). We were, thus, able to identify a number of critical factors such as the incoming gas lines and the ignition electrode.

  1. Risk Analysis

    NASA Technical Reports Server (NTRS)

    Morring, Frank, Jr.

    2004-01-01

    A National Academies panel says the Hubble Space Telescope is too valuable ;or gamblingon a long-shot robotic mission to extend its service life, and urges Directly contradicting Administrator Sean O'Keefe, who killed a planned fifth shuttle servicing mission to the telescope on grounds it was too dangerous for a human crew in the post-Challenger environment, the expert committee found that upgrades to shuttle safety actually should make it less hazardous to fly to the telescope than it was before Columbia was lost. Risks of a telescope-servicing mission are only marginally greater than the planned missions to the International Space Station (ISS) O'Keefe has authorized, the panel found. After comparing those risks to the dangers inherent in trying to develop a complex space robot in the 39 months remaining in the Hubble s estimated service life, the panel opted for the human mission to save one of the major achievements of the American space program, in the words of Louis J. Lanzerotti, its chairman.

  2. Is risk analysis scientific?

    PubMed

    Hansson, Sven Ove; Aven, Terje

    2014-07-01

    This article discusses to what extent risk analysis is scientific in view of a set of commonly used definitions and criteria. We consider scientific knowledge to be characterized by its subject matter, its success in developing the best available knowledge in its fields of study, and the epistemic norms and values that guide scientific investigations. We proceed to assess the field of risk analysis according to these criteria. For this purpose, we use a model for risk analysis in which science is used as a base for decision making on risks, which covers the five elements evidence, knowledge base, broad risk evaluation, managerial review and judgment, and the decision; and that relates these elements to the domains experts and decisionmakers, and to the domains fact-based or value-based. We conclude that risk analysis is a scientific field of study, when understood as consisting primarily of (i) knowledge about risk-related phenomena, processes, events, etc., and (ii) concepts, theories, frameworks, approaches, principles, methods and models to understand, assess, characterize, communicate, and manage risk, in general and for specific applications (the instrumental part).

  3. A Refinement of Risk Analysis Procedures for Trichloroethylene Through the Use of Monte Carlo Method in Conjunction with Physiologically Based Pharmacokinetic Modeling

    DTIC Science & Technology

    1993-09-01

    This study refines risk analysis procedures for trichloroethylene (TCE) using a physiologically based pharmacokinetic (PBPK) model in conjunction...promulgate, and better present, more realistic standards.... Risk analysis , Physiologically based pharmacokinetics, Pbpk, Trichloroethylene, Monte carlo method.

  4. Indoor Soiling Method and Outdoor Statistical Risk Analysis of Photovoltaic Power Plants

    NASA Astrophysics Data System (ADS)

    Rajasekar, Vidyashree

    This is a two-part thesis. Part 1 presents an approach for working towards the development of a standardized artificial soiling method for laminated photovoltaic (PV) cells or mini-modules. Construction of an artificial chamber to maintain controlled environmental conditions and components/chemicals used in artificial soil formulation is briefly explained. Both poly-Si mini-modules and a single cell mono-Si coupons were soiled and characterization tests such as I-V, reflectance and quantum efficiency (QE) were carried out on both soiled, and cleaned coupons. From the results obtained, poly-Si mini-modules proved to be a good measure of soil uniformity, as any non-uniformity present would not result in a smooth curve during I-V measurements. The challenges faced while executing reflectance and QE characterization tests on poly-Si due to smaller size cells was eliminated on the mono-Si coupons with large cells to obtain highly repeatable measurements. This study indicates that the reflectance measurements between 600-700 nm wavelengths can be used as a direct measure of soil density on the modules. Part 2 determines the most dominant failure modes of field aged PV modules using experimental data obtained in the field and statistical analysis, FMECA (Failure Mode, Effect, and Criticality Analysis). The failure and degradation modes of about 744 poly-Si glass/polymer frameless modules fielded for 18 years under the cold-dry climate of New York was evaluated. Defect chart, degradation rates (both string and module levels) and safety map were generated using the field measured data. A statistical reliability tool, FMECA that uses Risk Priority Number (RPN) is used to determine the dominant failure or degradation modes in the strings and modules by means of ranking and prioritizing the modes. This study on PV power plants considers all the failure and degradation modes from both safety and performance perspectives. The indoor and outdoor soiling studies were jointly

  5. New Methods for the Analysis of Heartbeat Behavior in Risk Stratification

    PubMed Central

    Glass, Leon; Lerma, Claudia; Shrier, Alvin

    2011-01-01

    Developing better methods for risk stratification for tachyarrhythmic sudden cardiac remains a major challenge for physicians and scientists. Since the transition from sinus rhythm to ventricular tachycardia/fibrillation happens by different mechanisms in different people, it is unrealistic to think that a single measure will be adequate to provide a good index for risk stratification. We analyze the dynamical properties of ventricular premature complexes over 24 h in an effort to understand the underlying mechanisms of ventricular arrhythmias and to better understand the arrhythmias that occur in individual patients. Two dimensional density plots, called heartprints, correlate characteristic features of the dynamics of premature ventricular complexes and the sinus rate. Heartprints show distinctive characteristics in individual patients. Based on a better understanding of the natures of transitions from sinus rhythm to sudden cardiac and the mechanisms of arrhythmia prior to cardiac arrest, it should be possible to develop better methods for risk stratification. PMID:22144963

  6. FOOD RISK ANALYSIS

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Food risk analysis is a holistic approach to food safety because it considers all aspects of the problem. Risk assessment modeling is the foundation of food risk analysis. Proper design and simulation of the risk assessment model is important to properly predict and control risk. Because of knowl...

  7. Modelling childhood caries using parametric competing risks survival analysis methods for clustered data.

    PubMed

    Stephenson, J; Chadwick, B L; Playle, R A; Treasure, E T

    2010-01-01

    Caries in primary teeth is an ongoing issue in children's dental health. Its quantification is affected by clustering of data within children and the concurrent risk of exfoliation of primary teeth. This analysis of caries data of 103,776 primary molar tooth surfaces from a cohort study of 2,654 British children aged 4-5 years at baseline applied multilevel competing risks survival analysis methodology to identify factors significantly associated with caries occurrence in primary tooth surfaces in the presence of the concurrent risk of exfoliation, and assessed the effect of exfoliation on caries development. Multivariate multilevel parametric survival models were applied at surface level to the analysis of the sound-carious and sound-exfoliation transitions to which primary tooth surfaces are subject. Socio-economic class, fluoridation status and surface type were found to be the strongest predictors of primary caries, with the highest rates of occurrence and lowest median survival times associated with occlusal surfaces of children from poor socio-economic class living in non-fluoridated areas. The concurrent risk of exfoliation was shown to reduce the distinction in survival experience between different types of surfaces, and between surfaces of teeth from children of different socio-economic class or fluoridation status. Clustering of data had little effect on inferences of parameter significance.

  8. Method for improved prediction of bone fracture risk using bone mineral density in structural analysis

    NASA Technical Reports Server (NTRS)

    Cann, Christopher E. (Inventor); Faulkner, Kenneth G. (Inventor)

    1992-01-01

    A non-invasive in-vivo method of analyzing a bone for fracture risk includes obtaining data from the bone such as by computed tomography or projection imaging which data represents a measure of bone material characteristics such as bone mineral density. The distribution of the bone material characteristics is used to generate a finite element method (FEM) mesh from which load capability of the bone can be determined. In determining load capability, the bone is mathematically compressed, and stress, strain force, force/area versus bone material characteristics are determined.

  9. Integrated seismic risk analysis using simple weighting method: the case of residential Eskişehir, Turkey

    NASA Astrophysics Data System (ADS)

    Pekkan, E.; Tun, M.; Guney, Y.; Mutlu, S.

    2015-06-01

    A large part of the residential areas in Turkey are at risk from earthquakes. The main factors that threaten residential areas during an earthquake are poor quality building stock and soil problems. Liquefaction, loss of bearing capacity, amplification, slope failure, and landslide hazards must be taken into account for residential areas that are close to fault zones and covered with younger sediments. Analyzing these hazards separately and then combining the analyses would ensure a more realistic risk evaluation according to population density than analyzing several risks based on a single parameter. In this study, an integrated seismic risk analysis of central Eskişehir was performed based on two earthquake related parameters, liquefaction and amplification. The analysis used a simple weighting method. Other earthquake-related problems such as loss of bearing capacity, landslides, and slope failures are not significant for Eskişehir because of the geological and the topographical conditions of the region. According to the integrated seismic risk analysis of the Eskişehir residential area, the populated area is found to be generally at medium to high risk during a potential earthquake.

  10. Use of a systematic risk analysis method to improve safety in the production of paediatric parenteral nutrition solutions

    PubMed Central

    Bonnabry, P; Cingria, L; Sadeghipour, F; Ing, H; Fonzo-Christe, C; Pfister, R

    2005-01-01

    Background: Until recently, the preparation of paediatric parenteral nutrition formulations in our institution included re-transcription and manual compounding of the mixture. Although no significant clinical problems have occurred, re-engineering of this high risk activity was undertaken to improve its safety. Several changes have been implemented including new prescription software, direct recording on a server, automatic printing of the labels, and creation of a file used to pilot a BAXA MM 12 automatic compounder. The objectives of this study were to compare the risks associated with the old and new processes, to quantify the improved safety with the new process, and to identify the major residual risks. Methods: A failure modes, effects, and criticality analysis (FMECA) was performed by a multidisciplinary team. A cause-effect diagram was built, the failure modes were defined, and the criticality index (CI) was determined for each of them on the basis of the likelihood of occurrence, the severity of the potential effect, and the detection probability. The CIs for each failure mode were compared for the old and new processes and the risk reduction was quantified. Results: The sum of the CIs of all 18 identified failure modes was 3415 for the old process and 1397 for the new (reduction of 59%). The new process reduced the CIs of the different failure modes by a mean factor of 7. The CI was smaller with the new process for 15 failure modes, unchanged for two, and slightly increased for one. The greatest reduction (by a factor of 36) concerned re-transcription errors, followed by readability problems (by a factor of 30) and chemical cross contamination (by a factor of 10). The most critical steps in the new process were labelling mistakes (CI 315, maximum 810), failure to detect a dosage or product mistake (CI 288), failure to detect a typing error during the prescription (CI 175), and microbial contamination (CI 126). Conclusions: Modification of the process

  11. Mission Risk Diagnostic (MRD) Method Description

    DTIC Science & Technology

    2012-02-01

    Mission Risk Diagnostic ( MRD ) Method Description Christopher Alberts Audrey Dorofee February 2012 TECHNICAL NOTE CMU/SEI-2012-TN-005...Analyzing Risk 13 3.1 Tactical Risk Analysis 14 3.2 Mission Risk Analysis 15 4 Mission Risk Diagnostic ( MRD ) Concepts 18 4.1 Identify Mission and...4.2.3 Tailoring an Existing Set of Drivers 25 4.3 Analyze Drivers 26 5 Mission Risk Diagnostic ( MRD ) Method 32 5.1 MRD Structure 34 5.2 Prepare

  12. Methods Development for a Spatially Explicit Population-Level Risk Assessment, Uncertainty Analysis, and Comparison with Risk Quotient Approaches

    EPA Science Inventory

    The standard framework of Ecological Risk Assessment (ERA) uses organism-level assessment endpoints to qualitatively determine the risk to populations. While organism-level toxicity data provide the pathway by which a species may be affected by a chemical stressor, they neither i...

  13. Probabilistic risk analysis and terrorism risk.

    PubMed

    Ezell, Barry Charles; Bennett, Steven P; von Winterfeldt, Detlof; Sokolowski, John; Collins, Andrew J

    2010-04-01

    Since the terrorist attacks of September 11, 2001, and the subsequent establishment of the U.S. Department of Homeland Security (DHS), considerable efforts have been made to estimate the risks of terrorism and the cost effectiveness of security policies to reduce these risks. DHS, industry, and the academic risk analysis communities have all invested heavily in the development of tools and approaches that can assist decisionmakers in effectively allocating limited resources across the vast array of potential investments that could mitigate risks from terrorism and other threats to the homeland. Decisionmakers demand models, analyses, and decision support that are useful for this task and based on the state of the art. Since terrorism risk analysis is new, no single method is likely to meet this challenge. In this article we explore a number of existing and potential approaches for terrorism risk analysis, focusing particularly on recent discussions regarding the applicability of probabilistic and decision analytic approaches to bioterrorism risks and the Bioterrorism Risk Assessment methodology used by the DHS and criticized by the National Academies and others.

  14. The "Dry-Run" Analysis: A Method for Evaluating Risk Scores for Confounding Control.

    PubMed

    Wyss, Richard; Hansen, Ben B; Ellis, Alan R; Gagne, Joshua J; Desai, Rishi J; Glynn, Robert J; Stürmer, Til

    2017-03-06

    A propensity score (PS) model's ability to control confounding can be assessed by evaluating covariate balance across exposure groups after PS adjustment. The optimal strategy for evaluating a disease risk score (DRS) model's ability to control confounding is less clear. DRS models cannot be evaluated through balance checks within the full population, and they are usually assessed through prediction diagnostics and goodness-of-fit tests. A proposed alternative is the "dry-run" analysis, which divides the unexposed population into "pseudo-exposed" and "pseudo-unexposed" groups so that differences on observed covariates resemble differences between the actual exposed and unexposed populations. With no exposure effect separating the pseudo-exposed and pseudo-unexposed groups, a DRS model is evaluated by its ability to retrieve an unconfounded null estimate after adjustment in this pseudo-population. We used simulations and an empirical example to compare traditional DRS performance metrics with the dry-run validation. In simulations, the dry run often improved assessment of confounding control, compared with the C statistic and goodness-of-fit tests. In the empirical example, PS and DRS matching gave similar results and showed good performance in terms of covariate balance (PS matching) and controlling confounding in the dry-run analysis (DRS matching). The dry-run analysis may prove useful in evaluating confounding control through DRS models.

  15. Assessing the Risk of Secondary Transfer Via Fingerprint Brush Contamination Using Enhanced Sensitivity DNA Analysis Methods.

    PubMed

    Bolivar, Paula-Andrea; Tracey, Martin; McCord, Bruce

    2016-01-01

    Experiments were performed to determine the extent of cross-contamination of DNA resulting from secondary transfer due to fingerprint brushes used on multiple items of evidence. Analysis of both standard and low copy number (LCN) STR was performed. Two different procedures were used to enhance sensitivity, post-PCR cleanup and increased cycle number. Under standard STR typing procedures, some additional alleles were produced that were not present in the controls or blanks; however, there was insufficient data to include the contaminant donor as a contributor. Inclusion of the contaminant donor did occur for one sample using post-PCR cleanup. Detection of the contaminant donor occurred for every replicate of the 31 cycle amplifications; however, using LCN interpretation recommendations for consensus profiles, only one sample would include the contaminant donor. Our results indicate that detection of secondary transfer of DNA can occur through fingerprint brush contamination and is enhanced using LCN-DNA methods.

  16. Risk analysis methodology survey

    NASA Technical Reports Server (NTRS)

    Batson, Robert G.

    1987-01-01

    NASA regulations require that formal risk analysis be performed on a program at each of several milestones as it moves toward full-scale development. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from simple to complex network-based simulation were surveyed. A Program Risk Analysis Handbook was prepared in order to provide both analyst and manager with a guide for selection of the most appropriate technique.

  17. Methods for Multitemporal Analysis of Satellite Data Aimed at Environmental Risk Monitoring

    NASA Astrophysics Data System (ADS)

    Caprioli, M.; Scognamiglio, A.

    2012-08-01

    In the last years the topic of Environmental monitoring has raised a particular importance, also according to minor short-term stability and predictability of climatic events. Facing this situation, often in terms of emergency, involves high and unpredictable costs for public Agencies. Prevention of damages caused by natural disasters does not regard only weather forecasts, but requires constant attention and practice of monitoring and control of human activity on territory. Practically, the problem is not knowing if and when an event will affect a determined area, but recognizing the possible damages if this event happened, by adopting the adequate measures to reduce them to a minimum, and requiring the necessary tools for a timely intervention. On the other hand, the surveying technologies should be the most possible accurate and updatable in order to guarantee high standards, involving the analysis of a great amount of data. The management of such data requires the integration and calculation systems with specialized software and fast and reliable connection and communication networks. To solve such requirements, current satellite technology, with recurrent data acquisition for the timely generation of cartographic products updated and coherent to the territorial investigation, offers the possibility to fill the temporal gap between the need of urgent information and official reference information. Among evolved image processing techniques, Change detection analysis is useful to facilitate individuation of environmental temporal variations, contributing to reduce the users intervention by means of the processes automation and improving in a progressive way the qualitative and quantitative accuracy of results. The research investigate automatic methods on land cover transformations by means of "Change detection" techniques executable on satellite data that are heterogeneous for spatial and spectral resolution with homogenization and registration in an unique

  18. Automated Method for Analysis of Mammographic Breast Density - A Technique for Breast Cancer Risk Estimation

    DTIC Science & Technology

    2006-07-01

    parenchymal patterns and breast cancer risk,’’ Epidemiol. Rev. 9, 146–174 ~1987!. 6 J. Brisson, R. Verreault , A. S. Morrison, D. Tennina, and F. Meyer...contraceptive designed to reduce breast cancer risk,’’ J. Natl. Cancer Inst. 86, 431–436 ~1994!. 17 J. Brisson, R. Verreault , A. S. Morrison, D. Tennina, and...equipment and product of any companies mentioned should be inferred. References 1. J. Brisson, R. Verreault , A. S. Morrison, D. Tennina and F. Meyer

  19. Utility Theory as a Method to Minimise the Risk in Deformation Analysis Decisions

    NASA Astrophysics Data System (ADS)

    Zhang, Yin; Neumann, Ingo

    2014-11-01

    Deformation monitoring usually focuses on the detection of whether the monitored objects satisfy the given properties (e.g. being stable or not), and makes further decisions to minimise the risks, for example, the consequences and costs in case of collapse of artificial objects and/or natural hazards. With this intention, a methodology relying on hypothesis testing and utility theory is reviewed in this paper. The main idea of utility theory is to judge each possible outcome with a utility value. The presented methodology makes it possible to minimise the risk of an individual monitoring project by considering the costs and consequences of overall possible situations within the decision process. It is not the danger that the monitored object may collapse that can be reduced. The risk (based on the utility values multiplied by the danger) can be described more appropriately and therefore more valuable decisions can be made. Especially, the opportunity for the measurement process to minimise the risk is an important key issue. In this paper, application of the methodology to two of the classical cases in hypothesis testing will be discussed in detail: 1) both probability density functions (pdfs) of tested objects under null and alternative hypotheses are known; 2) only the pdf under the null hypothesis is known and the alternative hypothesis is treated as the pure negation of the null hypothesis. Afterwards, a practical example in deformation monitoring is introduced and analysed. Additionally, the way in which the magnitudes of utility values (consequences of a decision) influence the decision will be considered and discussed at the end.

  20. Multidimensional Risk Analysis: MRISK

    NASA Technical Reports Server (NTRS)

    McCollum, Raymond; Brown, Douglas; O'Shea, Sarah Beth; Reith, William; Rabulan, Jennifer; Melrose, Graeme

    2015-01-01

    Multidimensional Risk (MRISK) calculates the combined multidimensional score using Mahalanobis distance. MRISK accounts for covariance between consequence dimensions, which de-conflicts the interdependencies of consequence dimensions, providing a clearer depiction of risks. Additionally, in the event the dimensions are not correlated, Mahalanobis distance reduces to Euclidean distance normalized by the variance and, therefore, represents the most flexible and optimal method to combine dimensions. MRISK is currently being used in NASA's Environmentally Responsible Aviation (ERA) project o assess risk and prioritize scarce resources.

  1. Analysis of the LaSalle Unit 2 nuclear power plant: Risk Methods Integration and Evaluation Program (RMIEP). Volume 8, Seismic analysis

    SciTech Connect

    Wells, J.E.; Lappa, D.A.; Bernreuter, D.L.; Chen, J.C.; Chuang, T.Y.; Johnson, J.J.; Campbell, R.D.; Hashimoto, P.S.; Maslenikov, O.R.; Tiong, L.W.; Ravindra, M.K.; Kincaid, R.H.; Sues, R.H.; Putcha, C.S.

    1993-11-01

    This report describes the methodology used and the results obtained from the application of a simplified seismic risk methodology to the LaSalle County Nuclear Generating Station Unit 2. This study is part of the Level I analysis being performed by the Risk Methods Integration and Evaluation Program (RMIEP). Using the RMIEP developed event and fault trees, the analysis resulted in a seismically induced core damage frequency point estimate of 6.OE-7/yr. This result, combined with the component importance analysis, indicated that system failures were dominated by random events. The dominant components included diesel generator failures (failure to swing, failure to start, failure to run after started), and condensate storage tank.

  2. Recasting risk analysis methods in terms of object-oriented modeling techniques

    SciTech Connect

    Wyss, G.D.; Craft, R.L.; Vandewart, R.L.; Funkhouser, D.R.

    1998-08-01

    For more than two decades, risk analysts have relied on powerful logic-based models to perform their analyses. However, the applicability of these models has been limited because they can be complex and expensive to develop. Analysts must frequently start from scratch when analyzing a new (but similar) system because the understanding of how the system works exists only in the mind of the analyst and is only incompletely instantiated in the actual logic model. This paper introduces the notion of using explicit object-oriented system models, such as those embodied in computer-aided software engineering (CASE) tools, to document the analyst`s understanding of the system and appropriately capture how the system works. It also shows that from these models, standard assessment products, such as fault trees and event trees, can be automatically derived.

  3. Program risk analysis handbook

    NASA Technical Reports Server (NTRS)

    Batson, R. G.

    1987-01-01

    NASA regulations specify that formal risk analysis be performed on a program at each of several milestones. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from extremely simple to complex network-based simulation, are described in this handbook in order to provide both analyst and manager with a guide for selection of the most appropriate technique. All program risk assessment techniques are shown to be based on elicitation and encoding of subjective probability estimates from the various area experts on a program. Techniques to encode the five most common distribution types are given. Then, a total of twelve distinct approaches to risk assessment are given. Steps involved, good and bad points, time involved, and degree of computer support needed are listed. Why risk analysis should be used by all NASA program managers is discussed. Tools available at NASA-MSFC are identified, along with commercially available software. Bibliography (150 entries) and a program risk analysis check-list are provided.

  4. The influence of the free space environment on the superlight-weight thermal protection system: conception, methods, and risk analysis

    NASA Astrophysics Data System (ADS)

    Yatsenko, Vitaliy; Falchenko, Iurii; Fedorchuk, Viktor; Petrushynets, Lidiia

    2016-07-01

    This report focuses on the results of the EU project "Superlight-weight thermal protection system for space application (LIGHT-TPS)". The bottom line is an analysis of influence of the free space environment on the superlight-weight thermal protection system (TPS). This report focuses on new methods that based on the following models: synergetic, physical, and computational. This report concentrates on four approaches. The first concerns the synergetic approach. The synergetic approach to the solution of problems of self-controlled synthesis of structures and creation of self-organizing technologies is considered in connection with the super-problem of creation of materials with new functional properties. Synergetics methods and mathematical design are considered according to actual problems of material science. The second approach describes how the optimization methods can be used to determine material microstructures with optimized or targeted properties. This technique enables one to find unexpected microstructures with exotic behavior (e.g., negative thermal expansion coefficients). The third approach concerns the dynamic probabilistic risk analysis of TPS l elements with complex characterizations for damages using a physical model of TPS system and a predictable level of ionizing radiation and space weather. Focusing is given mainly on the TPS model, mathematical models for dynamic probabilistic risk assessment and software for the modeling and prediction of the influence of the free space environment. The probabilistic risk assessment method for TPS is presented considering some deterministic and stochastic factors. The last approach concerns results of experimental research of the temperature distribution on the surface of the honeycomb sandwich panel size 150 x 150 x 20 mm at the diffusion welding in vacuum are considered. An equipment, which provides alignment of temperature fields in a product for the formation of equal strength of welded joints is

  5. Designing a Software for Flood Risk Assessment Based on Multi Criteria Desicion Analysis and Information Diffusion Methods

    NASA Astrophysics Data System (ADS)

    Musaoglu, N.; Saral, A.; Seker, D. Z.

    2012-12-01

    Flooding is one of the major natural disasters not only in Turkey but also in all over the world and it causes serious damage and harm. It is estimated that of the total economic loss caused by all kinds of disasters, 40% was due to floods. In July 1995, the Ayamama Creek in Istanbul was flooded, the insurance sector received around 1,200 claims notices during that period, insurance companies had to pay a total of $40 million for claims. In 2009, the same creek was flooded again and killed 31 people over two days and insurance firms paid for damages around cost €150 million for claims. To solve these kinds of problems modern tools such as GIS and Remote Sensing should be utilized. In this study, a software was designed for the flood risk analysis with Analytic Hierarchy Process (AHP) and Information Diffusion( InfoDif) methods.In the developed sofware, five evaluation criterias were taken into account, which were slope, aspect, elevation, geology and land use which were extracted from the satellite sensor data. The Digital Elevation Model (DEM) of the Ayamama River Basin was acquired from the SPOT 5 satellite image with 2.5 meter spatial resolution. Slope and aspect values of the study basin were extracted from this DEM. The land use of the Ayamama Creek was obtained by performing object-oriented nearest neighbor classification method by image segmentation on SPOT 5 image dated 2010. All produced data were used as an input for the part of Multi Criteria Desicion Analysis (MCDA) method of this software. Criterias and their each sub criteras were weighted and flood vulnerability was determined with MCDA-AHP. Also, daily flood data was collected from Florya Meteorological Station, between 1975 to 2009 years and the daily flood peak discharge was calculated with the method of Soil Conservation Service-Curve Number (SCS-CN) and were used as an input in the software for the part of InfoDif.Obtained results were verified using ground truth data and it has been clearly

  6. [Groundwater pollution risk mapping method].

    PubMed

    Shen, Li-na; Li, Guang-he

    2010-04-01

    Based on methods for groundwater vulnerability assessment not involving in contamination source elements, and lack of the systemic and effective techniques and parameter system on groundwater pollution risk mapping in present, through analyzing the structure of groundwater system and characteristics of contaminant sources, and coupling groundwater intrinsic vulnerability with contaminant sources, the integrated multi-index models were developed to evaluate the risk sources of groundwater contaminant and form the groundwater pollution risk mapping in this paper. The models had been used to a large-scale karst groundwater source of northern China as a case study. The results indicated that vulnerability assessment overlaid risk pollution sources of groundwater could effectively confirm the high risk regions of groundwater pollution, and the methods might provide necessary support for the supervision of groundwater pollution.

  7. Translational benchmark risk analysis

    PubMed Central

    Piegorsch, Walter W.

    2010-01-01

    Translational development – in the sense of translating a mature methodology from one area of application to another, evolving area – is discussed for the use of benchmark doses in quantitative risk assessment. Illustrations are presented with traditional applications of the benchmark paradigm in biology and toxicology, and also with risk endpoints that differ from traditional toxicological archetypes. It is seen that the benchmark approach can apply to a diverse spectrum of risk management settings. This suggests a promising future for this important risk-analytic tool. Extensions of the method to a wider variety of applications represent a significant opportunity for enhancing environmental, biomedical, industrial, and socio-economic risk assessments. PMID:20953283

  8. DWPF risk analysis summary

    SciTech Connect

    Shedrow, C.B.

    1990-10-01

    This document contains selected risk analysis data from Chapter 9 (Safety Analysis) of the Defense Waste Processing Facility Safety Analysis Report DWPF SAR and draft Addendum 1 to the Waste Tank Farms SAR. Although these data may be revised prior to finalization of the draft SAR and the draft addendum, they are presently the best available information and were therefore used in preparing the risk analysis portion of the DWPF Environmental Analysis (DWPF EA). This information has been extracted from those draft documents an approved under separate cover so that it can be used as reference material for the DWPF EA when it is placed in the public reading rooms. 9 refs., 4 tabs.

  9. Augmenting the Deliberative Method for Ranking Risks.

    PubMed

    Susel, Irving; Lasley, Trace; Montezemolo, Mark; Piper, Joel

    2016-01-01

    The Department of Homeland Security (DHS) characterized and prioritized the physical cross-border threats and hazards to the nation stemming from terrorism, market-driven illicit flows of people and goods (illegal immigration, narcotics, funds, counterfeits, and weaponry), and other nonmarket concerns (movement of diseases, pests, and invasive species). These threats and hazards pose a wide diversity of consequences with very different combinations of magnitudes and likelihoods, making it very challenging to prioritize them. This article presents the approach that was used at DHS to arrive at a consensus regarding the threats and hazards that stand out from the rest based on the overall risk they pose. Due to time constraints for the decision analysis, it was not feasible to apply multiattribute methodologies like multiattribute utility theory or the analytic hierarchy process. Using a holistic approach was considered, such as the deliberative method for ranking risks first published in this journal. However, an ordinal ranking alone does not indicate relative or absolute magnitude differences among the risks. Therefore, the use of the deliberative method for ranking risks is not sufficient for deciding whether there is a material difference between the top-ranked and bottom-ranked risks, let alone deciding what the stand-out risks are. To address this limitation of ordinal rankings, the deliberative method for ranking risks was augmented by adding an additional step to transform the ordinal ranking into a ratio scale ranking. This additional step enabled the selection of stand-out risks to help prioritize further analysis.

  10. Risk analysis and meat hygiene.

    PubMed

    Hathaway, S C

    1993-12-01

    Meat hygiene consists of three major activities: post-mortem inspection; monitoring and surveillance for chemical hazards; and maintenance of good hygienic practice throughout all stages between slaughter and consumption of meat. Risk analysis is an applied science of increasing importance to these activities in the following areas: facilitating the distribution of pre-harvest, harvest and post-harvest inspection resources, proportional to the likelihood of public health and animal health hazards; establishing internationally-harmonized standards and specifications which are consistent and science-based; and improving the safety and wholesomeness of meat and meat products in local and international trade. Risk analysis, in one form or another, is well developed with respect to establishing standards and specifications for chemical hazards; methods for risk analysis of post-mortem meat inspection programmes are beginning to emerge. However, risk analysis of microbiological hazards in meat and meat products presents particular difficulties. All areas of application currently suffer from a lack of international agreement on risk assessment and risk management methodology.

  11. Bivariate hydrologic risk analysis based on a coupled entropy-copula method for the Xiangxi River in the Three Gorges Reservoir area, China

    NASA Astrophysics Data System (ADS)

    Fan, Y. R.; Huang, W. W.; Huang, G. H.; Huang, K.; Li, Y. P.; Kong, X. M.

    2016-07-01

    In this study, a bivariate hydrologic risk framework is proposed based on a coupled entropy-copula method. In the proposed risk analysis framework, bivariate flood frequency would be analyzed for different flood variable pairs (i.e., flood peak-volume, flood peak-duration, flood volume-duration). The marginal distributions of flood peak, volume, and duration are quantified through both parametric (i.e., gamma, general extreme value (GEV), and lognormal distributions) and nonparametric (i.e., entropy) approaches. The joint probabilities of flood peak-volume, peak-duration, and volume-duration are established through copulas. The bivariate hydrologic risk is then derived based on the joint return period to reflect the interactive effects of flood variables on the final hydrologic risk values. The proposed method is applied to the risk analysis for the Xiangxi River in the Three Gorges Reservoir area, China. The results indicate the entropy method performs best in quantifying the distribution of flood duration. Bivariate hydrologic risk would then be generated to characterize the impacts of flood volume and duration on the occurrence of a flood. The results suggest that the bivariate risk for flood peak-volume would not decrease significantly for the flood volume less than 1000 m3/s. Moreover, a flood in the Xiangxi River may last at least 5 days without significant decrease of the bivariate risk for flood peak-duration.

  12. The report of Task Group 100 of the AAPM: Application of risk analysis methods to radiation therapy quality management

    PubMed Central

    Huq, M. Saiful; Fraass, Benedick A.; Dunscombe, Peter B.; Gibbons, John P.; Mundt, Arno J.; Mutic, Sasa; Palta, Jatinder R.; Rath, Frank; Thomadsen, Bruce R.; Williamson, Jeffrey F.; Yorke, Ellen D.

    2016-01-01

    The increasing complexity of modern radiation therapy planning and delivery challenges traditional prescriptive quality management (QM) methods, such as many of those included in guidelines published by organizations such as the AAPM, ASTRO, ACR, ESTRO, and IAEA. These prescriptive guidelines have traditionally focused on monitoring all aspects of the functional performance of radiotherapy (RT) equipment by comparing parameters against tolerances set at strict but achievable values. Many errors that occur in radiation oncology are not due to failures in devices and software; rather they are failures in workflow and process. A systematic understanding of the likelihood and clinical impact of possible failures throughout a course of radiotherapy is needed to direct limit QM resources efficiently to produce maximum safety and quality of patient care. Task Group 100 of the AAPM has taken a broad view of these issues and has developed a framework for designing QM activities, based on estimates of the probability of identified failures and their clinical outcome through the RT planning and delivery process. The Task Group has chosen a specific radiotherapy process required for “intensity modulated radiation therapy (IMRT)” as a case study. The goal of this work is to apply modern risk-based analysis techniques to this complex RT process in order to demonstrate to the RT community that such techniques may help identify more effective and efficient ways to enhance the safety and quality of our treatment processes. The task group generated by consensus an example quality management program strategy for the IMRT process performed at the institution of one of the authors. This report describes the methodology and nomenclature developed, presents the process maps, FMEAs, fault trees, and QM programs developed, and makes suggestions on how this information could be used in the clinic. The development and implementation of risk-assessment techniques will make radiation

  13. A multifactorial analysis of obesity as CVD risk factor: Use of neural network based methods in a nutrigenetics context

    PubMed Central

    2010-01-01

    Background Obesity is a multifactorial trait, which comprises an independent risk factor for cardiovascular disease (CVD). The aim of the current work is to study the complex etiology beneath obesity and identify genetic variations and/or factors related to nutrition that contribute to its variability. To this end, a set of more than 2300 white subjects who participated in a nutrigenetics study was used. For each subject a total of 63 factors describing genetic variants related to CVD (24 in total), gender, and nutrition (38 in total), e.g. average daily intake in calories and cholesterol, were measured. Each subject was categorized according to body mass index (BMI) as normal (BMI ≤ 25) or overweight (BMI > 25). Two artificial neural network (ANN) based methods were designed and used towards the analysis of the available data. These corresponded to i) a multi-layer feed-forward ANN combined with a parameter decreasing method (PDM-ANN), and ii) a multi-layer feed-forward ANN trained by a hybrid method (GA-ANN) which combines genetic algorithms and the popular back-propagation training algorithm. Results PDM-ANN and GA-ANN were comparatively assessed in terms of their ability to identify the most important factors among the initial 63 variables describing genetic variations, nutrition and gender, able to classify a subject into one of the BMI related classes: normal and overweight. The methods were designed and evaluated using appropriate training and testing sets provided by 3-fold Cross Validation (3-CV) resampling. Classification accuracy, sensitivity, specificity and area under receiver operating characteristics curve were utilized to evaluate the resulted predictive ANN models. The most parsimonious set of factors was obtained by the GA-ANN method and included gender, six genetic variations and 18 nutrition-related variables. The corresponding predictive model was characterized by a mean accuracy equal of 61.46% in the 3-CV testing sets. Conclusions The ANN

  14. Qualitative methods for assessing risk

    SciTech Connect

    Mahn, J.A.; Hannaman, G.W.; Kryska, P.

    1995-03-01

    The purpose of this document is to describe a qualitative risk assessment process that supplements the requirements of DOE/AL 5481.1B. Although facility managers have a choice of assessing risk either quantitatively or qualitatively, trade offs are involved in making the most appropriate choice for a given application. The results that can be obtained from a quantitative risk assessment are significantly more robust than those results derived from a qualitative approach. However, the advantages derived from quantitative risk assessment are achieved at a greater expenditure of money, time and convenience. This document provides the elements of a framework for performing a much less costly qualitative risk assessment, while retaining the best attributes of quantitative methods. The approach discussed herein will; (1) provide facility managers with the tools to prepare consistent, site wide assessments, and (2) aid the reviewers who may be tasked to evaluate the assessments. Added cost/benefit measures of the qualitative methodology include the identification of mechanisms for optimally allocating resources for minimizing risk in an expeditious, and fiscally responsible manner.

  15. Study Of The Risks Arising From Natural Disasters And Hazards On Urban And Intercity Motorways By Using Failure Mode Effect Analysis (FMEA) Methods

    NASA Astrophysics Data System (ADS)

    DELİCE, Yavuz

    2015-04-01

    Highways, Located in the city and intercity locations are generally prone to many kind of natural disaster risks. Natural hazards and disasters that may occur firstly from highway project making to construction and operation stages and later during the implementation of highway maintenance and repair stages have to be taken into consideration. And assessment of risks that may occur against adverse situations is very important in terms of project design, construction, operation maintenance and repair costs. Making hazard and natural disaster risk analysis is largely depending on the definition of the likelihood of the probable hazards on the highways. However, assets at risk , and the impacts of the events must be examined and to be rated in their own. With the realization of these activities, intended improvements against natural hazards and disasters will be made with the utilization of Failure Mode Effects Analysis (FMEA) method and their effects will be analyzed with further works. FMEA, is a useful method to identify the failure mode and effects depending on the type of failure rate effects priorities and finding the most optimum economic and effective solution. Although relevant measures being taken for the identified risks by this analysis method , it may also provide some information for some public institutions about the nature of these risks when required. Thus, the necessary measures will have been taken in advance in the city and intercity highways. Many hazards and natural disasters are taken into account in risk assessments. The most important of these dangers can be listed as follows; • Natural disasters 1. Meteorological based natural disasters (floods, severe storms, tropical storms, winter storms, avalanches, etc.). 2. Geological based natural disasters (earthquakes, tsunamis, landslides, subsidence, sinkholes, etc) • Human originated disasters 1. Transport accidents (traffic accidents), originating from the road surface defects (icing

  16. [Comparative analysis of two different methods for risk assessment of groundwater pollution: a case study in Beijing plain].

    PubMed

    Wang, Hong-na; He, Jiang-tao; Ma, Wen-jie; Xu, Zhen

    2015-01-01

    Groundwater contamination risk assessment has important meaning to groundwater contamination prevention planning and groundwater exploitation potentiality. Recently, UN assessment system and WP assessment system have become the focuses of international research. In both systems, the assessment framework and indices were drawn from five aspects: intrinsic vulnerability, aquifer storage, groundwater quality, groundwater resource protection zone and contamination load. But, the five factors were built up in different ways. In order to expound the difference between the UN and WP assessment systems, and explain the main reasons, the UN and WP assessment systems were applied to Beijing Plain, China. The maps constructed from the UN and WP risk assessment systems were compared. The results showed that both kinds of groundwater contamination risk assessment maps were in accordance with the actual conditions and were similar in spatial distribution trends. However, there was quite significant different in the coverage area at the same level. It also revealed that during the system construction process, the structural hierarchy, relevant overlaying principles and classification method might have effects on the groundwater contamination risk assessment map. UN assessment system and WP assessment system were both suitable for groundwater contamination risk assessment of the plain, however, their emphasis was different.

  17. Comparison of nonlinear methods symbolic dynamics, detrended fluctuation, and Poincaré plot analysis in risk stratification in patients with dilated cardiomyopathy

    NASA Astrophysics Data System (ADS)

    Voss, Andreas; Schroeder, Rico; Truebner, Sandra; Goernig, Matthias; Figulla, Hans Reiner; Schirdewan, Alexander

    2007-03-01

    Dilated cardiomyopathy (DCM) has an incidence of about 20/100 000 new cases per annum and accounts for nearly 10 000 deaths per year in the United States. Approximately 36% of patients with dilated cardiomyopathy (DCM) suffer from cardiac death within five years after diagnosis. Currently applied methods for an early risk prediction in DCM patients are rather insufficient. The objective of this study was to investigate the suitability of short-term nonlinear methods symbolic dynamics (STSD), detrended fluctuation (DFA), and Poincaré plot analysis (PPA) for risk stratification in these patients. From 91 DCM patients and 30 healthy subjects (REF), heart rate and blood pressure variability (HRV, BPV), STSD, DFA, and PPA were analyzed. Measures from BPV analysis, DFA, and PPA revealed highly significant differences (p<0.0011) discriminating REF and DCM. For risk stratification in DCM patients, four parameters from BPV analysis, STSD, and PPA revealed significant differences between low and high risk (maximum sensitivity: 90%, specificity: 90%). These results suggest that STSD and PPA are useful nonlinear methods for enhanced risk stratification in DCM patients.

  18. [METHODS AND TECHNOLOGIES OF HEALTH RISK ANALYSIS IN THE SYSTEM OF THE STATE MANAGEMENT UNDER ASSURANCE OF THE SANITATION AND EPIDEMIOLOGICAL WELFARE OF POPULATION].

    PubMed

    Zaĭtseva, N V; Popova, A Iu; Maĭ, I V; Shur, P Z

    2015-01-01

    The methodology of the analysis of health risk at the present stage of development of Russian society is in-demand at all levels of government management. In conjunction with the methods of mathematical modeling, spatial-temporal analysis and economic tools the risk assessment in the analysis of the situation makes it possible to determine the level of safety of the population, workers and consumers, to select prior resources, and threat factors as a point for exertion efforts. At the planning stage risk assessment is a basis for the establishment of most effective measures for the minimization of hazard and dangers. At the realization stage the methodology allows to estimate the efficiency of measures; at the control and supervision phase it permits to select out priorities for the concentration of efforts on the objects of maximal health risk for population. Risk assessments, including the elements of evolutionary modeling, are incorporated in the system of state hygienic regulation, the formation of evidence base of harm to health, the organization of control and supervisory activities. This allows you to harmonize the domestic legal framework with ternational legal requirements and ultimately enhances the credibility of the Russian data on the safety of the environment products and services. There is seemed to be actual the further assignment of enforcement of methodology of health risk analysis in the field of assurance of sanitary and epidemiological well-being and health of employers; he development of informational and analytical base in the part of the establishment of models of dependencies "exposure-response" for different types and levels of exposure and risk contingents; the accuracy enhancement of estimations of exposure; improvement of the economic aspects of health risk analysis and forecasting of measures aimed at mitigation of the losses associated with the negative impact of manifold factors on the health of citizens.

  19. Risk Analysis Virtual ENvironment

    SciTech Connect

    2014-02-10

    RAVEN has 3 major functionalities: 1. Provides a Graphical User Interface for the pre- and post-processing of the RELAP-7 input and output. 2. Provides the capability to model nuclear power plants control logic for the RELAP-7 code and dynamic control of the accident scenario evolution. This capability is based on a software structure that realizes a direct connection between the RELAP-7 solver engine (MOOSE) and a python environment where the variables describing the plant status are accessible in a scripting environment. RAVEN support the generation of the probabilistic scenario control by supplying a wide range of probability and cumulative distribution functions and their inverse functions. 3. Provides a general environment to perform probability risk analysis for RELAP-7, RELAP-5 and any generic MOOSE based applications. The probabilistic analysis is performed by sampling the input space of the coupled code parameters and it is enhanced by using modern artificial intelligence algorithms that accelerate the identification of the areas of major risk (in the input parameter space). This environment also provides a graphical visualization capability to analyze the outcomes. Among other approaches, the classical Monte Carlo and Latin Hypercube sampling algorithms are available. For the acceleration of the convergence of the sampling methodologies, Support Vector Machines, Bayesian regression, and collocation stochastic polynomials chaos are implemented. The same methodologies here described could be used to solve optimization and uncertainties propagation problems using the RAVEN framework.

  20. A ubiquitous method for street scale spatial data collection and analysis in challenging urban environments: mapping health risks using spatial video in Haiti

    PubMed Central

    2013-01-01

    Background Fine-scale and longitudinal geospatial analysis of health risks in challenging urban areas is often limited by the lack of other spatial layers even if case data are available. Underlying population counts, residential context, and associated causative factors such as standing water or trash locations are often missing unless collected through logistically difficult, and often expensive, surveys. The lack of spatial context also hinders the interpretation of results and designing intervention strategies structured around analytical insights. This paper offers a ubiquitous spatial data collection approach using a spatial video that can be used to improve analysis and involve participatory collaborations. A case study will be used to illustrate this approach with three health risks mapped at the street scale for a coastal community in Haiti. Methods Spatial video was used to collect street and building scale information, including standing water, trash accumulation, presence of dogs, cohort specific population characteristics, and other cultural phenomena. These data were digitized into Google Earth and then coded and analyzed in a GIS using kernel density and spatial filtering approaches. The concentrations of these risks around area schools which are sometimes sources of diarrheal disease infection because of the high concentration of children and variable sanitary practices will show the utility of the method. In addition schools offer potential locations for cholera education interventions. Results Previously unavailable fine scale health risk data vary in concentration across the town, with some schools being proximate to greater concentrations of the mapped risks. The spatial video is also used to validate coded data and location specific risks within these “hotspots”. Conclusions Spatial video is a tool that can be used in any environment to improve local area health analysis and intervention. The process is rapid and can be repeated in study

  1. Modified risk graph method using fuzzy rule-based approach.

    PubMed

    Nait-Said, R; Zidani, F; Ouzraoui, N

    2009-05-30

    The risk graph is one of the most popular methods used to determine the safety integrity level for safety instrumented functions. However, conventional risk graph as described in the IEC 61508 standard is subjective and suffers from an interpretation problem of risk parameters. Thus, it can lead to inconsistent outcomes that may result in conservative SILs. To overcome this difficulty, a modified risk graph using fuzzy rule-based system is proposed. This novel version of risk graph uses fuzzy scales to assess risk parameters and calibration may be made by varying risk parameter values. Furthermore, the outcomes which are numerical values of risk reduction factor (the inverse of the probability of failure on demand) can be compared directly with those given by quantitative and semi-quantitative methods such as fault tree analysis (FTA), quantitative risk assessment (QRA) and layers of protection analysis (LOPA).

  2. Compendium on Risk Analysis Techniques

    DTIC Science & Technology

    The evolution of risk analysis in the materiel acquisition process is traced from the Secretary Packard memorandum to current AMC guidance. Risk ... analysis is defined and many of the existing techniques are described in light of this definition and their specific role in program management and

  3. Relative risk regression analysis of epidemiologic data.

    PubMed

    Prentice, R L

    1985-11-01

    Relative risk regression methods are described. These methods provide a unified approach to a range of data analysis problems in environmental risk assessment and in the study of disease risk factors more generally. Relative risk regression methods are most readily viewed as an outgrowth of Cox's regression and life model. They can also be viewed as a regression generalization of more classical epidemiologic procedures, such as that due to Mantel and Haenszel. In the context of an epidemiologic cohort study, relative risk regression methods extend conventional survival data methods and binary response (e.g., logistic) regression models by taking explicit account of the time to disease occurrence while allowing arbitrary baseline disease rates, general censorship, and time-varying risk factors. This latter feature is particularly relevant to many environmental risk assessment problems wherein one wishes to relate disease rates at a particular point in time to aspects of a preceding risk factor history. Relative risk regression methods also adapt readily to time-matched case-control studies and to certain less standard designs. The uses of relative risk regression methods are illustrated and the state of development of these procedures is discussed. It is argued that asymptotic partial likelihood estimation techniques are now well developed in the important special case in which the disease rates of interest have interpretations as counting process intensity functions. Estimation of relative risks processes corresponding to disease rates falling outside this class has, however, received limited attention. The general area of relative risk regression model criticism has, as yet, not been thoroughly studied, though a number of statistical groups are studying such features as tests of fit, residuals, diagnostics and graphical procedures. Most such studies have been restricted to exponential form relative risks as have simulation studies of relative risk estimation

  4. Object Oriented Risk Analysis Workshop

    NASA Astrophysics Data System (ADS)

    Pons, M. Güell I.; Jaboyedoff, M.

    2009-04-01

    In the framework of the RISET Project (Interfaculty Network of Support to Education and Technology) an educational tool for introducing risk analysis has been developed. This workshop enables to carry a group of students (role-play game) through a step-by-step process of risk identification and quantification. The aim is to assess risk in a characteristic alpine village regarding natural hazards (rockfall, snow avalanche, flooding…) and is oriented to affected objects such as buildings, infrastructures... The workshop contains the following steps: 1.- Planning of the study and definition of stakeholders 2.- Hazard identification 3.- Risk analysis 4.- Risk assessment 5.- Proposition of mitigation measures 6- Risk management and cost-benefit analysis. During the process, information related to past events and useful concepts are provided in order to bring up discussion and decision making. The Risk Matrix and other graphical tools allow having a visual representation of the risk level and help to prioritize counter measures. At the end of the workshop, there is the possibility to compare the results between different groups and print out a summarizing report. This approach provides a rapid and comprehensible risk evaluation. The workshop is accessible from the internet and will be used for educational purposes at bachelor and master level as well as for external persons dealing with risk analysis.

  5. The Risks of Multimedia Methods

    PubMed Central

    Lenert, Leslie A.; Ziegler, Jennifer; Lee, Tina; Unfred, Christine; Mahmoud, Ramy

    2000-01-01

    an actor's gender may influence the willingness of viewers to gamble to gain health benefits (or risk attitude). Conclusions: Educators and researchers considering the use of multimedia methods for decision support need to be aware of the potential for the race and gender of patients or actors to influence preferences for health states and thus, potentially, medical decisions. PMID:10730601

  6. The Components of Microbiological Risk Analysis.

    PubMed

    Liuzzo, Gaetano; Bentley, Stefano; Giacometti, Federica; Serraino, Andrea

    2015-02-03

    The paper describes the process of risk analysis in a food safety perspective. The steps of risk analysis defined as a process consisting of three interconnected components (risk assessment, risk management, and risk communication) are analysed. The different components of the risk assessment, risk management and risk communication are further described.

  7. The Components of Microbiological Risk Analysis

    PubMed Central

    Bentley, Stefano; Giacometti, Federica; Serraino, Andrea

    2015-01-01

    The paper describes the process of risk analysis in a food safety perspective. The steps of risk analysis defined as a process consisting of three interconnected components (risk assessment, risk management, and risk communication) are analysed. The different components of the risk assessment, risk management and risk communication are further described. PMID:27800384

  8. Risk analysis and management

    NASA Technical Reports Server (NTRS)

    Smith, H. E.

    1990-01-01

    Present software development accomplishments are indicative of the emerging interest in and increasing efforts to provide risk assessment backbone tools in the manned spacecraft engineering community. There are indications that similar efforts are underway in the chemical processes industry and are probably being planned for other high risk ground base environments. It appears that complex flight systems intended for extended manned planetary exploration will drive this technology.

  9. Budget Risk & Prioritization Analysis Tool

    SciTech Connect

    Carlos Castillo, Jerel Nelson

    2010-12-31

    BRPAtool performs the following: •Assists managers in making solid decisions on what scope/activities to reduce and/or eliminate, to meet constrained budgets, based on multiple risk factors •Enables analysis of different budget scenarios •Can analyze risks and cost for each activity based on technical, quantifiable risk criteria and management-determined risks •Real-time analysis •Enables managers to determine the multipliers and where funding is best applied •Promotes solid budget defense

  10. Assessing environmental risks for high intensity agriculture using the material flow analysis method--a case study of the Dongting Lake basin in South Central China.

    PubMed

    Yin, Guanyi; Liu, Liming; Yuan, Chengcheng

    2015-07-01

    This study primarily examined the assessment of environmental risk in high intensity agricultural areas. Dongting Lake basin was taken as a case study, which is one of the major grain producing areas in China. Using data obtained from 1989 to 2012, we applied Material Flow Analysis (MFA) to show the material consumption, pollutant output and production storage in the agricultural-environmental system and assessed the environmental risk index on the basis of the MFA results. The results predicted that the status of the environmental quality of the Dongting Lake area is unsatisfactory for the foreseeable future. The direct material input (DMI) declined by 13.9%, the domestic processed output (DPO) increased by 28.21%, the intensity of material consumption (IMC) decreased by 36.7%, the intensity of material discharge (IMD) increased by 10%, the material productivity (MP) increased by 27 times, the environmental efficiency (EE) increased by 15.31 times, and the material storage (PAS) increased by 0.23%. The DMI and DPO was higher at rural places on the edge of cities, whereas the risk of urban agriculture has arisen due to the higher increasing rate of DMI and DPO in cities compared with the counties. The composite environmental risk index increased from 0.33 to 0.96, indicating that the total environmental risk changed gradually but seriously during the 24 years assessed. The driving factors that affect environmental risk in high intensity agriculture can be divided into five classes: social, economic, human, natural and disruptive incidents. This study discussed a number of effective measures for protecting the environment while ensuring food production yields. Additional research in other areas and certain improvements of this method in future studies may be necessary to develop a more effective method of managing and controlling agricultural-environmental interactions.

  11. [The cascade scheme as a methodical platform for analysis of health risks in space flight and partially and fully analog conditions].

    PubMed

    Ushakov, I B; Poliakov, A V; Usov, V M

    2011-01-01

    Space anthropoecology, a subsection of human ecology, studies various aspects of physiological, psychological, social and professional adaptation to the extreme environment of space flight and human life and work in partially- and fully analogous conditions on Earth. Both SF and simulated extreme conditions are known for high human safety standards and a substantial analytic base that secures on-line analysis of torrent of information. Management evaluation and response to germing undesired developments aimed to curb their impact on the functioning of the crew-vehicle-environment system and human health involve the complete wealth of knowledge about risks to human health and performance. Spacecrew safety issues are tackled by experts of many specialties which emphasizes the importance of integral methodical approaches to risk estimation and mitigation, setting up barriers to adverse trends in human physiology and psychology in challenging conditions, and minimization of delayed effects on professional longevity and disorders in behavioral reactions.

  12. At-Risk Youngsters: Methods That Work.

    ERIC Educational Resources Information Center

    Obiakor, Festus E.

    This paper examines problems faced by youngsters at risk of failure in school, and discusses methods for helping them succeed in educational programs. At-risk youngsters confront many problems in school and in mainstream society, and are frequently misidentified, misdiagnosed, and improperly instructed. Problems faced by at-risk youngsters…

  13. Probabilistic Exposure Analysis for Chemical Risk Characterization

    PubMed Central

    Bogen, Kenneth T.; Cullen, Alison C.; Frey, H. Christopher; Price, Paul S.

    2009-01-01

    This paper summarizes the state of the science of probabilistic exposure assessment (PEA) as applied to chemical risk characterization. Current probabilistic risk analysis methods applied to PEA are reviewed. PEA within the context of risk-based decision making is discussed, including probabilistic treatment of related uncertainty, interindividual heterogeneity, and other sources of variability. Key examples of recent experience gained in assessing human exposures to chemicals in the environment, and other applications to chemical risk characterization and assessment, are presented. It is concluded that, although improvements continue to be made, existing methods suffice for effective application of PEA to support quantitative analyses of the risk of chemically induced toxicity that play an increasing role in key decision-making objectives involving health protection, triage, civil justice, and criminal justice. Different types of information required to apply PEA to these different decision contexts are identified, and specific PEA methods are highlighted that are best suited to exposure assessment in these separate contexts. PMID:19223660

  14. Confronting deep uncertainties in risk analysis.

    PubMed

    Cox, Louis Anthony

    2012-10-01

    How can risk analysts help to improve policy and decision making when the correct probabilistic relation between alternative acts and their probable consequences is unknown? This practical challenge of risk management with model uncertainty arises in problems from preparing for climate change to managing emerging diseases to operating complex and hazardous facilities safely. We review constructive methods for robust and adaptive risk analysis under deep uncertainty. These methods are not yet as familiar to many risk analysts as older statistical and model-based methods, such as the paradigm of identifying a single "best-fitting" model and performing sensitivity analyses for its conclusions. They provide genuine breakthroughs for improving predictions and decisions when the correct model is highly uncertain. We demonstrate their potential by summarizing a variety of practical risk management applications.

  15. Risk-driven security testing using risk analysis with threat modeling approach.

    PubMed

    Palanivel, Maragathavalli; Selvadurai, Kanmani

    2014-01-01

    Security testing is a process of determining risks present in the system states and protects them from vulnerabilities. But security testing does not provide due importance to threat modeling and risk analysis simultaneously that affects confidentiality and integrity of the system. Risk analysis includes identification, evaluation and assessment of risks. Threat modeling approach is identifying threats associated with the system. Risk-driven security testing uses risk analysis results in test case identification, selection and assessment to prioritize and optimize the testing process. Threat modeling approach, STRIDE is generally used to identify both technical and non-technical threats present in the system. Thus, a security testing mechanism based on risk analysis results using STRIDE approach has been proposed for identifying highly risk states. Risk metrics considered for testing includes risk impact, risk possibility and risk threshold. Risk threshold value is directly proportional to risk impact and risk possibility. Risk-driven security testing results in reduced test suite which in turn reduces test case selection time. Risk analysis optimizes the test case selection and execution process. For experimentation, the system models namely LMS, ATM, OBS, OSS and MTRS are considered. The performance of proposed system is analyzed using Test Suite Reduction Rate (TSRR) and FSM coverage. TSRR varies from 13.16 to 21.43% whereas FSM coverage is achieved up to 91.49%. The results show that the proposed method combining risk analysis with threat modeling identifies states with high risks to improve the testing efficiency.

  16. Adversarial risk analysis for counterterrorism modeling.

    PubMed

    Rios, Jesus; Rios Insua, David

    2012-05-01

    Recent large-scale terrorist attacks have raised interest in models for resource allocation against terrorist threats. The unifying theme in this area is the need to develop methods for the analysis of allocation decisions when risks stem from the intentional actions of intelligent adversaries. Most approaches to these problems have a game-theoretic flavor although there are also several interesting decision-analytic-based proposals. One of them is the recently introduced framework for adversarial risk analysis, which deals with decision-making problems that involve intelligent opponents and uncertain outcomes. We explore how adversarial risk analysis addresses some standard counterterrorism models: simultaneous defend-attack models, sequential defend-attack-defend models, and sequential defend-attack models with private information. For each model, we first assess critically what would be a typical game-theoretic approach and then provide the corresponding solution proposed by the adversarial risk analysis framework, emphasizing how to coherently assess a predictive probability model of the adversary's actions, in a context in which we aim at supporting decisions of a defender versus an attacker. This illustrates the application of adversarial risk analysis to basic counterterrorism models that may be used as basic building blocks for more complex risk analysis of counterterrorism problems.

  17. A Course of Instruction in Risk Analysis.

    DTIC Science & Technology

    Contents: Risk analysis course schedule; Problems and perspectives - an introduction to a course of instruction in risk analysis ; Analytical...techniques; Overview of the process of risk analysis ; Network analysis; RISCA: USALMC’s network analyzer program; Case studies in risk analysis ; Armored...vehicle launched bridge (AVLB); Micom-air defense missile warhead/fuze subsystem performance; Helicopter performance risk analysis ; High performance fuze

  18. Recursive Partitioning Method on Competing Risk Outcomes

    PubMed Central

    Xu, Wei; Che, Jiahua; Kong, Qin

    2016-01-01

    In some cancer clinical studies, researchers have interests to explore the risk factors associated with competing risk outcomes such as recurrence-free survival. We develop a novel recursive partitioning framework on competing risk data for both prognostic and predictive model constructions. We define specific splitting rules, pruning algorithm, and final tree selection algorithm for the competing risk tree models. This methodology is quite flexible that it can corporate both semiparametric method using Cox proportional hazards model and parametric competing risk model. Both prognostic and predictive tree models are developed to adjust for potential confounding factors. Extensive simulations show that our methods have well-controlled type I error and robust power performance. Finally, we apply both Cox proportional hazards model and flexible parametric model for prognostic tree development on a retrospective clinical study on oropharyngeal cancer patients. PMID:27486300

  19. Intelligent adversary risk analysis: a bioterrorism risk management model.

    PubMed

    Parnell, Gregory S; Smith, Christopher M; Moxley, Frederick I

    2010-01-01

    The tragic events of 9/11 and the concerns about the potential for a terrorist or hostile state attack with weapons of mass destruction have led to an increased emphasis on risk analysis for homeland security. Uncertain hazards (natural and engineering) have been successfully analyzed using probabilistic risk analysis (PRA). Unlike uncertain hazards, terrorists and hostile states are intelligent adversaries who can observe our vulnerabilities and dynamically adapt their plans and actions to achieve their objectives. This article compares uncertain hazard risk analysis with intelligent adversary risk analysis, describes the intelligent adversary risk analysis challenges, and presents a probabilistic defender-attacker-defender model to evaluate the baseline risk and the potential risk reduction provided by defender investments. The model includes defender decisions prior to an attack; attacker decisions during the attack; defender actions after an attack; and the uncertainties of attack implementation, detection, and consequences. The risk management model is demonstrated with an illustrative bioterrorism problem with notional data.

  20. Psychiatrists' follow-up of identified metabolic risk: a mixed-method analysis of outcomes and influences on practice

    PubMed Central

    Patterson, Sue; Freshwater, Kathleen; Goulter, Nicole; Ewing, Julie; Leamon, Boyd; Choudhary, Anand; Moudgil, Vikas; Emmerson, Brett

    2016-01-01

    Aims and method To describe and explain psychiatrists' responses to metabolic abnormalities identified during screening. We carried out an audit of clinical records to assess rates of monitoring and follow-up practice. Semi-structured interviews with 36 psychiatrists followed by descriptive and thematic analyses were conducted. Results Metabolic abnormalities were identified in 76% of eligible patients screened. Follow-up, recorded for 59%, was variable but more likely with four or more abnormalities. Psychiatrists endorse guidelines but ambivalence about responsibility, professional norms, resource constraints and skills deficits as well as patient factors influences practice. Therapeutic optimism and desire to be a ‘good doctor’ supported comprehensive follow-up. Clinical implications Psychiatrists are willing to attend to physical healthcare, and obstacles to recommended practice are surmountable. Psychiatrists seek consensus among stakeholders about responsibilities and a systemic approach addressing the social determinants of health inequities. Understanding patients' expectations is critical to promoting best practice. PMID:27752343

  1. Sexual and injection-related risks in Puerto Rican-born injection drug users living in New York City: A mixed-methods analysis

    PubMed Central

    2011-01-01

    Background These data were collected as part of the National HIV Behavioral Surveillance (NHBS) study. NHBS is a cross-sectional study to investigate HIV behavioral risks among core risk groups in 21 U.S. cities with the highest HIV/AIDS prevalence. This analysis examines data from the NHBS data collection cycle with IDU conducted in New York City in 2009. We explored how the recency of migration from Puerto Rico (PR) to New York City (NYC) impacts both syringe sharing and unprotected sex among injection drug users (IDU) currently living in NYC. Methods We used a mixed-methods approach to examine differences in risk between US-born IDU, PR IDU who migrated to NYC more than three years ago (non-recent migrants), and PR IDU who migrated in the last three years (recent migrants). Respondent-driven sampling (RDS) was used to recruit the sample (n = 514). In addition, qualitative individual and group interviews with recent PR migrants (n = 12) and community experts (n = 2) allowed for an in-depth exploration of the IDU migration process and the material and cultural factors behind continued risk behaviors in NYC. Results In multiple logistic regression controlling for confounding factors, recent migrants were significantly more likely to report unprotected sexual intercourse with casual or exchange partners (adjusted odds ratio [AOR]: 2.81; 95% confidence intervals [CI]: 1.37-5.76) and receptive syringe sharing (AOR = 2.44; 95% CI: 1.20-4.97) in the past year, compared to US-born IDU. HIV and HCV seroprevalence were highest among non-recent migrants. Qualitative results showed that risky injection practices are partly based on cultural norms acquired while injecting drugs in Puerto Rico. These same results also illustrate how homelessness influences risky sexual practices. Conclusions Poor material conditions (especially homelessness) may be key in triggering risky sexual practices. Cultural norms (ingrained while using drugs in PR) around injection drug use are

  2. Evaluating the Risks of Clinical Research: Direct Comparative Analysis

    PubMed Central

    Abdoler, Emily; Roberson-Nay, Roxann; Pine, Daniel S.; Wendler, David

    2014-01-01

    Abstract Objectives: Many guidelines and regulations allow children and adolescents to be enrolled in research without the prospect of clinical benefit when it poses minimal risk. However, few systematic methods exist to determine when research risks are minimal. This situation has led to significant variation in minimal risk judgments, raising concern that some children are not being adequately protected. To address this concern, we describe a new method for implementing the widely endorsed “risks of daily life” standard for minimal risk. This standard defines research risks as minimal when they do not exceed the risks posed by daily life activities or routine examinations. Methods: This study employed a conceptual and normative analysis, and use of an illustrative example. Results: Different risks are composed of the same basic elements: Type, likelihood, and magnitude of harm. Hence, one can compare the risks of research and the risks of daily life by comparing the respective basic elements with each other. We use this insight to develop a systematic method, direct comparative analysis, for implementing the “risks of daily life” standard for minimal risk. The method offers a way of evaluating research procedures that pose the same types of risk as daily life activities, such as the risk of experiencing anxiety, stress, or other psychological harm. We thus illustrate how direct comparative analysis can be applied in practice by using it to evaluate whether the anxiety induced by a respiratory CO2 challenge poses minimal or greater than minimal risks in children and adolescents. Conclusions: Direct comparative analysis is a systematic method for applying the “risks of daily life” standard for minimal risk to research procedures that pose the same types of risk as daily life activities. It thereby offers a method to protect children and adolescents in research, while ensuring that important studies are not blocked because of unwarranted concerns about

  3. Nano risk analysis: advancing the science for nanomaterials risk management.

    PubMed

    Shatkin, Jo Anne; Abbott, Linda Carolyn; Bradley, Ann E; Canady, Richard Alan; Guidotti, Tee; Kulinowski, Kristen M; Löfstedt, Ragnar E; Louis, Garrick; MacDonell, Margaret; Macdonell, Margaret; Maynard, Andrew D; Paoli, Greg; Sheremeta, Lorraine; Walker, Nigel; White, Ronald; Williams, Richard

    2010-11-01

    Scientists, activists, industry, and governments have raised concerns about health and environmental risks of nanoscale materials. The Society for Risk Analysis convened experts in September 2008 in Washington, DC to deliberate on issues relating to the unique attributes of nanoscale materials that raise novel concerns about health risks. This article reports on the overall themes and findings of the workshop, uncovering the underlying issues for each of these topics that become recurring themes. The attributes of nanoscale particles and other nanomaterials that present novel issues for risk analysis are evaluated in a risk analysis framework, identifying challenges and opportunities for risk analysts and others seeking to assess and manage the risks from emerging nanoscale materials and nanotechnologies. Workshop deliberations and recommendations for advancing the risk analysis and management of nanotechnologies are presented.

  4. Russian risk assessment methods and approaches

    SciTech Connect

    Dvorack, M.A.; Carlson, D.D.; Smith, R.E.

    1996-07-01

    One of the benefits resulting from the collapse of the Soviet Union is the increased dialogue currently taking place between American and Russian nuclear weapons scientists in various technical arenas. One of these arenas currently being investigated involves collaborative studies which illustrate how risk assessment is perceived and utilized in the Former Soviet Union (FSU). The collaborative studies indicate that, while similarities exist with respect to some methodologies, the assumptions and approaches in performing risk assessments were, and still are, somewhat different in the FSU as opposed to that in the US. The purpose of this paper is to highlight the present knowledge of risk assessment methodologies and philosophies within the two largest nuclear weapons laboratories of the Former Soviet Union, Arzamas-16 and Chelyabinsk-70. Furthermore, This paper will address the relative progress of new risk assessment methodologies, such as Fuzzy Logic, within the framework of current risk assessment methods at these two institutes.

  5. Draft Waste Management Programmatic Environmental Impact Statement for managing treatment, storage, and disposal of radioactive and hazardous waste. Volume 3, Appendix A: Public response to revised NOI, Appendix B: Environmental restoration, Appendix C, Environmental impact analysis methods, Appendix D, Risk

    SciTech Connect

    1995-08-01

    Volume three contains appendices for the following: Public comments do DOE`s proposed revisions to the scope of the waste management programmatic environmental impact statement; Environmental restoration sensitivity analysis; Environmental impacts analysis methods; and Waste management facility human health risk estimates.

  6. Command Process Modeling & Risk Analysis

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  7. Comparison of Intake Gate Closure Methods At Lower Granite, Little Goose, Lower Monumental, And McNary Dams Using Risk-Based Analysis

    SciTech Connect

    Gore, Bryan F; Blackburn, Tye R; Heasler, Patrick G; Mara, Neil L

    2001-01-19

    The objective of this report is to compare the benefits and costs of modifications proposed for intake gate closure systems at four hydroelectric stations on the Lower Snake and Upper Columbia Rivers in the Walla Walla District that are unable to meet the COE 10-minute closure rule due to the installation of fish screens. The primary benefit of the proposed modifications is to reduce the risk of damage to the station and environs when emergency intake gate closure is required. Consequently, this report presents the results and methodology of an extensive risk analysis performed to assess the reliability of powerhouse systems and the costs and timing of potential damages resulting from events requiring emergency intake gate closure. As part of this analysis, the level of protection provided by the nitrogen emergency closure system was also evaluated. The nitrogen system was the basis for the original recommendation to partially disable the intake gate systems. The risk analysis quantifies this protection level.

  8. Comparison Of Intake Gate Closure Methods At Lower Granite, Little Goose, Lower Monumental, And Mcnary Dams Using Risk-Based Analysis

    SciTech Connect

    Gore, Bryan F.; Blackburn, Tyrone R.; Heasler, Patrick G.; Mara, Neil L.; Phan, Hahn K.; Bardy, David M.; Hollenbeck, Robert E.

    2001-01-19

    The objective of this report is to compare the benefits and costs of modifications proposed for intake gate closure systems at four hydroelectric stations on the Lower Snake and Upper Columbia Rivers in the Walla Walla District that are unable to meet the COE 10-minute closure rule due to the installation of fish screens. The primary benefit of the proposed modifications is to reduce the risk of damage to the station and environs when emergency intake gate closure is required. Consequently, this report presents the results and methodology of an extensive risk analysis performed to assess the reliability of powerhouse systems and the costs and timing of potential damages resulting from events requiring emergency intake gate closure. As part of this analysis, the level of protection provided by the nitrogen emergency closure system was also evaluated. The nitrogen system was the basis for the original recommendation to partially disable the intake gate systems. The risk analysis quantifies this protection level.

  9. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2007-01-01

    A formal method is described to quantify structural reliability and risk in the presence of a multitude of uncertainties. The method is based on the materials behavior level where primitive variables with their respective scatters are used to describe that behavior. Computational simulation is then used to propagate those uncertainties to the structural scale where reliability and risk are usually specified. A sample case is described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from this method demonstrate that the method is mature and that it can be used for future strategic projections and planning to assure better, cheaper, faster products for competitive advantages in world markets. The results also indicate that the methods are suitable for predicting remaining life in aging or deteriorating structures.

  10. [Study on the risk assessment method of regional groundwater pollution].

    PubMed

    Yang, Yan; Yu, Yun-Jiang; Wang, Zong-Qing; Li, Ding-Long; Sun, Hong-Wei

    2013-02-01

    Based on the boundary elements of system risk assessment, the regional groundwater pollution risk assessment index system was preliminarily established, which included: regional groundwater specific vulnerability assessment, the regional pollution sources characteristics assessment and the health risk assessment of regional featured pollutants. The three sub-evaluation systems were coupled with the multi-index comprehensive method, the risk was characterized with the Spatial Analysis of ArcMap, and a new method to evaluate regional groundwater pollution risk that suitable for different parts of natural conditions, different types of pollution was established. Take Changzhou as an example, the risk of shallow groundwater pollution was studied with the new method, and found that the vulnerability index of groundwater in Changzhou is high and distributes unevenly; The distribution of pollution sources is concentrated and has a great impact on groundwater pollution risks; Influenced by the pollutants and pollution sources, the values of health risks are high in the urban area of Changzhou. The pollution risk of shallow groundwater is high and distributes unevenly, and distributes in the north of the line of Anjia-Xuejia-Zhenglu, the center of the city and the southeast, where the human activities are more intense and the pollution sources are intensive.

  11. Method and apparatus for assessing cardiovascular risk

    NASA Technical Reports Server (NTRS)

    Albrecht, Paul (Inventor); Bigger, J. Thomas (Inventor); Cohen, Richard J. (Inventor)

    1998-01-01

    The method for assessing risk of an adverse clinical event includes detecting a physiologic signal in the subject and determining from the physiologic signal a sequence of intervals corresponding to time intervals between heart beats. The long-time structure of fluctuations in the intervals over a time period of more than fifteen minutes is analyzed to assess risk of an adverse clinical event. In a preferred embodiment, the physiologic signal is an electrocardiogram and the time period is at least fifteen minutes. A preferred method for analyzing the long-time structure variability in the intervals includes computing the power spectrum and fitting the power spectrum to a power law dependence on frequency over a selected frequency range such as 10.sup.-4 to 10.sup.-2 Hz. Characteristics of the long-time structure fluctuations in the intervals is used to assess risk of an adverse clinical event.

  12. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant... sensitive personal information, the risk analysis must also contain operational recommendations for responding to the data breach. Each risk analysis, regardless of findings and operational...

  13. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant... sensitive personal information, the risk analysis must also contain operational recommendations for responding to the data breach. Each risk analysis, regardless of findings and operational...

  14. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant... sensitive personal information, the risk analysis must also contain operational recommendations for responding to the data breach. Each risk analysis, regardless of findings and operational...

  15. A classification scheme for risk assessment methods.

    SciTech Connect

    Stamp, Jason Edwin; Campbell, Philip LaRoche

    2004-08-01

    This report presents a classification scheme for risk assessment methods. This scheme, like all classification schemes, provides meaning by imposing a structure that identifies relationships. Our scheme is based on two orthogonal aspects--level of detail, and approach. The resulting structure is shown in Table 1 and is explained in the body of the report. Each cell in the Table represent a different arrangement of strengths and weaknesses. Those arrangements shift gradually as one moves through the table, each cell optimal for a particular situation. The intention of this report is to enable informed use of the methods so that a method chosen is optimal for a situation given. This report imposes structure on the set of risk assessment methods in order to reveal their relationships and thus optimize their usage.We present a two-dimensional structure in the form of a matrix, using three abstraction levels for the rows and three approaches for the columns. For each of the nine cells in the matrix we identify the method type by name and example. The matrix helps the user understand: (1) what to expect from a given method, (2) how it relates to other methods, and (3) how best to use it. Each cell in the matrix represent a different arrangement of strengths and weaknesses. Those arrangements shift gradually as one moves through the table, each cell optimal for a particular situation. The intention of this report is to enable informed use of the methods so that a method chosen is optimal for a situation given. The matrix, with type names in the cells, is introduced in Table 2 on page 13 below. Unless otherwise stated we use the word 'method' in this report to refer to a 'risk assessment method', though often times we use the full phrase. The use of the terms 'risk assessment' and 'risk management' are close enough that we do not attempt to distinguish them in this report. The remainder of this report is organized as follows. In Section 2 we provide context for this report

  16. Medicare's risk-adjusted capitation method.

    PubMed

    Grimaldi, Paul L

    2002-01-01

    Since 1997, the method to establish capitation rates for Medicare beneficiaries who are members of risk-bearing managed care plans has undergone several important developments. This includes the factoring of beneficiary health status into the rate-setting calculations. These changes were expected to increase the number of participating health plans, accelerate Medicare enrollment growth, and slice Medicare spending.

  17. Recent methods for assessing osteoporosis and fracture risk.

    PubMed

    Imai, Kazuhiro

    2014-01-01

    In the management and treatment of osteoporosis, the target is to assess fracture risk and the end-point is to prevent fractures. Traditionally, measurement of bone mineral density (BMD) by dual energy X-ray absorptiometry (DXA) has been the standard method for diagnosing osteoporosis, in addition to assessing fracture risk and therapeutic effects. Quantitative computed tomography (QCT) can quantify volumetric BMD, and cancellous bone can be measured independently of surrounding cortical bone and aortic calcification. Hip structure analysis (HSA) is a method using the DXA scan image and provides useful data for assessing hip fracture risk. Recently, new tools to assess osteoporosis and fracture risk have been developed. One of the recent advances has been the development of the FRAX (Fracture Risk Assessment Tool), which is helpful in conveying fracture risk to patients and providing treatment guidance to clinicians. Another advance is the finite element (FE) method based on data from computed tomography (CT), which is useful for assessing bone strength, fracture risk, and therapeutic effects on osteoporosis. In selecting the most appropriate drug for osteoporosis treatment, assessment by bone metabolic markers is an important factor. In this review, recent patents for assessing osteoporosis and fracture risk are discussed.

  18. General Risk Analysis Methodological Implications to Explosives Risk Management Systems,

    DTIC Science & Technology

    An investigation sponsored by the National Science Foundation has produced as one of its results a survey and evaluation of risk analysis methodologies...This paper presents some implications of the survey to risk analysis and decision making for explosives hazards such as may ultimately be

  19. Tsunamis: Global Exposure and Local Risk Analysis

    NASA Astrophysics Data System (ADS)

    Harbitz, C. B.; Løvholt, F.; Glimsdal, S.; Horspool, N.; Griffin, J.; Davies, G.; Frauenfelder, R.

    2014-12-01

    The 2004 Indian Ocean tsunami led to a better understanding of the likelihood of tsunami occurrence and potential tsunami inundation, and the Hyogo Framework for Action (HFA) was one direct result of this event. The United Nations International Strategy for Disaster Risk Reduction (UN-ISDR) adopted HFA in January 2005 in order to reduce disaster risk. As an instrument to compare the risk due to different natural hazards, an integrated worldwide study was implemented and published in several Global Assessment Reports (GAR) by UN-ISDR. The results of the global earthquake induced tsunami hazard and exposure analysis for a return period of 500 years are presented. Both deterministic and probabilistic methods (PTHA) are used. The resulting hazard levels for both methods are compared quantitatively for selected areas. The comparison demonstrates that the analysis is rather rough, which is expected for a study aiming at average trends on a country level across the globe. It is shown that populous Asian countries account for the largest absolute number of people living in tsunami prone areas, more than 50% of the total exposed people live in Japan. Smaller nations like Macao and the Maldives are among the most exposed by population count. Exposed nuclear power plants are limited to Japan, China, India, Taiwan, and USA. On the contrary, a local tsunami vulnerability and risk analysis applies information on population, building types, infrastructure, inundation, flow depth for a certain tsunami scenario with a corresponding return period combined with empirical data on tsunami damages and mortality. Results and validation of a GIS tsunami vulnerability and risk assessment model are presented. The GIS model is adapted for optimal use of data available for each study. Finally, the importance of including landslide sources in the tsunami analysis is also discussed.

  20. Flood hazard energy in urban areas: a new integrated method for flood risk analysis in synthesizing interactions with urban boundary layer

    NASA Astrophysics Data System (ADS)

    Park, S. Y.; Schmidt, A.

    2015-12-01

    Since urban physical characteristics (such as morphology and land-use/land-cover) are different from those of nature, altered interactions between the surface and atmosphere (especially urban boundary layer, UBL) or surface and subsurface can affect the hydrologic behavior and hence the flood hazards. In this research we focus on three main aspects of the urban surface/atmosphere interactions that affect flood hazard: urban heat island (UHI) effect, increased surface roughness, and accumulated aerosols. These factors, along with the uncertainties in quantifying these components make risk analysis intractable. In order to perform a risk analysis, the impact of these components needs to be mapped to a variable that can be mathematically described in a risk-analysis framework. We propose defining hazard energy as a surrogate for the combined effect of these three components. Perturbations that can change the hazard energy come from diverse sources in the urban areas and these somewhat disconnected things can be combined by the energy concept to characterize the impacts of urban areas in risk assessment. This approach synthesizes across hydrological and hydraulic processes in UBL, land surface, subsurface, and sewer network with scrutinizing energy exchange across places. We can extend our understanding about not only the influence of cities on local climate in rural areas or larger scales but also the interaction of cities and nature affecting each other.

  1. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multifactor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.

  2. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2008-01-01

    A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multi-factor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.

  3. Low-thrust mission risk analysis.

    NASA Technical Reports Server (NTRS)

    Yen, C. L.; Smith, D. B.

    1973-01-01

    A computerized multi-stage failure process simulation procedure is used to evaluate the risk in a solar electric space mission. The procedure uses currently available thrust-subsystem reliability data and performs approximate simulations of the thrust subsystem burn operation, the system failure processes, and the retargetting operations. The application of the method is used to assess the risks in carrying out a 1980 rendezvous mission to Comet Encke. Analysis of the results and evaluation of the effects of various risk factors on the mission show that system component failure rates is the limiting factor in attaining a high mission reliability. But it is also shown that a well-designed trajectory and system operation mode can be used effectively to partially compensate for unreliable thruster performance.

  4. Modified risk evaluation method. Revision 1

    SciTech Connect

    Udell, C.J.; Tilden, J.A.; Toyooka, R.T.

    1993-08-01

    The purpose of this paper is to provide a structured and cost-oriented process to determine risks associated with nuclear material and other security interests. Financial loss is a continuing concern for US Department of Energy contractors. In this paper risk is equated with uncertainty of cost impacts to material assets or human resources. The concept provides a method for assessing the effectiveness of an integrated protection system, which includes operations, safety, emergency preparedness, and safeguards and security. The concept is suitable for application to sabotage evaluations. The protection of assets is based on risk associated with cost impacts to assets and the potential for undesirable events. This will allow managers to establish protection priorities in terms of the cost and the potential for the event, given the current level of protection.

  5. Risk factors analysis of consecutive exotropia

    PubMed Central

    Gong, Qianwen; Wei, Hong; Zhou, Xu; Li, Ziyuan; Liu, Longqian

    2016-01-01

    Abstract To evaluate clinical factors associated with the onset of consecutive exotropia (XT) following esotropia surgery. By a retrospective nested case-control design, we reviewed the medical records of 193 patients who had undergone initial esotropia surgery between 2008 and 2015, and had follow-up longer than 6 months. The probable risk factors were evaluated between groups 1 (consecutive XT) and 2 (non-consecutive exotropia). Pearson chi-square test and Mann–Whitney U test were used for univariate analysis, and conditional logistic regression model was applied for exploring the potential risk factors of consecutive XT. Consecutive exotropia occurred in 23 (11.9%) of 193 patients. Patients who had undergone large bilateral medial rectus recession (BMR) (P = 0.017) had a high risk of developing consecutive XT. Oblique dysfunction (P = 0.001), adduction limitation (P = 0.000) were associated with a high risk of consecutive XT, which was confirmed in the conditional logistic regression analysis. In addition, large amount of BMR (6 mm or more) was associated with higher incidence of adduction limitation (P = 0.045). The surgical methods and preoperative factors did not appear to influence the risk of developing consecutive XT (P > 0.05). The amount of surgery could be optimized to reduce the risk of consecutive XT. The presence of oblique overaction and postoperative adduction limitation may be associated with a high risk of consecutive XT, which may require close supervision, and/or even earlier operation intervention. PMID:27977611

  6. New method for assessing risks of email

    NASA Astrophysics Data System (ADS)

    Raja, Seyyed H.; Afrooz, Farzad

    2013-03-01

    E-mail technology, has become one of the requirements of human lives for correspondence between individuals. Given this, the important point is that the messages, server and client of e-mail and correspondences that exchanged between different people have acceptable security, to make people sure to use of this technology. In the information age, many of financial and non financial transactions are done electronically, data exchange takes place via the internet and theft and manipulation of data can make exorbitant cost in terms of integrity, financial, political, economic and culture. E-mail correspondence in there is same and it is very important. With review took place, a method that will focus on email system for risks assessment is not provided. We are examining ways of assessing for other systems and their strengths and weaknesses, then we use Mr Convery method for assessing email risks which it is for assessing network risks. At the end of paper we have offered special table for email risk assessment.

  7. Loss Exposure and Risk Analysis Methodology (LERAM) Project Database Design.

    DTIC Science & Technology

    1996-06-01

    MISREPS) to more capably support system safety engineering concepts such as hazard analysis and risk management. As part of the Loss Exposure and Risk ... Analysis Methodology (LERAM) project, the research into the methods which we employ to report, track, and analyze hazards has resulted in a series of low

  8. RAMS (Risk Analysis - Modular System) methodology

    SciTech Connect

    Stenner, R.D.; Strenge, D.L.; Buck, J.W.

    1996-10-01

    The Risk Analysis - Modular System (RAMS) was developed to serve as a broad scope risk analysis tool for the Risk Assessment of the Hanford Mission (RAHM) studies. The RAHM element provides risk analysis support for Hanford Strategic Analysis and Mission Planning activities. The RAHM also provides risk analysis support for the Hanford 10-Year Plan development activities. The RAMS tool draws from a collection of specifically designed databases and modular risk analysis methodologies and models. RAMS is a flexible modular system that can be focused on targeted risk analysis needs. It is specifically designed to address risks associated with overall strategy, technical alternative, and `what if` questions regarding the Hanford cleanup mission. RAMS is set up to address both near-term and long-term risk issues. Consistency is very important for any comparative risk analysis, and RAMS is designed to efficiently and consistently compare risks and produce risk reduction estimates. There is a wide range of output information that can be generated by RAMS. These outputs can be detailed by individual contaminants, waste forms, transport pathways, exposure scenarios, individuals, populations, etc. However, they can also be in rolled-up form to support high-level strategy decisions.

  9. Comprehensive rotorcraft analysis methods

    NASA Technical Reports Server (NTRS)

    Stephens, Wendell B.; Austin, Edward E.

    1988-01-01

    The development and application of comprehensive rotorcraft analysis methods in the field of rotorcraft technology are described. These large scale analyses and the resulting computer programs are intended to treat the complex aeromechanical phenomena that describe the behavior of rotorcraft. They may be used to predict rotor aerodynamics, acoustic, performance, stability and control, handling qualities, loads and vibrations, structures, dynamics, and aeroelastic stability characteristics for a variety of applications including research, preliminary and detail design, and evaluation and treatment of field problems. The principal comprehensive methods developed or under development in recent years and generally available to the rotorcraft community because of US Army Aviation Research and Technology Activity (ARTA) sponsorship of all or part of the software systems are the Rotorcraft Flight Simulation (C81), Dynamic System Coupler (DYSCO), Coupled Rotor/Airframe Vibration Analysis Program (SIMVIB), Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics (CAMRAD), General Rotorcraft Aeromechanical Stability Program (GRASP), and Second Generation Comprehensive Helicopter Analysis System (2GCHAS).

  10. The JPL Cost Risk Analysis Approach that Incorporates Engineering Realism

    NASA Technical Reports Server (NTRS)

    Harmon, Corey C.; Warfield, Keith R.; Rosenberg, Leigh S.

    2006-01-01

    This paper discusses the JPL Cost Engineering Group (CEG) cost risk analysis approach that accounts for all three types of cost risk. It will also describe the evaluation of historical cost data upon which this method is based. This investigation is essential in developing a method that is rooted in engineering realism and produces credible, dependable results to aid decision makers.

  11. Economic Risk Analysis: Using Analytical and Monte Carlo Techniques.

    ERIC Educational Resources Information Center

    O'Donnell, Brendan R.; Hickner, Michael A.; Barna, Bruce A.

    2002-01-01

    Describes the development and instructional use of a Microsoft Excel spreadsheet template that facilitates analytical and Monte Carlo risk analysis of investment decisions. Discusses a variety of risk assessment methods followed by applications of the analytical and Monte Carlo methods. Uses a case study to illustrate use of the spreadsheet tool…

  12. Applying the partitioned multiobjective risk method (PMRM) to portfolio selection.

    PubMed

    Reyes Santos, Joost; Haimes, Yacov Y

    2004-06-01

    The analysis of risk-return tradeoffs and their practical applications to portfolio analysis paved the way for Modern Portfolio Theory (MPT), which won Harry Markowitz a 1992 Nobel Prize in Economics. A typical approach in measuring a portfolio's expected return is based on the historical returns of the assets included in a portfolio. On the other hand, portfolio risk is usually measured using volatility, which is derived from the historical variance-covariance relationships among the portfolio assets. This article focuses on assessing portfolio risk, with emphasis on extreme risks. To date, volatility is a major measure of risk owing to its simplicity and validity for relatively small asset price fluctuations. Volatility is a justified measure for stable market performance, but it is weak in addressing portfolio risk under aberrant market fluctuations. Extreme market crashes such as that on October 19, 1987 ("Black Monday") and catastrophic events such as the terrorist attack of September 11, 2001 that led to a four-day suspension of trading on the New York Stock Exchange (NYSE) are a few examples where measuring risk via volatility can lead to inaccurate predictions. Thus, there is a need for a more robust metric of risk. By invoking the principles of the extreme-risk-analysis method through the partitioned multiobjective risk method (PMRM), this article contributes to the modeling of extreme risks in portfolio performance. A measure of an extreme portfolio risk, denoted by f(4), is defined as the conditional expectation for a lower-tail region of the distribution of the possible portfolio returns. This article presents a multiobjective problem formulation consisting of optimizing expected return and f(4), whose solution is determined using Evolver-a software that implements a genetic algorithm. Under business-as-usual market scenarios, the results of the proposed PMRM portfolio selection model are found to be compatible with those of the volatility-based model

  13. Spatio-temporal earthquake risk assessment for the Lisbon Metropolitan Area - A contribution to improving standard methods of population exposure and vulnerability analysis

    NASA Astrophysics Data System (ADS)

    Freire, Sérgio; Aubrecht, Christoph

    2010-05-01

    The recent 7.0 M earthquake that caused severe damage and destruction in parts of Haiti struck close to 5 PM (local time), at a moment when many people were not in their residences, instead being in their workplaces, schools, or churches. Community vulnerability assessment to seismic hazard relying solely on the location and density of resident-based census population, as is commonly the case, would grossly misrepresent the real situation. In particular in the context of global (climate) change, risk analysis is a research field increasingly gaining in importance whereas risk is usually defined as a function of hazard probability and vulnerability. Assessment and mapping of human vulnerability has however generally been lagging behind hazard analysis efforts. Central to the concept of vulnerability is the issue of human exposure. Analysis of exposure is often spatially tied to administrative units or reference objects such as buildings, spanning scales from the regional level to local studies for small areas. Due to human activities and mobility, the spatial distribution of population is time-dependent, especially in metropolitan areas. Accurately estimating population exposure is a key component of catastrophe loss modeling, one element of effective risk analysis and emergency management. Therefore, accounting for the spatio-temporal dynamics of human vulnerability correlates with recent recommendations to improve vulnerability analyses. Earthquakes are the prototype for a major disaster, being low-probability, rapid-onset, high-consequence events. Lisbon, Portugal, is subject to a high risk of earthquake, which can strike at any day and time, as confirmed by modern history (e.g. December 2009). The recently-approved Special Emergency and Civil Protection Plan (PEERS) is based on a Seismic Intensity map, and only contemplates resident population from the census as proxy for human exposure. In the present work we map and analyze the spatio-temporal distribution of

  14. Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581

    NASA Astrophysics Data System (ADS)

    Prayogo, Galang Sandy; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin

    2016-04-01

    Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presented a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.

  15. Improving causal inferences in risk analysis.

    PubMed

    Cox, Louis Anthony Tony

    2013-10-01

    Recent headlines and scientific articles projecting significant human health benefits from changes in exposures too often depend on unvalidated subjective expert judgments and modeling assumptions, especially about the causal interpretation of statistical associations. Some of these assessments are demonstrably biased toward false positives and inflated effects estimates. More objective, data-driven methods of causal analysis are available to risk analysts. These can help to reduce bias and increase the credibility and realism of health effects risk assessments and causal claims. For example, quasi-experimental designs and analysis allow alternative (noncausal) explanations for associations to be tested, and refuted if appropriate. Panel data studies examine empirical relations between changes in hypothesized causes and effects. Intervention and change-point analyses identify effects (e.g., significant changes in health effects time series) and estimate their sizes. Granger causality tests, conditional independence tests, and counterfactual causality models test whether a hypothesized cause helps to predict its presumed effects, and quantify exposure-specific contributions to response rates in differently exposed groups, even in the presence of confounders. Causal graph models let causal mechanistic hypotheses be tested and refined using biomarker data. These methods can potentially revolutionize the study of exposure-induced health effects, helping to overcome pervasive false-positive biases and move the health risk assessment scientific community toward more accurate assessments of the impacts of exposures and interventions on public health.

  16. Socioeconomic Considerations in Dam Safety Risk Analysis.

    DTIC Science & Technology

    1987-06-01

    The analytical review and summary critique of literature related to risk analysis was conducted for the purpose of highlighting those ideas, concepts...alternative solutions. The critique of the philosophical and analytical bases of risk analysis as further directed toward the specific problem of dam...safety risk analysis . Dam safety is unique in that it represents an extreme situation characteristic of low probability/high consequence event

  17. A comparison of radiological risk assessment methods for environmental restoration

    SciTech Connect

    Dunning, D.E. Jr.; Peterson, J.M.

    1993-09-01

    Evaluation of risks to human health from exposure to ionizing radiation at radioactively contaminated sites is an integral part of the decision-making process for determining the need for remediation and selecting remedial actions that may be required. At sites regulated under the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA), a target risk range of 10{sup {minus}4} to 10{sup {minus}6} incremental cancer incidence over a lifetime is specified by the US Environmental Protection Agency (EPA) as generally acceptable, based on the reasonable maximum exposure to any individual under current and future land use scenarios. Two primary methods currently being used in conducting radiological risk assessments at CERCLA sites are compared in this analysis. Under the first method, the radiation dose equivalent (i.e., Sv or rem) to the receptors of interest over the appropriate period of exposure is estimated and multiplied by a risk factor (cancer risk/Sv). Alternatively, incremental cancer risk can be estimated by combining the EPA`s cancer slope factors (previously termed potency factors) for radionuclides with estimates of radionuclide intake by ingestion and inhalation, as well as radionuclide concentrations in soil that contribute to external dose. The comparison of the two methods has demonstrated that resulting estimates of lifetime incremental cancer risk under these different methods may differ significantly, even when all other exposure assumptions are held constant, with the magnitude of the discrepancy depending upon the dominant radionuclides and exposure pathways for the site. The basis for these discrepancies, the advantages and disadvantages of each method, and the significance of the discrepant results for environmental restoration decisions are presented.

  18. Cumulative Risk Assessment Toolbox: Methods and Approaches for the Practitioner

    PubMed Central

    MacDonell, Margaret M.; Haroun, Lynne A.; Teuschler, Linda K.; Rice, Glenn E.; Hertzberg, Richard C.; Butler, James P.; Chang, Young-Soo; Clark, Shanna L.; Johns, Alan P.; Perry, Camarie S.; Garcia, Shannon S.; Jacobi, John H.; Scofield, Marcienne A.

    2013-01-01

    The historical approach to assessing health risks of environmental chemicals has been to evaluate them one at a time. In fact, we are exposed every day to a wide variety of chemicals and are increasingly aware of potential health implications. Although considerable progress has been made in the science underlying risk assessments for real-world exposures, implementation has lagged because many practitioners are unaware of methods and tools available to support these analyses. To address this issue, the US Environmental Protection Agency developed a toolbox of cumulative risk resources for contaminated sites, as part of a resource document that was published in 2007. This paper highlights information for nearly 80 resources from the toolbox and provides selected updates, with practical notes for cumulative risk applications. Resources are organized according to the main elements of the assessment process: (1) planning, scoping, and problem formulation; (2) environmental fate and transport; (3) exposure analysis extending to human factors; (4) toxicity analysis; and (5) risk and uncertainty characterization, including presentation of results. In addition to providing online access, plans for the toolbox include addressing nonchemical stressors and applications beyond contaminated sites and further strengthening resource accessibility to support evolving analyses for cumulative risk and sustainable communities. PMID:23762048

  19. A Comparison of Disease Risk Analysis Tools for Conservation Translocations.

    PubMed

    Dalziel, Antonia Eleanor; Sainsbury, Anthony W; McInnes, Kate; Jakob-Hoff, Richard; Ewen, John G

    2017-03-01

    Conservation translocations are increasingly used to manage threatened species and restore ecosystems. Translocations increase the risk of disease outbreaks in the translocated and recipient populations. Qualitative disease risk analyses have been used as a means of assessing the magnitude of any effect of disease and the probability of the disease occurring associated with a translocation. Currently multiple alternative qualitative disease risk analysis packages are available to practitioners. Here we compare the ease of use, expertise required, transparency, and results from, three different qualitative disease risk analyses using a translocation of the endangered New Zealand passerine, the hihi (Notiomystis cincta), as a model. We show that the three methods use fundamentally different approaches to define hazards. Different methods are used to produce estimations of the risk from disease, and the estimations are different for the same hazards. Transparency of the process varies between methods from no referencing, or explanations of evidence to justify decisions, through to full documentation of resources, decisions and assumptions made. Evidence to support decisions on estimation of risk from disease is important, to enable knowledge acquired in the future, for example, from translocation outcome, to be used to improve the risk estimation for future translocations. Information documenting each disease risk analysis differs along with variation in emphasis of the questions asked within each package. The expertise required to commence a disease risk analysis varies and an action flow chart tailored for the non-wildlife health specialist are included in one method but completion of the disease risk analysis requires wildlife health specialists with epidemiological and pathological knowledge in all three methods. We show that disease risk analysis package choice may play a greater role in the overall risk estimation of the effect of disease on animal populations

  20. Risk Analysis for Resource Planning Optimization

    NASA Technical Reports Server (NTRS)

    Chueng, Kar-Ming

    2008-01-01

    The main purpose of this paper is to introduce a risk management approach that allows planners to quantify the risk and efficiency tradeoff in the presence of uncertainties, and to make forward-looking choices in the development and execution of the plan. Demonstrate a planning and risk analysis framework that tightly integrates mathematical optimization, empirical simulation, and theoretical analysis techniques to solve complex problems.

  1. A GIS-based method for flood risk assessment

    NASA Astrophysics Data System (ADS)

    Kalogeropoulos, Kleomenis; Stathopoulos, Nikos; Psarogiannis, Athanasios; Penteris, Dimitris; Tsiakos, Chrisovalantis; Karagiannopoulou, Aikaterini; Krikigianni, Eleni; Karymbalis, Efthimios; Chalkias, Christos

    2016-04-01

    Floods are physical global hazards with negative environmental and socio-economic impacts on local and regional scale. The technological evolution during the last decades, especially in the field of geoinformatics, has offered new advantages in hydrological modelling. This study seeks to use this technology in order to quantify flood risk assessment. The study area which was used is an ungauged catchment and by using mostly GIS hydrological and geomorphological analysis together with a GIS-based distributed Unit Hydrograph model, a series of outcomes have risen. More specifically, this paper examined the behaviour of the Kladeos basin (Peloponnese, Greece) using real rainfall data, as well hypothetical storms. The hydrological analysis held using a Digital Elevation Model of 5x5m pixel size, while the quantitative drainage basin characteristics were calculated and were studied in terms of stream order and its contribution to the flood. Unit Hydrographs are, as it known, useful when there is lack of data and in this work, based on time-area method, a sequences of flood risk assessments have been made using the GIS technology. Essentially, the proposed methodology estimates parameters such as discharge, flow velocity equations etc. in order to quantify flood risk assessment. Keywords Flood Risk Assessment Quantification; GIS; hydrological analysis; geomorphological analysis.

  2. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant...) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive... possible after the data breach, a non-VA entity with relevant expertise in data breach assessment and...

  3. Risk Analysis and Uncertainty: Implications for Counselling

    ERIC Educational Resources Information Center

    Hassenzahl, David

    2004-01-01

    Over the past two decades, the risk analysis community has made substantial advances in understanding and describing uncertainty. Uncertainty is ubiquitous, complex, both quantitative and qualitative in nature, and often irreducible. Uncertainty thus creates a challenge when using risk analysis to evaluate the rationality of group and individual…

  4. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant...) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive... possible after the data breach, a non-VA entity with relevant expertise in data breach assessment and...

  5. Carbon Fiber Risk Analysis. [conference

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The scope and status of the effort to assess the risks associated with the accidental release of carbon/graphite fibers from civil aircraft is presented. Vulnerability of electrical and electronic equipment to carbon fibers, dispersal of carbon fibers, effectiveness of filtering systems, impact of fiber induced failures, and risk methodology are among the topics covered.

  6. Critical review of methods for risk ranking of food related hazards, based on risks for human health.

    PubMed

    van der Fels-Klerx, H J; van Asselt, E D; Raley, M; Poulsen, M; Korsgaard, H; Bredsdorff, L; Nauta, M; D'Agostino, M; Coles, D; Marvin, H J P; Frewer, L J

    2016-02-08

    This study aimed to critically review methods for ranking risks related to food safety and dietary hazards on the basis of their anticipated human health impacts. A literature review was performed to identify and characterize methods for risk ranking from the fields of food, environmental science and socio-economic sciences. The review used a predefined search protocol, and covered the bibliographic databases Scopus, CAB Abstracts, Web of Sciences, and PubMed over the period 1993-2013. All references deemed relevant, on the basis of of predefined evaluation criteria, were included in the review, and the risk ranking method characterized. The methods were then clustered - based on their characteristics - into eleven method categories. These categories included: risk assessment, comparative risk assessment, risk ratio method, scoring method, cost of illness, health adjusted life years, multi-criteria decision analysis, risk matrix, flow charts/decision trees, stated preference techniques and expert synthesis. Method categories were described by their characteristics, weaknesses and strengths, data resources, and fields of applications. It was concluded there is no single best method for risk ranking. The method to be used should be selected on the basis of risk manager/assessor requirements, data availability, and the characteristics of the method. Recommendations for future use and application are provided.

  7. Resource allocation using risk analysis

    SciTech Connect

    Bott, T. F.; Eisenhawer, S. W.

    2003-01-01

    Allocating limited resources among competing priorities is an important problem in management. In this paper we describe an approach to resource allocation using risk as a metric. We call this approach the Logic-Evolved Decision (LED) approach because we use logic-models to generate an exhaustive set of competing options and to describe the often highly complex model used for evaluating the risk reduction achieved by different resource allocations among these options. The risk evaluation then proceeds using probabilistic or linguistic input data.

  8. Selecting high-risk micro-scale enterprises using a qualitative risk assessment method.

    PubMed

    Kim, Hyunwook; Park, Dong-Uk

    2006-01-01

    Micro-scale enterprises (MSEs) with less than 5 employees are subject to be covered by the scheme of the regular workplace environmental inspection and medical health examination from 2002 in Korea. Due to limited resources as well as vast number of enterprises to be covered, there is an urgent need to focus these efforts to only those high-risk MSEs. To identify them, a qualitative risk assessment methodology was developed combining the hazardous nature of chemicals and exposure potentials as modeled by the HSE and the risk categorization technique by the AIHA. Risk Index (RI) was determined by combining characteristics specific to chemicals and scale of use of the chemicals. The method was applied to 514 MSEs that were selected from a random sample of 4000 MSEs. A total of 170 out of 514 MSEs studied were included in the final analysis. Current status and characteristics of MSEs were identified and RI was assigned to chemicals in each industry. Based on the distribution of RIs, the high-risk MSEs were selected. These include: wood and products of wood, chemicals and chemical products, basic metals, other machinery and equipment, motor vehicles, trailer and semi-trailer manufacturing, and furniture manufacturing. Since these MSEs are high-risk ones, more attentions should be focused on them. This method can be applied to other workplaces with no previous history of quantitative workplace inspections.

  9. Carbon Fiber Risk Analysis: Conclusions

    NASA Technical Reports Server (NTRS)

    Huston, R. J.

    1979-01-01

    It was concluded that preliminary estimates indicate that the public risk due to accidental release of carbon fiber from air transport aircraft is small. It was also concluded that further work is required to increase confidence in these estimates.

  10. Bridging the two cultures of risk analysis

    SciTech Connect

    Jasanoff, S. )

    1993-04-01

    During the past 15 years, risk analysis has come of age as an interdisciplinary field of remarkable breadth, nurturing connections among fields as diverse as mathematics, biostatistics, toxicology, and engineering on one hand, and law, psychology, sociology, and economics on the other hand. In this editorial, the author addresses the question: What has the presence of social scientists in the network meant to the substantive development of the field of risk analysis The answers offered here discuss the substantial progress in bridging the two cultures of risk analysis. Emphasis is made of the continual need for monitoring risk analysis. Topics include: the micro-worlds of risk assessment; constraining assumptions; and exchange programs. 14 refs.

  11. Revised Methods for Worker Risk Assessment

    EPA Pesticide Factsheets

    EPA is updating and changing the way it approaches pesticide risk assessments. This new approach will result in more comprehensive and consistent evaluation of potential risks of food use pesticides, non-food use pesticides, and occupational exposures.

  12. Initial Risk Analysis and Decision Making Framework

    SciTech Connect

    Engel, David W.

    2012-02-01

    Commercialization of new carbon capture simulation initiative (CCSI) technology will include two key elements of risk management, namely, technical risk (will process and plant performance be effective, safe, and reliable) and enterprise risk (can project losses and costs be controlled within the constraints of market demand to maintain profitability and investor confidence). Both of these elements of risk are incorporated into the risk analysis subtask of Task 7. Thus far, this subtask has developed a prototype demonstration tool that quantifies risk based on the expected profitability of expenditures when retrofitting carbon capture technology on a stylized 650 MW pulverized coal electric power generator. The prototype is based on the selection of specific technical and financial factors believed to be important determinants of the expected profitability of carbon capture, subject to uncertainty. The uncertainty surrounding the technical performance and financial variables selected thus far is propagated in a model that calculates the expected profitability of investments in carbon capture and measures risk in terms of variability in expected net returns from these investments. Given the preliminary nature of the results of this prototype, additional work is required to expand the scope of the model to include additional risk factors, additional information on extant and proposed risk factors, the results of a qualitative risk factor elicitation process, and feedback from utilities and other interested parties involved in the carbon capture project. Additional information on proposed distributions of these risk factors will be integrated into a commercial implementation framework for the purpose of a comparative technology investment analysis.

  13. How important are venue-based HIV risks among male clients of female sex workers? A mixed methods analysis of the risk environment in nightlife venues in Tijuana, Mexico

    PubMed Central

    Goldenberg, Shira; Strathdee, Steffanie A.; Gallardo, Manuel; Nguyen, Lucie; Lozada, Remedios; Semple, Shirley J.; Patterson, Thomas L.

    2011-01-01

    In 2008, 400 males ≥ 18 years old who paid or traded for sex with a female sex worker (FSW) in Tijuana, Mexico, in the past 4 months completed surveys and HIV/STI testing; 30 also completed qualitative interviews. To analyze environmental HIV vulnerability among male clients of FSWs in Tijuana, Mexico, we used mixed methods to investigate correlates of clients who met FSWs in nightlife venues and clients’ perspectives on venue-based risks. Logistic regression identified micro-level correlates of meeting FSWs in nightlife venues, which were triangulated with clients’ narratives regarding macro-level influences. In a multivariate model, offering increased pay for unprotected sex and binge drinking were micro-level factors that were independently associated with meeting FSWs in nightlife venues versus other places. In qualitative interviews, clients characterized nightlife venues as high risk due to the following macro-level features: social norms dictating heavy alcohol consumption; economic exploitation by establishment owners; and poor enforcement of sex work regulations in nightlife venues. Structural interventions in nightlife venues are needed to address venue-based risks. PMID:21396875

  14. How important are venue-based HIV risks among male clients of female sex workers? A mixed methods analysis of the risk environment in nightlife venues in Tijuana, Mexico.

    PubMed

    Goldenberg, Shira M; Strathdee, Steffanie A; Gallardo, Manuel; Nguyen, Lucie; Lozada, Remedios; Semple, Shirley J; Patterson, Thomas L

    2011-05-01

    In 2008, 400 males ≥18 years old who paid or traded for sex with a female sex worker (FSW) in Tijuana, Mexico, in the past 4 months completed surveys and HIV/STI testing; 30 also completed qualitative interviews. To analyze environmental sources of HIV vulnerability among male clients of FSWs in Tijuana, we used mixed methods to investigate correlates of clients who met FSWs in nightlife venues and clients' perspectives on venue-based HIV risk. Logistic regression identified micro-level correlates of meeting FSWs in nightlife venues, which were triangulated with clients' narratives regarding macro-level influences. In a multivariate model, offering increased pay for unprotected sex and binge drinking were micro-level factors that were independently associated with meeting FSWs in nightlife venues versus other places. In qualitative interviews, clients characterized nightlife venues as high risk due to the following macro-level features: social norms dictating heavy alcohol consumption; economic exploitation by establishment owners; and poor enforcement of sex work regulations in nightlife venues. Structural interventions in nightlife venues are needed to address venue-based risks.

  15. Dealing with Uncertainty in Chemical Risk Analysis

    DTIC Science & Technology

    1988-12-01

    0 * (OF 41 C-DEALING WITH UNCERTAINTY IN - CHEMICAL RISK ANALYSIS THESIS David S. Clement Captain, USAF AFIT/GOR/MA/8CD-2 DT[C. ~ELECTEf 2 9 MAR 18...AFIT/GOR/MA/88D-2 DEALING WITH UNCERTAINTY IN CHEMICAL RISK ANALYSIS THESIS David S. Clement Captain, USAF AFIT/GOR/MA/88D-2 DTIC V ~ 27989 Approved...for public release; distribution unlimited S . AFIT/GOR/KA/88D-2 DEALING WITH UNCERTAINTY IN CHEMICAL RISK ANALYSIS THESIS Presented to the Faculty

  16. Risk analysis of colorectal cancer incidence by gene expression analysis

    PubMed Central

    Shangkuan, Wei-Chuan; Lin, Hung-Che; Chang, Yu-Tien; Jian, Chen-En; Fan, Hueng-Chuen; Chen, Kang-Hua; Liu, Ya-Fang; Hsu, Huan-Ming; Chou, Hsiu-Ling; Yao, Chung-Tay

    2017-01-01

    Background Colorectal cancer (CRC) is one of the leading cancers worldwide. Several studies have performed microarray data analyses for cancer classification and prognostic analyses. Microarray assays also enable the identification of gene signatures for molecular characterization and treatment prediction. Objective Microarray gene expression data from the online Gene Expression Omnibus (GEO) database were used to to distinguish colorectal cancer from normal colon tissue samples. Methods We collected microarray data from the GEO database to establish colorectal cancer microarray gene expression datasets for a combined analysis. Using the Prediction Analysis for Microarrays (PAM) method and the GSEA MSigDB resource, we analyzed the 14,698 genes that were identified through an examination of their expression values between normal and tumor tissues. Results Ten genes (ABCG2, AQP8, SPIB, CA7, CLDN8, SCNN1B, SLC30A10, CD177, PADI2, and TGFBI) were found to be good indicators of the candidate genes that correlate with CRC. From these selected genes, an average of six significant genes were obtained using the PAM method, with an accuracy rate of 95%. The results demonstrate the potential of utilizing a model with the PAM method for data mining. After a detailed review of the published reports, the results confirmed that the screened candidate genes are good indicators for cancer risk analysis using the PAM method. Conclusions Six genes were selected with 95% accuracy to effectively classify normal and colorectal cancer tissues. We hope that these results will provide the basis for new research projects in clinical practice that aim to rapidly assess colorectal cancer risk using microarray gene expression analysis. PMID:28229027

  17. [Quantitative method of representative contaminants in groundwater pollution risk assessment].

    PubMed

    Wang, Jun-Jie; He, Jiang-Tao; Lu, Yan; Liu, Li-Ya; Zhang, Xiao-Liang

    2012-03-01

    In the light of the problem that stress vulnerability assessment in groundwater pollution risk assessment is lack of an effective quantitative system, a new system was proposed based on representative contaminants and corresponding emission quantities through the analysis of groundwater pollution sources. And quantitative method of the representative contaminants in this system was established by analyzing the three properties of representative contaminants and determining the research emphasis using analytic hierarchy process. The method had been applied to the assessment of Beijing groundwater pollution risk. The results demonstrated that the representative contaminants hazards greatly depended on different research emphasizes. There were also differences between the sequence of three representative contaminants hazards and their corresponding properties. It suggested that subjective tendency of the research emphasis had a decisive impact on calculation results. In addition, by the means of sequence to normalize the three properties and to unify the quantified properties results would zoom in or out of the relative properties characteristic of different representative contaminants.

  18. CUMULATIVE RISK ANALYSIS FOR ORGANOPHOSPHORUS PESTICIDES

    EPA Science Inventory

    Cumulative Risk Analysis for Organophosphorus Pesticides
    R. Woodrow Setzer, Jr. NHEERL MD-74, USEPA, RTP, NC 27711

    The US EPA has recently completed a risk assessment of the effects of exposure to 33 organophosphorous pesticides (OPs) through the diet, water, and resi...

  19. Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William

    2009-01-01

    This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).

  20. Cassini nuclear risk analysis with SPARRC

    NASA Astrophysics Data System (ADS)

    Ha, Chuong T.; Deane, Nelson A.

    1998-01-01

    The nuclear risk analysis of the Cassini mission is one of the most comprehensive risk analyses ever conducted for a space nuclear mission. The complexity of postulated accident scenarios and source term definitions, from launch to Earth swingby, has necessitated an extensive series of analyses in order to provide best-estimates of potential consequence results and bounding uncertainty intervals. The Space Accident Radiological Release and Consequence (SPARRC) family of codes, developed by Lockheed Martin to analyze polydispersed source terms and a combination of different atmospheric transport patterns, have been used for the Cassini Final Safety Analysis Report (FSAR). By identifying dominant contributors, the nuclear risk of each mission segment is understood with a high level of confidence. This paper provides the overall analysis process and insights developed from the risk analysis.

  1. Risk analysis approach. [of carbon fiber release

    NASA Technical Reports Server (NTRS)

    Huston, R. J.

    1979-01-01

    The assessment of the carbon fiber hazard is outlined. Program objectives, requirements of the risk analysis, and elements associated with the physical phenomena of the accidental release are described.

  2. Risking basin analysis: Procedures, methods and case studies in the Arctic National Wildlife Refuge, Alaska, and the Gulf of Mexico, Mexico

    NASA Astrophysics Data System (ADS)

    Rocha Legorreta, Francisco Javier

    Integrated basin analysis was conducted using a state-of-the-art code developed for Excel, interfacing with the Monte Carlo risking program Crystal BallRTM with the purpose to perform a total uncertainty analysis that can be done with as many uncertain inputs as required and as many outputs of interest as needed without increasing the computer time involved. Two main examples using the code are described: the first one uses synthetic information and the second example uses real but minimal information from the Arctic National Wildlife Refuge (ANWR) Area 1002 (undeformed area Brookian Sequence), Alaska, USA. In both examples, the code serves to identify which parameters in the input (ranging from uncertain data, uncertain thermal history, uncertain permeability, uncertain fracture coefficients for rocks, uncertain geochemistry kinetics, uncertain kerogen amounts and types per formation, through to uncertain volumetric factors) are causing the greatest contributions to the uncertainty in any of the selected outputs. Relative importance, relative contribution and relative sensitivity are examined to illustrate when individual parameters need to have their ranges of uncertainty narrowed in order to reduce the range of uncertainty of particular outputs. Relevant results from the ANWR case of study revealed that forecasts for oil available charge gave around 2.3 Bbbl; for gas the maximum charge available reached is 46 Bm3 . These ranges, in comparison with previous assessments, are quite different due to the group of variables used being influenced basically by the input data, the equation parameter and intrinsic assumptions. As part of the future research, the third section of this work describes the actual and future prospective areas for gas in the Mexican Basins. The main point here is to identify the advances and the important role of the Mexican gas industry as part of a future strategic investment opportunity.

  3. Risk Based Requirements for Long Term Stewardship: A Proof-of-Principle Analysis of an Analytic Method Tested on Selected Hanford Locations

    SciTech Connect

    GM Gelston; JW Buck; LR Huesties; MS Peffers; TB Miley; TT Jarvis; WB Andrews

    1998-12-03

    Since 1989, the Department of Energy's (DOE) Environmental Management (EM) Program has managed the environmental legacy of US nuclear weapons production research and testing at 137 facilities in 31 states and one US territory. The EM program has conducted several studies on the public risks posed by contaminated sites at these facilities. In Risks and the Risk Debate DOE, 1995a, the Department analyzed the risks at sites before, during, and after remediation work by the EM program. The results indicated that aside from a few urgent risks, most hazards present little inherent risk because physical and active site management controls limit both the releases of site contaminants, and public access to these hazards. Without these controls, these sites would pose greater risks to the public. Past risk reports, however, provided little irdiormation about post- cleanup risk, primarily because of uncertainty about fiture site uses and site characteristics at the end of planned cleanup activities. This is of concern because in many cases current cleanup technologies, and remedies, will last a shorter period of time than the waste itself and the resulting contamination will remain hazardous.

  4. The application of risk analysis in aquatic animal health management.

    PubMed

    Peeler, E J; Murray, A G; Thebault, A; Brun, E; Giovaninni, A; Thrush, M A

    2007-09-14

    assessment. Risk analysis has improved decision making in aquatic animal health management by providing a transparent method for using the available scientific information. The lack of data is the main constraint to the application of risk analysis in aquatic animal health. The identification of critical parameters is an important output from risk analysis models which should be used to prioritise research.

  5. Fuzzy risk analysis for nuclear safeguards

    SciTech Connect

    Zardecki, A.

    1993-01-01

    Analysis of a safeguards system, based on the notion of fuzzy sets and linguistic variables, concerns such as complexity and inherent imprecision in estimating the possibility of loss or compromise. The automated risk analysis allows the risk to be determined for an entire system based on estimates for lowest level components and the component proportion. In addition, for each component (asset) the most effective combination of protection mechanisms against a given set of threats is determined. A distinction between bar and featured risk is made.

  6. RISK ANALYSIS, ANALYSIS OF VARIANCE: GETTING MORE FROM OUR DATA

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Analysis of variance (ANOVA) and regression are common statistical techniques used to analyze agronomic experimental data and determine significant differences among yields due to treatments or other experimental factors. Risk analysis provides an alternate and complimentary examination of the same...

  7. Alcohol Consumption and Gastric Cancer Risk: A Meta-Analysis

    PubMed Central

    Ma, Ke; Baloch, Zulqarnain; He, Ting-Ting; Xia, Xueshan

    2017-01-01

    Background We sought to determine by meta-analysis the relationship between drinking alcohol and the risk of gastric cancer. Material/Methods A systematic Medline search was performed to identify all published reports of drinking alcohol and the associated risk of gastric cancer. Initially we retrieved 2,494 studies, but after applying inclusion and exclusion criteria, only ten studies were found to be eligible for our meta-analysis. Results Our meta-analysis showed that alcohol consumption elevated the risk of gastric cancer with an odds ratio (OR) of 1.39 (95% CI 1.20–1.61). Additionally, subgroup analysis showed that only a nested case-control report from Sweden did not support this observation. Subgroup analysis of moderate drinking and heavy drinking also confirmed that drinking alcohol increased the risk of gastric cancer. Publication bias analysis (Begg’s and Egger’s tests) showed p values were more than 0.05, suggesting that the 10 articles included in our analysis did not have a publication bias. Conclusions The results from this meta-analysis support the hypothesis that alcohol consumption can increase the risk of gastric cancer; suggesting that effective moderation of alcohol drinking may reduce the risk of gastric cancer. PMID:28087989

  8. Risk Analysis Training within the Army: Current Status, Future Trends,

    DTIC Science & Technology

    risk analysis . Since risk analysis training in the Army is...become involved in risk analysis training. He reviews all risk analysis -related training done in any course at the Center. Also provided is information...expected to use the training. Then the future trend in risk analysis training is presented. New course, course changes and hardware/software changes that will make risk analysis more palatable are

  9. Starlink corn: a risk analysis.

    PubMed Central

    Bucchini, Luca; Goldman, Lynn R

    2002-01-01

    Modern biotechnology has dramatically increased our ability to alter the agronomic traits of plants. Among the novel traits that biotechnology has made available, an important group includes Bacillus thuringiensis-derived insect resistance. This technology has been applied to potatoes, cotton, and corn. Benefits of Bt crops, and biotechnology generally, can be realized only if risks are assessed and managed properly. The case of Starlink corn, a plant modified with a gene that encodes the Bt protein Cry9c, was a severe test of U.S. regulatory agencies. The U.S. Environmental Protection Agency had restricted its use to animal feed due to concern about the potential for allergenicity. However, Starlink corn was later found throughout the human food supply, resulting in food recalls by the Food and Drug Administration and significant disruption of the food supply. Here we examine the regulatory history of Starlink, the assessment framework employed by the U.S. government, assumptions and information gaps, and the key elements of government efforts to manage the product. We explore the impacts on regulations, science, and society and conclude that only significant advances in our understanding of food allergies and improvements in monitoring and enforcement will avoid similar events in the future. Specifically, we need to develop a stronger fundamental basis for predicting allergic sensitization and reactions if novel proteins are to be introduced in this fashion. Mechanisms are needed to assure that worker and community aeroallergen risks are considered. Requirements are needed for the development of valid assays so that enforcement and post market surveillance activities can be conducted. PMID:11781159

  10. Risk Assessment and Integration Team (RAIT) Portfolio Risk Analysis Strategy

    NASA Technical Reports Server (NTRS)

    Edwards, Michelle

    2010-01-01

    Impact at management level: Qualitative assessment of risk criticality in conjunction with risk consequence, likelihood, and severity enable development of an "investment policy" towards managing a portfolio of risks. Impact at research level: Quantitative risk assessments enable researchers to develop risk mitigation strategies with meaningful risk reduction results. Quantitative assessment approach provides useful risk mitigation information.

  11. Risk-Based Prioritization Method for the Classification of Groundwater Pollution from Hazardous Waste Landfills

    NASA Astrophysics Data System (ADS)

    Yang, Yu; Jiang, Yong-Hai; lian, Xin-Ying; Xi, Bei-Dou; Ma, Zhi-fei; Xu, Xiang-Jian; An, Da

    2016-12-01

    Hazardous waste landfill sites are a significant source of groundwater pollution. To ensure that these landfills with a significantly high risk of groundwater contamination are properly managed, a risk-based ranking method related to groundwater contamination is needed. In this research, a risk-based prioritization method for the classification of groundwater pollution from hazardous waste landfills was established. The method encompasses five phases, including risk pre-screening, indicator selection, characterization, classification and, lastly, validation. In the risk ranking index system employed here, 14 indicators involving hazardous waste landfills and migration in the vadose zone as well as aquifer were selected. The boundary of each indicator was determined by K-means cluster analysis and the weight of each indicator was calculated by principal component analysis. These methods were applied to 37 hazardous waste landfills in China. The result showed that the risk for groundwater contamination from hazardous waste landfills could be ranked into three classes from low to high risk. In all, 62.2 % of the hazardous waste landfill sites were classified in the low and medium risk classes. The process simulation method and standardized anomalies were used to validate the result of risk ranking; the results were consistent with the simulated results related to the characteristics of contamination. The risk ranking method was feasible, valid and can provide reference data related to risk management for groundwater contamination at hazardous waste landfill sites.

  12. Risk-Based Prioritization Method for the Classification of Groundwater Pollution from Hazardous Waste Landfills.

    PubMed

    Yang, Yu; Jiang, Yong-Hai; Lian, Xin-Ying; Xi, Bei-Dou; Ma, Zhi-Fei; Xu, Xiang-Jian; An, Da

    2016-12-01

    Hazardous waste landfill sites are a significant source of groundwater pollution. To ensure that these landfills with a significantly high risk of groundwater contamination are properly managed, a risk-based ranking method related to groundwater contamination is needed. In this research, a risk-based prioritization method for the classification of groundwater pollution from hazardous waste landfills was established. The method encompasses five phases, including risk pre-screening, indicator selection, characterization, classification and, lastly, validation. In the risk ranking index system employed here, 14 indicators involving hazardous waste landfills and migration in the vadose zone as well as aquifer were selected. The boundary of each indicator was determined by K-means cluster analysis and the weight of each indicator was calculated by principal component analysis. These methods were applied to 37 hazardous waste landfills in China. The result showed that the risk for groundwater contamination from hazardous waste landfills could be ranked into three classes from low to high risk. In all, 62.2 % of the hazardous waste landfill sites were classified in the low and medium risk classes. The process simulation method and standardized anomalies were used to validate the result of risk ranking; the results were consistent with the simulated results related to the characteristics of contamination. The risk ranking method was feasible, valid and can provide reference data related to risk management for groundwater contamination at hazardous waste landfill sites.

  13. Integrating Household Risk Mitigation Behavior in Flood Risk Analysis: An Agent-Based Model Approach.

    PubMed

    Haer, Toon; Botzen, W J Wouter; de Moel, Hans; Aerts, Jeroen C J H

    2016-11-28

    Recent studies showed that climate change and socioeconomic trends are expected to increase flood risks in many regions. However, in these studies, human behavior is commonly assumed to be constant, which neglects interaction and feedback loops between human and environmental systems. This neglect of human adaptation leads to a misrepresentation of flood risk. This article presents an agent-based model that incorporates human decision making in flood risk analysis. In particular, household investments in loss-reducing measures are examined under three economic decision models: (1) expected utility theory, which is the traditional economic model of rational agents; (2) prospect theory, which takes account of bounded rationality; and (3) a prospect theory model, which accounts for changing risk perceptions and social interactions through a process of Bayesian updating. We show that neglecting human behavior in flood risk assessment studies can result in a considerable misestimation of future flood risk, which is in our case study an overestimation of a factor two. Furthermore, we show how behavior models can support flood risk analysis under different behavioral assumptions, illustrating the need to include the dynamic adaptive human behavior of, for instance, households, insurers, and governments. The method presented here provides a solid basis for exploring human behavior and the resulting flood risk with respect to low-probability/high-impact risks.

  14. Complementing Gender Analysis Methods.

    PubMed

    Kumar, Anant

    2016-01-01

    The existing gender analysis frameworks start with a premise that men and women are equal and should be treated equally. These frameworks give emphasis on equal distribution of resources between men and women and believe that this will bring equality which is not always true. Despite equal distribution of resources, women tend to suffer and experience discrimination in many areas of their lives such as the power to control resources within social relationships, and the need for emotional security and reproductive rights within interpersonal relationships. These frameworks believe that patriarchy as an institution plays an important role in women's oppression, exploitation, and it is a barrier in their empowerment and rights. Thus, some think that by ensuring equal distribution of resources and empowering women economically, institutions like patriarchy can be challenged. These frameworks are based on proposed equality principle which puts men and women in competing roles. Thus, the real equality will never be achieved. Contrary to the existing gender analysis frameworks, the Complementing Gender Analysis framework proposed by the author provides a new approach toward gender analysis which not only recognizes the role of economic empowerment and equal distribution of resources but suggests to incorporate the concept and role of social capital, equity, and doing gender in gender analysis which is based on perceived equity principle, putting men and women in complementing roles that may lead to equality. In this article the author reviews the mainstream gender theories in development from the viewpoint of the complementary roles of gender. This alternative view is argued based on existing literature and an anecdote of observations made by the author. While criticizing the equality theory, the author offers equity theory in resolving the gender conflict by using the concept of social and psychological capital.

  15. Fault tree analysis for integrated and probabilistic risk analysis of drinking water systems.

    PubMed

    Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof

    2009-04-01

    Drinking water systems are vulnerable and subject to a wide range of risks. To avoid sub-optimisation of risk-reduction options, risk analyses need to include the entire drinking water system, from source to tap. Such an integrated approach demands tools that are able to model interactions between different events. Fault tree analysis is a risk estimation tool with the ability to model interactions between events. Using fault tree analysis on an integrated level, a probabilistic risk analysis of a large drinking water system in Sweden was carried out. The primary aims of the study were: (1) to develop a method for integrated and probabilistic risk analysis of entire drinking water systems; and (2) to evaluate the applicability of Customer Minutes Lost (CML) as a measure of risk. The analysis included situations where no water is delivered to the consumer (quantity failure) and situations where water is delivered but does not comply with water quality standards (quality failure). Hard data as well as expert judgements were used to estimate probabilities of events and uncertainties in the estimates. The calculations were performed using Monte Carlo simulations. CML is shown to be a useful measure of risks associated with drinking water systems. The method presented provides information on risk levels, probabilities of failure, failure rates and downtimes of the system. This information is available for the entire system as well as its different sub-systems. Furthermore, the method enables comparison of the results with performance targets and acceptable levels of risk. The method thus facilitates integrated risk analysis and consequently helps decision-makers to minimise sub-optimisation of risk-reduction options.

  16. Advanced uncertainty modelling for container port risk analysis.

    PubMed

    Alyami, Hani; Yang, Zaili; Riahi, Ramin; Bonsall, Stephen; Wang, Jin

    2016-08-13

    Globalization has led to a rapid increase of container movements in seaports. Risks in seaports need to be appropriately addressed to ensure economic wealth, operational efficiency, and personnel safety. As a result, the safety performance of a Container Terminal Operational System (CTOS) plays a growing role in improving the efficiency of international trade. This paper proposes a novel method to facilitate the application of Failure Mode and Effects Analysis (FMEA) in assessing the safety performance of CTOS. The new approach is developed through incorporating a Fuzzy Rule-Based Bayesian Network (FRBN) with Evidential Reasoning (ER) in a complementary manner. The former provides a realistic and flexible method to describe input failure information for risk estimates of individual hazardous events (HEs) at the bottom level of a risk analysis hierarchy. The latter is used to aggregate HEs safety estimates collectively, allowing dynamic risk-based decision support in CTOS from a systematic perspective. The novel feature of the proposed method, compared to those in traditional port risk analysis lies in a dynamic model capable of dealing with continually changing operational conditions in ports. More importantly, a new sensitivity analysis method is developed and carried out to rank the HEs by taking into account their specific risk estimations (locally) and their Risk Influence (RI) to a port's safety system (globally). Due to its generality, the new approach can be tailored for a wide range of applications in different safety and reliability engineering and management systems, particularly when real time risk ranking is required to measure, predict, and improve the associated system safety performance.

  17. Preliminary Technical Risk Analysis for the Geothermal Technologies Program

    SciTech Connect

    2009-01-18

    This report explains the goals, methods, and results of a probabilistic analysis of technical risk for a portfolio of R&D projects in the DOE Geothermal Technologies Program (The Program). The analysis is a task by Princeton Energy Resources International, LLC, in support of the National Renewable Energy Laboratory on behalf of the Program. The main challenge in the analysis lies in translating R&D results to a quantitative reflection of technical risk for a key Program metric: levelized cost of energy (LCOE).

  18. A risk analysis model for radioactive wastes.

    PubMed

    Külahcı, Fatih

    2011-07-15

    Hazardous wastes affect natural environmental systems to a significant extend, and therefore, it is necessary to control their harm through risk analysis. Herein, an effective risk methodology is proposed by considering their uncertain behaviors on stochastic, statistical and probabilistic bases. The basic element is attachment of a convenient probability distribution function (pdf) to a given waste quality measurement sequence. In this paper, (40)K contaminant measurements are adapted for risk assessment application after derivation of necessary fundamental formulations. The spatial contaminant distribution of (40)K is presented in the forms of maps and three-dimensional surfaces.

  19. Analysis of labour risks in the Spanish industrial aerospace sector.

    PubMed

    Laguardia, Juan; Rubio, Emilio; Garcia, Ana; Garcia-Foncillas, Rafael

    2016-01-01

    Labour risk prevention is an activity integrated within Safety and Hygiene at Work in Spain. In 2003, the Electronic Declaration for Accidents at Work, Delt@ (DELTA) was introduced. The industrial aerospace sector is subject to various risks. Our objective is to analyse the Spanish Industrial Aerospace Sector (SIAS) using the ACSOM methodology to assess its labour risks and to prioritise preventive actions. The SIAS and the Services Subsector (SS) were created and the relevant accident rate data were obtained. The ACSOM method was applied through double contrast (deviation and translocation) of the SIAS or SS risk polygon with the considered pattern, accidents from all sectors (ACSOM G) or the SIAS. A list of risks was obtained, ordered by action phases. In the SIAS vs. ACSOM G analysis, radiation risks were the worst, followed by overstrains. Accidents caused by living beings were also significant in the SS vs. SIAE, which will be able to be used to improve Risk Prevention. Radiation is the most significant risk in the SIAS and the SS. Preventive actions will be primary and secondary. ACSOM has shown itself to be a valid tool for the analysis of labour risks.

  20. State of the art in benefit-risk analysis: medicines.

    PubMed

    Luteijn, J M; White, B C; Gunnlaugsdóttir, H; Holm, F; Kalogeras, N; Leino, O; Magnússon, S H; Odekerken, G; Pohjola, M V; Tijhuis, M J; Tuomisto, J T; Ueland, Ø; McCarron, P A; Verhagen, H

    2012-01-01

    Benefit-risk assessment in medicine has been a valuable tool in the regulation of medicines since the 1960s. Benefit-risk assessment takes place in multiple stages during a medicine's life-cycle and can be conducted in a variety of ways, using methods ranging from qualitative to quantitative. Each benefit-risk assessment method is subject to its own specific strengths and limitations. Despite its widespread and long-time use, benefit-risk assessment in medicine is subject to debate and suffers from a number of limitations and is currently still under development. This state of the art review paper will discuss the various aspects and approaches to benefit-risk assessment in medicine in a chronological pathway. The review will discuss all types of benefit-risk assessment a medicinal product will undergo during its lifecycle, from Phase I clinical trials to post-marketing surveillance and health technology assessment for inclusion in public formularies. The benefit-risk profile of a drug is dynamic and differs for different indications and patient groups. In the end of this review we conclude benefit-risk analysis in medicine is a developed practice that is subject to continuous improvement and modernisation. Improvement not only in methodology, but also in cooperation between organizations can improve benefit-risk assessment.

  1. Methods of Cosmochemical Analysis

    NASA Astrophysics Data System (ADS)

    Lahiri, S.; Maiti, M.

    Some radionuclides, like 10Be (T 1/2 = 1.5 Ma), 14C (T 1/2 = 5,730 years), 26Al (T 1/2 = 0.716 Ma), 53Mn (T 1/2 = 3.7 Ma), and 60Fe (T 1/2 = 1.5 Ma), 146Sm (T 1/2 = 103 Ma), 182Hf (T 1/2 = 9 Ma), 244Pu (T 1/2 = 80 Ma) are either being produced continuously by the interaction of cosmic rays (CR) or might have been produced in supernovae millions of years ago. Analysis of these radionuclides in ultratrace scale has strong influence in almost all branches of sciences, starting from archaeology to biology, nuclear physics to astrophysics. However, measurement of these radionuclides appeared as a borderline problem exploiting their decay properties because of scarcity in natural archives and long half-life. The one and only way seemed to be that of mass measurement. Accelerator mass spectrometry (AMS) is the best suited for this purpose. Apart from AMS, other mass measurement techniques like inductively coupled plasma-mass spectrometry (ICP-MS), thermal ionization mass spectrometry (TIMS), resonant laser ionization mass spectrometry (RIMS), secondary ionization mass spectrometry (SIMS) have also been used with limited sensitivity and approach.

  2. 14 CFR 417.225 - Debris risk analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Debris risk analysis. 417.225 Section 417... OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.225 Debris risk analysis. A... § 417.107(b) for debris. A debris risk analysis must account for risk to populations on land,...

  3. The dissection of risk: a conceptual analysis.

    PubMed

    O'Byrne, Patrick

    2008-03-01

    Recently, patient safety has gained popularity in the nursing literature. While this topic is used extensively and has been analyzed thoroughly, some of the concepts upon which it relies, such as risk, have remained undertheorized. In fact, despite its considerable use, the term 'risk' has been largely assumed to be inherently neutral - meaning that its definition and discovery is seen as objective and impartial, and that risk avoidance is natural and logical. Such an oversight in evaluation requires that the concept of risk be thoroughly analyzed as it relates to nursing practices, particularly in relation to those practices surrounding bio-political nursing care, such as public health, as well as other more trendy nursing topics, such as patient safety. Thus, this paper applies the Evolutionary Model of concept analysis to explore 'risk', and expose it as one mechanism of maintaining prescribed/ proscribed social practices. Thereby, an analysis of risk results in the definitions and roles of the discipline and profession of nursing expanding from solely being dedicated to patient care, to include, in addition, its functions as a governmental body that unwittingly maintains hegemonic infrastructures.

  4. Seismic vulnerability assessments in risk analysis

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Larionov, Valery; Bonnin, Jean; Ugarov, Alexander

    2013-04-01

    The assessment of seismic vulnerability is a critical issue within natural and technological risk analysis. In general, there are three common types of methods used for development of vulnerability functions of different elements at risk: empirical, analytical and expert estimations. The paper addresses the empirical methods for seismic vulnerability estimation for residential buildings and industrial facilities. The results of engineering analysis of past earthquake consequences, as well as the statistical data on buildings behavior during strong earthquakes presented in the different seismic intensity scales, are used to verify the regional parameters of mathematical models in order to simulate physical and economic vulnerability for different building types classified according to seismic scale MMSK-86. Verified procedure has been used to estimate the physical and economic vulnerability of buildings and constructions against earthquakes for the Northern Caucasus Federal region of the Russian Federation and Krasnodar area, which are characterized by rather high level of seismic activity and high population density. In order to estimate expected damage states to buildings and constructions in the case of the earthquakes according to the OSR-97B (return period T=1,000 years) within big cities and towns, they were divided into unit sites and their coordinates were presented as dots located in the centers of unit sites. Then the indexes obtained for each unit site were summed up. The maps of physical vulnerability zoning for Northern Caucasus Federal region of the Russian Federation and Krasnodar area includes two elements: percent of different damage states for settlements with number of inhabitants less than 1,000 and vulnerability for cities and towns with number of inhabitants more than 1,000. The hypsometric scale is used to represent both elements on the maps. Taking into account the size of oil pipe line systems located in the highly active seismic zones in

  5. Traditional Methods for Mineral Analysis

    NASA Astrophysics Data System (ADS)

    Ward, Robert E.; Carpenter, Charles E.

    This chapter describes traditional methods for analysis of minerals involving titrimetric and colorimetric procedures, and the use of ion selective electrodes. Other traditional methods of mineral analysis include gravimetric titration (i.e., insoluble forms of minerals are precipitated, rinse, dried, and weighed) and redox reactions (i.e., mineral is part of an oxidation-reduction reaction, and product is quantitated). However, these latter two methods will not be covered because they currently are used little in the food industry. The traditional methods that will be described have maintained widespread usage in the food industry despite the development of more modern instrumentation such as atomic absorption spectroscopy and inductively coupled plasma-atomic emission spectroscopy (Chap. 24). Traditional methods generally require chemicals and equipment that are routinely available in an analytical laboratory and are within the experience of most laboratory technicians. Additionally, traditional methods often form the basis for rapid analysis kits (e.g., Quantab®; for salt determination) that are increasingly in demand. Procedures for analysis of minerals of major nutritional or food processing concern are used for illustrative purposes. For additional examples of traditional methods refer to references (1-6). Slight modifications of these traditional methods are often needed for specific foodstuffs to minimize interferences or to be in the range of analytical performance. For analytical requirements for specific foods see the Official Methods of Analysis of AOAC International (5) and related official methods (6).

  6. A Method for Dynamic Risk Assessment and Management of Rockbursts in Drill and Blast Tunnels

    NASA Astrophysics Data System (ADS)

    Liu, Guo-Feng; Feng, Xia-Ting; Feng, Guang-Liang; Chen, Bing-Rui; Chen, Dong-Fang; Duan, Shu-Qian

    2016-08-01

    Focusing on the problems caused by rockburst hazards in deep tunnels, such as casualties, damage to construction equipment and facilities, construction schedule delays, and project cost increase, this research attempts to present a methodology for dynamic risk assessment and management of rockbursts in D&B tunnels. The basic idea of dynamic risk assessment and management of rockbursts is determined, and methods associated with each step in the rockburst risk assessment and management process are given, respectively. Among them, the main parts include a microseismic method for early warning the occurrence probability of rockburst risk, an estimation method that aims to assess potential consequences of rockburst risk, an evaluation method that utilizes a new quantitative index considering both occurrence probability and consequences for determining the level of rockburst risk, and the dynamic updating. Specifically, this research briefly describes the referenced microseismic method of warning rockburst, but focuses on the analysis of consequences and associated risk assessment and management of rockburst. Using the proposed method of risk assessment and management of rockburst, the occurrence probability, potential consequences, and the level of rockburst risk can be obtained in real-time during tunnel excavation, which contributes to the dynamic optimisation of risk mitigation measures and their application. The applicability of the proposed method has been verified by those cases from the Jinping II deep headrace and water drainage tunnels at depths of 1900-2525 m (with a length of 11.6 km in total for D&B tunnels).

  7. [Statistical prediction methods in violence risk assessment and its application].

    PubMed

    Liu, Yuan-Yuan; Hu, Jun-Mei; Yang, Min; Li, Xiao-Song

    2013-06-01

    It is an urgent global problem how to improve the violence risk assessment. As a necessary part of risk assessment, statistical methods have remarkable impacts and effects. In this study, the predicted methods in violence risk assessment from the point of statistics are reviewed. The application of Logistic regression as the sample of multivariate statistical model, decision tree model as the sample of data mining technique, and neural networks model as the sample of artificial intelligence technology are all reviewed. This study provides data in order to contribute the further research of violence risk assessment.

  8. Probabilistic methods for rotordynamics analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.

    1991-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.

  9. Campylobacter detection along the food chain--towards improved quantitative risk analysis by live/dead discriminatory culture-independent methods.

    PubMed

    Stingl, Kerstin; Buhler, Christiane; Krüger, Nora-Johanna

    2015-01-01

    Death, although absolute in its consequence, is not measurable by an absolute parameter in bacteria. Viability assays address different aspects of life, e. g. the capability to form a colony on an agar plate (CFU), metabolic properties or mem- brane integrity. For food safety, presence of infectious potential is the relevant criterion for risk assessment, currently only partly reflected by the quantification of CFU. It will be necessary for future improved risk assessment, in particular when fastidious bacterial pathogens are implicated, to enhance the informative value of CFU. This might be feasible by quantification of the number of intact and potentially infectious Campylobacter, impermeable to the DNA intercalating dye propidium monoazide (PMA). The latter are quantifiable by the combination of PMA with real-time PCR, although thorough controls have to be developed for standardization and the circumvention of pitfalls. Under consideration of differ- ent physiological states of the food-borne pathogen, we provide an overview of current and future suitable detection/quantification targets along the food chain, including putative limitations of detection.

  10. Advanced Risk Analysis for High-Performing Organizations

    DTIC Science & Technology

    2006-01-01

    using traditional risk analysis techniques. Mission Assurance Analysis Protocol (MAAP) is one technique that high performers can use to identify and mitigate the risks arising from operational complexity....The operational environment for many types of organizations is changing. Changes in operational environments are driving the need for advanced risk ... analysis techniques. Many types of risk prevalent in today’s operational environments (e.g., event risks, inherited risk) are not readily identified

  11. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    SciTech Connect

    Lowry, Peter P.; Wagner, Katie A.

    2015-04-01

    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish a lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.

  12. Risk Analysis of the Supply-Handling Conveyor System.

    DTIC Science & Technology

    The report documents the risk analysis that was performed on a supply-handling conveyor system. The risk analysis was done to quantify the risks...involved for project development in addition to compliance with the draft AMC regulation on risk analysis . The conveyor system is in the final phase of

  13. Risk analysis for confined space entries: Critical analysis of four tools applied to three risk scenarios.

    PubMed

    Burlet-Vienney, Damien; Chinniah, Yuvin; Bahloul, Ali; Roberge, Brigitte

    2016-01-01

    Investigation reports of fatal confined space accidents nearly always point to a problem of identifying or underestimating risks. This paper compares 4 different risk analysis tools developed for confined spaces by applying them to 3 hazardous scenarios. The tools were namely 1. a checklist without risk estimation (Tool A), 2. a checklist with a risk scale (Tool B), 3. a risk calculation without a formal hazard identification stage (Tool C), and 4. a questionnaire followed by a risk matrix (Tool D). Each tool's structure and practical application were studied. Tools A and B gave crude results comparable to those of more analytic tools in less time. Their main limitations were lack of contextual information for the identified hazards and greater dependency on the user's expertise and ability to tackle hazards of different nature. Tools C and D utilized more systematic approaches than tools A and B by supporting risk reduction based on the description of the risk factors. Tool D is distinctive because of 1. its comprehensive structure with respect to the steps suggested in risk management, 2. its dynamic approach to hazard identification, and 3. its use of data resulting from the risk analysis.

  14. Reliability/Risk Methods and Design Tools for Application in Space Programs

    NASA Technical Reports Server (NTRS)

    Townsend, John S.; Smart, Christian

    1999-01-01

    Since 1984 NASA has funded several major programs to develop Reliability/Risk Methods and tools for engineers to apply in the design and assessment of aerospace hardware. Two probabilistic software tools that show great promise for practical application are the finite element code NESSUS and the system risk analysis code QRAS. This paper examines NASA's past, present, and future directions in reliability and risk engineering applications, Both the NESSUS and QRAS software tools are detailed.

  15. An approximate method for determining of investment risk

    NASA Astrophysics Data System (ADS)

    Slavkova, Maria; Tzenova, Zlatina

    2016-12-01

    In this work a method for determining of investment risk during all economic states is considered. It is connected to matrix games with two players. A definition for risk in a matrix game is introduced. Three properties are proven. It is considered an appropriate example.

  16. An integrated risk analysis methodology in a multidisciplinary design environment

    NASA Astrophysics Data System (ADS)

    Hampton, Katrina Renee

    Design of complex, one-of-a-kind systems, such as space transportation systems, is characterized by high uncertainty and, consequently, high risk. It is necessary to account for these uncertainties in the design process to produce systems that are more reliable. Systems designed by including uncertainties and managing them, as well, are more robust and less prone to poor operations as a result of parameter variability. The quantification, analysis and mitigation of uncertainties are challenging tasks as many systems lack historical data. In such an environment, risk or uncertainty quantification becomes subjective because input data is based on professional judgment. Additionally, there are uncertainties associated with the analysis tools and models. Both the input data and the model uncertainties must be considered for a multi disciplinary systems level risk analysis. This research synthesizes an integrated approach for developing a method for risk analysis. Expert judgment methodology is employed to quantify external risk. This methodology is then combined with a Latin Hypercube Sampling - Monte Carlo simulation to propagate uncertainties across a multidisciplinary environment for the overall system. Finally, a robust design strategy is employed to mitigate risk during the optimization process. This type of approach to risk analysis is conducive to the examination of quantitative risk factors. The core of this research methodology is the theoretical framework for uncertainty propagation. The research is divided into three stages or modules. The first two modules include the identification/quantification and propagation of uncertainties. The third module involves the management of uncertainties or response optimization. This final module also incorporates the integration of risk into program decision-making. The risk analysis methodology, is applied to a launch vehicle conceptual design study at NASA Langley Research Center. The launch vehicle multidisciplinary

  17. Comparison of Hartmann analysis methods.

    PubMed

    Canovas, Carmen; Ribak, Erez N

    2007-04-01

    Analysis of Hartmann-Shack wavefront sensors for the eye is traditionally performed by locating and centroiding the sensor spots. These centroids provide the gradient, which is integrated to yield the ocular aberration. Fourier methods can replace the centroid stage, and Fourier integration can replace the direct integration. The two--demodulation and integration--can be combined to directly retrieve the wavefront, all in the Fourier domain. Now we applied this full Fourier analysis to circular apertures and real images. We performed a comparison between it and previous methods of convolution, interpolation, and Fourier demodulation. We also compared it with a centroid method, which yields the Zernike coefficients of the wavefront. The best performance was achieved for ocular pupils with a small boundary slope or far from the boundary and acceptable results for images missing part of the pupil. The other Fourier analysis methods had much higher tolerance to noncentrosymmetric apertures.

  18. Integrated Reliability and Risk Analysis System (IRRAS)

    SciTech Connect

    Russell, K D; McKay, M K; Sattison, M.B. Skinner, N.L.; Wood, S T; Rasmuson, D M

    1992-01-01

    The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the user the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification. Version 1.0 of the IRRAS program was released in February of 1987. Since that time, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 4.0 and is the subject of this Reference Manual. Version 4.0 of IRRAS provides the same capabilities as Version 1.0 and adds a relational data base facility for managing the data, improved functionality, and improved algorithm performance.

  19. Risk Analysis for Environmental Health Triage

    SciTech Connect

    Bogen, K T

    2005-11-18

    The Homeland Security Act mandates development of a national, risk-based system to support planning for, response to and recovery from emergency situations involving large-scale toxic exposures. To prepare for and manage consequences effectively, planners and responders need not only to identify zones of potentially elevated individual risk, but also to predict expected casualties. Emergency response support systems now define ''consequences'' by mapping areas in which toxic chemical concentrations do or may exceed Acute Exposure Guideline Levels (AEGLs) or similar guidelines. However, because AEGLs do not estimate expected risks, current unqualified claims that such maps support consequence management are misleading. Intentionally protective, AEGLs incorporate various safety/uncertainty factors depending on scope and quality of chemical-specific toxicity data. Some of these factors are irrelevant, and others need to be modified, whenever resource constraints or exposure-scenario complexities require responders to make critical trade-off (triage) decisions in order to minimize expected casualties. AEGL-exceedance zones cannot consistently be aggregated, compared, or used to calculate expected casualties, and so may seriously misguide emergency response triage decisions. Methods and tools well established and readily available to support environmental health protection are not yet developed for chemically related environmental health triage. Effective triage decisions involving chemical risks require a new assessment approach that focuses on best estimates of likely casualties, rather than on upper plausible bounds of individual risk. If risk-based consequence management is to become a reality, federal agencies tasked with supporting emergency response must actively coordinate to foster new methods that can support effective environmental health triage.

  20. Risk and value analysis of SETI.

    PubMed

    Billingham, J

    1990-01-01

    This paper attempts to apply a traditional risk and value analysis to the Search for Extraterrestrial Intelligence--SETI. In view of the difficulties of assessing the probability of success, a comparison is made between SETI and a previous search for extraterrestrial life, the biological component of Project Viking. Our application of simple Utility Theory, given some reasonable assumptions, suggests that SETI is at least as worthwhile as the biological experiment on Viking.

  1. Risk and value analysis of SETI

    NASA Technical Reports Server (NTRS)

    Billingham, J.

    1990-01-01

    This paper attempts to apply a traditional risk and value analysis to the Search for Extraterrestrial Intelligence--SETI. In view of the difficulties of assessing the probability of success, a comparison is made between SETI and a previous search for extraterrestrial life, the biological component of Project Viking. Our application of simple Utility Theory, given some reasonable assumptions, suggests that SETI is at least as worthwhile as the biological experiment on Viking.

  2. Comparison study on qualitative and quantitative risk assessment methods for urban natural gas pipeline network.

    PubMed

    Han, Z Y; Weng, W G

    2011-05-15

    In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment.

  3. Fire Risk Analysis for Armenian NPP Confinement

    SciTech Connect

    Poghosyan, Shahen; Malkhasyan, Albert; Bznuni, Surik; Amirjanyan, Armen

    2006-07-01

    Major fire occurred at Armenian NPP (ANPP) in October 1982 showed that fire-induced initiating events (IE) can have dominant contribution in overall risk of core damage. Probabilistic Safety Assessment study for fire-induced initiating events for ANPP was initiated in 2002. Analysis was performed for compartments fires in which could result in failure of components which are necessary for reactor cold shutdown. Analysis shows that main risk from fire at ANPP is conditioned by fire in cable tunnels 61-64. Meanwhile fire in confinement compartments don't have significant contribution to overall risk of core damage. The exception is so called 'confinement valves compartment' (room no.A-013/2) fire (more than 7.5% of CDF) in which fire could result in the loss of coolant accident with unavailability of primary makeup system, which directly leads to core damage. Detailed analysis of this problem that is common for typical WWER-440/230 reactors with no hermetic MCPs and recommendations for solution are presented in this paper. (authors)

  4. A generic computerized method for estimate of familial risks.

    PubMed Central

    Colombet, Isabelle; Xu, Yigang; Jaulent, Marie-Christine; Desages, Daniel; Degoulet, Patrice; Chatellier, Gilles

    2002-01-01

    Most guidelines developed for cancers screening and for cardiovascular risk management use rules to estimate familial risk. These rules are complex, difficult to memorize, and need to collect a complete pedigree. This paper describes a generic computerized method to estimate familial risks and its implementation in an internet-based application. The program is based on 3 generic models: a model of the family; a model of familial risk; a display model for the pedigree. The model of family allows to represent each member of the family and to construct and display a family tree. The model of familial risk is generic and allows easy update of the program with new diseases or new rules. It was possible to implement guidelines dealing with breast and colorectal cancer and cardiovascular diseases prevention. First evaluation with general practitioners showed that the program was usable. Impact on quality of familial risk estimate should be more documented. PMID:12463810

  5. Preliminary Technical Risk Analysis for the Geothermal Technologies Program

    SciTech Connect

    McVeigh, J.; Cohen, J.; Vorum, M.; Porro, G.; Nix, G.

    2007-03-01

    This report explains the goals, methods, and results of a probabilistic analysis of technical risk for a portfolio of R&D projects in the DOE Geothermal Technologies Program ('the Program'). The analysis is a task by Princeton Energy Resources International, LLC (PERI), in support of the National Renewable Energy Laboratory (NREL) on behalf of the Program. The main challenge in the analysis lies in translating R&D results to a quantitative reflection of technical risk for a key Program metric: levelized cost of energy (LCOE). This requires both computational development (i.e., creating a spreadsheet-based analysis tool) and a synthesis of judgments by a panel of researchers and experts of the expected results of the Program's R&D.

  6. Sensitivity analysis of a two-dimensional probabilistic risk assessment model using analysis of variance.

    PubMed

    Mokhtari, Amirhossein; Frey, H Christopher

    2005-12-01

    This article demonstrates application of sensitivity analysis to risk assessment models with two-dimensional probabilistic frameworks that distinguish between variability and uncertainty. A microbial food safety process risk (MFSPR) model is used as a test bed. The process of identifying key controllable inputs and key sources of uncertainty using sensitivity analysis is challenged by typical characteristics of MFSPR models such as nonlinearity, thresholds, interactions, and categorical inputs. Among many available sensitivity analysis methods, analysis of variance (ANOVA) is evaluated in comparison to commonly used methods based on correlation coefficients. In a two-dimensional risk model, the identification of key controllable inputs that can be prioritized with respect to risk management is confounded by uncertainty. However, as shown here, ANOVA provided robust insights regarding controllable inputs most likely to lead to effective risk reduction despite uncertainty. ANOVA appropriately selected the top six important inputs, while correlation-based methods provided misleading insights. Bootstrap simulation is used to quantify uncertainty in ranks of inputs due to sampling error. For the selected sample size, differences in F values of 60% or more were associated with clear differences in rank order between inputs. Sensitivity analysis results identified inputs related to the storage of ground beef servings at home as the most important. Risk management recommendations are suggested in the form of a consumer advisory for better handling and storage practices.

  7. Compounding conservatisms: EPA's health risk assessment methods

    SciTech Connect

    Stackelberg, K. von; Burmaster, D.E. )

    1993-03-01

    Superfund conjures up images of hazardous waste sites, which EPA is spending billions of dollars to remediate. One of the law's most worrisome effects is that it drains enormous economic resources without returning commensurate benefits. In a Sept. 1, 1991, front page article in The New York Times, experts argued that most health dangers at Superfund sites could be eliminated for a fraction of the billions that will be spent cleaning up the 1,200 high-priority sites across the country. Even EPA has suggested that the Superfund program may receive disproportionate resources, compared with other public health programs, such as radon in houses, the diminishing ozone layer and occupational diseases. Public opinion polls over the last decade consistently have mirrored the public's vast fear of hazardous waste sites, a fear as great as that held for nuclear power plants. Fear notwithstanding, the high cost of chosen remedies at given sites may have less to do with public health goals than with the method EPA uses to translate them into acceptable contaminant concentrations in soil, groundwater and other environmental media.

  8. Risk analysis by FMEA as an element of analytical validation.

    PubMed

    van Leeuwen, J F; Nauta, M J; de Kaste, D; Odekerken-Rombouts, Y M C F; Oldenhof, M T; Vredenbregt, M J; Barends, D M

    2009-12-05

    We subjected a Near-Infrared (NIR) analytical procedure used for screening drugs on authenticity to a Failure Mode and Effects Analysis (FMEA), including technical risks as well as risks related to human failure. An FMEA team broke down the NIR analytical method into process steps and identified possible failure modes for each step. Each failure mode was ranked on estimated frequency of occurrence (O), probability that the failure would remain undetected later in the process (D) and severity (S), each on a scale of 1-10. Human errors turned out to be the most common cause of failure modes. Failure risks were calculated by Risk Priority Numbers (RPNs)=O x D x S. Failure modes with the highest RPN scores were subjected to corrective actions and the FMEA was repeated, showing reductions in RPN scores and resulting in improvement indices up to 5.0. We recommend risk analysis as an addition to the usual analytical validation, as the FMEA enabled us to detect previously unidentified risks.

  9. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    PubMed

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix.

  10. Approach to uncertainty in risk analysis

    SciTech Connect

    Rish, W.R.

    1988-08-01

    In the Fall of 1985 EPA's Office of Radiation Programs (ORP) initiated a project to develop a formal approach to dealing with uncertainties encountered when estimating and evaluating risks to human health and the environment. Based on a literature review of modeling uncertainty, interviews with ORP technical and management staff, and input from experts on uncertainty analysis, a comprehensive approach was developed. This approach recognizes by design the constraints on budget, time, manpower, expertise, and availability of information often encountered in ''real world'' modeling. It is based on the observation that in practice risk modeling is usually done to support a decision process. As such, the approach focuses on how to frame a given risk modeling problem, how to use that framing to select an appropriate mixture of uncertainty analyses techniques, and how to integrate the techniques into an uncertainty assessment that effectively communicates important information and insight to decision-makers. The approach is presented in this report. Practical guidance on characterizing and analyzing uncertainties about model form and quantities and on effectively communicating uncertainty analysis results is included. Examples from actual applications are presented.

  11. Sensitivity Analysis of Launch Vehicle Debris Risk Model

    NASA Technical Reports Server (NTRS)

    Gee, Ken; Lawrence, Scott L.

    2010-01-01

    As part of an analysis of the loss of crew risk associated with an ascent abort system for a manned launch vehicle, a model was developed to predict the impact risk of the debris resulting from an explosion of the launch vehicle on the crew module. The model consisted of a debris catalog describing the number, size and imparted velocity of each piece of debris, a method to compute the trajectories of the debris and a method to calculate the impact risk given the abort trajectory of the crew module. The model provided a point estimate of the strike probability as a function of the debris catalog, the time of abort and the delay time between the abort and destruction of the launch vehicle. A study was conducted to determine the sensitivity of the strike probability to the various model input parameters and to develop a response surface model for use in the sensitivity analysis of the overall ascent abort risk model. The results of the sensitivity analysis and the response surface model are presented in this paper.

  12. Methods for diagnosing the risk factors of stone formation

    PubMed Central

    Robertson, William G.

    2012-01-01

    Objective To compare various systems for assessing the risk of recurrent stones, based on the composition of urine. Methods The relative supersaturation (RSS) of urine, the Tiselius Indices, the Robertson Risk Factor Algorithms (RRFA) and the BONN-Risk Index were compared in terms of the numbers of variables required to be measured, the ease of use of the system and the value of the information obtained. Results The RSS methods require up to 14 analyses in every urine sample but measure the RSS of all the main constituents of kidney stones. The Tiselius Indices and the RRFA require only seven analyses. The Tiselius Indices yield information on the crystallisation potentials (CP) of calcium oxalate and calcium phosphate; the RRFA also provide information on the CP of uric acid. Both methods provide details on the particular urinary abnormalities that lead to the abnormal CP of that urine. The BONN-Risk Index requires two measurements in each urine sample but only provides information on the CP of calcium oxalate. Additional measurements in urine have to be made to identify the cause of any abnormality. Conclusions The methods that are based on measuring RSS are work-intensive and unsuitable for the routine screening of patients. The Tiselius Indices and the RRFA are equally good at predicting the risk of a patient forming further stones. The BONN-Risk Index provides no additional information about the causative factors for any abnormality detected. PMID:26558033

  13. Application of Risk Analysis: Response from a Systems Division,

    DTIC Science & Technology

    A review of theoretical literature reveals that most technical aspects of risk analysis have become a reasonably well-defined process with many... risk analysis in order to enhance its application. Also needed are better tools to enhance use of both subjective judgment and group decision processes...hope that it would lead to increased application of risk analysis in the acquisition process.

  14. Method of multivariate spectral analysis

    DOEpatents

    Keenan, Michael R.; Kotula, Paul G.

    2004-01-06

    A method of determining the properties of a sample from measured spectral data collected from the sample by performing a multivariate spectral analysis. The method can include: generating a two-dimensional matrix A containing measured spectral data; providing a weighted spectral data matrix D by performing a weighting operation on matrix A; factoring D into the product of two matrices, C and S.sup.T, by performing a constrained alternating least-squares analysis of D=CS.sup.T, where C is a concentration intensity matrix and S is a spectral shapes matrix; unweighting C and S by applying the inverse of the weighting used previously; and determining the properties of the sample by inspecting C and S. This method can be used to analyze X-ray spectral data generated by operating a Scanning Electron Microscope (SEM) with an attached Energy Dispersive Spectrometer (EDS).

  15. Modeling Opponents in Adversarial Risk Analysis.

    PubMed

    Rios Insua, David; Banks, David; Rios, Jesus

    2016-04-01

    Adversarial risk analysis has been introduced as a framework to deal with risks derived from intentional actions of adversaries. The analysis supports one of the decisionmakers, who must forecast the actions of the other agents. Typically, this forecast must take account of random consequences resulting from the set of selected actions. The solution requires one to model the behavior of the opponents, which entails strategic thinking. The supported agent may face different kinds of opponents, who may use different rationality paradigms, for example, the opponent may behave randomly, or seek a Nash equilibrium, or perform level-k thinking, or use mirroring, or employ prospect theory, among many other possibilities. We describe the appropriate analysis for these situations, and also show how to model the uncertainty about the rationality paradigm used by the opponent through a Bayesian model averaging approach, enabling a fully decision-theoretic solution. We also show how as we observe an opponent's decision behavior, this approach allows learning about the validity of each of the rationality models used to predict his decision by computing the models' (posterior) probabilities, which can be understood as a measure of their validity. We focus on simultaneous decision making by two agents.

  16. Defining Human Failure Events for Petroleum Risk Analysis

    SciTech Connect

    Ronald L. Boring; Knut Øien

    2014-06-01

    In this paper, an identification and description of barriers and human failure events (HFEs) for human reliability analysis (HRA) is performed. The barriers, called target systems, are identified from risk significant accident scenarios represented as defined situations of hazard and accident (DSHAs). This report serves as the foundation for further work to develop petroleum HFEs compatible with the SPAR-H method and intended for reuse in future HRAs.

  17. Integration of PKPD relationships into benefit–risk analysis

    PubMed Central

    Bellanti, Francesco; van Wijk, Rob C; Danhof, Meindert; Della Pasqua, Oscar

    2015-01-01

    Aim Despite the continuous endeavour to achieve high standards in medical care through effectiveness measures, a quantitative framework for the assessment of the benefit–risk balance of new medicines is lacking prior to regulatory approval. The aim of this short review is to summarise the approaches currently available for benefit–risk assessment. In addition, we propose the use of pharmacokinetic–pharmacodynamic (PKPD) modelling as the pharmacological basis for evidence synthesis and evaluation of novel therapeutic agents. Methods A comprehensive literature search has been performed using MESH terms in PubMed, in which articles describing benefit–risk assessment and modelling and simulation were identified. In parallel, a critical review of multi-criteria decision analysis (MCDA) is presented as a tool for characterising a drug's safety and efficacy profile. Results A definition of benefits and risks has been proposed by the European Medicines Agency (EMA), in which qualitative and quantitative elements are included. However, in spite of the value of MCDA as a quantitative method, decisions about benefit–risk balance continue to rely on subjective expert opinion. By contrast, a model-informed approach offers the opportunity for a more comprehensive evaluation of benefit–risk balance before extensive evidence is generated in clinical practice. Conclusions Benefit–risk balance should be an integral part of the risk management plan and as such considered before marketing authorisation. Modelling and simulation can be incorporated into MCDA to support the evidence synthesis as well evidence generation taking into account the underlying correlations between favourable and unfavourable effects. In addition, it represents a valuable tool for the optimization of protocol design in effectiveness trials. PMID:25940398

  18. Assessment of Methods for Estimating Risk to Birds from ...

    EPA Pesticide Factsheets

    The U.S. EPA Ecological Risk Assessment Support Center (ERASC) announced the release of the final report entitled, Assessment of Methods for Estimating Risk to Birds from Ingestion of Contaminated Grit Particles. This report evaluates approaches for estimating the probability of ingestion by birds of contaminated particles such as pesticide granules or lead particles (i.e. shot or bullet fragments). In addition, it presents an approach for using this information to estimate the risk of mortality to birds from ingestion of lead particles. Response to ERASC Request #16

  19. FORTRAN computer program for seismic risk analysis

    USGS Publications Warehouse

    McGuire, Robin K.

    1976-01-01

    A program for seismic risk analysis is described which combines generality of application, efficiency and accuracy of operation, and the advantage of small storage requirements. The theoretical basis for the program is first reviewed, and the computational algorithms used to apply this theory are described. The information required for running the program is listed. Published attenuation functions describing the variation with earthquake magnitude and distance of expected values for various ground motion parameters are summarized for reference by the program user. Finally, suggestions for use of the program are made, an example problem is described (along with example problem input and output) and the program is listed.

  20. Risk Analysis Related to Quality Management Principles

    NASA Astrophysics Data System (ADS)

    Vykydal, David; Halfarová, Petra; Nenadál, Jaroslav; Plura, Jiří; Hekelová, Edita

    2012-12-01

    Efficient and effective implementation of quality management principles asks for a responsible approach from top managers' perspectives. A study of the current state of affairs in Czech organizations discovers a lot of shortcomings in this field that can be changed to vary managerial risks. The article identifies and analyses some of them and gives short guidance for appropriate treatment. Text of the article reflects the authors' experience as well as knowledge obtained from the systematic analysis of industrial companies' environments.

  1. Methods to Develop Inhalation Cancer Risk Estimates for ...

    EPA Pesticide Factsheets

    This document summarizes the approaches and rationale for the technical and scientific considerations used to derive inhalation cancer risks for emissions of chromium and nickel compounds from electric utility steam generating units. The purpose of this document is to discuss the methods used to develop inhalation cancer risk estimates associated with emissions of chromium and nickel compounds from coal- and oil-fired electric utility steam generating units (EGUs) in support of EPA's recently proposed Air Toxics Rule.

  2. Best self visualization method with high-risk youth.

    PubMed

    Schussel, Lorne; Miller, Lisa

    2013-08-01

    The healing process of the Best Self Visualization Method (BSM) is described within the framework of meditation, neuroscience, and psychodynamic theory. Cases are drawn from the treatment of high-risk youth, who have histories of poverty, survival of sexual and physical abuse, and/or current risk for perpetrating abuse. Clinical use of BSM is demonstrated in two case illustrations, one of group psychotherapy and another of individual therapy.

  3. Bleeding after endoscopic submucosal dissection: Risk factors and preventive methods

    PubMed Central

    Kataoka, Yosuke; Tsuji, Yosuke; Sakaguchi, Yoshiki; Minatsuki, Chihiro; Asada-Hirayama, Itsuko; Niimi, Keiko; Ono, Satoshi; Kodashima, Shinya; Yamamichi, Nobutake; Fujishiro, Mitsuhiro; Koike, Kazuhiko

    2016-01-01

    Endoscopic submucosal dissection (ESD) has become widely accepted as a standard method of treatment for superficial gastrointestinal neoplasms because it enables en block resection even for large lesions or fibrotic lesions with minimal invasiveness, and decreases the local recurrence rate. Moreover, specimens resected in an en block fashion enable accurate histological assessment. Taking these factors into consideration, ESD seems to be more advantageous than conventional endoscopic mucosal resection (EMR), but the associated risks of perioperative adverse events are higher than in EMR. Bleeding after ESD is the most frequent among these adverse events. Although post-ESD bleeding can be controlled by endoscopic hemostasis in most cases, it may lead to serious conditions including hemorrhagic shock. Even with preventive methods including administration of acid secretion inhibitors and preventive hemostasis, post-ESD bleeding cannot be completely prevented. In addition high-risk cases for post-ESD bleeding, which include cases with the use of antithrombotic agents or which require large resection, are increasing. Although there have been many reports about associated risk factors and methods of preventing post-ESD bleeding, many issues remain unsolved. Therefore, in this review, we have overviewed risk factors and methods of preventing post-ESD bleeding from previous studies. Endoscopists should have sufficient knowledge of these risk factors and preventive methods when performing ESD. PMID:27468187

  4. New challenges on uncertainty propagation assessment of flood risk analysis

    NASA Astrophysics Data System (ADS)

    Martins, Luciano; Aroca-Jiménez, Estefanía; Bodoque, José M.; Díez-Herrero, Andrés

    2016-04-01

    Natural hazards, such as floods, cause considerable damage to the human life, material and functional assets every year and around the World. Risk assessment procedures has associated a set of uncertainties, mainly of two types: natural, derived from stochastic character inherent in the flood process dynamics; and epistemic, that are associated with lack of knowledge or the bad procedures employed in the study of these processes. There are abundant scientific and technical literature on uncertainties estimation in each step of flood risk analysis (e.g. rainfall estimates, hydraulic modelling variables); but very few experience on the propagation of the uncertainties along the flood risk assessment. Therefore, epistemic uncertainties are the main goal of this work, in particular,understand the extension of the propagation of uncertainties throughout the process, starting with inundability studies until risk analysis, and how far does vary a proper analysis of the risk of flooding. These methodologies, such as Polynomial Chaos Theory (PCT), Method of Moments or Monte Carlo, are used to evaluate different sources of error, such as data records (precipitation gauges, flow gauges...), hydrologic and hydraulic modelling (inundation estimation), socio-demographic data (damage estimation) to evaluate the uncertainties propagation (UP) considered in design flood risk estimation both, in numerical and cartographic expression. In order to consider the total uncertainty and understand what factors are contributed most to the final uncertainty, we used the method of Polynomial Chaos Theory (PCT). It represents an interesting way to handle to inclusion of uncertainty in the modelling and simulation process. PCT allows for the development of a probabilistic model of the system in a deterministic setting. This is done by using random variables and polynomials to handle the effects of uncertainty. Method application results have a better robustness than traditional analysis

  5. Using Qualitative Disease Risk Analysis for Herpetofauna Conservation Translocations Transgressing Ecological and Geographical Barriers.

    PubMed

    Bobadilla Suarez, Mariana; Ewen, John G; Groombridge, Jim J; Beckmann, K; Shotton, J; Masters, N; Hopkins, T; Sainsbury, Anthony W

    2017-03-01

    Through the exploration of disease risk analysis methods employed for four different UK herpetofauna translocations, we illustrate how disease hazards can be identified, and how the risk of disease can be analysed. Where ecological or geographical barriers between source and destination sites exist, parasite populations are likely to differ in identity or strain between the two sites, elevating the risk from disease and increasing the number and category of hazards requiring analysis. Simplification of the translocation pathway through the avoidance of these barriers reduces the risk from disease. The disease risk analysis tool is intended to aid conservation practitioners in decision making relating to disease hazards prior to implementation of a translocation.

  6. A Handbook of Cost Risk Analysis Methods

    DTIC Science & Technology

    1993-04-01

    considered products MIA publishes. They normally embody results of major projects which (a) have a direct bearing on decisions affecting major programs...April 1981. Budescu , David V., and Thomas S. Wallsten. "Encoding Subjective Probabilities: A Psychological and Psychometric Review." Management

  7. Method of photon spectral analysis

    DOEpatents

    Gehrke, R.J.; Putnam, M.H.; Killian, E.W.; Helmer, R.G.; Kynaston, R.L.; Goodwin, S.G.; Johnson, L.O.

    1993-04-27

    A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and [gamma]-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2,000 keV), as well as high-energy [gamma] rays (>1 MeV). A 8,192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The [gamma]-ray portion of each spectrum is analyzed by a standard Ge [gamma]-ray analysis program. This method can be applied to any analysis involving x- and [gamma]-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the [gamma]-ray analysis and accommodated during the x-ray analysis.

  8. Method of photon spectral analysis

    DOEpatents

    Gehrke, Robert J.; Putnam, Marie H.; Killian, E. Wayne; Helmer, Richard G.; Kynaston, Ronnie L.; Goodwin, Scott G.; Johnson, Larry O.

    1993-01-01

    A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and .gamma.-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2000 keV), as well as high-energy .gamma. rays (>1 MeV). A 8192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The .gamma.-ray portion of each spectrum is analyzed by a standard Ge .gamma.-ray analysis program. This method can be applied to any analysis involving x- and .gamma.-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the .gamma.-ray analysis and accommodated during the x-ray analysis.

  9. Meta-analysis of osteoporosis: fracture risks, medication and treatment.

    PubMed

    Liu, W; Yang, L-H; Kong, X-C; An, L-K; Wang, R

    2015-08-01

    Osteoporosis is a brittle bone disease that can cause fractures mostly in older men and women. Meta-analysis is the statistical method which is applied in the frame work for the assessment of results obtained from various research studies conducted in several years. A meta-analysis of osteoporotic fracture risk with medication non-adherence has been described to assess the bone fracture risk among patients non-adherent versus adherent to therapy for osteoporosis by many researchers. Osteoporosis therapy reduces the risk of fracture in clinical trials, and real-world adherence to therapy which is suboptimal and can reduce the effectiveness of intervention. The methods of Medline, Embase, and CINAHL were literature searched for these observational studies from year 1998 to 2009, and up to 2015. The results of meta-analysis of osteoporosis research on fractures of postmenopausal women and men are presented. The use of bisphosphonate therapy for osteoporosis has been described with other drugs. The authors, design, studies (women %), years (data), follow-up (wks), fractures (types), and compliance or persistence results from years 2004 to 2009 from are shown in a brief table. The meta-analysis studies have been reviewed from other researchers on osteoporosis and fractures, medications and treatments.

  10. Topographic Avalanche Risk: DEM Sensitivity Analysis

    NASA Astrophysics Data System (ADS)

    Nazarkulova, Ainura; Strobl, Josef

    2015-04-01

    GIS-based models are frequently used to assess the risk and trigger probabilities of (snow) avalanche releases, based on parameters and geomorphometric derivatives like elevation, exposure, slope, proximity to ridges and local relief energy. Numerous models, and model-based specific applications and project results have been published based on a variety of approaches and parametrizations as well as calibrations. Digital Elevation Models (DEM) come with many different resolution (scale) and quality (accuracy) properties, some of these resulting from sensor characteristics and DEM generation algorithms, others from different DEM processing workflows and analysis strategies. This paper explores the impact of using different types and characteristics of DEMs for avalanche risk modeling approaches, and aims at establishing a framework for assessing the uncertainty of results. The research question is derived from simply demonstrating the differences in release risk areas and intensities by applying identical models to DEMs with different properties, and then extending this into a broader sensitivity analysis. For the quantification and calibration of uncertainty parameters different metrics are established, based on simple value ranges, probabilities, as well as fuzzy expressions and fractal metrics. As a specific approach the work on DEM resolution-dependent 'slope spectra' is being considered and linked with the specific application of geomorphometry-base risk assessment. For the purpose of this study focusing on DEM characteristics, factors like land cover, meteorological recordings and snowpack structure and transformation are kept constant, i.e. not considered explicitly. Key aims of the research presented here are the development of a multi-resolution and multi-scale framework supporting the consistent combination of large area basic risk assessment with local mitigation-oriented studies, and the transferability of the latter into areas without availability of

  11. A comprehensive risk analysis of coastal zones in China

    NASA Astrophysics Data System (ADS)

    Wang, Guanghui; Liu, Yijun; Wang, Hongbing; Wang, Xueying

    2014-03-01

    Although coastal zones occupy an important position in the world development, they face high risks and vulnerability to natural disasters because of their special locations and their high population density. In order to estimate their capability for crisis-response, various models have been established. However, those studies mainly focused on natural factors or conditions, which could not reflect the social vulnerability and regional disparities of coastal zones. Drawing lessons from the experiences of the United Nations Environment Programme (UNEP), this paper presents a comprehensive assessment strategy based on the mechanism of Risk Matrix Approach (RMA), which includes two aspects that are further composed of five second-class indicators. The first aspect, the probability phase, consists of indicators of economic conditions, social development, and living standards, while the second one, the severity phase, is comprised of geographic exposure and natural disasters. After weighing all of the above indicators by applying the Analytic Hierarchy Process (AHP) and Delphi Method, the paper uses the comprehensive assessment strategy to analyze the risk indices of 50 coastal cities in China. The analytical results are presented in ESRI ArcGis10.1, which generates six different risk maps covering the aspects of economy, society, life, environment, disasters, and an overall assessment of the five areas. Furthermore, the study also investigates the spatial pattern of these risk maps, with detailed discussion and analysis of different risks in coastal cities.

  12. Risk analysis for renewable energy projects due to constraints arising

    NASA Astrophysics Data System (ADS)

    Prostean, G.; Vasar, C.; Prostean, O.; Vartosu, A.

    2016-02-01

    Starting from the target of the European Union (EU) to use renewable energy in the area that aims a binding target of 20% renewable energy in final energy consumption by 2020, this article illustrates the identification of risks for implementation of wind energy projects in Romania, which could lead to complex technical implications, social and administrative. In specific projects analyzed in this paper were identified critical bottlenecks in the future wind power supply chain and reasonable time periods that may arise. Renewable energy technologies have to face a number of constraints that delayed scaling-up their production process, their transport process, the equipment reliability, etc. so implementing these types of projects requiring complex specialized team, the coordination of which also involve specific risks. The research team applied an analytical risk approach to identify major risks encountered within a wind farm project developed in Romania in isolated regions with different particularities, configured for different geographical areas (hill and mountain locations in Romania). Identification of major risks was based on the conceptual model set up for the entire project implementation process. Throughout this conceptual model there were identified specific constraints of such process. Integration risks were examined by an empirical study based on the method HAZOP (Hazard and Operability). The discussion describes the analysis of our results implementation context of renewable energy projects in Romania and creates a framework for assessing energy supply to any entity from renewable sources.

  13. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments

    PubMed Central

    Conners, Erin E.; West, Brooke S.; Roth, Alexis M.; Meckel-Parker, Kristen G.; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D.; Brouwer, Kimberly C.

    2016-01-01

    Increasingly, ‘place’, including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions. PMID:27191846

  14. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments.

    PubMed

    Conners, Erin E; West, Brooke S; Roth, Alexis M; Meckel-Parker, Kristen G; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D; Brouwer, Kimberly C

    2016-01-01

    Increasingly, 'place', including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions.

  15. Risk analysis of landslide disaster in Ponorogo, East Java, Indonesia

    NASA Astrophysics Data System (ADS)

    Koesuma, S.; Saido, A. P.; Fukuda, Y.

    2016-11-01

    Ponorogo is one of regency in South-West of East Java Province, Indonesia, where located in subduction zone between Eurasia and Australia plate tectonics. It has a lot of mountain area which is disaster-prone area for landslide. We have collected landslide data in 305 villages in Ponorogo and make it to be Hazards Index. Then we also calculate Vulnerability Index, Economic Loss index, Environmental Damage Index and Capacity Index. The risk analysis map is composed of three components H (Hazards), V (Vulnerability, Economic Loss index, Environmental Damage Index) and C (Capacity Index). The method is based on regulations of National Disaster Management Authority (BNPB) number 02/2012 and number 03/2012. It has three classes of risk index, i.e. Low, Medium and High. Ponorogo city has a medium landslide risk index.

  16. [Risk analysis in radiation therapy: state of the art].

    PubMed

    Mazeron, R; Aguini, N; Deutsch, É

    2013-01-01

    Five radiotherapy accidents, from which two serial, occurred in France from 2003 to 2007, led the authorities to establish a roadmap for securing radiotherapy. By analogy with industrial processes, a technical decision form the French Nuclear Safety Authority in 2008 requires radiotherapy professionals to conduct analyzes of risks to patients. The process of risk analysis had been tested in three pilot centers, before the occurrence of accidents, with the creation of cells feedback. The regulation now requires all radiotherapy services to have similar structures to collect precursor events, incidents and accidents, to perform analyzes following rigorous methods and to initiate corrective actions. At the same time, it is also required to conduct analyzes a priori, less intuitive, and usually require the help of a quality engineer, with the aim of reducing risk. The progressive implementation of these devices is part of an overall policy to improve the quality of radiotherapy. Since 2007, no radiotherapy accident was reported.

  17. Method and system for dynamic probabilistic risk assessment

    NASA Technical Reports Server (NTRS)

    Dugan, Joanne Bechta (Inventor); Xu, Hong (Inventor)

    2013-01-01

    The DEFT methodology, system and computer readable medium extends the applicability of the PRA (Probabilistic Risk Assessment) methodology to computer-based systems, by allowing DFT (Dynamic Fault Tree) nodes as pivot nodes in the Event Tree (ET) model. DEFT includes a mathematical model and solution algorithm, supports all common PRA analysis functions and cutsets. Additional capabilities enabled by the DFT include modularization, phased mission analysis, sequence dependencies, and imperfect coverage.

  18. Multivariate analysis methods for spectroscopic blood analysis

    NASA Astrophysics Data System (ADS)

    Wood, Michael F. G.; Rohani, Arash; Ghazalah, Rashid; Vitkin, I. Alex; Pawluczyk, Romuald

    2012-01-01

    Blood tests are an essential tool in clinical medicine with the ability diagnosis or monitor various diseases and conditions; however, the complexities of these measurements currently restrict them to a laboratory setting. P&P Optica has developed and currently produces patented high performance spectrometers and is developing a spectrometer-based system for rapid reagent-free blood analysis. An important aspect of this analysis is the need to extract the analyte specific information from the measured signal such that the analyte concentrations can be determined. To this end, advanced chemometric methods are currently being investigated and have been tested using simulated spectra. A blood plasma model was used to generate Raman, near infrared, and optical rotatory dispersion spectra with glucose as the target analyte. The potential of combined chemometric techniques, where multiple spectroscopy modalities are used in a single regression model to improve the prediction ability was investigated using unfold partial least squares and multiblock partial least squares. Results show improvement in the predictions of glucose levels using the combined methods and demonstrate potential for multiblock chemometrics in spectroscopic blood analysis.

  19. 49 CFR 260.17 - Credit risk premium analysis.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Credit risk premium analysis. 260.17 Section 260... Financial Assistance § 260.17 Credit risk premium analysis. (a) When Federal appropriations are not available to cover the total subsidy cost, the Administrator will determine the Credit Risk...

  20. Development of a preliminary framework for informing the risk analysis and risk management of nanoparticles.

    PubMed

    Morgan, Kara

    2005-12-01

    Decisions are often made even when there is uncertainty about the possible outcomes. However, methods for making decisions with uncertainty in the problem framework are scarce. Presently, safety assessment for a product containing engineered nano-scale particles is a very poorly structured problem. Many fields of study may inform the safety assessment of such particles (e.g., ultrafines, aerosols, debris from medical devices), but engineered nano-scale particles may present such unique properties that extrapolating from other types of studies may introduce, and not resolve, uncertainty. Some screening-level health effects studies conducted specifically on engineered nano-scale materials have been published and many more are underway. However, it is clear that the extent of research needed to fully and confidently understand the potential for health or environmental risk from engineered nano-scale particles may take years or even decades to complete. In spite of the great uncertainty, there is existing research and experience among researchers that can help to provide a taxonomy of particle properties, perhaps indicating a relative likelihood of risk, in order to prioritize nanoparticle risk research. To help structure this problem, a framework was developed from expert interviews of nanotechnology researchers. The analysis organizes the information as a system based on the risk assessment framework, in order to support the decision about safety. In the long term, this framework is designed to incorporate research results as they are generated, and therefore serve as a tool for estimating the potential for human health and environmental risk.

  1. Approaches to uncertainty analysis in probabilistic risk assessment

    SciTech Connect

    Bohn, M.P.; Wheeler, T.A.; Parry, G.W.

    1988-01-01

    An integral part of any probabilistic risk assessment (PRA) is the performance of an uncertainty analysis to quantify the uncertainty in the point estimates of the risk measures considered. While a variety of classical methods of uncertainty analysis exist, application of these methods and developing new techniques consistent with existing PRA data bases and the need for expert (subjective) input has been an area of considerable interest since the pioneering Reactor Safety Study (WASH-1400) in 1975. This report presents the results of a critical review of existing methods for performing uncertainty analyses for PRAs, with special emphasis on identifying data base limitations on the various methods. Both classical and Baysian approaches have been examined. This work was funded by the US Nuclear Regulatory Commission in support of its ongoing full-scope PRA of the LaSalle nuclear power station. Thus in addition to the review, this report contains recommendations for a suitable uncertainty analysis methodology for the LaSalle PRA.

  2. Concentration of Risk Model (CORM) Verification and Analysis

    DTIC Science & Technology

    2014-06-15

    Mental Health and using data from a repository at the University of Michigan, had attempted to identify soldiers at higher-than-average risk of suicide ...TRAC-M-TR-14-023 15 June 2014 Concentration of Risk Model (CORM) Verification and Analysis TRADOC Analysis Center - Monterey 700 Dyer Road Monterey...TRAC-M-TR-14-023 15 June 2014 Concentration of Risk Model (CORM) Verification and Analysis Edward M. Masotti Sam Buttrey TRADOC Analysis Center

  3. Overcoming barriers to integrating economic analysis into risk assessment.

    PubMed

    Hoffmann, Sandra

    2011-09-01

    Regulatory risk analysis is designed to provide decisionmakers with a clearer understanding of how policies are likely to affect risk. The systems that produce risk are biological, physical, and social and economic. As a result, risk analysis is an inherently interdisciplinary task. Yet in practice, risk analysis has been interdisciplinary in only limited ways. Risk analysis could provide more accurate assessments of risk if there were better integration of economics and other social sciences into risk assessment itself. This essay examines how discussions about risk analysis policy have influenced the roles of various disciplines in risk analysis. It explores ways in which integrated bio/physical-economic modeling could contribute to more accurate assessments of risk. It reviews examples of the kind of integrated economics-bio/physical modeling that could be used to enhance risk assessment. The essay ends with a discussion of institutional barriers to greater integration of economic modeling into risk assessment and provides suggestions on how these might be overcome.

  4. Flow methods in chiral analysis.

    PubMed

    Trojanowicz, Marek; Kaniewska, Marzena

    2013-11-01

    The methods used for the separation and analytical determination of individual isomers are based on interactions with substances exhibiting optical activity. The currently used methods for the analysis of optically active compounds are primarily high-performance separation methods, such as gas and liquid chromatography using chiral stationary phases or chiral selectors in the mobile phase, and highly efficient electromigration techniques, such as capillary electrophoresis using chiral selectors. Chemical sensors and biosensors may also be designed for the analysis of optically active compounds. As enantiomers of the same compound are characterised by almost identical physico-chemical properties, their differentiation/separation in one-step unit operation in steady-state or dynamic flow systems requires the use of highly effective chiral selectors. Examples of such determinations are reviewed in this paper, based on 105 references. The greatest successes for isomer determination involve immunochemical interactions, enantioselectivity of the enzymatic biocatalytic processes, and interactions with ion-channel receptors or molecularly imprinted polymers. Conducting such processes under dynamic flow conditions may significantly enhance the differences in the kinetics of such processes, leading to greater differences in the signals recorded for enantiomers. Such determinations in flow conditions are effectively performed using surface-plasmon resonance and piezoelectric detections, as well as using common spectroscopic and electrochemical detections.

  5. Methodology for risk analysis based on atmospheric dispersion modelling from nuclear risk sites

    NASA Astrophysics Data System (ADS)

    Baklanov, A.; Mahura, A.; Sørensen, J. H.; Rigina, O.

    2003-04-01

    The main purpose of this multidisciplinary study is to develop a methodology for complex nuclear risk and vulnerability assessment, and to test it on example of estimation of nuclear risk to the population in the Nordic countries in case of a severe accident at a nuclear risk site (NRS). The main focus of the paper is the methodology for the evaluation of the atmospheric transport and deposition of radioactive pollutants from NRSs. The method developed for this evaluation is derived from a probabilistic point of view. The main question we are trying to answer is: What is the probability for radionuclide atmospheric transport and impact to different neighbouring regions and countries in case of an accident at an NPP? To answer this question we applied a number of different tools: (i) Trajectory Modelling - to calculate multiyear forward trajectories originating over the locations of selected risk sites; (ii) Dispersion Modelling - for long-term simulation and case studies of radionuclide transport from hypothetical accidental releases at NRSs; (iii) Cluster Analysis - to identify atmospheric transport pathways from NRSs; (iv) Probability Fields Analysis - to construct annual, monthly, and seasonal NRS impact indicators to identify the most impacted geographical regions; (v) Specific Case Studies - to estimate consequences for the environment and the populations after a hypothetical accident; (vi) Vulnerability Evaluation to Radioactive Deposition - to describe its persistence in the ecosystems with a focus to the transfer of certain radionuclides into the food chains of key importance for the intake and exposure for a whole population and for certain population groups; (vii) Risk Evaluation and Mapping - to analyse socio-economical consequences for different geographical areas and various population groups taking into account social-geophysical factors and probabilities, and using demographic databases based on GIS analysis.

  6. Voltametric analysis apparatus and method

    DOEpatents

    Almon, Amy C.

    1993-01-01

    An apparatus and method for electrochemical analysis of elements in solution. An auxiliary electrode 14, a reference electrode 18, and five working electrodes 20, 22, 26, 28, and 30 are positioned in a container 12 containing a sample solution 34. The working electrodes are spaced apart evenly from each other and auxiliary electrode 14 to minimize any inter-electrode interference that may occur during analysis. An electric potential is applied between auxiliary electrode 14 and each of the working electrodes 20, 22, 26, 28, and 30. Simultaneous measurements taken of the current flow through each of the working electrodes for each given potential in a potential range are used for identifying chemical elements present in sample solution 34 and their respective concentrations. Multiple working electrodes enable a more positive identification to be made by providing unique data characteristic of chemical elements present in the sample solution.

  7. Voltametric analysis apparatus and method

    SciTech Connect

    Almon, A.C.

    1991-12-31

    An apparatus and method are disclosed for electrochemical analysis of elements in solution. An auxiliary electrode a reference electrode and five working electrodes are positioned in a container containing a sample solution. The working electrodes are spaced apart evenly from each other and auxiliary electrode to minimize any inter-electrode interference that may occur during analysis. An electric potential is applied between auxiliary electrode and each of the working electrodes. Simultaneous measurements taken of the current flow through each of the working electrodes for each given potential in a potential range are used for identifying chemical elements present in sample solution and their respective concentrations. Multiple working electrodes enable a more positive identification to be made by providing unique data characteristic of chemical elements present in the sample solution.

  8. Voltammetric analysis apparatus and method

    DOEpatents

    Almon, A.C.

    1993-06-08

    An apparatus and method is described for electrochemical analysis of elements in solution. An auxiliary electrode, a reference electrode, and five working electrodes are positioned in a container containing a sample solution. The working electrodes are spaced apart evenly from each other and the auxiliary electrode to minimize any inter-electrode interference that may occur during analysis. An electric potential is applied between the auxiliary electrode and each of the working electrodes. Simultaneous measurements taken of the current flow through each of the working electrodes for each given potential in a potential range are used for identifying chemical elements present in the sample solution and their respective concentrations. Multiple working electrodes enable a more positive identification to be made by providing unique data characteristic of chemical elements present in the sample solution.

  9. Analysis and classification of the tools for assessing the risks associated with industrial machines.

    PubMed

    Paques, Joseph-Jean; Gauthier, François; Perez, Alejandro

    2007-01-01

    To assess and plan future risk-analysis research projects, 275 documents describing methods and tools for assessing the risks associated with industrial machines or with other sectors such as the military, and the nuclear and aeronautics industries, etc., were collected. These documents were in the format of published books or papers, standards, technical guides and company procedures collected throughout industry. From the collected documents, 112 documents were selected for analysis; 108 methods applied or potentially applicable for assessing the risks associated with industrial machines were analyzed and classified. This paper presents the main quantitative results of the analysis of the methods and tools.

  10. Technical Overview of Ecological Risk Assessment - Analysis Phase: Exposure Characterization

    EPA Pesticide Factsheets

    Exposure Characterization is the second major component of the analysis phase of a risk assessment. For a pesticide risk assessment, the exposure characterization describes the potential or actual contact of a pesticide with a plant, animal, or media.

  11. Risk prediction with machine learning and regression methods.

    PubMed

    Steyerberg, Ewout W; van der Ploeg, Tjeerd; Van Calster, Ben

    2014-07-01

    This is a discussion of issues in risk prediction based on the following papers: "Probability estimation with machine learning methods for dichotomous and multicategory outcome: Theory" by Jochen Kruppa, Yufeng Liu, Gérard Biau, Michael Kohler, Inke R. König, James D. Malley, and Andreas Ziegler; and "Probability estimation with machine learning methods for dichotomous and multicategory outcome: Applications" by Jochen Kruppa, Yufeng Liu, Hans-Christian Diener, Theresa Holste, Christian Weimar, Inke R. König, and Andreas Ziegler.

  12. A Proposal of Operational Risk Management Method Using FMEA for Drug Manufacturing Computerized System

    NASA Astrophysics Data System (ADS)

    Takahashi, Masakazu; Nanba, Reiji; Fukue, Yoshinori

    This paper proposes operational Risk Management (RM) method using Failure Mode and Effects Analysis (FMEA) for drug manufacturing computerlized system (DMCS). The quality of drug must not be influenced by failures and operational mistakes of DMCS. To avoid such situation, DMCS has to be conducted enough risk assessment and taken precautions. We propose operational RM method using FMEA for DMCS. To propose the method, we gathered and compared the FMEA results of DMCS, and develop a list that contains failure modes, failures and countermeasures. To apply this list, we can conduct RM in design phase, find failures, and conduct countermeasures efficiently. Additionally, we can find some failures that have not been found yet.

  13. Working session 5: Operational aspects and risk analysis

    SciTech Connect

    Cizelj, L.; Donoghue, J.

    1997-02-01

    A general observation is that both operational aspects and risk analysis cannot be adequately discussed without information presented in other sessions. Some overlap of conclusions and recommendations is therefore to be expected. Further, it was assumed that recommendations concerning improvements in some related topics were generated by other sessions and are not repeated here. These include: (1) Knowledge on degradation mechanisms (initiation, progression, and failure). (2) Modeling of degradation (initiation, progression, and failure). (3) Capabilities of NDE methods. (4) Preventive maintenance and repair. One should note here, however, that all of these directly affect both operational and risk aspects of affected plants. A list of conclusions and recommendations is based on available presentations and discussions addressing risk and operational experience. The authors aimed at reaching as broad a consensus as possible. It should be noted here that there is no strict delineation between operational and safety aspects of degradation of steam generator tubes. This is caused by different risk perceptions in different countries/societies. The conclusions and recommendations were divided into four broad groups: human reliability; leakage monitoring; risk impact; and consequence assessment.

  14. Putting problem formulation at the forefront of GMO risk analysis.

    PubMed

    Tepfer, Mark; Racovita, Monica; Craig, Wendy

    2013-01-01

    When applying risk assessment and the broader process of risk analysis to decisions regarding the dissemination of genetically modified organisms (GMOs), the process has a tendency to become remarkably complex. Further, as greater numbers of countries consider authorising the large-scale dissemination of GMOs, and as GMOs with more complex traits reach late stages of development, there has been increasing concern about the burden posed by the complexity of risk analysis. We present here an improved approach for GMO risk analysis that gives a central role to problem formulation. Further, the risk analysis strategy has been clarified and simplified in order to make rigorously scientific risk assessment and risk analysis more broadly accessible to diverse stakeholder groups.

  15. Pressure Systems Stored-Energy Threshold Risk Analysis

    SciTech Connect

    Paulsen, Samuel S.

    2009-08-25

    Federal Regulation 10 CFR 851, which became effective February 2007, brought to light potential weaknesses regarding the Pressure Safety Program at the Pacific Northwest National Laboratory (PNNL). The definition of a pressure system in 10 CFR 851 does not contain a limit based upon pressure or any other criteria. Therefore, the need for a method to determine an appropriate risk-based hazard level for pressure safety was identified. The Laboratory has historically used a stored energy of 1000 lbf-ft to define a pressure hazard; however, an analytical basis for this value had not been documented. This document establishes the technical basis by evaluating the use of stored energy as an appropriate criterion to establish a pressure hazard, exploring a suitable risk threshold for pressure hazards, and reviewing the methods used to determine stored energy. The literature review and technical analysis concludes the use of stored energy as a method for determining a potential risk, the 1000 lbf-ft threshold, and the methods used by PNNL to calculate stored energy are all appropriate. Recommendations for further program improvements are also discussed

  16. Global Human Settlement Analysis for Disaster Risk Reduction

    NASA Astrophysics Data System (ADS)

    Pesaresi, M.; Ehrlich, D.; Ferri, S.; Florczyk, A.; Freire, S.; Haag, F.; Halkia, M.; Julea, A. M.; Kemper, T.; Soille, P.

    2015-04-01

    The Global Human Settlement Layer (GHSL) is supported by the European Commission, Joint Research Center (JRC) in the frame of his institutional research activities. Scope of GHSL is developing, testing and applying the technologies and analysis methods integrated in the JRC Global Human Settlement analysis platform for applications in support to global disaster risk reduction initiatives (DRR) and regional analysis in the frame of the European Cohesion policy. GHSL analysis platform uses geo-spatial data, primarily remotely sensed and population. GHSL also cooperates with the Group on Earth Observation on SB-04-Global Urban Observation and Information, and various international partners andWorld Bank and United Nations agencies. Some preliminary results integrating global human settlement information extracted from Landsat data records of the last 40 years and population data are presented.

  17. Risk-Informed Safety Margin Characterization Methods Development Work

    SciTech Connect

    Smith, Curtis L; Ma, Zhegang; Riley, Tom; Mandelli, Diego; Nielsen, Joseph W; Alfonsi, Andrea; Rabiti, Cristian

    2014-09-01

    This report summarizes the research activity developed during the Fiscal year 2014 within the Risk Informed Safety Margin and Characterization (RISMC) pathway within the Light Water Reactor Sustainability (LWRS) campaign. This research activity is complementary to the one presented in the INL/EXT-??? report which shows advances Probabilistic Risk Assessment Analysis using RAVEN and RELAP-7 in conjunction to novel flooding simulation tools. Here we present several analyses that prove the values of the RISMC approach in order to assess risk associated to nuclear power plants (NPPs). We focus on simulation based PRA which, in contrast to classical PRA, heavily employs system simulator codes. Firstly we compare, these two types of analyses, classical and RISMC, for a Boiling water reactor (BWR) station black out (SBO) initiating event. Secondly we present an extended BWR SBO analysis using RAVEN and RELAP-5 which address the comments and suggestions received about he original analysis presented in INL/EXT-???. This time we focus more on the stochastic analysis such probability of core damage and on the determination of the most risk-relevant factors. We also show some preliminary results regarding the comparison between RELAP5-3D and the new code RELAP-7 for a simplified Pressurized Water Reactors system. Lastly we present some conceptual ideas regarding the possibility to extended the RISMC capabilities from an off-line tool (i.e., as PRA analysis tool) to an online-tool. In this new configuration, RISMC capabilities can be used to assist and inform reactor operator during real accident scenarios.

  18. HANFORD SAFETY ANALYSIS & RISK ASSESSMENT HANDBOOK (SARAH)

    SciTech Connect

    EVANS, C B

    2004-12-21

    The purpose of the Hanford Safety Analysis and Risk Assessment Handbook (SARAH) is to support the development of safety basis documentation for Hazard Category 2 and 3 (HC-2 and 3) U.S. Department of Energy (DOE) nuclear facilities to meet the requirements of 10 CFR 830, ''Nuclear Safety Management''. Subpart B, ''Safety Basis Requirements.'' Consistent with DOE-STD-3009-94, Change Notice 2, ''Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'' (STD-3009), and DOE-STD-3011-2002, ''Guidance for Preparation of Basis for Interim Operation (BIO) Documents'' (STD-3011), the Hanford SARAH describes methodology for performing a safety analysis leading to development of a Documented Safety Analysis (DSA) and derivation of Technical Safety Requirements (TSR), and provides the information necessary to ensure a consistently rigorous approach that meets DOE expectations. The DSA and TSR documents, together with the DOE-issued Safety Evaluation Report (SER), are the basic components of facility safety basis documentation. For HC-2 or 3 nuclear facilities in long-term surveillance and maintenance (S&M), for decommissioning activities, where source term has been eliminated to the point that only low-level, residual fixed contamination is present, or for environmental remediation activities outside of a facility structure, DOE-STD-1120-98, ''Integration of Environment, Safety, and Health into Facility Disposition Activities'' (STD-1120), may serve as the basis for the DSA. HC-2 and 3 environmental remediation sites also are subject to the hazard analysis methodologies of this standard.

  19. Cost Risk Analysis Based on Perception of the Engineering Process

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.; Wood, Darrell A.; Moore, Arlene A.; Bogart, Edward H.

    1986-01-01

    In most cost estimating applications at the NASA Langley Research Center (LaRC), it is desirable to present predicted cost as a range of possible costs rather than a single predicted cost. A cost risk analysis generates a range of cost for a project and assigns a probability level to each cost value in the range. Constructing a cost risk curve requires a good estimate of the expected cost of a project. It must also include a good estimate of expected variance of the cost. Many cost risk analyses are based upon an expert's knowledge of the cost of similar projects in the past. In a common scenario, a manager or engineer, asked to estimate the cost of a project in his area of expertise, will gather historical cost data from a similar completed project. The cost of the completed project is adjusted using the perceived technical and economic differences between the two projects. This allows errors from at least three sources. The historical cost data may be in error by some unknown amount. The managers' evaluation of the new project and its similarity to the old project may be in error. The factors used to adjust the cost of the old project may not correctly reflect the differences. Some risk analyses are based on untested hypotheses about the form of the statistical distribution that underlies the distribution of possible cost. The usual problem is not just to come up with an estimate of the cost of a project, but to predict the range of values into which the cost may fall and with what level of confidence the prediction is made. Risk analysis techniques that assume the shape of the underlying cost distribution and derive the risk curve from a single estimate plus and minus some amount usually fail to take into account the actual magnitude of the uncertainty in cost due to technical factors in the project itself. This paper addresses a cost risk method that is based on parametric estimates of the technical factors involved in the project being costed. The engineering

  20. Cost-effectiveness of various risk stratification methods for asymptomatic ventricular pre-excitation.

    PubMed

    Czosek, Richard J; Anderson, Jeffrey; Cassedy, Amy; Spar, David S; Knilans, Timothy K

    2013-07-15

    Accessory pathways with "high-risk" properties confer a small but potential risk of sudden cardiac death. Pediatric guidelines advocate for either risk stratification or ablation in patients with ventricular pre-excitation but do not advocate specific methodology. We sought to compare the cost of differing risk-stratification methodologies in pediatric patients with ventricular pre-excitation in this single institutional, retrospective cohort study of asymptomatic pediatric patients who underwent risk stratification for ventricular pre-excitation. Institutional methodology consisted of stratification using graded exercise testing (GXT) followed by esophageal testing in patients without loss of pre-excitation and ultimately ablation in high-risk patients or patients who became clinically symptomatic during follow-up. A decision analysis model was used to compare this methodology with hypothetical methodologies using different components of the stratification technique and an "ablate all" method. One hundred and two pediatric patients with asymptomatic ventricular pre-excitation underwent staged risk stratification; 73% of patients were deemed low risk and avoided ablation and the remaining 27% ultimately were successfully ablated. The use of esophageal testing was associated with a 23% (p ≤0.0001) reduction in cost compared with GXT stratification alone and a 48% (p ≤0.0001) reduction compared with the "ablate all" model. GXT as a lone stratification method was also associated with a 15% cost reduction (p ≤0.0001) compared with the "ablate all" method. In conclusion, risk stratification of pediatric patients with asymptomatic ventricular pre-excitation is associated with reduced cost. These outcomes of cost-effectiveness need to be combined with the risks and benefits associated with ablation and risk stratification.

  1. Environmental risk analysis for indirect coal liquefaction

    SciTech Connect

    Barnthouse, L.W.; Suter, G.W. II; Baes, C.F. III; Bartell, S.M.; Cavendish, M.G.; Gardner, R.H.; O'Neill, R.V.; Rosen, A.E.

    1985-01-01

    This report presents an analysis of the risks to fish, water quality (due to noxious algal blooms), crops, forests, and wildlife of two technologies for the indirect liquefaction of coal: Lurgi and Koppers-Totzek gasification of coal for Fischer-Tropsch synthesis. A variety of analytical techniques were used to make maximum use of the available data to consider effects of effluents on different levels of ecological organization. The most significant toxicants to fish were found to be ammonia, cadmium, and acid gases. An analysis of whole-effluent toxicity indicated that the Lurgi effluent is more acutely toxic than the Koppers-Totzek effluent. Six effluent components appear to pose a potential threat of blue-green algal blooms, primarily because of their effects on higher trophic levels. The most important atmospheric emissions with respect to crops, forests, and wildlife were found to be the conventional combustion products SO/sub 2/ and NO/sub 2/. Of the materials deposited on the soil, arsenic, cadmium, and nickel appear of greatest concern for phytotoxicity. 147 references, 5 figures, 41 tables.

  2. Advances in validation, risk and uncertainty assessment of bioanalytical methods.

    PubMed

    Rozet, E; Marini, R D; Ziemons, E; Boulanger, B; Hubert, Ph

    2011-06-25

    Bioanalytical method validation is a mandatory step to evaluate the ability of developed methods to provide accurate results for their routine application in order to trust the critical decisions that will be made with them. Even if several guidelines exist to help perform bioanalytical method validations, there is still the need to clarify the meaning and interpretation of bioanalytical method validation criteria and methodology. Yet, different interpretations can be made of the validation guidelines as well as for the definitions of the validation criteria. This will lead to diverse experimental designs implemented to try fulfilling these criteria. Finally, different decision methodologies can also be interpreted from these guidelines. Therefore, the risk that a validated bioanalytical method may be unfit for its future purpose will depend on analysts personal interpretation of these guidelines. The objective of this review is thus to discuss and highlight several essential aspects of methods validation, not only restricted to chromatographic ones but also to ligand binding assays owing to their increasing role in biopharmaceutical industries. The points that will be reviewed are the common validation criteria, which are selectivity, standard curve, trueness, precision, accuracy, limits of quantification and range, dilutional integrity and analyte stability. Definitions, methodology, experimental design and decision criteria are reviewed. Two other points closely connected to method validation are also examined: incurred sample reproducibility testing and measurement uncertainty as they are highly linked to bioanalytical results reliability. Their additional implementation is foreseen to strongly reduce the risk of having validated a bioanalytical method unfit for its purpose.

  3. Multiobjective Risk Partitioning: An Application to Dam Safety Risk Analysis

    DTIC Science & Technology

    1988-04-01

    expectation distorts, and a’most eliminates, the distinctive features of many viable alternative policy options that could lead to the reduction of the risk...height of the dam) from 20 to 30 million dollirs would contribute to a negligible reduction of 0.1 units of conventional (unconditional) expected social...results could be easily influenced by either a change in the return period of the PMH or by the choice of the distribution. Therefore, it is

  4. The Use and Abuse of Risk Analysis in Policy Debate.

    ERIC Educational Resources Information Center

    Herbeck, Dale A.; Katsulas, John P.

    The best check on the preposterous claims of crisis rhetoric is an appreciation of the nature of risk analysis and how it functions in argumentation. The use of risk analysis is common in policy debate. While the stock issues paradigm focused the debate exclusively on the affirmative case, the advent of policy systems analysis has transformed…

  5. Risk Analysis from a Top-Down Perspective

    DTIC Science & Technology

    1983-07-15

    and focused studies in critical areas. A variety of analyses, such as a localized version of the bottom up risk analysis approach and sensitivity...analysis, focus on these open ended cases to resolve them. Unresolvable decision conflicts include value judgments which risk analysis cannot solve

  6. A novel risk assessment method for landfill slope failure: Case study application for Bhalswa Dumpsite, India.

    PubMed

    Jahanfar, Ali; Amirmojahedi, Mohsen; Gharabaghi, Bahram; Dubey, Brajesh; McBean, Edward; Kumar, Dinesh

    2017-03-01

    Rapid population growth of major urban centres in many developing countries has created massive landfills with extraordinary heights and steep side-slopes, which are frequently surrounded by illegal low-income residential settlements developed too close to landfills. These extraordinary landfills are facing high risks of catastrophic failure with potentially large numbers of fatalities. This study presents a novel method for risk assessment of landfill slope failure, using probabilistic analysis of potential failure scenarios and associated fatalities. The conceptual framework of the method includes selecting appropriate statistical distributions for the municipal solid waste (MSW) material shear strength and rheological properties for potential failure scenario analysis. The MSW material properties for a given scenario is then used to analyse the probability of slope failure and the resulting run-out length to calculate the potential risk of fatalities. In comparison with existing methods, which are solely based on the probability of slope failure, this method provides a more accurate estimate of the risk of fatalities associated with a given landfill slope failure. The application of the new risk assessment method is demonstrated with a case study for a landfill located within a heavily populated area of New Delhi, India.

  7. Technical Risk Analysis - Exploiting the Power of MBSE

    DTIC Science & Technology

    2012-11-01

    UNCLASSIFIED DSTO-GD-0734 18. Technical Risk Analysis – Exploiting the Power of MBSE – Despina Tramoundanis1, Wayne Power1 and Daniel Spencer2...Functional Risk Analysis (FRA) conducted within a Model Based Systems Engineering ( MBSE ) environment. FRA is a rigorous technique used to explore potential...TITLE AND SUBTITLE Technical Risk Analysis â Exploiting the Power of MBSE â 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6

  8. White Paper: A Defect Prioritization Method Based on the Risk Priority Number

    DTIC Science & Technology

    2013-11-01

    adapted The Failure Modes and Effects Analysis ( FMEA ) method employs a measurement technique called Risk Priority Number (RPN) to quantify the...Up to an hour 16-60 1.5 Brief Interrupt 0-15 1 Table 1 – Time Scaling Factors In the FMEA formulation, RPN is a product of the three categories

  9. Methods development to evaluate the risk of upgrading to DCS: The human factor

    SciTech Connect

    Ostrom, L.T.; Wilhelmsen, C.A.

    1995-04-01

    The NRC recognizes that a more complete technical basis for understanding and regulating advanced digital technologies in commercial nuclear power plants is needed. A concern is that the introduction of digital safety systems may have an impact on risk. There is currently no standard methodology for measuring digital system reliability. A tool currently used to evaluate NPP risk in analog systems is the probabilistic risk assessment (PRA). The use of this tool to evaluate the digital system risk was considered to be a potential methodology for determining the risk. To test this hypothesis, it was decided to perform a limited PRA on a single dominant accident sequence. However, a review of existing human reliability analysis (HRA) methods showed that they were inadequate to analyze systems utilizing digital technology. A four step process was used to adapt existing HRA methodologies to digital environments and to develop new techniques. The HRA methods were then used to analyze an NPP that had undergone a backfit to digital technology in order to determine, as a first step, whether the methods were effective. The very small-break loss of coolant accident sequence was analyzed to determine whether the upgrade to the Eagle-21 process protection system had an effect on risk. The analysis of the very small-break LOCA documented in the Sequoyah PRA was used as the basis of the analysis. The analysis of the results of the HRA showed that the mean human error probabilities for the Eagle-21 PPS were slightly less than those for the analog system it replaced. One important observation from the analysis is that the operators have increased confidence steming from the better level of control provided by the digital system. The analysis of the PRA results, which included the human error component and the Eagle-21 PPS, disclosed that the reactor protection system had a higher failure rate than the analog system, although the difference was not statistically significant.

  10. Methodology for Risk Analysis of Dam Gates and Associated Operating Equipment Using Fault Tree Analysis

    DTIC Science & Technology

    2005-05-01

    aging gate structures at dam spillways, there is an increasing risk of potential dam failures due to gate inoperability, malfunction, or under-design...method uses probabilities for more events defined more precisely than in standard practice, and adds criticality analysis to rank each of the potential ...a combination of the two. One method defined by Boeing Systems (1998) classifies failure modes according to the three levels defined be- low in

  11. Quantitative risk analysis for landslides -- Examples from Bíldudalur, NW-Iceland

    NASA Astrophysics Data System (ADS)

    Bell, R.; Glade, T.

    2004-03-01

    Although various methods to carry out quantitative landslide risk analyses are available, applications are still rare and mostly dependent on the occurrence of disasters. In Iceland, two catastrophic snow avalanches killed 34 people in 1995. As a consequence the Ministry of the Environment issued a new regulation on hazard zoning due to snow avalanches and landslides in 2000, which aims to prevent people living or working within the areas most at risk until 2010. The regulation requires to carry out landslide and snow avalanche risk analyses, however, a method to calculate landslide risk adopted to Icelandic conditions is still missing. Therefore, the ultimate goal of this study is to develop such a method for landslides, focussing on debris flows and rock falls and to test it in Bíldudalur, NW-Iceland. Risk analysis, beside risk evaluation and risk management, is part of the holistic concept of risk assessment. Within this study, risk analysis is considered only, focussing on the risks to life. To calculate landslide risk, the spatial and temporal probability of occurrence of potential damaging events, as well as the distribution of the elements at risk in space and time, considering also changing vulnerabilities, must be determined. Within this study, a new raster-based approach is developed. Thus, all existent vector data are transferred into raster data using a resolution of 1m x 1m. The specific attribute data are attributed to the grid cells, resulting in specific raster data layers for each input parameter. The calculation of the landslide risk follows a function of the input parameters hazard, damage potential of the elements at risk, vulnerability, probability of the spatial impact, probability of the temporal impact and probability of the seasonal occurrence. Finally, results are upscaled to a resolution of 20m x 20m and are presented as individual risk to life and object risk to life for each process. Within the quantitative landslide risk analysis the

  12. Osteoporosis risk prediction using machine learning and conventional methods.

    PubMed

    Kim, Sung Kean; Yoo, Tae Keun; Oh, Ein; Kim, Deok Won

    2013-01-01

    A number of clinical decision tools for osteoporosis risk assessment have been developed to select postmenopausal women for the measurement of bone mineral density. We developed and validated machine learning models with the aim of more accurately identifying the risk of osteoporosis in postmenopausal women, and compared with the ability of a conventional clinical decision tool, osteoporosis self-assessment tool (OST). We collected medical records from Korean postmenopausal women based on the Korea National Health and Nutrition Surveys (KNHANES V-1). The training data set was used to construct models based on popular machine learning algorithms such as support vector machines (SVM), random forests (RF), artificial neural networks (ANN), and logistic regression (LR) based on various predictors associated with low bone density. The learning models were compared with OST. SVM had significantly better area under the curve (AUC) of the receiver operating characteristic (ROC) than ANN, LR, and OST. Validation on the test set showed that SVM predicted osteoporosis risk with an AUC of 0.827, accuracy of 76.7%, sensitivity of 77.8%, and specificity of 76.0%. We were the first to perform comparisons of the performance of osteoporosis prediction between the machine learning and conventional methods using population-based epidemiological data. The machine learning methods may be effective tools for identifying postmenopausal women at high risk for osteoporosis.

  13. Aircraft accidents : method of analysis

    NASA Technical Reports Server (NTRS)

    1929-01-01

    This report on a method of analysis of aircraft accidents has been prepared by a special committee on the nomenclature, subdivision, and classification of aircraft accidents organized by the National Advisory Committee for Aeronautics in response to a request dated February 18, 1928, from the Air Coordination Committee consisting of the Assistant Secretaries for Aeronautics in the Departments of War, Navy, and Commerce. The work was undertaken in recognition of the difficulty of drawing correct conclusions from efforts to analyze and compare reports of aircraft accidents prepared by different organizations using different classifications and definitions. The air coordination committee's request was made "in order that practices used may henceforth conform to a standard and be universally comparable." the purpose of the special committee therefore was to prepare a basis for the classification and comparison of aircraft accidents, both civil and military. (author)

  14. Flow analysis system and method

    NASA Technical Reports Server (NTRS)

    Hill, Wayne S. (Inventor); Barck, Bruce N. (Inventor)

    1998-01-01

    A non-invasive flow analysis system and method wherein a sensor, such as an acoustic sensor, is coupled to a conduit for transmitting a signal which varies depending on the characteristics of the flow in the conduit. The signal is amplified and there is a filter, responsive to the sensor signal, and tuned to pass a narrow band of frequencies proximate the resonant frequency of the sensor. A demodulator generates an amplitude envelope of the filtered signal and a number of flow indicator quantities are calculated based on variations in amplitude of the amplitude envelope. A neural network, or its equivalent, is then used to determine the flow rate of the flow in the conduit based on the flow indicator quantities.

  15. Hybrid methods for rotordynamic analysis

    NASA Technical Reports Server (NTRS)

    Noah, Sherif T.

    1986-01-01

    Effective procedures are presented for the response analysis of the Space Shuttle Main Engine turbopumps under transient loading conditions. Of particular concern is the determination of the nonlinear response of the systems to rotor imbalance in presence of bearing clearances. The proposed procedures take advantage of the nonlinearities involved being localized at only a few rotor/housing coupling joints. The methods include those based on integral formulations for the incremental solutions involving the transition matrices of the rotor and housing. Alternatively, a convolutional representation of the housing displacements at the coupling points is proposed which would allow performing the transient analysis on a reduced model of the housing. The integral approach is applied to small dynamical models to demonstrate the efficiency of the approach. For purposes of assessing the numerical integration results for the nonlinear rotor/housing systems, a numerical harmonic balance procedure is developed to enable determining all possible harmonic, subharmonic, and nonperiodic solutions of the systems. A brief account of the Fourier approach is presented as applied to a two degree of freedon rotor-support system.

  16. Computational methods for global/local analysis

    NASA Technical Reports Server (NTRS)

    Ransom, Jonathan B.; Mccleary, Susan L.; Aminpour, Mohammad A.; Knight, Norman F., Jr.

    1992-01-01

    Computational methods for global/local analysis of structures which include both uncoupled and coupled methods are described. In addition, global/local analysis methodology for automatic refinement of incompatible global and local finite element models is developed. Representative structural analysis problems are presented to demonstrate the global/local analysis methods.

  17. New Method For Classification of Avalanche Paths With Risks

    NASA Astrophysics Data System (ADS)

    Rapin, François

    After the Chamonix-Montroc avalanche event in February 1999, the French Ministry of the environment wanted to engage a new examination of the "sensitive avalanche paths", i.e. sites with stakes (in particular habitat) whose operation cannot be apprehended in a simple way. The ordered objective consisted in establishing a tool, a method, making it possible to identify them and to treat on a hierarchical basis them according to the risk which they generate, in order to later on as well as possible distribute the efforts of public policy. The proposed tool is based only on objective and quantifiable criteria, a priori of relatively fast access. These criteria are gathered in 4 groups : vulnerability concerned, the morphology of the site, known avalanche history, snow-climatology. Each criterion selected is affected by a " weight ", according to the group to which it belongs and relatively compared to the others. Thus this tool makes it possible to classify the sites subjected at one avalanche risk in a three dangerousness levels grid, which are: - low sensitivity: a priori the site does not deserve a particular avalanche study; - doubtful sensitivity: the site can deserve a study specifying the avalanche risk; - strong sensitivity: the site deserves a thorough study of the avalanche risk. According to conclusions' of these studies, existing measurements of prevention and risk management (zoning, protection, alert, help) will be examined and supplemented as a need. The result obtained by the application of the method by no means imposes the renewal of a thorough study of the avalanche risk which would exist beforehand. A priori less than one ten percent of the paths will be in a strong sensitivity. The present method is thus a new tool of decision-making aid for the first phase of identification and classification of the avalanche sites according to the risk which they generate. To be recognized and used under good conditions, this tool was worked out by the search for

  18. Reliability and cost analysis methods

    NASA Technical Reports Server (NTRS)

    Suich, Ronald C.

    1991-01-01

    In the design phase of a system, how does a design engineer or manager choose between a subsystem with .990 reliability and a more costly subsystem with .995 reliability? When is the increased cost justified? High reliability is not necessarily an end in itself but may be desirable in order to reduce the expected cost due to subsystem failure. However, this may not be the wisest use of funds since the expected cost due to subsystem failure is not the only cost involved. The subsystem itself may be very costly. We should not consider either the cost of the subsystem or the expected cost due to subsystem failure separately but should minimize the total of the two costs, i.e., the total of the cost of the subsystem plus the expected cost due to subsystem failure. This final report discusses the Combined Analysis of Reliability, Redundancy, and Cost (CARRAC) methods which were developed under Grant Number NAG 3-1100 from the NASA Lewis Research Center. CARRAC methods and a CARRAC computer program employ five models which can be used to cover a wide range of problems. The models contain an option which can include repair of failed modules.

  19. Analysis Methods of Magnesium Chips

    NASA Astrophysics Data System (ADS)

    Ohmann, Sven; Ditze, André; Scharf, Christiane

    2015-11-01

    The quality of recycled magnesium from chips depends strongly on their exposure to inorganic and organic impurities that are added during the production processes. Different kinds of magnesium chips from these processes were analyzed by several methods. In addition, the accuracy and effectiveness of the methods are discussed. The results show that the chips belong either to the AZ91, AZ31, AM50/60, or AJ62 alloy. Some kinds of chips show deviations from the above-mentioned normations. Different impurities result mainly from transition metals and lime. The water and oil content does not exceed 25%, and the chip size is not more than 4 mm in the diameter. The sieve analysis shows good results for oily and wet chips. The determination of oil and water shows better results for the application of a Soxhlet compared with the addition of lime and vacuum distillation. The most accurate values for the determination of water and oil are obtained by drying at 110°C (for water) and washing with acetone (for oil) by hand.

  20. Method and tool for network vulnerability analysis

    DOEpatents

    Swiler, Laura Painton; Phillips, Cynthia A.

    2006-03-14

    A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

  1. Sexual Pleasure and Sexual Risk among Women who Use Methamphetamine: A Mixed Methods Study

    PubMed Central

    Lorvick, Jennifer; Bourgois, Philippe; Wenger, Lynn D.; Arreola, Sonya G.; Lutnick, Alexandra; Wechsberg, Wendee M.; Kral, Alex H.

    2012-01-01

    Background The intersection of drug use, sexual pleasure and sexual risk behavior is rarely explored when it comes to poor women who use drugs. This paper explores the relationship between sexual behavior and methamphetamine use in a community-based sample of women, exploring not only risk, but also desire, pleasure and the challenges of overcoming trauma. Methods Quantitative data were collected using standard epidemiological methods (N=322) for community-based studies. In addition, using purposive sampling, qualitative data were collected among a subset of participants (n=34). Data were integrated for mixed methods analysis. Results While many participants reported sexual risk behavior (unprotected vaginal or anal intercourse) in the quantitative survey, sexual risk was not the central narrative pertaining to sexual behavior and methamphetamine use in qualitative findings. Rather, desire, pleasure and disinhibition arose as central themes. Women described feelings of power and agency related to sexual behavior while high on methamphetamine. Findings were mixed on whether methamphetamine use increased sexual risk behavior. Conclusion The use of mixed methods afforded important insights into the sexual behavior and priorities of methamphetamine-using women. Efforts to reduce sexual risk should recognize and valorize the positive aspects of methamphetamine use for some women, building on positive feelings of power and agency as an approach to harm minimization. PMID:22954501

  2. Risk analysis for autonomous underwater vehicle operations in extreme environments.

    PubMed

    Brito, Mario Paulo; Griffiths, Gwyn; Challenor, Peter

    2010-12-01

    Autonomous underwater vehicles (AUVs) are used increasingly to explore hazardous marine environments. Risk assessment for such complex systems is based on subjective judgment and expert knowledge as much as on hard statistics. Here, we describe the use of a risk management process tailored to AUV operations, the implementation of which requires the elicitation of expert judgment. We conducted a formal judgment elicitation process where eight world experts in AUV design and operation were asked to assign a probability of AUV loss given the emergence of each fault or incident from the vehicle's life history of 63 faults and incidents. After discussing methods of aggregation and analysis, we show how the aggregated risk estimates obtained from the expert judgments were used to create a risk model. To estimate AUV survival with mission distance, we adopted a statistical survival function based on the nonparametric Kaplan-Meier estimator. We present theoretical formulations for the estimator, its variance, and confidence limits. We also present a numerical example where the approach is applied to estimate the probability that the Autosub3 AUV would survive a set of missions under Pine Island Glacier, Antarctica in January-March 2009.

  3. Geotechnical risk analysis by flat dilatometer (DMT)

    NASA Astrophysics Data System (ADS)

    Amoroso, Sara; Monaco, Paola

    2015-04-01

    In the last decades we have assisted at a massive migration from laboratory testing to in situ testing, to the point that, today, in situ testing is often the major part of a geotechnical investigation. The State of the Art indicates that direct-push in situ tests, such as the Cone Penetration Test (CPT) and the Flat Dilatometer Test (DMT), are fast and convenient in situ tests for routine site investigation. In most cases the DMT estimated parameters, in particular the undrained shear strength su and the constrained modulus M, are used with the common design methods of Geotechnical Engineering for evaluating bearing capacity, settlements etc. The paper focuses on the prediction of settlements of shallow foundations, that is probably the No. 1 application of the DMT, especially in sands, where undisturbed samples cannot be retrieved, and on the risk associated with their design. A compilation of documented case histories that compare DMT-predicted vs observed settlements, was collected by Monaco et al. (2006), indicating that, in general, the constrained modulus M can be considered a reasonable "operative modulus" (relevant to foundations in "working conditions") for settlement predictions based on the traditional linear elastic approach. Indeed, the use of a site investigation method, such as DMT, that improve the accuracy of design parameters, reduces risk, and the design can then center on the site's true soil variability without parasitic test variability. In this respect, Failmezger et al. (1999, 2015) suggested to introduce Beta probability distribution, that provides a realistic and useful description of variability for geotechnical design problems. The paper estimates Beta probability distribution in research sites where DMT tests and observed settlements are available. References Failmezger, R.A., Rom, D., Ziegler, S.R. (1999). "SPT? A better approach of characterizing residual soils using other in-situ tests", Behavioral Characterics of Residual Soils, B

  4. Automating risk analysis of software design models.

    PubMed

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  5. Risk analysis for Arctic offshore operations

    SciTech Connect

    Slomski, S.; Vivatrat, V.

    1986-04-01

    Offshore exploration for hydrocarbons is being conducted in the near-shore regions of the Beaufort Sea. This activity is expected to be intensified and expanded into the deeper portions of the Beaufort, as well as into the Chukchi Sea. The ice conditions in the Beaufort Sea are very variable, particularly in the deeper water regions. This variability greatly influences the probability of success or failure of an offshore operation. For example, a summer exploratory program conducted from a floating drilling unit may require a period of 60 to 100 days on station. The success of such a program depends on: (a) the time when the winter ice conditions deteriorate sufficiently for the drilling unit to move on station; (b) the number of summer invasions by the arctic ice pack, forcing the drilling unit to abandon station; (c) the rate at which first-year ice grows to the ice thickness limit of the supporting icebreakers; and (d) the extent of arctic pack expansion during the fall and early winter. In general, the ice conditions are so variable that, even with good planning, the change of failure of an offshore operation will not be negligible. Contingency planning for such events is therefore necessary. This paper presents a risk analysis procedure which can greatly benefit the planning of an offshore operation. A floating drilling program and a towing and installation operation for a fixed structure are considered to illustrate the procedure.

  6. Probabilistic risk analysis of groundwater remediation strategies

    NASA Astrophysics Data System (ADS)

    Bolster, D.; Barahona, M.; Dentz, M.; Fernandez-Garcia, D.; Sanchez-Vila, X.; Trinchero, P.; Valhondo, C.; Tartakovsky, D. M.

    2009-06-01

    Heterogeneity of subsurface environments and insufficient site characterization are some of the reasons why decisions about groundwater exploitation and remediation have to be made under uncertainty. A typical decision maker chooses between several alternative remediation strategies by balancing their respective costs with the probability of their success or failure. We conduct a probabilistic risk assessment (PRA) to determine the likelihood of the success of a permeable reactive barrier, one of the leading approaches to groundwater remediation. While PRA is used extensively in many engineering fields, its applications in hydrogeology are scarce. This is because rigorous PRA requires one to quantify structural and parametric uncertainties inherent in predictions of subsurface flow and transport. We demonstrate how PRA can facilitate a comprehensive uncertainty quantification for complex subsurface phenomena by identifying key transport processes contributing to a barrier's failure, each of which is amenable to uncertainty analysis. Probability of failure of a remediation strategy is computed by combining independent and conditional probabilities of failure of each process. Individual probabilities can be evaluated either analytically or numerically or, barring both, can be inferred from expert opinion.

  7. A performance-based approach to landslide risk analysis

    NASA Astrophysics Data System (ADS)

    Romeo, R. W.

    2009-04-01

    An approach for the risk assessment based on a probabilistic analysis of the performance of structures threatened by landslides is shown and discussed. The risk is a possible loss due to the occurrence of a potentially damaging event. Analytically the risk is the probability convolution of hazard, which defines the frequency of occurrence of the event (i.e., the demand), and fragility that defines the capacity of the system to withstand the event given its characteristics (i.e., severity) and those of the exposed goods (vulnerability), that is: Risk=p(D>=d|S,V) The inequality sets a damage (or loss) threshold beyond which the system's performance is no longer met. Therefore a consistent approach to risk assessment should: 1) adopt a probabilistic model which takes into account all the uncertainties of the involved variables (capacity and demand), 2) follow a performance approach based on given loss or damage thresholds. The proposed method belongs to the category of the semi-empirical ones: the theoretical component is given by the probabilistic capacity-demand model; the empirical component is given by the observed statistical behaviour of structures damaged by landslides. Two landslide properties alone are required: the area-extent and the type (or kinematism). All other properties required to determine the severity of landslides (such as depth, speed and frequency) are derived via probabilistic methods. The severity (or intensity) of landslides, in terms of kinetic energy, is the demand of resistance; the resistance capacity is given by the cumulative distribution functions of the limit state performance (fragility functions) assessed via damage surveys and cards compilation. The investigated limit states are aesthetic (of nominal concern alone), functional (interruption of service) and structural (economic and social losses). The damage probability is the probabilistic convolution of hazard (the probability mass function of the frequency of occurrence of given

  8. Spatial risk assessment for critical network infrastructure using sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Möderl, Michael; Rauch, Wolfgang

    2011-12-01

    The presented spatial risk assessment method allows for managing critical network infrastructure in urban areas under abnormal and future conditions caused e.g., by terrorist attacks, infrastructure deterioration or climate change. For the spatial risk assessment, vulnerability maps for critical network infrastructure are merged with hazard maps for an interfering process. Vulnerability maps are generated using a spatial sensitivity analysis of network transport models to evaluate performance decrease under investigated thread scenarios. Thereby parameters are varied according to the specific impact of a particular threat scenario. Hazard maps are generated with a geographical information system using raster data of the same threat scenario derived from structured interviews and cluster analysis of events in the past. The application of the spatial risk assessment is exemplified by means of a case study for a water supply system, but the principal concept is applicable likewise to other critical network infrastructure. The aim of the approach is to help decision makers in choosing zones for preventive measures.

  9. a Meteorological Risk Assessment Method for Power Lines Based on GIS and Multi-Sensor Integration

    NASA Astrophysics Data System (ADS)

    Lin, Zhiyong; Xu, Zhimin

    2016-06-01

    Power lines, exposed in the natural environment, are vulnerable to various kinds of meteorological factors. Traditional research mainly deals with the influence of a single meteorological condition on the power line, which lacks of comprehensive effects evaluation and analysis of the multiple meteorological factors. In this paper, we use multiple meteorological monitoring data obtained by multi-sensors to implement the meteorological risk assessment and early warning of power lines. Firstly, we generate meteorological raster map from discrete meteorological monitoring data using spatial interpolation. Secondly, the expert scoring based analytic hierarchy process is used to compute the power line risk index of all kinds of meteorological conditions and establish the mathematical model of meteorological risk. By adopting this model in raster calculator of ArcGIS, we will have a raster map showing overall meteorological risks for power line. Finally, by overlaying the power line buffer layer to that raster map, we will get to know the exact risk index around a certain part of power line, which will provide significant guidance for power line risk management. In the experiment, based on five kinds of observation data gathered from meteorological stations in Guizhou Province of China, including wind, lightning, rain, ice, temperature, we carry on the meteorological risk analysis for the real power lines, and experimental results have proved the feasibility and validity of our proposed method.

  10. Methods and Techniques for Risk Prediction of Space Shuttle Upgrades

    NASA Technical Reports Server (NTRS)

    Hoffman, Chad R.; Pugh, Rich; Safie, Fayssal

    1998-01-01

    Since the Space Shuttle Accident in 1986, NASA has been trying to incorporate probabilistic risk assessment (PRA) in decisions concerning the Space Shuttle and other NASA projects. One major study NASA is currently conducting is in the PRA area in establishing an overall risk model for the Space Shuttle System. The model is intended to provide a tool to predict the Shuttle risk and to perform sensitivity analyses and trade studies including evaluation of upgrades. Marshall Space Flight Center (MSFC) and its prime contractors including Pratt and Whitney (P&W) are part of the NASA team conducting the PRA study. MSFC responsibility involves modeling the External Tank (ET), the Solid Rocket Booster (SRB), the Reusable Solid Rocket Motor (RSRM), and the Space Shuttle Main Engine (SSME). A major challenge that faced the PRA team is modeling the shuttle upgrades. This mainly includes the P&W High Pressure Fuel Turbopump (HPFTP) and the High Pressure Oxidizer Turbopump (HPOTP). The purpose of this paper is to discuss the various methods and techniques used for predicting the risk of the P&W redesigned HPFTP and HPOTP.

  11. A workshop on developing risk assessment methods for medical use of radioactive material. Volume 1: Summary

    SciTech Connect

    Tortorelli, J.P.

    1995-08-01

    A workshop was held at the Idaho National Engineering Laboratory, August 16--18, 1994 on the topic of risk assessment on medical devices that use radioactive isotopes. Its purpose was to review past efforts to develop a risk assessment methodology to evaluate these devices, and to develop a program plan and a scoping document for future methodology development. This report contains a summary of that workshop. Participants included experts in the fields of radiation oncology, medical physics, risk assessment, human-error analysis, and human factors. Staff from the US Nuclear Regulatory Commission (NRC) associated with the regulation of medical uses of radioactive materials and with research into risk-assessment methods participated in the workshop. The workshop participants concurred in NRC`s intended use of risk assessment as an important technology in the development of regulations for the medical use of radioactive material and encouraged the NRC to proceed rapidly with a pilot study. Specific recommendations are included in the executive summary and the body of this report. An appendix contains the 8 papers presented at the conference: NRC proposed policy statement on the use of probabilistic risk assessment methods in nuclear regulatory activities; NRC proposed agency-wide implementation plan for probabilistic risk assessment; Risk evaluation of high dose rate remote afterloading brachytherapy at a large research/teaching institution; The pros and cons of using human reliability analysis techniques to analyze misadministration events; Review of medical misadministration event summaries and comparison of human error modeling; Preliminary examples of the development of error influences and effects diagrams to analyze medical misadministration events; Brachytherapy risk assessment program plan; and Principles of brachytherapy quality assurance.

  12. Methods to evaluate the nutrition risk in hospitalized patients

    PubMed Central

    Erkan, Tülay

    2014-01-01

    The rate of malnutrition is substantially high both in the population and in chronic patients hospitalized because of different reasons. The rate of patients with no marked malnutrition at the time of hospitalization who develop malnutrition during hospitalization is also substantially high. Therefore, there are currently different screening methods with different targets to prevent malnutrition and its overlook. These methods should be simple and reliable and should not be time-consuming in order to be used in daily practice. Seven nutrition risk screening methods used in children have been established until the present time. However, no consensus has been made on any method as in adults. It should be accepted that interrogation of nutrition is a part of normal examination to increase awareness on this issue and to draw attention to this issue. PMID:26078678

  13. Cleanup standards and pathways analysis methods

    SciTech Connect

    Devgun, J.S.

    1993-09-01

    Remediation of a radioactively contaminated site requires that certain regulatory criteria be met before the site can be released for unrestricted future use. Since the ultimate objective of remediation is to protect the public health and safety, residual radioactivity levels remaining at a site after cleanup must be below certain preset limits or meet acceptable dose or risk criteria. Release of a decontaminated site requires proof that the radiological data obtained from the site meet the regulatory criteria for such a release. Typically release criteria consist of a composite of acceptance limits that depend on the radionuclides, the media in which they are present, and federal and local regulations. In recent years, the US Department of Energy (DOE) has developed a pathways analysis model to determine site-specific soil activity concentration guidelines for radionuclides that do not have established generic acceptance limits. The DOE pathways analysis computer code (developed by Argonne National Laboratory for the DOE) is called RESRAD (Gilbert et al. 1989). Similar efforts have been initiated by the US Nuclear Regulatory Commission (NRC) to develop and use dose-related criteria based on genetic pathways analyses rather than simplistic numerical limits on residual radioactivity. The focus of this paper is radionuclide contaminated soil. Cleanup standards are reviewed, pathways analysis methods are described, and an example is presented in which RESRAD was used to derive cleanup guidelines.

  14. 49 CFR 260.17 - Credit risk premium analysis.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Financial Assistance § 260.17 Credit risk premium analysis. (a) When Federal appropriations are not... 49 Transportation 4 2014-10-01 2014-10-01 false Credit risk premium analysis. 260.17 Section 260..., based on Applicant's: (A) Industry outlook; (B) Market position; (C) Management and financial...

  15. 49 CFR 260.17 - Credit risk premium analysis.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Financial Assistance § 260.17 Credit risk premium analysis. (a) When Federal appropriations are not... 49 Transportation 4 2013-10-01 2013-10-01 false Credit risk premium analysis. 260.17 Section 260..., based on Applicant's: (A) Industry outlook; (B) Market position; (C) Management and financial...

  16. Virtues and Limitations of Risk Analysis

    ERIC Educational Resources Information Center

    Weatherwax, Robert K.

    1975-01-01

    After summarizing the Rasmussion Report, the author reviews the probabilistic portion of the report from the perspectives of engineering utility and risk assessment uncertainty. The author shows that the report may represent a significant step forward in the assurance of reactor safety and an imperfect measure of actual reactor risk. (BT)

  17. Risk analysis for plant-made vaccines.

    PubMed

    Kirk, Dwayne D; McIntosh, Kim; Walmsley, Amanda M; Peterson, Robert K D

    2005-08-01

    The production of vaccines in transgenic plants was first proposed in 1990 however no product has yet reached commercialization. There are several risks during the production and delivery stages of this technology, with potential impact on the environment and on human health. Risks to the environment include gene transfer and exposure to antigens or selectable marker proteins. Risks to human health include oral tolerance, allergenicity, inconsistent dosage, worker exposure and unintended exposure to antigens or selectable marker proteins in the food chain. These risks are controllable through appropriate regulatory measures at all stages of production and distribution of a potential plant-made vaccine. Successful use of this technology is highly dependant on stewardship and active risk management by the developers of this technology, and through quality standards for production, which will be set by regulatory agencies. Regulatory agencies can also negatively affect the future viability of this technology by requiring that all risks must be controlled, or by applying conventional regulations which are overly cumbersome for a plant production and oral delivery system. The value of new or replacement vaccines produced in plant cells and delivered orally must be considered alongside the probability and severity of potential risks in their production and use, and the cost of not deploying this technology--the risk of continuing with the status quo alternative.

  18. Risk-Based Explosive Safety Analysis

    DTIC Science & Technology

    2016-11-30

    safety siting of energetic liquids and propellants can be greatly aided by the use of risk-based methodologies. The low probability of exposed...of energetic liquids and propellants can be greatly aided by the use of risk-based methodologies. The low probability of exposed personnel and the

  19. Risk analysis for worker exposure to benzene

    NASA Astrophysics Data System (ADS)

    Hallenbeck, William H.; Flowers, Roxanne E.

    1992-05-01

    Cancer risk factors (characterized by route, dose, dose rate per kilogram, fraction of lifetime exposed, species, and sex) were derived for workers exposed to benzene via inhalation or ingestion. Exposure at the current Occupational Safety and Health Administration (OSHA) permissible exposure limit (PEL) and at leaking underground storage tank (LUST) sites were evaluated. At the current PEL of 1 ppm, the theoretical lifetime excess risk of cancer from benzene inhalation is ten per 1000. The theoretical lifetime excess risk for worker inhalation exposure at LUST sites ranged from 10 to 40 per 1000. These results indicate that personal protection should be required. The theoretical lifetime excess risk due to soil ingestion is five to seven orders of magnitude less than the inhalation risks.

  20. Proposal of a management method of rockfall risk induced on a road

    NASA Astrophysics Data System (ADS)

    Mignelli, C.; Peila, D.; Lo Russo, S.

    2012-04-01

    Many kilometers of roads have adjacent rock slopes that are prone to rockfall. The analysis of risks associated with these types of instabilities is a complex operation requiring the precise assessment of hazard, the vulnerability and therefore the risk of vehicles on roads along the foothills. Engineering design of protection devices should aim to minimize risk while taking advantage of the most advanced technologies. Decision makers should be equipped with the technical tools permitting them to choose the best solution within the context of local maximum acceptable risk levels. The fulfilment of safety requirements for mountainside routes involves in many cases the implementation of protective measures and devices to control and manage rockfall and it is of key importance the evaluation of the positive effects of such measures in terms of risk reduction. A risk analysis management procedure for roads subject to rockfall phenomena using a specifically developed method named: Rockfall risk Management (RO.MA.) is presented and discussed. The method is based on statistic tools, using as input the data coming both from in situ survey and from historical data. It is important to highline that historical database are not often available and usually there is a lack of useful information due to a not complete setting of parameters. The analysis based only on historical data can be difficult to be developed. For this purpose a specific database collection system has been developed to provide geotechnical and geomechanical description of the studied rockside. This parameters and the data collected from historical database, define the input parameters of the Ro.Ma method. Moreover to allow the quantification of the harm, the data coming from the monitoring of the road by the road manager are required. The value of harm is proportional to the number of persons on the road (i.e. people in a vehicle) and the following traffic characteristics: type of vehicles (i.e. bicycles

  1. Toward a risk assessment of the spent fuel and high-level nuclear waste disposal system. Risk assessment requirements, literature review, methods evaluation: an interim report

    SciTech Connect

    Hamilton, L.D.; Hill, D.; Rowe, M.D.; Stern, E.

    1986-04-01

    This report provides background information for a risk assessment of the disposal system for spent nuclear fuel and high-level radioactive waste (HLW). It contains a literature review, a survey of the statutory requirements for risk assessment, and a preliminary evaluation of methods. The literature review outlines the state of knowledge of risk assessment and accident consequence analysis in the nuclear fuel cycle and its applicability to spent fuel and HLW disposal. The survey of statutory requirements determines the extent to which risk assessment may be needed in development of the waste-disposal system. The evaluation of methods reviews and evaluates merits and applicabilities of alternative methods for assessing risks and relates them to the problems of spent fuel and HLW disposal. 99 refs.

  2. Extended risk-analysis model for activities of the project.

    PubMed

    Kušar, Janez; Rihar, Lidija; Zargi, Urban; Starbek, Marko

    2013-12-01

    Project management of product/service orders has become a mode of operation in many companies. Although these are mostly cyclically recurring projects, risk management is very important for them. An extended risk-analysis model for new product/service projects is presented in this paper. Emphasis is on a solution developed in the Faculty of Mechanical Engineering in Ljubljana, Slovenia. The usual project activities risk analysis is based on evaluation of the probability that risk events occur and on evaluation of their consequences. A third parameter has been added in our model: an estimate of the incidence of risk events. On the basis of the calculated activity risk level, a project team prepares preventive and corrective measures that should be taken according to the status indicators. An important advantage of the proposed solution is that the project manager and his team members are timely warned of risk events and they can thus activate the envisaged preventive and corrective measures as necessary.

  3. EC Transmission Line Risk Identification and Analysis

    SciTech Connect

    Bigelow, Tim S

    2012-04-01

    The purpose of this document is to assist in evaluating and planning for the cost, schedule, and technical project risks associated with the delivery and operation of the EC (Electron cyclotron) transmission line system. In general, the major risks that are anticipated to be encountered during the project delivery phase associated with the implementation of the Procurement Arrangement for the EC transmission line system are associated with: (1) Undefined or changing requirements (e.g., functional or regulatory requirements) (2) Underperformance of prototype, first unit, or production components during testing (3) Unavailability of qualified vendors for critical components Technical risks associated with the design and operation of the system are also identified.

  4. Control of Risks Through the Use of Procedures: A Method for Evaluating the Change in Risk

    NASA Technical Reports Server (NTRS)

    Praino, Gregory T.; Sharit, Joseph

    2010-01-01

    This paper considers how procedures can be used to control risks faced by an organization and proposes a means of recognizing if a particular procedure reduces risk or contributes to the organization's exposure. The proposed method was developed out of the review of work documents and the governing procedures performed in the wake of the Columbia accident by NASA and the Space Shuttle prime contractor, United Space Alliance, LLC. A technique was needed to understand the rules, or procedural controls, in place at the time in the context of how important the role of each rule was. The proposed method assesses procedural risks, the residual risk associated with a hazard after a procedure's influence is accounted for, by considering each clause of a procedure as a unique procedural control that may be beneficial or harmful. For procedural risks with consequences severe enough to threaten the survival of the organization, the method measures the characteristics of each risk on a scale that is an alternative to the traditional consequence/likelihood couple. The dual benefits of the substitute scales are that they eliminate both the need to quantify a relationship between different consequence types and the need for the extensive history a probabilistic risk assessment would require. Control Value is used as an analog for the consequence, where the value of a rule is based on how well the control reduces the severity of the consequence when operating successfully. This value is composed of two parts: the inevitability of the consequence in the absence of the control, and the opportunity to intervene before the consequence is realized. High value controls will be ones where there is minimal need for intervention but maximum opportunity to actively prevent the outcome. Failure Likelihood is used as the substitute for the conventional likelihood of the outcome. For procedural controls, a failure is considered to be any non-malicious violation of the rule, whether intended or

  5. An analysis of the new EPA risk management rule

    SciTech Connect

    Loran, B.; Nand, K.; Male, M.

    1997-08-01

    Due to increasing public concern of risks from handling highly hazardous chemicals at various facilities, a number of state and federal regulatory agencies, such as the Occupational Safety and Health Administration (OSHA) and recently the US Environmental Protection Agency (EPA), have enacted regulations requiring these facilities to perform accidental risk analysis and develop process safety and risk management programs. The regulatory requirements to be fulfilled are described; the major components involved are a Process Hazard Analysis, a Consequence Analysis, and a Management Program. The performance of these analyses and the development of a management program for 21 facilities operated by the City of Los Angeles Department of Water and Power, treating drinking water supplies with chlorine, is discussed. The effectiveness of the EPA risk management rule in achieving risk reduction is critically analyzed; it is found that, while the rule increases the worker and public awareness of the inherent risks present, some of the analytical results obtained may have a limited practical application.

  6. Stratospheric Aerosol and Gas Experiment, SAGE III on ISS, An Earth Science Mission on the International Space Station, Schedule Risk Analysis, A Project Perspective

    NASA Technical Reports Server (NTRS)

    Bonine, Lauren

    2015-01-01

    The presentation provides insight into the schedule risk analysis process used by the Stratospheric Aerosol and Gas Experiment III on the International Space Station Project. The presentation focuses on the schedule risk analysis process highlighting the methods for identification of risk inputs, the inclusion of generic risks identified outside the traditional continuous risk management process, and the development of tailored analysis products used to improve risk informed decision making.

  7. Method Analysis of Microbial-Resistant Gypsum Products

    EPA Science Inventory

    Method Analysis of Microbial-Resistant Gypsum ProductsD.A. Betancourt1, T.R.Dean1, A. Evans2, and G.Byfield2 1. US Environmental Protection Agency, Office of Research and Development, National Risk Management Research Laboratory; RTP, NC 277112. RTI International, RTP, NCSeveral...

  8. A Model-Free Machine Learning Method for Risk Classification and Survival Probability Prediction.

    PubMed

    Geng, Yuan; Lu, Wenbin; Zhang, Hao Helen

    2014-01-01

    Risk classification and survival probability prediction are two major goals in survival data analysis since they play an important role in patients' risk stratification, long-term diagnosis, and treatment selection. In this article, we propose a new model-free machine learning framework for risk classification and survival probability prediction based on weighted support vector machines. The new procedure does not require any specific parametric or semiparametric model assumption on data, and is therefore capable of capturing nonlinear covariate effects. We use numerous simulation examples to demonstrate finite sample performance of the proposed method under various settings. Applications to a glioma tumor data and a breast cancer gene expression survival data are shown to illustrate the new methodology in real data analysis.

  9. Fire behavior and risk analysis in spacecraft

    NASA Technical Reports Server (NTRS)

    Friedman, Robert; Sacksteder, Kurt R.

    1988-01-01

    Practical risk management for present and future spacecraft, including space stations, involves the optimization of residual risks balanced by the spacecraft operational, technological, and economic limitations. Spacecraft fire safety is approached through three strategies, in order of risk: (1) control of fire-causing elements, through exclusion of flammable materials for example; (2) response to incipient fires through detection and alarm; and (3) recovery of normal conditions through extinguishment and cleanup. Present understanding of combustion in low gravity is that, compared to normal gravity behavior, fire hazards may be reduced by the absence of buoyant gas flows yet at the same time increased by ventilation flows and hot particle expulsion. This paper discusses the application of low-gravity combustion knowledge and appropriate aircraft analogies to fire detection, fire fighting, and fire-safety decisions for eventual fire-risk management and optimization in spacecraft.

  10. Enhancing local action planning through quantitative flood risk analysis: a case study in Spain

    NASA Astrophysics Data System (ADS)

    Castillo-Rodríguez, Jesica Tamara; Escuder-Bueno, Ignacio; Perales-Momparler, Sara; Ramón Porta-Sancho, Juan

    2016-07-01

    This article presents a method to incorporate and promote quantitative risk analysis to support local action planning against flooding. The proposed approach aims to provide a framework for local flood risk analysis, combining hazard mapping with vulnerability data to quantify risk in terms of expected annual affected population, potential injuries, number of fatalities, and economic damages. Flood risk is estimated combining GIS data of loads, system response, and consequences and using event tree modelling for risk calculation. The study area is the city of Oliva, located on the eastern coast of Spain. Results from risk modelling have been used to inform local action planning and to assess the benefits of structural and non-structural risk reduction measures. Results show the potential impact on risk reduction of flood defences and improved warning communication schemes through local action planning: societal flood risk (in terms of annual expected affected population) would be reduced up to 51 % by combining both structural and non-structural measures. In addition, the effect of seasonal population variability is analysed (annual expected affected population ranges from 82 to 107 %, compared with the current situation, depending on occupancy rates in hotels and campsites). Results highlight the need for robust and standardized methods for urban flood risk analysis replicability at regional and national scale.

  11. Human-centered risk management for medical devices - new methods and tools.

    PubMed

    Janß, Armin; Plogmann, Simon; Radermacher, Klaus

    2016-04-01

    Studies regarding adverse events with technical devices in the medical context showed, that in most of the cases non-usable interfaces are the cause for use deficiencies and therefore a potential harm for the patient and third parties. This is partially due to the lack of suitable methods for interlinking usability engineering and human-centered risk management. Especially regarding the early identification of human-induced errors and the systematic control of these failures, medical device manufacturers and in particular the developers have to be supported in order to guarantee reliable design and error-tolerant human-machine interfaces (HMI). In this context, we developed the HiFEM methodology and a corresponding software tool (mAIXuse) for model-based human risk analysis. Based on a two-fold approach, HiFEM provides a task-type-sensitive modeling structure with integrated temporal relations in order to represent and analyze the use process in a detailed way. The approach can be used from early developmental stages up to the validation process. Results of a comparative study with the HiFEM method and a classical process-failure mode and effect analysis (FMEA) depict, that the new modeling and analysis technique clearly outperforms the FMEA. Besides, we implemented a new method for systematic human risk control (mAIXcontrol). Accessing information from the method's knowledge base enables the operator to detect the most suitable countermeasures for a respective risk. Forty-one approved generic countermeasure principles have been indexed as a resulting combination of root causes and failures in a matrix. The methodology has been tested in comparison to a conventional approach as well. Evaluation of the matrix and the reassessment of the risk priority numbers by a blind expert demonstrate a substantial benefit of the new mAIXcontrol method.

  12. DMAICR in an ergonomic risks analysis.

    PubMed

    Santos, E F; Lima, C R C

    2012-01-01

    The DMAICR problem-solving methodology is used throughout this paper to show you how to implement ergonomics recommendations. The DMAICR method consists of the following five six steps by which you can solve ergonomic design problems: The steps of the proposed method, adapting DMAICR, are the following: In the steep D, there is the definition of the project or the situation to be assessed and its guiding objectives, known as demand. In the step M, it relates to the work, tasks and organizational protocols and also includes the need of measuring. In the step A, all concepts are about the analysis itself. The step I is the moment of improving or incrementing. In the step C, control, prevention from prospective troublesome situation and implementation of management are the activities controlling the situation. R is Report. Some relevant technical and conceptual aspects for the comparison of these methodologies are illustrated in this paper. The steps of DMAICR were taken by a multifunctional team (multi-professional and multi-disciplinary) termed as focus group, composed by selected members of the company and supported by experts in ergonomics.

  13. Comparing multiple competing interventions in the absence of randomized trials using clinical risk-benefit analysis

    PubMed Central

    2012-01-01

    Background To demonstrate the use of risk-benefit analysis for comparing multiple competing interventions in the absence of randomized trials, we applied this approach to the evaluation of five anticoagulants to prevent thrombosis in patients undergoing orthopedic surgery. Methods Using a cost-effectiveness approach from a clinical perspective (i.e. risk benefit analysis) we compared thromboprophylaxis with warfarin, low molecular weight heparin, unfractionated heparin, fondaparinux or ximelagatran in patients undergoing major orthopedic surgery, with sub-analyses according to surgery type. Proportions and variances of events defining risk (major bleeding) and benefit (thrombosis averted) were obtained through a meta-analysis and used to define beta distributions. Monte Carlo simulations were conducted and used to calculate incremental risks, benefits, and risk-benefit ratios. Finally, net clinical benefit was calculated for all replications across a range of risk-benefit acceptability thresholds, with a reference range obtained by estimating the case fatality rate - ratio of thrombosis to bleeding. Results The analysis showed that compared to placebo ximelagatran was superior to other options but final results were influenced by type of surgery, since ximelagatran was superior in total knee replacement but not in total hip replacement. Conclusions Using simulation and economic techniques we demonstrate a method that allows comparing multiple competing interventions in the absence of randomized trials with multiple arms by determining the option with the best risk-benefit profile. It can be helpful in clinical decision making since it incorporates risk, benefit, and personal risk acceptance. PMID:22233221

  14. Risk analysis of an RTG on the Space Shuttle

    NASA Technical Reports Server (NTRS)

    Frank, Michael V.

    1991-01-01

    As part of the effort to review the Ulysses Final Safety Analysis Report and to understand the risk of plutonium release from the Ulysses spacecraft General Purpose Heat Source-Radioisotope Thermal Generator (GPHS-RTG), the Interagency Nuclear Safety Review Panel (INSRP) and the author performed an integrated, quantitative analysis of the uncertainties of the calculated risk of plutonium release from Ulysses. Using state-of-the-art probabilistic risk assessment technology, the uncertainty analysis accounted for both variability and uncertainty of the key parameters of the risk analysis. The results show tht INSRP had high confidence that risk of fatal cancers from potential plutonium release associated with calculated launch and deployment accident scenarios is low.

  15. Passive sampling methods for contaminated sediments: Risk assessment and management

    PubMed Central

    Greenberg, Marc S; Chapman, Peter M; Allan, Ian J; Anderson, Kim A; Apitz, Sabine E; Beegan, Chris; Bridges, Todd S; Brown, Steve S; Cargill, John G; McCulloch, Megan C; Menzie, Charles A; Shine, James P; Parkerton, Thomas F

    2014-01-01

    This paper details how activity-based passive sampling methods (PSMs), which provide information on bioavailability in terms of freely dissolved contaminant concentrations (Cfree), can be used to better inform risk management decision making at multiple points in the process of assessing and managing contaminated sediment sites. PSMs can increase certainty in site investigation and management, because Cfree is a better predictor of bioavailability than total bulk sediment concentration (Ctotal) for 4 key endpoints included in conceptual site models (benthic organism toxicity, bioaccumulation, sediment flux, and water column exposures). The use of passive sampling devices (PSDs) presents challenges with respect to representative sampling for estimating average concentrations and other metrics relevant for exposure and risk assessment. These challenges can be addressed by designing studies that account for sources of variation associated with PSMs and considering appropriate spatial scales to meet study objectives. Possible applications of PSMs include: quantifying spatial and temporal trends in bioavailable contaminants, identifying and evaluating contaminant source contributions, calibrating site-specific models, and, improving weight-of-evidence based decision frameworks. PSM data can be used to assist in delineating sediment management zones based on likelihood of exposure effects, monitor remedy effectiveness, and, evaluate risk reduction after sediment treatment, disposal, or beneficial reuse after management actions. Examples are provided illustrating why PSMs and freely dissolved contaminant concentrations (Cfree) should be incorporated into contaminated sediment investigations and study designs to better focus on and understand contaminant bioavailability, more accurately estimate exposure to sediment-associated contaminants, and better inform risk management decisions. Research and communication needs for encouraging broader use are discussed. Integr

  16. Obesity and Risk of Thyroid Cancer: Evidence from a Meta-Analysis of 21 Observational Studies

    PubMed Central

    Ma, Jie; Huang, Min; Wang, Li; Ye, Wei; Tong, Yan; Wang, Hanmin

    2015-01-01

    Background Several studies have evaluated the association between obesity and thyroid cancer risk. However, the results remain uncertain. In this study, we conducted a meta-analysis to assess the association between obesity and thyroid cancer risk. Material/Methods Published literature from PubMed, EMBASE, Springer Link, Ovid, Chinese Wanfang Data Knowledge Service Platform, Chinese National Knowledge Infrastructure (CNKI), and Chinese Biology Medicine (CBM) were retrieved before 10 August 2014. We included all studies that reported adjusted risk ratios (RRs), hazard ratios (HRs) or odds ratios (ORs), and 95% confidence intervals (CIs) of thyroid cancer risk. Results Thirty-two studies (n=12 620 676) were included in this meta-analysis. Obesity was associated with a significantly increased risk of thyroid cancer (adjusted RR=1.33; 95% CI, 1.24–1.42; I2=25%). In the subgroup analysis by study type, increased risk of thyroid cancer was found in cohort studies and case-control studies. In subgroup analysis by sex, both obese men and women were at significantly greater risk of thyroid cancer than non-obese subjects. When stratified by ethnicity, significantly elevated risk was observed in Caucasians and in Asians. In the age subgroup analysis, both young and old populations showed increased thyroid cancer risk. Subgroup analysis on smoking status showed that increased thyroid cancer risks were found in smokers and in non-smokers. In the histology subgroup analyses, increased risks of papillary thyroid cancer, follicular thyroid cancer, and anaplastic thyroid cancer were observed. However, obesity was associated with decreased risk of medullary thyroid cancer. Conclusions Our results indicate that obesity is associated with an increased thyroid cancer risk, except medullary thyroid cancer. PMID:25612155

  17. Space Weather Influence on Power Systems: Prediction, Risk Analysis, and Modeling

    NASA Astrophysics Data System (ADS)

    Yatsenko, Vitaliy

    2016-04-01

    This report concentrates on dynamic probabilistic risk analysis of optical elements for complex characterization of damages using physical model of solid state lasers and predictable level of ionizing radiation and space weather. The following main subjects will be covered by our report: (a) solid-state laser model; (b) mathematical models for dynamic probabilistic risk assessment; and (c) software for modeling and prediction of ionizing radiation. A probabilistic risk assessment method for solid-state lasers is presented with consideration of some deterministic and stochastic factors. Probabilistic risk assessment is a comprehensive, structured, and logical analysis method aimed at identifying and assessing risks in solid-state lasers for the purpose of cost-e®ectively improving their safety and performance. This method based on the Conditional Value-at-Risk measure (CVaR) and the expected loss exceeding Value-at-Risk (VaR). We propose to use a new dynamical-information approach for radiation damage risk assessment of laser elements by cosmic radiation. Our approach includes the following steps: laser modeling, modeling of ionizing radiation in°uences on laser elements, probabilistic risk assessment methods, and risk minimization. For computer simulation of damage processes at microscopic and macroscopic levels the following methods are used: () statistical; (b) dynamical; (c) optimization; (d) acceleration modeling, and (e) mathematical modeling of laser functioning. Mathematical models of space ionizing radiation in°uence on laser elements were developed for risk assessment in laser safety analysis. This is a so-called `black box' or `input-output' models, which seeks only to reproduce the behaviour of the system's output in response to changes in its inputs. The model inputs are radiation in°uences on laser systems and output parameters are dynamical characteristics of the solid laser. Algorithms and software for optimal structure and parameters of

  18. Pesticide residues in cashew apple, guava, kaki and peach: GC-μECD, GC-FPD and LC-MS/MS multiresidue method validation, analysis and cumulative acute risk assessment.

    PubMed

    Jardim, Andréia Nunes Oliveira; Mello, Denise Carvalho; Goes, Fernanda Caroline Silva; Frota Junior, Elcio Ferreira; Caldas, Eloisa Dutra

    2014-12-01

    A multiresidue method for the determination of 46 pesticides in fruits was validated. Samples were extracted with acidified ethyl acetate, MgSO4 and CH3COONa and cleaned up by dispersive SPE with PSA. The compounds were analysed by GC-FPD, GC-μECD or LC-MS/MS, with LOQs from 1 to 8 μg/kg. The method was used to analyse 238 kaki, cashew apple, guava, and peach fruit and pulp samples, which were also analysed for dithiocarbamates (DTCs) using a spectrophotometric method. Over 70% of the samples were positive, with DTC present in 46.5%, λ-cyhalothrin in 37.1%, and omethoate in 21.8% of the positive samples. GC-MS/MS confirmed the identities of the compounds detected by GC. None of the pesticides found in kaki, cashew apple and guava was authorised for these crops in Brazil. The risk assessment has shown that the cumulative acute intake of organophosphorus or pyrethroid compounds from the consumption of these fruits is unlikely to pose a health risk to consumers.

  19. Use-related risk analysis for medical devices based on improved FMEA.

    PubMed

    Liu, Long; Shuai, Ma; Wang, Zhu; Li, Ping

    2012-01-01

    In order to effectively analyze and control use-related risk of medical devices, quantitative methodologies must be applied. Failure Mode and Effects Analysis (FMEA) is a proactive technique for error detection and risk reduction. In this article, an improved FMEA based on Fuzzy Mathematics and Grey Relational Theory is developed to better carry out user-related risk analysis for medical devices. As an example, the analysis process using this improved FMEA method for a certain medical device (C-arm X-ray machine) is described.

  20. Methods of Building Cost Analysis.

    ERIC Educational Resources Information Center

    Building Research Inst., Inc., Washington, DC.

    Presentation of symposium papers includes--(1) a study describing techniques for economic analysis of building designs, (2) three case studies of analysis techniques, (3) procedures for measuring the area and volume of buildings, and (4) an open forum discussion. Case studies evaluate--(1) the thermal economics of building enclosures, (2) an…

  1. Computational methods in the pricing and risk management of modern financial derivatives

    NASA Astrophysics Data System (ADS)

    Deutsch, Hans-Peter

    1999-09-01

    In the last 20 years modern finance has developed into a complex mathematically challenging field. Very complicated risks exist in financial markets which need very advanced methods to measure and/or model them. The financial instruments invented by the market participants to trade these risk, the so called derivatives are usually even more complicated than the risks themselves and also sometimes generate new riks. Topics like random walks, stochastic differential equations, martingale measures, time series analysis, implied correlations, etc. are of common use in the field. This is why more and more people with a science background, such as physicists, mathematicians, or computer scientists, are entering the field of finance. The measurement and management of all theses risks is the key to the continuing success of banks. This talk gives insight into today's common methods of modern market risk management such as variance-covariance, historical simulation, Monte Carlo, “Greek” ratios, etc., including the statistical concepts on which they are based. Derivatives are at the same time the main reason for and the most effective means of conducting risk management. As such, they stand at the beginning and end of risk management. The valuation of derivatives and structured financial instruments is therefore the prerequisite, the condition sine qua non, for all risk management. This talk introduces some of the important valuation methods used in modern derivatives pricing such as present value, Black-Scholes, binomial trees, Monte Carlo, etc. In summary this talk highlights an area outside physics where there is a lot of interesting work to do, especially for physicists. Or as one of our consultants said: The fascinating thing about this job is that Arthur Andersen hired me not ALTHOUGH I am a physicist but BECAUSE I am a physicist.

  2. Metabolic Disease Risk in Children by Salivary Biomarker Analysis

    PubMed Central

    Goodson, J. Max; Kantarci, Alpdogan; Hartman, Mor-Li; Denis, Gerald V.; Stephens, Danielle; Hasturk, Hatice; Yaskell, Tina; Vargas, Jorel; Wang, Xiaoshan; Cugini, Maryann; Barake, Roula; Alsmadi, Osama; Al-Mutawa, Sabiha; Ariga, Jitendra; Soparkar, Pramod; Behbehani, Jawad; Behbehani, Kazem; Welty, Francine

    2014-01-01

    Objective The study of obesity-related metabolic syndrome or Type 2 diabetes (T2D) in children is particularly difficult because of fear of needles. We tested a non-invasive approach to study inflammatory parameters in an at-risk population of children to provide proof-of-principle for future investigations of vulnerable subjects. Design and Methods We evaluated metabolic differences in 744, 11-year old children selected from underweight, normal healthy weight, overweight and obese categories by analyzing fasting saliva samples for 20 biomarkers. Saliva supernatants were obtained following centrifugation and used for analyses. Results Salivary C-reactive protein (CRP) was 6 times higher, salivary insulin and leptin were 3 times higher, and adiponectin was 30% lower in obese children compared to healthy normal weight children (all P<0.0001). Categorical analysis suggested that there might be three types of obesity in children. Distinctly inflammatory characteristics appeared in 76% of obese children while in 13%, salivary insulin was high but not associated with inflammatory mediators. The remaining 11% of obese children had high insulin and reduced adiponectin. Forty percent of the non-obese children were found in groups which, based on biomarker characteristics, may be at risk for becoming obese. Conclusions Significantly altered levels of salivary biomarkers in obese children from a high-risk population, suggest the potential for developing non-invasive screening procedures to identify T2D-vulnerable individuals and a means to test preventative strategies. PMID:24915044

  3. Application of advanced reliability methods to local strain fatigue analysis

    NASA Technical Reports Server (NTRS)

    Wu, T. T.; Wirsching, P. H.

    1983-01-01

    When design factors are considered as random variables and the failure condition cannot be expressed by a closed form algebraic inequality, computations of risk (or probability of failure) might become extremely difficult or very inefficient. This study suggests using a simple, and easily constructed, second degree polynomial to approximate the complicated limit state in the neighborhood of the design point; a computer analysis relates the design variables at selected points. Then a fast probability integration technique (i.e., the Rackwitz-Fiessler algorithm) can be used to estimate risk. The capability of the proposed method is demonstrated in an example of a low cycle fatigue problem for which a computer analysis is required to perform local strain analysis to relate the design variables. A comparison of the performance of this method is made with a far more costly Monte Carlo solution. Agreement of the proposed method with Monte Carlo is considered to be good.

  4. Improved Methods for Fire Risk Assessment in Low-Income and Informal Settlements.

    PubMed

    Twigg, John; Christie, Nicola; Haworth, James; Osuteye, Emmanuel; Skarlatidou, Artemis

    2017-02-01

    Fires cause over 300,000 deaths annually worldwide and leave millions more with permanent injuries: some 95% of these deaths are in low- and middle-income countries. Burn injury risk is strongly associated with low-income and informal (or slum) settlements, which are growing rapidly in an urbanising world. Fire policy and mitigation strategies in poorer countries are constrained by inadequate data on incidence, impacts, and causes, which is mainly due to a lack of capacity and resources for data collection, analysis, and modelling. As a first step towards overcoming such challenges, this project reviewed the literature on the subject to assess the potential of a range of methods and tools for identifying, assessing, and addressing fire risk in low-income and informal settlements; the process was supported by an expert workshop at University College London in May 2016. We suggest that community-based risk and vulnerability assessment methods, which are widely used in disaster risk reduction, could be adapted to urban fire risk assessment, and could be enhanced by advances in crowdsourcing and citizen science for geospatial data creation and collection. To assist urban planners, emergency managers, and community organisations who are working in resource-constrained settings to identify and assess relevant fire risk factors, we also suggest an improved analytical framework based on the Haddon Matrix.

  5. Improved Methods for Fire Risk Assessment in Low-Income and Informal Settlements

    PubMed Central

    Twigg, John; Christie, Nicola; Haworth, James; Osuteye, Emmanuel; Skarlatidou, Artemis

    2017-01-01

    Fires cause over 300,000 deaths annually worldwide and leave millions more with permanent injuries: some 95% of these deaths are in low- and middle-income countries. Burn injury risk is strongly associated with low-income and informal (or slum) settlements, which are growing rapidly in an urbanising world. Fire policy and mitigation strategies in poorer countries are constrained by inadequate data on incidence, impacts, and causes, which is mainly due to a lack of capacity and resources for data collection, analysis, and modelling. As a first step towards overcoming such challenges, this project reviewed the literature on the subject to assess the potential of a range of methods and tools for identifying, assessing, and addressing fire risk in low-income and informal settlements; the process was supported by an expert workshop at University College London in May 2016. We suggest that community-based risk and vulnerability assessment methods, which are widely used in disaster risk reduction, could be adapted to urban fire risk assessment, and could be enhanced by advances in crowdsourcing and citizen science for geospatial data creation and collection. To assist urban planners, emergency managers, and community organisations who are working in resource-constrained settings to identify and assess relevant fire risk factors, we also suggest an improved analytical framework based on the Haddon Matrix. PMID:28157149

  6. Cost-effectiveness analysis of risk-reduction measures to reach water safety targets.

    PubMed

    Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof; Pettersson, Thomas J R

    2011-01-01

    Identifying the most suitable risk-reduction measures in drinking water systems requires a thorough analysis of possible alternatives. In addition to the effects on the risk level, also the economic aspects of the risk-reduction alternatives are commonly considered important. Drinking water supplies are complex systems and to avoid sub-optimisation of risk-reduction measures, the entire system from source to tap needs to be considered. There is a lack of methods for quantification of water supply risk reduction in an economic context for entire drinking water systems. The aim of this paper is to present a novel approach for risk assessment in combination with economic analysis to evaluate risk-reduction measures based on a source-to-tap approach. The approach combines a probabilistic and dynamic fault tree method with cost-effectiveness analysis (CEA). The developed approach comprises the following main parts: (1) quantification of risk reduction of alternatives using a probabilistic fault tree model of the entire system; (2) combination of the modelling results with CEA; and (3) evaluation of the alternatives with respect to the risk reduction, the probability of not reaching water safety targets and the cost-effectiveness. The fault tree method and CEA enable comparison of risk-reduction measures in the same quantitative unit and consider costs and uncertainties. The approach provides a structured and thorough analysis of risk-reduction measures that facilitates transparency and long-term planning of drinking water systems in order to avoid sub-optimisation of available resources for risk reduction.

  7. Maternal migration and autism risk: systematic analysis.

    PubMed

    Crafa, Daina; Warfa, Nasir

    2015-02-01

    Autism (AUT) is one of the most prevalent developmental disorders emerging during childhood, and can be amongst the most incapacitating mental disorders. Some individuals with AUT require a lifetime of supervised care. Autism Speaks reported estimated costs for 2012 at £34 billion in the UK; and $3.2 million-$126 billion in the US, Australia and Canada. Ethnicity and migration experiences appear to increase risks of AUT and relate to underlying biological risk factors. Sociobiological stress factors can affect the uterine environment, or relate to stress-induced epigenetic changes during pregnancy and delivery. Epigenetic risk factors associated with AUT also include poor pregnancy conditions, low birth weight, and congenital malformation. Recent studies report that children from migrant communities are at higher risk of AUT than children born to non-migrant mothers, with the exception of Hispanic children. This paper provides the first systematic review into prevalence and predictors of AUT with a particular focus on maternal migration stressors and epigenetic risk factors. AUT rates appear higher in certain migrant communities, potentially relating to epigenetic changes after stressful experiences. Although AUT remains a rare disorder, failures to recognize its public health urgency and local community needs continue to leave certain cultural groups at a disadvantage.

  8. Food adulteration: Sources, health risks, and detection methods.

    PubMed

    Bansal, Sangita; Singh, Apoorva; Mangal, Manisha; Mangal, Anupam K; Kumar, Sanjiv

    2017-04-13

    Adulteration in food has been a concern since the beginning of civilization, as it not only decreases the quality of food products but also results in a number of ill effects on health. Authentic testing of food and adulterant detection of various food products is required for value assessment and to assure consumer protection against fraudulent activities. Through this review we intend to compile different types of adulterations made in different food items, the health risks imposed by these adulterants and detection methods available for them. Concerns about food safety and regulation have ensured the development of various techniques like physical, biochemical/immunological and molecular techniques, for adulterant detection in food. Molecular methods are more preferable when it comes to detection of biological adulterants in food, although physical and biochemical techniques are preferable for detection of other adulterants in food.

  9. Successful risk assessment may not always lead to successful risk control: A systematic literature review of risk control after root cause analysis.

    PubMed

    Card, Alan J; Ward, James; Clarkson, P John

    2012-01-01

    Root cause analysis is perhaps the most widely used tool in healthcare risk management, but does it actually lead to successful risk control? Are there categories of risk control that are more likely to be effective? And do healthcare risk managers have the tools they need to support the risk control process? This systematic review examines how the healthcare sector translates risk analysis to risk control action plans and examines how to do better. It suggests that the hierarchy of risk controls should inform risk control action planning and that new tools should be developed to improve the risk control process.

  10. Balancing precision and risk: should multiple detection methods be analyzed separately in N-mixture models?

    USGS Publications Warehouse

    Graves, Tabitha A.; Royle, J. Andrew; Kendall, Katherine C.; Beier, Paul; Stetz, Jeffrey B.; Macleod, Amy C.

    2012-01-01

    Using multiple detection methods can increase the number, kind, and distribution of individuals sampled, which may increase accuracy and precision and reduce cost of population abundance estimates. However, when variables influencing abundance are of interest, if individuals detected via different methods are influenced by the landscape differently, separate analysis of multiple detection methods may be more appropriate. We evaluated the effects of combining two detection methods on the identification of variables important to local abundance using detections of grizzly bears with hair traps (systematic) and bear rubs (opportunistic). We used hierarchical abundance models (N-mixture models) with separate model components for each detection method. If both methods sample the same population, the use of either data set alone should (1) lead to the selection of the same variables as important and (2) provide similar estimates of relative local abundance. We hypothesized that the inclusion of 2 detection methods versus either method alone should (3) yield more support for variables identified in single method analyses (i.e. fewer variables and models with greater weight), and (4) improve precision of covariate estimates for variables selected in both separate and combined analyses because sample size is larger. As expected, joint analysis of both methods increased precision as well as certainty in variable and model selection. However, the single-method analyses identified different variables and the resulting predicted abundances had different spatial distributions. We recommend comparing single-method and jointly modeled results to identify the presence of individual heterogeneity between detection methods in N-mixture models, along with consideration of detection probabilities, correlations among variables, and tolerance to risk of failing to identify variables important to a subset of the population. The benefits of increased precision should be weighed against

  11. Analysis of Affordance, Time, and Adaptation in the Assessment of Industrial Control System Cybersecurity Risk.

    PubMed

    Busby, J S; Green, B; Hutchison, D

    2017-01-17

    Industrial control systems increasingly use standard communication protocols and are increasingly connected to public networks-creating substantial cybersecurity risks, especially when used in critical infrastructures such as electricity and water distribution systems. Methods of assessing risk in such systems have recognized for some time the way in which the strategies of potential adversaries and risk managers interact in defining the risk to which such systems are exposed. But it is also important to consider the adaptations of the systems' operators and other legitimate users to risk controls, adaptations that often appear to undermine these controls, or shift the risk from one part of a system to another. Unlike the case with adversarial risk analysis, the adaptations of system users are typically orthogonal to the objective of minimizing or maximizing risk in the system. We argue that this need to analyze potential adaptations to risk controls is true for risk problems more generally, and we develop a framework for incorporating such adaptations into an assessment process. The method is based on the principle of affordances, and we show how this can be incorporated in an iterative procedure based on raising the minimum period of risk materialization above some threshold. We apply the method in a case study of a small European utility provider and discuss the observations arising from this.

  12. Risk analysis: divergent models and convergent interpretations

    NASA Technical Reports Server (NTRS)

    Carnes, B. A.; Gavrilova, N.

    2001-01-01

    Material presented at a NASA-sponsored workshop on risk models for exposure conditions relevant to prolonged space flight are described in this paper. Analyses used mortality data from experiments conducted at Argonne National Laboratory on the long-term effects of external whole-body irradiation on B6CF1 mice by 60Co gamma rays and fission neutrons delivered as a single exposure or protracted over either 24 or 60 once-weekly exposures. The maximum dose considered was restricted to 1 Gy for neutrons and 10 Gy for gamma rays. Proportional hazard models were used to investigate the shape of the dose response at these lower doses for deaths caused by solid-tissue tumors and tumors of either connective or epithelial tissue origin. For protracted exposures, a significant mortality effect was detected at a neutron dose of 14 cGy and a gamma-ray dose of 3 Gy. For single exposures, radiation-induced mortality for neutrons also occurred within the range of 10-20 cGy, but dropped to 86 cGy for gamma rays. Plots of risk relative to control estimated for each observed dose gave a visual impression of nonlinearity for both neutrons and gamma rays. At least for solid-tissue tumors, male and female mortality was nearly identical for gamma-ray exposures, but mortality risks for females were higher than for males for neutron exposures. As expected, protracting the gamma-ray dose reduced mortality risks. Although curvature consistent with that observed visually could be detected by a model parameterized to detect curvature, a relative risk term containing only a simple term for total dose was usually sufficient to describe the dose response. Although detectable mortality for the three pathology end points considered typically occurred at the same level of dose, the highest risks were almost always associated with deaths caused by tumors of epithelial tissue origin.

  13. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    SciTech Connect

    Nachtigal, Noel M.; Fruetel, Julia A.; Gleason, Nathaniel J.; Helms, Jovana; Imbro, Dennis Raymond; Sumner, Matthew C.

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  14. Asbestos Workshop: Sampling, Analysis, and Risk Assessment

    DTIC Science & Technology

    2012-03-01

    fibrosis (fibrosis of the lining of the cavity holding the lungs) EMDQ March 2012 Chest x - ray showing areas of scarring related to asbestosis. 8...soil) •If the expected number of asbestos structures in a sample is λ, then the probability that there are exactly x asbestos fibers is equal to: •E.g...Estimating Risk for Asbestos Risk = Exposure x Toxicity = [Air] × ET × EF × IUR = f/cm3× hour/hour × day/day × (f/cm3)-1 For asbestos , ED is

  15. Integrated Hybrid System Architecture for Risk Analysis

    NASA Technical Reports Server (NTRS)

    Moynihan, Gary P.; Fonseca, Daniel J.; Ray, Paul S.

    2010-01-01

    A conceptual design has been announced of an expert-system computer program, and the development of a prototype of the program, intended for use as a project-management tool. The program integrates schedule and risk data for the purpose of determining the schedule applications of safety risks and, somewhat conversely, the effects of changes in schedules on changes on safety. It is noted that the design has been delivered to a NASA client and that it is planned to disclose the design in a conference presentation.

  16. Prognostic Analysis System and Methods of Operation

    NASA Technical Reports Server (NTRS)

    MacKey, Ryan M. E. (Inventor); Sneddon, Robert (Inventor)

    2014-01-01

    A prognostic analysis system and methods of operating the system are provided. In particular, a prognostic analysis system for the analysis of physical system health applicable to mechanical, electrical, chemical and optical systems and methods of operating the system are described herein.

  17. A review of risk analysis and helicopter air ambulance accidents.

    PubMed

    Nix, Sam; Buckner, Steven; Cercone, Richard

    2014-01-01

    The Federal Aviation Administration announced a final rule in February 2014 that includes a requirement for helicopter air ambulance operators to institute preflight risk analysis programs. This qualitative study examined risk factors that were described in 22 preliminary, factual, and probable cause helicopter air ambulance accident and incident reports that were initiated by the National Transportation Safety Board between January 1, 2011, and December 31, 2013. Insights into the effectiveness of existing preflight risk analysis strategies were gained by comparing these risk factors with the preflight risk analysis guidance that is published by the Federal Aviation Administration in the Flight Standards Information Management System. When appropriate, a deeper understanding of the human factors that may have contributed to occurrences was gained through methodologies that are described in the Human Factors Analysis and Classification System. The results of this study suggest that there are some vulnerabilities in existing preflight risk analysis guidelines that may affect safety in the helicopter air ambulance industry. The likelihood that human factors contributed to most of the helicopter air ambulance accidents and incidents that occurred during the study period was also evidenced. The results of this study suggest that effective risk analysis programs should provide pilots with both preflight and in-flight resources.

  18. Cardiometabolic risk in Canada: a detailed analysis and position paper by the cardiometabolic risk working group.

    PubMed

    Leiter, Lawrence A; Fitchett, David H; Gilbert, Richard E; Gupta, Milan; Mancini, G B John; McFarlane, Philip A; Ross, Robert; Teoh, Hwee; Verma, Subodh; Anand, Sonia; Camelon, Kathryn; Chow, Chi-Ming; Cox, Jafna L; Després, Jean-Pierre; Genest, Jacques; Harris, Stewart B; Lau, David C W; Lewanczuk, Richard; Liu, Peter P; Lonn, Eva M; McPherson, Ruth; Poirier, Paul; Qaadri, Shafiq; Rabasa-Lhoret, Rémi; Rabkin, Simon W; Sharma, Arya M; Steele, Andrew W; Stone, James A; Tardif, Jean-Claude; Tobe, Sheldon; Ur, Ehud

    2011-01-01

    The concepts of "cardiometabolic risk," "metabolic syndrome," and "risk stratification" overlap and relate to the atherogenic process and development of type 2 diabetes. There is confusion about what these terms mean and how they can best be used to improve our understanding of cardiovascular disease treatment and prevention. With the objectives of clarifying these concepts and presenting practical strategies to identify and reduce cardiovascular risk in multiethnic patient populations, the Cardiometabolic Working Group reviewed the evidence related to emerging cardiovascular risk factors and Canadian guideline recommendations in order to present a detailed analysis and consolidated approach to the identification and management of cardiometabolic risk. The concepts related to cardiometabolic risk, pathophysiology, and strategies for identification and management (including health behaviours, pharmacotherapy, and surgery) in the multiethnic Canadian population are presented. "Global cardiometabolic risk" is proposed as an umbrella term for a comprehensive list of existing and emerging factors that predict cardiovascular disease and/or type 2 diabetes. Health behaviour interventions (weight loss, physical activity, diet, smoking cessation) in people identified at high cardiometabolic risk are of critical importance given the emerging crisis of obesity and the consequent epidemic of type 2 diabetes. Vascular protective measures (health behaviours for all patients and pharmacotherapy in appropriate patients) are essential to reduce cardiometabolic risk, and there is growing consensus that a multidisciplinary approach is needed to adequately address cardiometabolic risk factors. Health care professionals must also consider risk factors related to ethnicity in order to appropriately evaluate everyone in their diverse patient populations.

  19. Metabolic syndrome risk factors and dry eye syndrome: a Meta-analysis

    PubMed Central

    Tang, Ye-Lei; Cheng, Ya-Lan; Ren, Yu-Ping; Yu, Xiao-Ning; Shentu, Xing-Chao

    2016-01-01

    AIM To explore the relationship between metabolic risk factors and dry eye syndrome (DES). METHODS Retrieved studies on the association of metabolic syndrome risk factors (hypertension, hyperglycemia, obesity, and hyperlipidemia) and DES were collected from PubMed, Web of Science, and the Cochrane Library in December 2015. Odds ratio (OR) with 95% confidence interval (CI) were pooled to evaluate the final relationship. Subgroup analyses were conducted according to diagnostic criteria of DES. RESULTS Nine cross-sectional studies and three case-control studies were included in this Meta-analysis. The pooled results showed that people with hypertension, hyperglycemia, and hyperlipidemia had a higher risk of suffering from DES (P<0.05), especially the typical DES symptoms. On the other hand, obesity did not increase the risk of DES. CONCLUSION The present Meta-analysis suggests that all metabolic risk factors except obesity were risk factors for DES. PMID:27500114

  20. Convergence analysis of combinations of different methods

    SciTech Connect

    Kang, Y.

    1994-12-31

    This paper provides a convergence analysis for combinations of different numerical methods for solving systems of differential equations. The author proves that combinations of two convergent linear multistep methods or Runge-Kutta methods produce a new convergent method of which the order is equal to the smaller order of the two original methods.

  1. Network analysis of wildfire transmission and implications for risk governance

    PubMed Central

    Ager, Alan A.; Evers, Cody R.; Day, Michelle A.; Preisler, Haiganoush K.; Barros, Ana M. G.; Nielsen-Pincus, Max

    2017-01-01

    We characterized wildfire transmission and exposure within a matrix of large land tenures (federal, state, and private) surrounding 56 communities within a 3.3 million ha fire prone region of central Oregon US. Wildfire simulation and network analysis were used to quantify the exchange of fire among land tenures and communities and analyze the relative contributions of human versus natural ignitions to wildfire exposure. Among the land tenures examined, the area burned by incoming fires averaged 57% of the total burned area. Community exposure from incoming fires ignited on surrounding land tenures accounted for 67% of the total area burned. The number of land tenures contributing wildfire to individual communities and surrounding wildland urban interface (WUI) varied from 3 to 20. Community firesheds, i.e. the area where ignitions can spawn fires that can burn into the WUI, covered 40% of the landscape, and were 5.5 times larger than the combined area of the community core and WUI. For the major land tenures within the study area, the amount of incoming versus outgoing fire was relatively constant, with some exceptions. The study provides a multi-scale characterization of wildfire networks within a large, mixed tenure and fire prone landscape, and illustrates the connectivity of risk between communities and the surrounding wildlands. We use the findings to discuss how scale mismatches in local wildfire governance result from disconnected planning systems and disparate fire management objectives among the large landowners (federal, state, private) and local communities. Local and regional risk planning processes can adopt our concepts and methods to better define and map the scale of wildfire risk from large fire events and incorporate wildfire network and connectivity concepts into risk assessments. PMID:28257416

  2. Sensitivity analysis of a two-dimensional quantitative microbiological risk assessment: keeping variability and uncertainty separated.

    PubMed

    Busschaert, Pieter; Geeraerd, Annemie H; Uyttendaele, Mieke; Van Impe, Jan F

    2011-08-01

    The aim of quantitative microbiological risk assessment is to estimate the risk of illness caused by the presence of a pathogen in a food type, and to study the impact of interventions. Because of inherent variability and uncertainty, risk assessments are generally conducted stochastically, and if possible it is advised to characterize variability separately from uncertainty. Sensitivity analysis allows to indicate to which of the input variables the outcome of a quantitative microbiological risk assessment is most sensitive. Although a number of methods exist to apply sensitivity analysis to a risk assessment with probabilistic input variables (such as contamination, storage temperature, storage duration, etc.), it is challenging to perform sensitivity analysis in the case where a risk assessment includes a separate characterization of variability and uncertainty of input variables. A procedure is proposed that focuses on the relation between risk estimates obtained by Monte Carlo simulation and the location of pseudo-randomly sampled input variables within the uncertainty and variability distributions. Within this procedure, two methods are used-that is, an ANOVA-like model and Sobol sensitivity indices-to obtain and compare the impact of variability and of uncertainty of all input variables, and of model uncertainty and scenario uncertainty. As a case study, this methodology is applied to a risk assessment to estimate the risk of contracting listeriosis due to consumption of deli meats.

  3. Risk-benefit analysis: from a logical point of view.

    PubMed

    Spielthenner, Georg

    2012-06-01

    In this paper I am concerned with risk-benefit analysis; that is, the comparison of the risks of a situation to its related benefits. We all face such situations in our daily lives and they are very common in medicine too, where risk-benefit analysis has become an important tool for rational decision-making. This paper explores risk-benefit analysis from a logical point of view. In particular, it seeks a better understanding of the common view that decisions should be made by weighing risks against benefits and that an option should be chosen if its benefits outweigh its risks. I devote a good deal of this paper scrutinizing this popular view. Specifically, I demonstrate that this mode of reasoning is logically faulty if "risk" and "benefit" are taken in their absolute sense. But I also show that arguing in favour of an action because its benefits outweigh its risks can be valid if we refer to incremental risks and benefits.

  4. Trial Sequential Methods for Meta-Analysis

    ERIC Educational Resources Information Center

    Kulinskaya, Elena; Wood, John

    2014-01-01

    Statistical methods for sequential meta-analysis have applications also for the design of new trials. Existing methods are based on group sequential methods developed for single trials and start with the calculation of a required information size. This works satisfactorily within the framework of fixed effects meta-analysis, but conceptual…

  5. Uncertainties in Cancer Risk Coefficients for Environmental Exposure to Radionuclides. An Uncertainty Analysis for Risk Coefficients Reported in Federal Guidance Report No. 13

    SciTech Connect

    Pawel, David; Leggett, Richard Wayne; Eckerman, Keith F; Nelson, Christopher

    2007-01-01

    Federal Guidance Report No. 13 (FGR 13) provides risk coefficients for estimation of the risk of cancer due to low-level exposure to each of more than 800 radionuclides. Uncertainties in risk coefficients were quantified in FGR 13 for 33 cases (exposure to each of 11 radionuclides by each of three exposure pathways) on the basis of sensitivity analyses in which various combinations of plausible biokinetic, dosimetric, and radiation risk models were used to generate alternative risk coefficients. The present report updates the uncertainty analysis in FGR 13 for the cases of inhalation and ingestion of radionuclides and expands the analysis to all radionuclides addressed in that report. The analysis indicates that most risk coefficients for inhalation or ingestion of radionuclides are determined within a factor of 5 or less by current information. That is, application of alternate plausible biokinetic and dosimetric models and radiation risk models (based on the linear, no-threshold hypothesis with an adjustment for the dose and dose rate effectiveness factor) is unlikely to change these coefficients by more than a factor of 5. In this analysis the assessed uncertainty in the radiation risk model was found to be the main determinant of the uncertainty category for most risk coefficients, but conclusions concerning the relative contributions of risk and dose models to the total uncertainty in a risk coefficient may depend strongly on the method of assessing uncertainties in the risk model.

  6. Methods of DNA methylation analysis.

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The purpose of this review was to provide guidance for investigators who are new to the field of DNA methylation analysis. Epigenetics is the study of mitotically heritable alterations in gene expression potential that are not mediated by changes in DNA sequence. Recently, it has become clear that n...

  7. Bisphosphonates and Risk of Cardiovascular Events: A Meta-Analysis

    PubMed Central

    Kim, Dae Hyun; Rogers, James R.; Fulchino, Lisa A.; Kim, Caroline A.; Solomon, Daniel H.; Kim, Seoyoung C.

    2015-01-01

    Background and Objectives Some evidence suggests that bisphosphonates may reduce atherosclerosis, while concerns have been raised about atrial fibrillation. We conducted a meta-analysis to determine the effects of bisphosphonates on total adverse cardiovascular (CV) events, atrial fibrillation, myocardial infarction (MI), stroke, and CV death in adults with or at risk for low bone mass. Methods A systematic search of MEDLINE and EMBASE through July 2014 identified 58 randomized controlled trials with longer than 6 months in duration that reported CV events. Absolute risks and the Mantel-Haenszel fixed-effects odds ratios (ORs) and 95% confidence intervals (CIs) of total CV events, atrial fibrillation, MI, stroke, and CV death were estimated. Subgroup analyses by follow-up duration, population characteristics, bisphosphonate types, and route were performed. Results Absolute risks over 25–36 months in bisphosphonate-treated versus control patients were 6.5% versus 6.2% for total CV events; 1.4% versus 1.5% for atrial fibrillation; 1.0% versus 1.2% for MI; 1.6% versus 1.9% for stroke; and 1.5% versus 1.4% for CV death. Bisphosphonate treatment up to 36 months did not have any significant effects on total CV events (14 trials; ORs [95% CI]: 0.98 [0.84–1.14]; I2 = 0.0%), atrial fibrillation (41 trials; 1.08 [0.92–1.25]; I2 = 0.0%), MI (10 trials; 0.96 [0.69–1.34]; I2 = 0.0%), stroke (10 trials; 0.99 [0.82–1.19]; I2 = 5.8%), and CV death (14 trials; 0.88 [0.72–1.07]; I2 = 0.0%) with little between-study heterogeneity. The risk of atrial fibrillation appears to be modestly elevated for zoledronic acid (6 trials; 1.24 [0.96–1.61]; I2 = 0.0%), not for oral bisphosphonates (26 trials; 1.02 [0.83–1.24]; I2 = 0.0%). The CV effects did not vary by subgroups or study quality. Conclusions Bisphosphonates do not have beneficial or harmful effects on atherosclerotic CV events, but zoledronic acid may modestly increase the risk of atrial fibrillation. Given the large

  8. A quantitative risk analysis approach to port hydrocarbon logistics.

    PubMed

    Ronza, A; Carol, S; Espejo, V; Vílchez, J A; Arnaldos, J

    2006-01-16

    A method is presented that allows quantitative risk analysis to be performed on marine hydrocarbon terminals sited in ports. A significant gap was identified in the technical literature on QRA for the handling of hazardous materials in harbours published prior to this work. The analysis is extended to tanker navigation through port waters and loading and unloading facilities. The steps of the method are discussed, beginning with data collecting. As to accident scenario identification, an approach is proposed that takes into account minor and massive spills due to loading arm failures and tank rupture. Frequency estimation is thoroughly reviewed and a shortcut approach is proposed for frequency calculation. This allows for the two-fold possibility of a tanker colliding/grounding at/near the berth or while navigating to/from the berth. A number of probability data defining the possibility of a cargo spill after an external impact on a tanker are discussed. As to consequence and vulnerability estimates, a scheme is proposed for the use of ratios between the numbers of fatal victims, injured and evacuated people. Finally, an example application is given, based on a pilot study conducted in the Port of Barcelona, where the method was tested.

  9. A Framework for Flood Risk Analysis and Benefit Assessment of Flood Control Measures in Urban Areas

    PubMed Central

    Li, Chaochao; Cheng, Xiaotao; Li, Na; Du, Xiaohe; Yu, Qian; Kan, Guangyuan

    2016-01-01

    Flood risk analysis is more complex in urban areas than that in rural areas because of their closely packed buildings, different kinds of land uses, and large number of flood control works and drainage systems. The purpose of this paper is to propose a practical framework for flood risk analysis and benefit assessment of flood control measures in urban areas. Based on the concept of disaster risk triangle (hazard, vulnerability and exposure), a comprehensive analysis method and a general procedure were proposed for urban flood risk analysis. Urban Flood Simulation Model (UFSM) and Urban Flood Damage Assessment Model (UFDAM) were integrated to estimate the flood risk in the Pudong flood protection area (Shanghai, China). S-shaped functions were adopted to represent flood return period and damage (R-D) curves. The study results show that flood control works could significantly reduce the flood risk within the 66-year flood return period and the flood risk was reduced by 15.59%. However, the flood risk was only reduced by 7.06% when the flood return period exceeded 66-years. Hence, it is difficult to meet the increasing demands for flood control solely relying on structural measures. The R-D function is suitable to describe the changes of flood control capacity. This frame work can assess the flood risk reduction due to flood control measures, and provide crucial information for strategy development and planning adaptation. PMID:27527202

  10. A Framework for Flood Risk Analysis and Benefit Assessment of Flood Control Measures in Urban Areas.

    PubMed

    Li, Chaochao; Cheng, Xiaotao; Li, Na; Du, Xiaohe; Yu, Qian; Kan, Guangyuan

    2016-08-05

    Flood risk analysis is more complex in urban areas than that in rural areas because of their closely packed buildings, different kinds of land uses, and large number of flood control works and drainage systems. The purpose of this paper is to propose a practical framework for flood risk analysis and benefit assessment of flood control measures in urban areas. Based on the concept of disaster risk triangle (hazard, vulnerability and exposure), a comprehensive analysis method and a general procedure were proposed for urban flood risk analysis. Urban Flood Simulation Model (UFSM) and Urban Flood Damage Assessment Model (UFDAM) were integrated to estimate the flood risk in the Pudong flood protection area (Shanghai, China). S-shaped functions were adopted to represent flood return period and damage (R-D) curves. The study results show that flood control works could significantly reduce the flood risk within the 66-year flood return period and the flood risk was reduced by 15.59%. However, the flood risk was only reduced by 7.06% when the flood return period exceeded 66-years. Hence, it is difficult to meet the increasing demands for flood control solely relying on structural measures. The R-D function is suitable to describe the changes of flood control capacity. This frame work can assess the flood risk reduction due to flood control measures, and provide crucial information for strategy development and planning adaptation.

  11. Vulnerabilities, Influences and Interaction Paths: Failure Data for Integrated System Risk Analysis

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land

    2006-01-01

    We describe graph-based analysis methods for identifying and analyzing cross-subsystem interaction risks from subsystem connectivity information. By discovering external and remote influences that would be otherwise unexpected, these methods can support better communication among subsystem designers at points of potential conflict and to support design of more dependable and diagnosable systems. These methods identify hazard causes that can impact vulnerable functions or entities if propagated across interaction paths from the hazard source to the vulnerable target. The analysis can also assess combined impacts of And-Or trees of disabling influences. The analysis can use ratings of hazards and vulnerabilities to calculate cumulative measures of the severity and importance. Identification of cross-subsystem hazard-vulnerability pairs and propagation paths across subsystems will increase coverage of hazard and risk analysis and can indicate risk control and protection strategies.

  12. DOE 2009 Geothermal Risk Analysis: Methodology and Results (Presentation)

    SciTech Connect

    Young, K. R.; Augustine, C.; Anderson, A.

    2010-02-01

    This presentation summarizes the methodology and results for a probabilistic risk analysis of research, development, and demonstration work-primarily for enhanced geothermal systems (EGS)-sponsored by the U.S. Department of Energy Geothermal Technologies Program.

  13. American Airlines Propeller STOL Transport Economic Risk Analysis

    NASA Technical Reports Server (NTRS)

    Ransone, B.

    1972-01-01

    A Monte Carlo risk analysis on the economics of STOL transports in air passenger traffic established the probability of making the expected internal rate of financial return, or better, in a hypothetical regular Washington/New York intercity operation.

  14. Quantitative landslide risk analysis Examples from Bíldudalur, NW-Iceland

    NASA Astrophysics Data System (ADS)

    Bell, R.; Glade, T.

    2003-04-01

    Risk analysis, risk evaluation and risk management are integrated in the holistic concept of risk assessment. Internationally, various quantitative, semiquantitative and qualitative approaches exist to analyse the risk to life and/or the economic risk caused by landslides. In Iceland, a method to carry out snow avalanche risk analysis was developed in 1999, followed by rough guidelines on how to integrate results from landslide hazard assessments into a comprehensive landslide and snow avalanche risk assessment in 2002. The Icelandic regulation on hazard zoning due to snow avalanches and landslides, issued by the Icelandic Ministry of the Environment in the year 2000, aims to prevent people living or working within the areas most at risk, until 2010. The regulation requires to carry out landslide and snow avalanche risk analyses, however, an approach to calculate landslide risk in detail is still missing. Therefore, the ultimate goal of this study is to develop such a method and apply it in Bildudalur, NW-Iceland. Within this presentation, the risk analysis focuses on the risks to loose life. To calculate landslide risk, the spatial and temporal probability of occurrence of potential damaging events, as well as the distribution of the elements at risk in space and time, under the consideration of changing vulnerabilities must be determined. Based on existent debris flow and rock fall run-out maps, hazard maps are derived and the respective risks are calculated. Already digitized elements at risk (people in houses) are verified and updated. The damage potential (the number of all of the people living or working at a specific location), derived from official statistics and own investigations, are attributed to each house. The vulnerability of the elements at risk is mainly based on literature studies. The probability of spatial impact (i.e. of the hazardous event impacting a building) is estimated using benchmarks given in literature, results from field

  15. Comparison of 3 methods for identifying dietary patterns associated with risk of disease.

    PubMed

    DiBello, Julia R; Kraft, Peter; McGarvey, Stephen T; Goldberg, Robert; Campos, Hannia; Baylin, Ana

    2008-12-15

    Reduced rank regression and partial least-squares regression (PLS) are proposed alternatives to principal component analysis (PCA). Using all 3 methods, the authors derived dietary patterns in Costa Rican data collected on 3,574 cases and controls in 1994-2004 and related the resulting patterns to risk of first incident myocardial infarction. Four dietary patterns associated with myocardial infarction were identified. Factor 1, characterized by high intakes of lean chicken, vegetables, fruit, and polyunsaturated oil, was generated by all 3 dietary pattern methods and was associated with a significantly decreased adjusted risk of myocardial infarction (28%-46%, depending on the method used). PCA and PLS also each yielded a pattern associated with a significantly decreased risk of myocardial infarction (31% and 23%, respectively); this pattern was characterized by moderate intake of alcohol and polyunsaturated oil and low intake of high-fat dairy products. The fourth factor derived from PCA was significantly associated with a 38% increased risk of myocardial infarction and was characterized by high intakes of coffee and palm oil. Contrary to previous studies, the authors found PCA and PLS to produce more patterns associated with cardiovascular disease than reduced rank regression. The most effective method for deriving dietary patterns related to disease may vary depending on the study goals.

  16. Handbook of methods for risk-based analyses of technical specifications

    SciTech Connect

    Samanta, P.K.; Kim, I.S.; Mankamo, T.; Vesely, W.E.

    1994-12-01

    Technical Specifications (TS) requirements for nuclear power plants define the Limiting Conditions for Operation (LCOs) and Surveillance Requirements (SRs) to assure safety during operation. In general, these requirements are based on deterministic analysis and engineering judgments. Experiences with plant operation indicate that some elements of the requirements are unnecessarily restrictive, while a few may not be conducive to safety. The US Nuclear Regulatory Commission (USNRC) Office of Research has sponsored research to develop systematic risk-based methods to improve various aspects of TS requirements. This handbook summarizes these risk-based methods. The scope of the handbook includes reliability and risk-based methods for evaluating allowed outage times (AOTs), scheduled or preventive maintenance, action statements requiring shutdown where shutdown risk may be substantial, surveillance test intervals (STIs), and management of plant configurations resulting from outages of systems, or components. For each topic, the handbook summarizes analytic methods with data needs, outlines the insights to be gained, lists additional references, and gives examples of evaluations.

  17. Hybrid methods for cybersecurity analysis :

    SciTech Connect

    Davis, Warren Leon,; Dunlavy, Daniel M.

    2014-01-01

    Early 2010 saw a signi cant change in adversarial techniques aimed at network intrusion: a shift from malware delivered via email attachments toward the use of hidden, embedded hyperlinks to initiate sequences of downloads and interactions with web sites and network servers containing malicious software. Enterprise security groups were well poised and experienced in defending the former attacks, but the new types of attacks were larger in number, more challenging to detect, dynamic in nature, and required the development of new technologies and analytic capabilities. The Hybrid LDRD project was aimed at delivering new capabilities in large-scale data modeling and analysis to enterprise security operators and analysts and understanding the challenges of detection and prevention of emerging cybersecurity threats. Leveraging previous LDRD research e orts and capabilities in large-scale relational data analysis, large-scale discrete data analysis and visualization, and streaming data analysis, new modeling and analysis capabilities were quickly brought to bear on the problems in email phishing and spear phishing attacks in the Sandia enterprise security operational groups at the onset of the Hybrid project. As part of this project, a software development and deployment framework was created within the security analyst work ow tool sets to facilitate the delivery and testing of new capabilities as they became available, and machine learning algorithms were developed to address the challenge of dynamic threats. Furthermore, researchers from the Hybrid project were embedded in the security analyst groups for almost a full year, engaged in daily operational activities and routines, creating an atmosphere of trust and collaboration between the researchers and security personnel. The Hybrid project has altered the way that research ideas can be incorporated into the production environments of Sandias enterprise security groups, reducing time to deployment from months and

  18. USAWC Coronary Risk and Fitness Analysis

    DTIC Science & Technology

    1980-06-04

    hazardous chemicals or other to an increased risk of cancer of the (especially high fat diets) and bowel substances. mouth, throat , larynx (voice box...There is no sure technique lot death rate 60-80% greater than of the mouth, throat , larynx . giving up smoking People Nnmoke nonsmokers. They are more...mouth, throat . larynx . giving up smoking. People smoke nonsmokers. They are more likely esophagus, pancreas. and bladder. for different reasons and what

  19. Employing the radiological and nuclear risk assessment methods (RNRAM) for assessing radiological and nuclear detection architectures

    SciTech Connect

    Brigantic, Robert T.; Eddy, Ryan R.

    2014-03-20

    The United States Department of Homeland Security’s Domestic Nuclear Detection Office (DNDO) is charged with implementing domestic nuclear detection efforts to protect the U.S. from radiological and nuclear threats. DNDO is also responsible for coordinating the development of the Global Nuclear Detection Architecture (GNDA). DNDO utilizes a unique risk analysis tool to conduct a holistic risk assessment of the GNDA known as the Radiological and Nuclear Risk Assessment Methods (RNRAM). The capabilities of this tool will be used to support internal DNDO analyses and has also been used for other entities such as the International Atomic Energy Agency. The model uses a probabilistic risk assessment methodology and includes the ability to conduct a risk assessment of the effectiveness of layered architectures in the GNDA against an attack by an intelligent, adaptive adversary. This paper overviews the basic structure, capabilities, and use of RNRAM as used to assess different architectures and how various risk components are calculated through a series of interconnected modules. Also highlighted is flexible structure of RNRAM which can accommodate new modules in order to examine a variety of threat detection architectures and concepts.

  20. [Assessment of non-ionic contrast agents in reducing the risk of side effects: analysis on the basis of voluntary willingness-to-pay measured by the contingent valuation method].

    PubMed

    Sugimura, K

    2000-01-01

    The benefit of replacing ionic contrast agents with non-ionic ones was assessed by employing cost-benefit analysis, a method of medical economic analysis. The benefit derived from replacing ionic with non-ionic contrast agents was assessed by a questionnaire survey of patients using the willingness-to-pay method based on the contingent valuation method. This questionnaire survey was conducted on 204 patients in Shimane Medical University Hospital during the period from October to December 1998. The result of analysis showed that when ionic contrast agents are replaced with non-ionic ones, patients' willingness-to-pay stands at a median value of 12,500 yen and a mean value of 17,082 +/- 1,049 yen. These figures are identical with or larger than the NHI-price differences (12,266-14,234 yen; average 13,287 yen), suggesting that patients think the benefit of reduced side effects from non-ionic contrast agents has a value that is equal to or higher than the actual NHI-prices of these agents. Further, analysis of patient profiles indicated that patients' willingness-to-pay went up with age and income but decreased when age exceeded 60 years, a finding which also suggests that the willingness-to-pay amount is closely related to the economic strength of patients.

  1. Environmental risk assessment in GMO analysis.

    PubMed

    Pirondini, Andrea; Marmiroli, Nelson

    2010-01-01

    Genetically modified or engineered organisms (GMOs, GEOs) are utilised in agriculture, expressing traits of interest, such as insect or herbicide resistance. Soybean, maize, cotton and oilseed rape are the GM crops with the largest acreage in the world. The distribution of GM acreage in the different countries is related with the different positions concerning labelling of GMO products: based on the principle of substantial equivalence, or rather based on the precautionary principle. The paper provides an overview on how the risks associated with release of GMO in the environments can be analysed and predicted, in view of a possible coexistence of GM and non-GM organisms in agriculture.Risk assessment procedures, both qualitative and quantitative, are compared in the context of application to GMOs considering also legislation requirements (Directive 2001/18/EC). Criteria and measurable properties to assess harm for human health and environmental safety are listed, and the possible consequences are evaluated in terms of significance.Finally, a mapping of the possible risks deriving from GMO release is reported, focusing on gene transfer to related species, horizontal gene transfer, direct and indirect effects on non target organisms, development of resistance in target organisms, and effects on biodiversity.

  2. Environmental risk assessment in GMO analysis.

    PubMed

    Pirondini, Andrea; Marmiroli, Nelson

    2008-01-01

    Genetically modified or engineered organisms (GMOs, GEOs) are utilised in agriculture, expressing traits of interest, such as insect or herbicide resistance. Soybean, maize, cotton and oilseed rape are the GM crops with the largest acreage in the world. The distribution of GM acreage in the different countries is related with the different positions concerning labelling of GMO products: based on the principle of substantial equivalence, or rather based on the precautionary principle. The paper provides an overview on how the risks associated with release of GMO in the environments can be analysed and predicted, in view of a possible coexistence of GM and non-GM organisms in agriculture.Risk assessment procedures, both qualitative and quantitative, are compared in the context of application to GMOs considering also legislation requirements (Directive 2001/18/EC). Criteria and measurable properties to assess harm for human health and environmental safety are listed, and the possible consequences are evaluated in terms of significance.Finally, a mapping of the possible risks deriving from GMO release is reported, focusing on gene transfer to related species, horizontal gene transfer, direct and indirect effects on non target organisms, development of resistance in target organisms, and effects on biodiversity.

  3. Anticipating risk for human subjects participating in clinical research: application of Failure Mode and Effects Analysis.

    PubMed

    Cody, Robert J

    2006-03-01

    Failure Mode and Effects Analysis (FMEA) is a method applied in various industries to anticipate and mitigate risk. This methodology can be more systematically applied to the protection of human subjects in research. The purpose of FMEA is simple: prevent problems before they occur. By applying FMEA process analysis to the elements of a specific research protocol, the failure severity, occurrence, and detection rates can be estimated for calculation of a "risk priority number" (RPN). Methods can then be identified to reduce the RPN to levels where the risk/benefit ratio favors human subject benefit, to a greater magnitude than existed in the pre-analysis risk profile. At the very least, the approach provides a checklist of issues that can be individualized for specific research protocols or human subject populations.

  4. Developing risk management behaviours for nurses through medication incident analysis.

    PubMed

    Johnson, Maree; Tran, Duong Thuy; Young, Helen

    2011-12-01

    The aim of this study was to define risk management behaviours related to medication safety. Mixed methods were used to analyze 318 nursing related medication incidents reported in an Australian metropolitan hospital. Most incidents did not result in patient harm (93%). Omission of medications was the most frequent often related to patient absences from the unit or nurses failing to sign for medications. Thematic analysis resulted in the Medication Safety Subscales including 29 behavioural statements within three domains-administering medications, storage and management of medications, managing adverse events related to medications. The Medication Safety Subscales can be used by managers, educators and clinicians to reinforce the importance of medication safety. Early action by nurses may reduce patient injury.

  5. The semantic distinction between "risk" and "danger": a linguistic analysis.

    PubMed

    Boholm, Max

    2012-02-01

    The analysis combines frame semantic and corpus linguistic approaches in analyzing the role of agency and decision making in the semantics of the words "risk" and "danger" (both nominal and verbal uses). In frame semantics, the meanings of "risk" and of related words, such as "danger," are analyzed against the background of a specific cognitive-semantic structure (a frame) comprising frame elements such as Protagonist, Bad Outcome, Decision, Possession, and Source. Empirical data derive from the British National Corpus (100 million words). Results indicate both similarities and differences in use. First, both "risk" and "danger" are commonly used to represent situations having potential negative consequences as the result of agency. Second, "risk" and "danger," especially their verbal uses (to risk, to endanger), differ in agent-victim structure, i.e., "risk" is used to express that a person affected by an action is also the agent of the action, while "endanger" is used to express that the one affected is not the agent. Third, "risk," but not "danger," tends to be used to represent rational and goal-directed action. The results therefore to some extent confirm the analysis of "risk" and "danger" suggested by German sociologist Niklas Luhmann. As a point of discussion, the present findings arguably have implications for risk communication.

  6. Analysis of dysphagia risk using the modified dysphagia risk assessment for the community-dwelling elderly

    PubMed Central

    Byeon, Haewon

    2016-01-01

    [Purpose] The elderly are susceptible to dysphagia, and complications can be minimized if high-risk groups are screened in early stages and properly rehabilitated. This study provides basic material for the early detection and prevention of dysphagia by investigating the risks of dysphagia and related factors in community-dwelling elders. [Subjects and Methods] Participants included 325 community-dwelling elderly people aged 65 or older. The modified dysphagia risk assessment for the community-dwelling elderly was used to assess dysphagia risk. [Results] Approximately 52.6% (n=171) of participants belonged to the high-risk group for dysphagia. After adjusting for confounding variables, people aged 75+, who used dentures, and who needed partial help in daily living had a significantly higher risk of dysphagia. [Conclusion] It is necessary to develop guidelines for dysphagia for early detection and rehabilitation. PMID:27799680

  7. Microparticle analysis system and method

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R. (Inventor)

    2007-01-01

    A device for analyzing microparticles is provided which includes a chamber with an inlet and an outlet for respectively introducing and dispensing a flowing fluid comprising microparticles, a light source for providing light through the chamber and a photometer for measuring the intensity of light transmitted through individual microparticles. The device further includes an imaging system for acquiring images of the fluid. In some cases, the device may be configured to identify and determine a quantity of the microparticles within the fluid. Consequently, a method for identifying and tracking microparticles in motion is contemplated herein. The method involves flowing a fluid comprising microparticles in laminar motion through a chamber, transmitting light through the fluid, measuring the intensities of the light transmitted through the microparticles, imaging the fluid a plurality of times and comparing at least some of the intensities of light between different images of the fluid.

  8. Risk analysis of analytical validations by probabilistic modification of FMEA.

    PubMed

    Barends, D M; Oldenhof, M T; Vredenbregt, M J; Nauta, M J

    2012-05-01

    Risk analysis is a valuable addition to validation of an analytical chemistry process, enabling not only detecting technical risks, but also risks related to human failures. Failure Mode and Effect Analysis (FMEA) can be applied, using a categorical risk scoring of the occurrence, detection and severity of failure modes, and calculating the Risk Priority Number (RPN) to select failure modes for correction. We propose a probabilistic modification of FMEA, replacing the categorical scoring of occurrence and detection by their estimated relative frequency and maintaining the categorical scoring of severity. In an example, the results of traditional FMEA of a Near Infrared (NIR) analytical procedure used for the screening of suspected counterfeited tablets are re-interpretated by this probabilistic modification of FMEA. Using this probabilistic modification of FMEA, the frequency of occurrence of undetected failure mode(s) can be estimated quantitatively, for each individual failure mode, for a set of failure modes, and the full analytical procedure.

  9. Aircraft accidents : method of analysis

    NASA Technical Reports Server (NTRS)

    1931-01-01

    The revised report includes the chart for the analysis of aircraft accidents, combining consideration of the immediate causes, underlying causes, and results of accidents, as prepared by the special committee, with a number of the definitions clarified. A brief statement of the organization and work of the special committee and of the Committee on Aircraft Accidents; and statistical tables giving a comparison of the types of accidents and causes of accidents in the military services on the one hand and in civil aviation on the other, together with explanations of some of the important differences noted in these tables.

  10. Risk Analysis for Unintentional Slide Deployment During Airline Operations.

    PubMed

    Ayra, Eduardo S; Insua, David Ríos; Castellanos, María Eugenia; Larbi, Lydia

    2015-09-01

    We present a risk analysis undertaken to mitigate problems in relation to the unintended deployment of slides under normal operations within a commercial airline. This type of incident entails relevant costs for the airline industry. After assessing the likelihood and severity of its consequences, we conclude that such risks need to be managed. We then evaluate the effectiveness of various countermeasures, describing and justifying the chosen ones. We also discuss several issues faced when implementing and communicating the proposed measures, thus fully illustrating the risk analysis process.

  11. Analysis of Risk Management in Adapted Physical Education Textbooks

    ERIC Educational Resources Information Center

    Murphy, Kelle L.; Donovan, Jacqueline B.; Berg, Dominck A.

    2016-01-01

    Physical education teacher education (PETE) programs vary on how the topics of safe teaching and risk management are addressed. Common practices to cover such issues include requiring textbooks, lesson planning, peer teaching, videotaping, reflecting, and reading case law analyses. We used a mixed methods design to examine how risk management is…

  12. FIFRA Peer Review: Proposed Risk Assessment Methods Process

    EPA Pesticide Factsheets

    From September 11-14, 2012, EPA participated in a Federal Insecticide, Fungicide and Rodenticide Act Scientific Advisory Panel (SAP) meeting on a proposed pollinator risk assessment framework for determining the potential risks of pesticides to honey bees.

  13. Land Use Adaptation Strategies Analysis in Landslide Risk Region

    NASA Astrophysics Data System (ADS)

    Lin, Yu-Ching; Chang, Chin-Hsin; Chen, Ying-Tung

    2013-04-01

    In order to respond to the impact of climate and environmental change on Taiwanese mountain region, this study used GTZ (2004) Risk analysis guidelines to assess the landslide risk for 178 Taiwanese mountain towns. This study used 7 indicators to assess landslide risk, which are rainfall distribution, natural environment vulnerability (e.g., rainfall threshold criterion for debris flow, historical disaster frequency, landslide ratio, and road density), physicality vulnerability (e.g., population density) and socio-economic vulnerability (e.g., population with higher education, death rate and income). The landslide risk map can be obtained by multiplying 7 indicators together and ranking the product. The map had 5 risk ranges, and towns within the range of 4 to 5, which are high landslide risk regions, and have high priority in reducing risk. This study collected the regions with high landslide risk regions and analyzed the difference after Typhoon Morakot (2009). The spatial distribution showed that after significant environmental damage high landslide risk regions moved from central to south Taiwan. The changeable pattern of risk regions pointed out the necessity of updating the risk map periodically. Based on the landslide risk map and the land use investigation data which was provided by the National Land Surveying and Mapping Center in 2007, this study calculated the size of the land use area with landslide disaster risk. According to the above results and discussion, this study can be used to suggest appropriate land use adaptation strategies provided for reducing landslide risk under the impact of climate and environmental change.

  14. Climate change, land slide risks and sustainable development, risk analysis and decision support process tool

    NASA Astrophysics Data System (ADS)

    Andersson-sköld, Y. B.; Tremblay, M.

    2011-12-01

    Climate change is in most parts of Sweden expected to result in increased precipitation and increased sea water levels causing flooding, erosion, slope instability and related secondary consequences. Landslide risks are expected to increase with climate change in large parts of Sweden due to increased annual precipitation, more intense precipitation and increased flows combined with dryer summers. In response to the potential climate related risks, and on the commission of the Ministry of Environment, the Swedish Geotechnical Institute (SGI) is at present performing a risk analysis project for the most prominent landslide risk area in Sweden: the Göta river valley. As part of this, a methodology for land slide ex-ante consequence analysis today, and in a future climate, has been developed and applied in the Göta river valley. Human life, settlements, industry, contaminated sites, infrastructure of national importance are invented and assessed important elements at risk. The goal of the consequence analysis is to produce a map of geographically distributed expected losses, which can be combined with a corresponding map displaying landslide probability to describe the risk (the combination of probability and consequence of a (negative) event). The risk analysis is GIS-aided in presenting and visualise the risk and using existing databases for quantification of the consequences represented by ex-ante estimated monetary losses. The results will be used on national, regional and as an indication of the risk on local level, to assess the need of measures to mitigate the risk. The costs and environmental and social impacts to mitigate the risk are expected to be very high but the costs and impacts of a severe landslide are expected to be even higher. Therefore, civil servants have pronounced a need of tools to assess both the vulnerability and a more holistic picture of impacts of climate change adaptation measures. At SGI a tool for the inclusion of sustainability

  15. Environmental risk analysis of hazardous material rail transportation.

    PubMed

    Saat, Mohd Rapik; Werth, Charles J; Schaeffer, David; Yoon, Hongkyu; Barkan, Christopher P L

    2014-01-15

    An important aspect of railroad environmental risk management involves tank car transportation of hazardous materials. This paper describes a quantitative, environmental risk analysis of rail transportation of a group of light, non-aqueous-phase liquid (LNAPL) chemicals commonly transported by rail in North America. The Hazardous Materials Transportation Environmental Consequence Model (HMTECM) was used in conjunction with a geographic information system (GIS) analysis of environmental characteristics to develop probabilistic estimates of exposure to different spill scenarios along the North American rail network. The risk analysis incorporated the estimated clean-up cost developed using the HMTECM, route-specific probability distributions of soil type and depth to groundwater, annual traffic volume, railcar accident rate, and tank car safety features, to estimate the nationwide annual risk of transporting each product. The annual risk per car-mile (car-km) and per ton-mile (ton-km) was also calculated to enable comparison between chemicals and to provide information on the risk cost associated with shipments of these products. The analysis and the methodology provide a quantitative approach that will enable more effective management of the environmental risk of transporting hazardous materials.

  16. Poultry consumption and prostate cancer risk: a meta-analysis

    PubMed Central

    He, Qian; Wan, Zheng-ce; Xu, Xiao-bing; Wu, Jing

    2016-01-01

    Background. Several kinds of foods are hypothesized to be potential factors contributing to the variation of prostate cancer (PCa) incidence. But the effect of poultry on PCa is still inconsistent and no quantitative assessment has been published up to date. So we conducted this meta-analysis to clarify the association between them. Materials and Methods. We conducted a literature search of PubMed and Embase for studies examining the association between poultry consumption and PCa up to June, 2015. Pooled risk ratio (RR) and corresponding 95% confidence interval (CI) of the highest versus lowest poultry consumption categories were calculated by fixed-effect model or random-effect model. Results. A total of 27 (12 cohort and 15 case-control) studies comprising 23,703 cases and 469,986 noncases were eligible for inclusion. The summary RR of total PCa incidence was 1.03 (95% CI [0.95–1.11]) for the highest versus lowest categories of poultry intake. The heterogeneity between studies was not statistically significant (P = 0.768, I2 = 28.5%). Synthesized analysis of 11 studies on high stage PCa and 8 studies on chicken exposure also demonstrated null association. We also did not obtain significant association in the subgroup of cohort study (RR = 1.04, 95% CI [0.98–1.10]), as well as in the subgroups of population-based case-control study and hospital-based case-control study. Then the studies were divided into three geographic groups: Western countries, Asia and South America. The pooled RRs in these areas did not reveal statistically significant association between poultry and PCa. Conclusions. This meta-analysis suggests no association between poultry consumption and PCa risk. Further well-designed studies are warranted to confirm the result. PMID:26855875

  17. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    DOEpatents

    West, Phillip B.; Novascone, Stephen R.; Wright, Jerry P.

    2011-09-27

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  18. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    DOEpatents

    West, Phillip B [Idaho Falls, ID; Novascone, Stephen R [Idaho Falls, ID; Wright, Jerry P [Idaho Falls, ID

    2012-05-29

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  19. Risk Assessment of Infrastructure System of Systems with Precursor Analysis.

    PubMed

    Guo, Zhenyu; Haimes, Yacov Y

    2016-08-01

    Physical infrastructure systems are commonly composed of interconnected and interdependent subsystems, which in their essence constitute system of systems (S-o-S). System owners and policy researchers need tools to foresee potential emergent forced changes and to understand their impact so that effective risk management strategies can be developed. We develop a systemic framework for precursor analysis to support the design of an effective and efficient precursor monitoring and decision support system with the ability to (i) identify and prioritize indicators of evolving risks of system failure; and (ii) evaluate uncertainties in precursor analysis to support informed and rational decision making. This integrated precursor analysis framework is comprised of three processes: precursor identification, prioritization, and evaluation. We use an example of a highway bridge S-o-S to demonstrate the theories and methodologies of the framework. Bridge maintenance processes involve many interconnected and interdependent functional subsystems and decision-making entities and bridge failure can have broad social and economic consequences. The precursor analysis framework, which constitutes an essential part of risk analysis, examines the impact of various bridge inspection and maintenance scenarios. It enables policy researchers and analysts who are seeking a risk perspective on bridge infrastructure in a policy setting to develop more risk informed policies and create guidelines to efficiently allocate limited risk management resources and mitigate severe consequences resulting from bridge failures.

  20. Methods for assessing risks of dermal exposures in the workplace.

    PubMed

    McDougal, James N; Boeniger, Mark F

    2002-07-01

    The skin as a route of entry for toxic chemicals has caused increasing concern over the last decade. The assessment of systemic hazards from dermal exposures has evolved over time, often limited by the amount of experimental data available. The result is that there are many methods being used to assess safety of chemicals in the workplace. The process of assessing hazards of skin contact includes estimating the amount of substance that may end up on the skin and estimating the amount that might reach internal organs. Most times, toxicology studies by the dermal route are not available and extrapolations from other exposure routes are necessary. The hazards of particular chemicals can be expressed as "skin notations", actual exposure levels, or safe exposure times. Characterizing the risk of a specific procedure in the workplace involves determining the ratio of exposure standards to an expected exposure. The purpose of this review is to address each of the steps in the process and describe the assumptions that are part of the process. Methods are compared by describing their strengths and weaknesses. Recommendations for research in this area are also included.

  1. Risk Analysis for Resource Planning Optimization

    NASA Technical Reports Server (NTRS)

    Cheung, Kar-Ming

    2008-01-01

    This paper describes a systems engineering approach to resource planning by integrating mathematical modeling and constrained optimization, empirical simulation, and theoretical analysis techniques to generate an optimal task plan in the presence of uncertainties.

  2. Component outage data analysis methods. Volume 2: Basic statistical methods

    NASA Astrophysics Data System (ADS)

    Marshall, J. A.; Mazumdar, M.; McCutchan, D. A.

    1981-08-01

    Statistical methods for analyzing outage data on major power system components such as generating units, transmission lines, and transformers are identified. The analysis methods produce outage statistics from component failure and repair data that help in understanding the failure causes and failure modes of various types of components. Methods for forecasting outage statistics for those components used in the evaluation of system reliability are emphasized.

  3. Relative risk analysis of several manufactured nanomaterials: an insurance industry context.

    PubMed

    Robichaud, Christine Ogilvie; Tanzil, Dicksen; Weilenmann, Ulrich; Wiesner, Mark R

    2005-11-15

    A relative risk assessment is presented for the industrial fabrication of several nanomaterials. The production processes for five nanomaterials were selected for this analysis, based on their current or near-term potential for large-scale production and commercialization: single-walled carbon nanotubes, bucky balls (C60), one variety of quantum dots, alumoxane nanoparticles, and nano-titanium dioxide. The assessment focused on the activities surrounding the fabrication of nanomaterials, exclusive of any impacts or risks with the nanomaterials themselves. A representative synthesis method was selected for each nanomaterial based on its potential for scaleup. A list of input materials, output materials, and waste streams for each step of fabrication was developed and entered into a database that included key process characteristics such as temperature and pressure. The physical-chemical properties and quantities of the inventoried materials were used to assess relative risk based on factors such as volatility, carcinogenicity, flammability, toxicity, and persistence. These factors were first used to qualitatively rank risk, then combined using an actuarial protocol developed by the insurance industry for the purpose of calculating insurance premiums for chemical manufacturers. This protocol ranks three categories of risk relative to a 100 point scale (where 100 represents maximum risk): incident risk, normal operations risk, and latent contamination risk. Results from this analysis determined that relative environmental risk from manufacturing each of these five materials was comparatively low in relation to other common industrial manufacturing processes.

  4. A coffee can, factor analysis, and prediction of antisocial behavior: the structure of criminal risk.

    PubMed

    Kroner, Daryl G; Mills, Jeremy F; Reddon, John R

    2005-01-01

    The predictive accuracy of the Psychopathy Checklist-Revised, Level of Service Inventory-Revised, Violence Risk Appraisal Guide, and the General Statistical Information on Recidivism were compared to four instruments randomly generated from the total pool of original items. None of the four original instruments better predicted post-release failure than the four randomly generated instruments. These results suggest two conclusions: (a) the instruments are only measuring criminal risk, and (b) no single instrument has captured sufficient risk assessment theory to result in better prediction than randomly derived instruments measuring criminal risk. A two-stage factor analysis was completed on 1614 cases. This analysis of the risk items indicated a 4-factor solution and all 4 factors were equal to the original instruments in predicting post-release failure. Thus, the original instruments did not improve prediction over randomly structured scales, nor did the restructuring of items improve risk assessment, suggesting substantial deficiencies in the conceptualization of risk assessment and instrumentation. We argue that developing a risk-based construct, which involves hypothesis testing and an explanation of behavior, is the optimal method to advance risk assessment within the criminal justice and mental health systems. Such an approach would provide targeted areas for clinical intervention that are salient to risk.

  5. Use of Monte Carlo methods in environmental risk assessments at the INEL: Applications and issues

    SciTech Connect

    Harris, G.; Van Horn, R.

    1996-06-01

    The EPA is increasingly considering the use of probabilistic risk assessment techniques as an alternative or refinement of the current point estimate of risk. This report provides an overview of the probabilistic technique called Monte Carlo Analysis. Advantages and disadvantages of implementing a Monte Carlo analysis over a point estimate analysis for environmental risk assessment are discussed. The general methodology is provided along with an example of its implementation. A phased approach to risk analysis that allows iterative refinement of the risk estimates is recommended for use at the INEL.

  6. Use of a risk assessment method to improve the safety of negative pressure wound therapy.

    PubMed

    Lelong, Anne-Sophie; Martelli, Nicolas; Bonan, Brigitte; Prognon, Patrice; Pineau, Judith

    2014-06-01

    To conduct a risk analysis of the negative pressure wound therapy (NPWT) care process and to improve the safety of NPWT, a working group of nurses, hospital pharmacists, physicians and hospital managers performed a risk analysis for the process of NPWT care. The failure modes, effects and criticality analysis (FMECA) method was used for this analysis. Failure modes and their consequences were defined and classified as a function of their criticality to identify priority actions for improvement. By contrast to classical FMECA, the criticality index (CI) of each consequence was calculated by multiplying occurrence, severity and detection scores. We identified 13 failure modes, leading to 20 different consequences. The CI of consequences was initially 712, falling to 357 after corrective measures were implemented. The major improvements proposed included the establishment of 6-monthly training cycles for nurses, physicians and surgeons and the introduction of computerised prescription for NPWT. The FMECA method also made it possible to prioritise actions as a function of the criticality ranking of consequences and was easily understood and used by the working group. This study is, to our knowledge, the first to use the FMECA method to improve the safety of NPWT.

  7. Development of a screening method to assess flood risk on danish national roads and highway systems.

    PubMed

    Nielsen, N H; Larsen, M R A; Rasmussen, S F

    2011-01-01

    A method to assess flood risk on Danish national roads in a large area in the middle and southern part of Jutland, Denmark, was developed for the Danish Road Directorate. Flood risk has gained renewed focus due to the climate changes in recent years and extreme rain events are expected to become more frequent in the future. The assessment was primarily based on a digital terrain model (DTM) covering 7,500 km2 in a 1.6 x 1.6 m grid. The high-resolution terrain model was chosen in order to get an accurate estimation of the potential flooding in the road area and in the immediate vicinity, but also put a high requirement on the methods, hardware and software applied. The outcome of the analysis was detailed maps (as GIS layers) illustrating the location of depressions with depths, surface area and volume data for each depression. Furthermore, preferential flow paths, catchment boundaries and ranking of each depression were calculated. The ranking was based on volume of depressions compared with upstream catchment and a sensitivity analysis of the runoff coefficient. Finally, a method for assessing flood risk at a more advanced level (hydrodynamic simulation of surface and drainage) was developed and used on a specific blue spot as an example. The case study shows that upstream catchment, depressions, drainage system, and use of hydrodynamic calculations have a great influence on the result. Upstream catchments can contribute greatly to the flooding.

  8. Decomposition-Based Failure Mode Identification Method for Risk-Free Design of Large Systems

    NASA Technical Reports Server (NTRS)

    Tumer, Irem Y.; Stone, Robert B.; Roberts, Rory A.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    When designing products, it is crucial to assure failure and risk-free operation in the intended operating environment. Failures are typically studied and eliminated as much as possible during the early stages of design. The few failures that go undetected result in unacceptable damage and losses in high-risk applications where public safety is of concern. Published NASA and NTSB accident reports point to a variety of components identified as sources of failures in the reported cases. In previous work, data from these reports were processed and placed in matrix form for all the system components and failure modes encountered, and then manipulated using matrix methods to determine similarities between the different components and failure modes. In this paper, these matrices are represented in the form of a linear combination of failures modes, mathematically formed using Principal Components Analysis (PCA) decomposition. The PCA decomposition results in a low-dimensionality representation of all failure modes and components of interest, represented in a transformed coordinate system. Such a representation opens the way for efficient pattern analysis and prediction of failure modes with highest potential risks on the final product, rather than making decisions based on the large space of component and failure mode data. The mathematics of the proposed method are explained first using a simple example problem. The method is then applied to component failure data gathered from helicopter, accident reports to demonstrate its potential.

  9. Issues in Benchmarking Human Reliability Analysis Methods: A Literature Review

    SciTech Connect

    Ronald L. Boring; Stacey M. L. Hendrickson; John A. Forester; Tuan Q. Tran; Erasmia Lois

    2010-06-01

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessments (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study comparing and evaluating HRA methods in assessing operator performance in simulator experiments is currently underway. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies are presented in order to aid in the design of future HRA benchmarking endeavors.

  10. Issues in benchmarking human reliability analysis methods : a literature review.

    SciTech Connect

    Lois, Erasmia; Forester, John Alan; Tran, Tuan Q.; Hendrickson, Stacey M. Langfitt; Boring, Ronald L.

    2008-04-01

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessment (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study is currently underway that compares HRA methods with each other and against operator performance in simulator studies. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies are presented in order to aid in the design of future HRA benchmarking endeavors.

  11. Is adaptation or transformation needed? Active nanomaterials and risk analysis

    NASA Astrophysics Data System (ADS)

    Kuzma, Jennifer; Roberts, John Patrick

    2016-07-01

    Nanotechnology has been a key area of funding and policy for the United States and globally for the past two decades. Since nanotechnology research and development became a focus and nanoproducts began to permeate the market, scholars and scientists have been concerned about how to assess the risks that they may pose to human health and the environment. The newest generation of nanomaterials includes biomolecules that can respond to and influence their environments, and there is a need to explore whether and how existing risk-analysis frameworks are challenged by such novelty. To fill this niche, we used a modified approach of upstream oversight assessment (UOA), a subset of anticipatory governance. We first selected case studies of "active nanomaterials," that are early in research and development and designed for use in multiple sectors, and then considered them under several, key risk-analysis frameworks. We found two ways in which the cases challenge the frameworks. The first category relates to how to assess risk under a narrow framing of the term (direct health and environmental harm), and the second involves the definition of what constitutes a "risk" worthy of assessment and consideration in decision making. In light of these challenges, we propose some changes for risk analysis in the face of active nanostructures in order to improve risk governance.

  12. Goodness-of-fit methods for additive-risk models in tumorigenicity experiments.

    PubMed

    Ghosh, Debashis

    2003-09-01

    In tumorigenicity experiments, a complication is that the time to event is generally not observed, so that the time to tumor is subject to interval censoring. One of the goals in these studies is to properly model the effect of dose on risk. Thus, it is important to have goodness of fit procedures available for assessing the model fit. While several estimation procedures have been developed for current-status data, relatively little work has been done on model-checking techniques. In this article, we propose numerical and graphical methods for the analysis of current-status data using the additive-risk model, primarily focusing on the situation where the monitoring times are dependent. The finite-sample properties of the proposed methodology are examined through numerical studies. The methods are then illustrated with data from a tumorigenicity experiment.

  13. A meta-analysis of alcohol consumption and thyroid cancer risk

    PubMed Central

    Wang, Xiaofei; Cheng, Wenli; Li, Jingdong; Zhu, Jingqiang

    2016-01-01

    Background It is still inconclusive whether alcohol consumption affects the risk of thyroid cancer. We conducted a meta-analysis of available epidemiological data to address this issue. Results Compared with nondrinkers, the pooled relative risks (RRs) and corresponding 95% confidential intervals (CIs) of thyroid cancer were 0.80 (95% CI 0.71-0.90) for any drinkers, 0.81 (95% CI 0.70-0.93) for light and 0.71 (95% CI 0.63-0.79) for moderate drinkers. The dose–response analysis suggested that there is no evidence of a dose-risk relationship between alcohol intaking and thyroid cancer risk (P = 0.112). Methods Eligible studies were identified by searching PubMed and EMbase databases. A total of 24 studies, included 9,990 cases with thyroid cancer, were included in this meta-analysis. We defined light alcohol intake as ≤ one drink/day and moderate as >one drink/day. The summary risk estimates were calculated by the random effects model. A dose-response analysis was also conducted for modeling the dose-risk relation. Conclusion This meta-analysis confirmed an inverse association between alcohol consumption and thyroid cancer risk. Further studies are needed to better understand the potential mechanisms underlying this association. PMID:27385005

  14. SWECS tower dynamics analysis methods and results

    NASA Technical Reports Server (NTRS)

    Wright, A. D.; Sexton, J. H.; Butterfield, C. P.; Thresher, R. M.

    1981-01-01

    Several different tower dynamics analysis methods and computer codes were used to determine the natural frequencies and mode shapes of both guyed and freestanding wind turbine towers. These analysis methods are described and the results for two types of towers, a guyed tower and a freestanding tower, are shown. The advantages and disadvantages in the use of and the accuracy of each method are also described.

  15. Germany wide seasonal flood risk analysis for agricultural crops

    NASA Astrophysics Data System (ADS)

    Klaus, Stefan; Kreibich, Heidi; Kuhlmann, Bernd; Merz, Bruno; Schröter, Kai

    2016-04-01

    In recent years, large-scale flood risk analysis and mapping has gained attention. Regional to national risk assessments are needed, for example, for national risk policy developments, for large-scale disaster management planning and in the (re-)insurance industry. Despite increasing requests for comprehensive risk assessments some sectors have not received much scientific attention, one of these is the agricultural sector. In contrast to other sectors, agricultural crop losses depend strongly on the season. Also flood probability shows seasonal variation. Thus, the temporal superposition of high flood susceptibility of crops and high flood probability plays an important role for agricultural flood risk. To investigate this interrelation and provide a large-scale overview of agricultural flood risk in Germany, an agricultural crop loss model is used for crop susceptibility analyses and Germany wide seasonal flood-frequency analyses are undertaken to derive seasonal flood patterns. As a result, a Germany wide map of agricultural flood risk is shown as well as the crop type most at risk in a specific region. The risk maps may provide guidance for federal state-wide coordinated designation of retention areas.

  16. Comparing risk in conventional and organic dairy farming in the Netherlands: an empirical analysis.

    PubMed

    Berentsen, P B M; Kovacs, K; van Asseldonk, M A P M

    2012-07-01

    This study was undertaken to contribute to the understanding of why most dairy farmers do not convert to organic farming. Therefore, the objective of this research was to assess and compare risks for conventional and organic farming in the Netherlands with respect to gross margin and the underlying price and production variables. To investigate the risk factors a farm accountancy database was used containing panel data from both conventional and organic representative Dutch dairy farms (2001-2007). Variables with regard to price and production risk were identified using a gross margin analysis scheme. Price risk variables were milk price and concentrate price. The main production risk variables were milk yield per cow, roughage yield per hectare, and veterinary costs per cow. To assess risk, an error component implicit detrending method was applied and the resulting detrended standard deviations were compared between conventional and organic farms. Results indicate that the risk included in the gross margin per cow is significantly higher in organic farming. This is caused by both higher price and production risks. Price risks are significantly higher in organic farming for both milk price and concentrate price. With regard to production risk, only milk yield per cow poses a significantly higher risk in organic farming.

  17. Text analysis methods, text analysis apparatuses, and articles of manufacture

    DOEpatents

    Whitney, Paul D; Willse, Alan R; Lopresti, Charles A; White, Amanda M

    2014-10-28

    Text analysis methods, text analysis apparatuses, and articles of manufacture are described according to some aspects. In one aspect, a text analysis method includes accessing information indicative of data content of a collection of text comprising a plurality of different topics, using a computing device, analyzing the information indicative of the data content, and using results of the analysis, identifying a presence of a new topic in the collection of text.

  18. [Establishment of Method for Health Risk Assessment of Pollutants from Fixed Sources].

    PubMed

    Chen, Qiang; Wu, Huan-bo

    2016-05-15

    A health risk assessment method of pollutants from fixed sources was developed by applying AERMOD model in the health risk assessment. The method could directly forecast the health risks of toxic pollutants from source by some exposure pathway. Using the established method, in combination with the data of sources and traditional health risk assessment method as well as the measured data of PAHs in inhalation particle matter (PM₁₀) in Lanzhou, the health risk of polycyclic aromatic hydrocarbons (PAHs) and benzo [a] pyrene (BaP) in PM₁₀ from the three fire power plants and the health risk of PAHs and BaP in PM₁₀ at the receptor point by inhalation exposure in heating and non-heating seasons was calculated, respectively. Then the contribution rates of the health risk caused by the three fire power plants to the health risk at the receptor point were calculated. The results showed that the contribution rates were not associated with sex and age, but were associated with time period and risk types. The contribution rates in the non-heating seasons were greater than those in heating seasons, and the contribution rates of the carcinogenic risk index were greater than those of the cancer risk value. The reliability of the established method was validated by comparing with the traditional method. This method was applicable to health risk assessment of toxic pollutants from all fixed sources and environmental risk assessment of environmental impact assessment.

  19. Development of economic consequence methodology for process risk analysis.

    PubMed

    Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed

    2015-04-01

    A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies.

  20. Risk analysis of dust explosion scenarios using Bayesian networks.

    PubMed

    Yuan, Zhi; Khakzad, Nima; Khan, Faisal; Amyotte, Paul

    2015-02-01

    In this study, a methodology has been proposed for risk analysis of dust explosion scenarios based on Bayesian network. Our methodology also benefits from a bow-tie diagram to better represent the logical relationships existing among contributing factors and consequences of dust explosions. In this study, the risks of dust explosion scenarios are evaluated, taking into account common cause failures and dependencies among root events and possible consequences. Using a diagnostic analysis, dust particle properties, oxygen concentration, and safety training of staff are identified as the most critical root events leading to dust explosions. The probability adaptation concept is also used for sequential updating and thus learning from past dust explosion accidents, which is of great importance in dynamic risk assessment and management. We also apply the proposed methodology to a case study to model dust explosion scenarios, to estimate the envisaged risks, and to identify the vulnerable parts of the system that need additional safety measures.

  1. Population-standardized genetic risk score: the SNP-based method of choice for inherited risk assessment of prostate cancer

    PubMed Central

    Conran, Carly A; Na, Rong; Chen, Haitao; Jiang, Deke; Lin, Xiaoling; Zheng, S Lilly; Brendler, Charles B; Xu, Jianfeng

    2016-01-01

    Several different approaches are available to clinicians for determining prostate cancer (PCa) risk. The clinical validity of various PCa risk assessment methods utilizing single nucleotide polymorphisms (SNPs) has been established; however, these SNP-based methods have not been compared. The objective of this study was to compare the three most commonly used SNP-based methods for PCa risk assessment. Participants were men (n = 1654) enrolled in a prospective study of PCa development. Genotypes of 59 PCa risk-associated SNPs were available in this cohort. Three methods of calculating SNP-based genetic risk scores (GRSs) were used for the evaluation of individual disease risk such as risk allele count (GRS-RAC), weighted risk allele count (GRS-wRAC), and population-standardized genetic risk score (GRS-PS). Mean GRSs were calculated, and performances were compared using area under the receiver operating characteristic curve (AUC) and positive predictive value (PPV). All SNP-based methods were found to be independently associated with PCa (all P < 0.05; hence their clinical validity). The mean GRSs in men with or without PCa using GRS-RAC were 55.15 and 53.46, respectively, using GRS-wRAC were 7.42 and 6.97, respectively, and using GRS-PS were 1.12 and 0.84, respectively (all P < 0.05 for differences between patients with or without PCa). All three SNP-based methods performed similarly in discriminating PCa from non-PCa based on AUC and in predicting PCa risk based on PPV (all P > 0.05 for comparisons between the three methods), and all three SNP-based methods had a significantly higher AUC than family history (all P < 0.05). Results from this study suggest that while the three most commonly used SNP-based methods performed similarly in discriminating PCa from non-PCa at the population level, GRS-PS is the method of choice for risk assessment at the individual level because its value (where 1.0 represents average population risk) can be easily interpreted regardless

  2. Risk analysis with a fuzzy-logic approach of a complex installation

    NASA Astrophysics Data System (ADS)

    Peikert, Tim; Garbe, Heyno; Potthast, Stefan

    2016-09-01

    This paper introduces a procedural method based on fuzzy logic to analyze systematic the risk of an electronic system in an intentional electromagnetic environment (IEME). The method analyzes the susceptibility of a complex electronic installation with respect to intentional electromagnetic interference (IEMI). It combines the advantages of well-known techniques as fault tree analysis (FTA), electromagnetic topology (EMT) and Bayesian networks (BN) and extends the techniques with an approach to handle uncertainty. This approach uses fuzzy sets, membership functions and fuzzy logic to handle the uncertainty with probability functions and linguistic terms. The linguistic terms add to the risk analysis the knowledge from experts of the investigated system or environment.

  3. The SIMEX approach to measurement error correction in meta-analysis with baseline risk as covariate.

    PubMed

    Guolo, A

    2014-05-30

    This paper investigates the use of SIMEX, a simulation-based measurement error correction technique, for meta-analysis of studies involving the baseline risk of subjects in the control group as explanatory variable. The approach accounts for the measurement error affecting the information about either the outcome in the treatment group or the baseline risk available from each study, while requiring no assumption about the distribution of the true unobserved baseline risk. This robustness property, together with the feasibility of computation, makes SIMEX very attractive. The approach is suggested as an alternative to the usual likelihood analysis, which can provide misleading inferential results when the commonly assumed normal distribution for the baseline risk is violated. The performance of SIMEX is compared to the likelihood method and to the moment-based correction through an extensive simulation study and the analysis of two datasets from the medical literature.

  4. A Bayesian adaptive blinded sample size adjustment method for risk differences.

    PubMed

    Hartley, Andrew Montgomery

    2015-01-01

    Adaptive sample size adjustment (SSA) for clinical trials consists of examining early subsets of on trial data to adjust estimates of sample size requirements. Blinded SSA is often preferred over unblinded SSA because it obviates many logistical complications of the latter and generally introduces less bias. On the other hand, current blinded SSA methods for binary data offer little to no new information about the treatment effect, ignore uncertainties associated with the population treatment proportions, and/or depend on enhanced randomization schemes that risk partial unblinding. I propose an innovative blinded SSA method for use when the primary analysis is a non-inferiority or superiority test regarding a risk difference. The method incorporates evidence about the treatment effect via the likelihood function of a mixture distribution. I compare the new method with an established one and with the fixed sample size study design, in terms of maximization of an expected utility function. The new method maximizes the expected utility better than do the comparators, under a range of assumptions. I illustrate the use of the proposed method with an example that incorporates a Bayesian hierarchical model. Lastly, I suggest topics for future study regarding the proposed methods.

  5. State of the art in benefit-risk analysis: introduction.

    PubMed

    Verhagen, H; Tijhuis, M J; Gunnlaugsdóttir, H; Kalogeras, N; Leino, O; Luteijn, J M; Magnússon, S H; Odekerken, G; Pohjola, M V; Tuomisto, J T; Ueland, Ø; White, B C; Holm, F

    2012-01-01

    Risk-taking is normal in everyday life if there are associated (perceived) benefits. Benefit-Risk Analysis (BRA) compares the risk of a situation to its related benefits and addresses the acceptability of the risk. Over the past years BRA in relation to food and food ingredients has gained attention. Food, and even the same food ingredient, may confer both beneficial and adverse effects. Measures directed at food safety may lead to suboptimal or insufficient levels of ingredients from a benefit perspective. In BRA, benefits and risks of food (ingredients) are assessed in one go and may conditionally be expressed into one currency. This allows the comparison of adverse and beneficial effects to be qualitative and quantitative. A BRA should help policy-makers to make more informed and balanced benefit-risk management decisions. Not allowing food benefits to occur in order to guarantee food safety is a risk management decision much the same as accepting some risk in order to achieve more benefits. BRA in food and nutrition is making progress, but difficulties remain. The field may benefit from looking across its borders to learn from other research areas. The BEPRARIBEAN project (Best Practices for Risk-Benefit Analysis: experience from out of food into food; http://en.opasnet.org/w/Bepraribean) aims to do so, by working together with Medicines, Food Microbiology, Environmental Health, Economics & Marketing-Finance and Consumer Perception. All perspectives are reviewed and subsequently integrated to identify opportunities for further development of BRA for food and food ingredients. Interesting issues that emerge are the varying degrees of risk that are deemed acceptable within the areas and the trend towards more open and participatory BRA processes. A set of 6 'state of the art' papers covering the above areas and a paper integrating the separate (re)views are published in this volume.

  6. An evaluation of fracture analysis methods

    NASA Technical Reports Server (NTRS)

    Newman, J. C., Jr.

    1985-01-01

    The results of an experimental and predictive round robin on the applications of fracture analysis methods are presented. The objective of the round robin was to verify whether fracture analysis methods currently in use can or cannot predict failure loads on complex structural components containing cracks. Fracture results from tests on a number of compact specimens were used to make the predictions. The accuracy of the prediction methods was evaluated in terms of the variation in the ratio of predicted to experimental failure loads, and the predictions methods are ranked in order of minimum standard error. A range of applicability of the different methods was also considered in assessing their usefulness. For 7075-T651 aluminum alloy, the best methods were: the effective K sub R curve; the critical crack-tip opening displacement (CTOD) criterion using a finite element analysis; and the K sub R curve with the Dugdale model. For the 2024-T351 aluminum alloy, the best methods included: the two-parameter fracture criterion (TPFC); the CTOD parameter using finite element analysis; the K-curve with the Dugdale model; the deformation plasticity failure assessment diagram (DPFAD); and the effective K sub R curve with a limit load condition. For 304 stainless steel, the best methods were the limit load analysis; the CTOD criterion using finite-element analysis TPFC and DPFAD. Some sample experimental results are given in an appendix.

  7. Credibility analysis of risk classes by generalized linear model

    NASA Astrophysics Data System (ADS)

    Erdemir, Ovgucan Karadag; Sucu, Meral

    2016-06-01

    In this paper generalized linear model (GLM) and credibility theory which are frequently used in nonlife insurance pricing are combined for reliability analysis. Using full credibility standard, GLM is associated with limited fluctuation credibility approach. Comparison criteria such as asymptotic variance and credibility probability are used to analyze the credibility of risk classes. An application is performed by using one-year claim frequency data of a Turkish insurance company and results of credible risk classes are interpreted.

  8. Risk analysis of gravity dam instability using credibility theory Monte Carlo simulation model.

    PubMed

    Xin, Cao; Chongshi, Gu

    2016-01-01

    Risk analysis of gravity dam stability involves complicated uncertainty in many design parameters and measured data. Stability failure risk ratio described jointly by probability and possibility has deficiency in characterization of influence of fuzzy factors and representation of the likelihood of risk occurrence in practical engineering. In this article, credibility theory is applied into stability failure risk analysis of gravity dam. Stability of gravity dam is viewed as a hybrid event considering both fuzziness and randomness of failure criterion, design parameters and measured data. Credibility distribution function is conducted as a novel way to represent uncertainty of influence factors of gravity dam stability. And combining with Monte Carlo simulation, corresponding calculation method and procedure are proposed. Based on a dam section, a detailed application of the modeling approach on risk calculation of both dam foundation and double sliding surfaces is provided. The results show that, the present method is feasible to be applied on analysis of stability failure risk for gravity dams. The risk assessment obtained can reflect influence of both sorts of uncertainty, and is suitable as an index value.

  9. Gambler Risk Perception: A Mental Model and Grounded Theory Analysis.

    PubMed

    Spurrier, Michael; Blaszczynski, Alexander; Rhodes, Paul

    2015-09-01

    Few studies have investigated how gamblers perceive risk or the role of risk perception in disordered gambling. The purpose of the current study therefore was to obtain data on lay gamblers' beliefs on these variables and their effects on decision-making, behaviour, and disordered gambling aetiology. Fifteen regular lay gamblers (non-problem/low risk, moderate risk and problem gamblers) completed a semi-structured interview following mental models and grounded theory methodologies. Gambler interview data was compared to an expert 'map' of risk-perception, to identify comparative gaps or differences associated with harmful or safe gambling. Systematic overlapping processes of data gathering and analysis were used to iteratively extend, saturate, test for exception, and verify concepts and themes emerging from the data. The preliminary findings suggested that gambler accounts supported the presence of expert conceptual constructs, and to some degree the role of risk perception in protecting against or increasing vulnerability to harm and disordered gambling. Gambler accounts of causality, meaning, motivation, and strategy were highly idiosyncratic, and often contained content inconsistent with measures of disordered gambling. Disordered gambling appears heavily influenced by relative underestimation of risk and overvaluation of gambling, based on explicit and implicit analysis, and deliberate, innate, contextual, and learned processing evaluations and biases.

  10. Sensitivity and uncertainty analysis of a regulatory risk model

    SciTech Connect

    Kumar, A.; Manocha, A.; Shenoy, T.

    1999-07-01

    Health Risk Assessments (H.R.A.s) are increasingly being used in the environmental decision making process, starting from problem identification to the final clean up activities. A key issue concerning the results of these risk assessments is the uncertainty associated with them. This uncertainty has been associated with highly conservative estimates of risk assessment parameters in past studies. The primary purpose of this study was to investigate error propagation through a risk model. A hypothetical glass plant situated in the state of California was studied. Air emissions from this plant were modeled using the ISCST2 model and the risk was calculated using the ACE2588 model. The downwash was also considered during the concentration calculations. A sensitivity analysis on the risk computations identified five parameters--mixing depth for human consumption, deposition velocity, weathering constant, interception factors for vine crop and the average leaf vegetable consumption--which had the greatest impact on the calculated risk. A Monte Carlo analysis using these five parameters resulted in a distribution with a lesser percentage deviation than the percentage standard deviation of the input parameters.

  11. Risk Analysis and Decision Making FY 2013 Milestone Report

    SciTech Connect

    Engel, David W.; Dalton, Angela C.; Dale, Crystal; Jones, Edward; Thompson, J.

    2013-06-01

    Risk analysis and decision making is one of the critical objectives of CCSI, which seeks to use information from science-based models with quantified uncertainty to inform decision makers who are making large capital investments. The goal of this task is to develop tools and capabilities to facilitate the development of risk models tailored for carbon capture technologies, quantify the uncertainty of model predictions, and estimate the technical and financial risks associated with the system. This effort aims to reduce costs by identifying smarter demonstrations, which could accelerate development and deployment of the technology by several years.

  12. CERTS; A Comparative Evaluation Method for Risk Management Methodologies and Tools

    DTIC Science & Technology

    1990-03-01

    This thesis develops a comparative evaluation method for computer security risk management methodologies and tools. The subjective biases inherent to...current comparison practices are reduced by measuring unique characteristics of computer security risk management methodologies. Standardized...methodologies and tools to each other. As a demonstration of its effectiveness, our method is applied to four distinct risk management methodologies and

  13. A Tractable Method for Measuring Nanomaterial Risk Using Bayesian Networks

    NASA Astrophysics Data System (ADS)

    Murphy, Finbarr; Sheehan, Barry; Mullins, Martin; Bouwmeester, Hans; Marvin, Hans J. P.; Bouzembrak, Yamine; Costa, Anna L.; Das, Rasel; Stone, Vicki; Tofail, Syed A. M.

    2016-11-01

    While control banding has been identified as a suitable framework for the evaluation and the determination of potential human health risks associated with exposure to nanomaterials (NMs), the approach currently lacks any implementation that enjoys widespread support. Large inconsistencies in characterisation data, toxicological measurements and exposure scenarios make it difficult to map and compare the risk associated with NMs based on physicochemical data, concentration and exposure route. Here we demonstrate the use of Bayesian networks as a reliable tool for NM risk estimation. This tool is tractable, accessible and scalable. Most importantly, it captures a broad span of data types, from complete, high quality data sets through to data sets with missing data and/or values with a relatively high spread of probability distribution. The tool is able to learn iteratively in order to further refine forecasts as the quality of data available improves. We demonstrate how this risk measurement approach works on NMs with varying degrees of risk potential, namely, carbon nanotubes, silver and titanium dioxide. The results afford even non-experts an accurate picture of the occupational risk probabilities associated with these NMs and, in doing so, demonstrated how NM risk can be evaluated into a tractable, quantitative risk comparator.

  14. Dietary Patterns and Pancreatic Cancer Risk: A Meta-Analysis.

    PubMed

    Lu, Pei-Ying; Shu, Long; Shen, Shan-Shan; Chen, Xu-Jiao; Zhang, Xiao-Yan

    2017-01-05

    A number of studies have examined the associations between dietary patterns and pancreatic cancer risk, but the findings have been inconclusive. Herein, we conducted this meta-analysis to assess the associations between dietary patterns and the risk of pancreatic cancer. MEDLINE (provided by the National Library of Medicine) and EBSCO (Elton B. Stephens Company) databases were searched for relevant articles published up to May 2016 that identified common dietary patterns. Thirty-two studies met the inclusion criteria and were finally included in this meta-analysis. A reduced risk of pancreatic cancer was shown for the highest compared with the lowest categories of healthy patterns (odds ratio, OR = 0.86; 95% confidence interval, CI: 0.77-0.95; p = 0.004) and light-moderate drinking patterns (OR = 0.90; 95% CI: 0.83-0.98; p = 0.02). There was evidence of an increased risk for pancreatic cancer in the highest compared with the lowest categories of western-type pattern (OR = 1.24; 95% CI: 1.06-1.45; p = 0.008) and heavy drinking pattern (OR = 1.29; 95% CI: 1.10-1.48; p = 0.002). The results of this meta-analysis demonstrate that healthy and light-moderate drinking patterns may decrease the risk of pancreatic cancer, whereas western-type and heavy drinking patterns may increase the risk of pancreatic cancer. Additional prospective studies are needed to confirm these findings.

  15. Quantitative Risk Analysis of Obstacle Limitation Standards

    NASA Astrophysics Data System (ADS)

    Sandaradura, Amila Silva

    Obstacle limitation surfaces (OLS) are the main safeguard against objects that can pose a hazard to aircraft operations at and around the airports. The standard dimensions of the most of these surfaces were estimated using the pilot's experience at the time when they were included in to the standard documents. As a result, some of these standards may have been overestimated while others may not provide an adequate level of safety. With airports moving to the Safety Management System (SMS) approach to design and operations safety, proper evaluation of the level of safety provided by OLS at specific sites becomes great importance to airport operators. There is no published evidence, however, for the estimation of the safety level provided by the existing OLS standards. Moreover, the rationale used by the ICAO to establish existing OLS standards is not readily available in the standard documents. Therefore this study attempts to collect actual flight path data using information provided by air traffic control radars and construct a methodology to assess the probability of aircraft deviating from their intended/protected path. The extension of the developed methodology can be used to estimate the OLS dimensions that provide an acceptable safety level for the aircraft operations. This will be helpful to estimate safe and efficient standard dimensions of the OLS and assess the risk level of objects to the aircraft operations around airports. In order to assess the existing standards and show the applications of the methodology, three case studies were conducted using aircraft data collected from Ottawa (CYOW), Calgary (CYYC) and Edmonton (CYEG) International Airports.

  16. A quantitative analysis of fish consumption and stroke risk.

    PubMed

    Bouzan, Colleen; Cohen, Joshua T; Connor, William E; Kris-Etherton, Penny M; Gray, George M; König, Ariane; Lawrence, Robert S; Savitz, David A; Teutsch, Steven M

    2005-11-01

    Although a rich source of n-3 polyunsaturated fatty acids (PUFAs) that may confer multiple health benefits, some fish contain methyl mercury (MeHg), which may harm the developing fetus. U.S. government recommendations for women of childbearing age are to modify consumption of high-MeHg fish to reduce MeHg exposure, while recommendations encourage fish consumption among the general population because of the nutritional benefits. The Harvard Center for Risk Analysis convened an expert panel (see acknowledgements) to quantify the net impact of resulting hypothetical changes in fish consumption across the population. This paper estimates the impact of fish consumption on stroke risk. Other papers quantify coronary heart disease mortality risk and the impacts of both prenatal MeHg exposure and maternal intake of n-3 PUFAs on cognitive development. This analysis identified articles in a recent qualitative literature review that are appropriate for the development of a dose-response relationship between fish consumption and stroke risk. Studies had to satisfy quality criteria, quantify fish intake, and report the precision of the relative risk estimates. The analysis combined the relative risk results, weighting each proportionately to its precision. Six studies were identified as appropriate for inclusion in this analysis, including five prospective cohort studies and one case-control study (total of 24 exposure groups). Our analysis indicates that any fish consumption confers substantial relative risk reduction compared to no fish consumption (12% for the linear model), with the possibility that additional consumption confers incremental benefits (central estimate of 2.0% per serving per week).

  17. A review of the methods used to define glucocorticoid exposure and risk attribution when investigating the risk of fracture in a rheumatoid arthritis population

    PubMed Central

    Robinson, DE; Dennison, EM; Cooper, C; van Staa, TP; Dixon, WG

    2016-01-01

    Background Glucocorticoid therapy is used widely in patients with rheumatoid arthritis (RA) with good efficacy but concerns about safety including fractures. Estimates of fracture risk for any given patient are complicated by the dynamic pattern of glucocorticoid use, where patients vary in their dose, duration and timing of glucocorticoid use. Objective To investigate which methods are currently used to attribute fractures to glucocorticoid exposure and investigate whether such methods can consider individual treatment patterns. Results Thirty-eight studies used five common definitions of risk attribution to glucocorticoid exposure: “current use”, “ever use”, “daily dose”, “cumulative dose” and “time variant”. One study attempted to combine multiple definitions where “cumulative dose” was nested within “daily dose”, covering the effects of dose and duration but not timing. The majority of results demonstrated an equivocal or increased risk of fracture with increased exposure, although there was wide variation, with odds ratios, hazard ratios and relative risks ranging from 0.16 to 8.16. Within definitions there was also variability in the results with the smallest range for “time variant”, 1.07 to 2.8, and the largest for “cumulative dose”, ranging from risk estimates of 0.88 to 8.12. Conclusion Many studies have looked into the effect of glucocorticoids on fracture risk in patients with RA. Despite this, there is no clear consensus about the magnitude of risk. This is a consequence of the varied analysis models and their different assumptions. Moreover, no current analysis method allows consideration of dose, duration and timing of glucocorticoid therapy, preventing a clear understanding of fracture risk for patients and their individual treatment patterns. PMID:27268854

  18. A Micro-Method of Protein Analysis

    DTIC Science & Technology

    The determination of protein by means of Weichselbaum’s (1) biuret method is too inexact when dealing with small quantities of protein (less than 200...microgram/ml initial reactant), owing to the low sensitivity of the color reaction . Although we have used this method for protein analysis of...have searched for a more sensitive colorimetric method. Nielsen (3) recently reported on a method in which the Cu bound by protein in the biuret

  19. Benefit-Risk Analysis for Decision-Making: An Approach.

    PubMed

    Raju, G K; Gurumurthi, K; Domike, R

    2016-12-01

    The analysis of benefit and risk is an important aspect of decision-making throughout the drug lifecycle. In this work, the use of a benefit-risk analysis approach to support decision-making was explored. The proposed approach builds on the qualitative US Food and Drug Administration (FDA) approach to include a more explicit analysis based on international standards and guidance that enables aggregation and comparison of benefit and risk on a common basis and a lifecycle focus. The approach is demonstrated on six decisions over the lifecycle (e.g., accelerated approval, withdrawal, and traditional approval) using two case studies: natalizumab for multiple sclerosis (MS) and bedaquiline for multidrug-resistant tuberculosis (MDR-TB).

  20. Why Map Issues? On Controversy Analysis as a Digital Method.

    PubMed

    Marres, Noortje

    2015-09-01

    This article takes stock of recent efforts to implement controversy analysis as a digital method in the study of science, technology, and society (STS) and beyond and outlines a distinctive approach to address the problem of digital bias. Digital media technologies exert significant influence on the enactment of controversy in online settings, and this risks undermining the substantive focus of controversy analysis conducted by digital means. To address this problem, I propose a shift in thematic focus from controversy analysis to issue mapping. The article begins by distinguishing between three broad frameworks that currently guide the development of controversy analysis as a digital method, namely, demarcationist, discursive, and empiricist. Each has been adopted in STS, but only the last one offers a digital "move beyond impartiality." I demonstrate this approach by analyzing issues of Internet governance with the aid of the social media platform Twitter.

  1. Why Map Issues? On Controversy Analysis as a Digital Method

    PubMed Central

    2015-01-01

    This article takes stock of recent efforts to implement controversy analysis as a digital method in the study of science, technology, and society (STS) and beyond and outlines a distinctive approach to address the problem of digital bias. Digital media technologies exert significant influence on the enactment of controversy in online settings, and this risks undermining the substantive focus of controversy analysis conducted by digital means. To address this problem, I propose a shift in thematic focus from controversy analysis to issue mapping. The article begins by distinguishing between three broad frameworks that currently guide the development of controversy analysis as a digital method, namely, demarcationist, discursive, and empiricist. Each has been adopted in STS, but only the last one offers a digital “move beyond impartiality.” I demonstrate this approach by analyzing issues of Internet governance with the aid of the social media platform Twitter. PMID:26336325

  2. A Bayesian approach to probabilistic sensitivity analysis in structured benefit-risk assessment.

    PubMed

    Waddingham, Ed; Mt-Isa, Shahrul; Nixon, Richard; Ashby, Deborah

    2016-01-01

    Quantitative decision models such as multiple criteria decision analysis (MCDA) can be used in benefit-risk assessment to formalize trade-offs between benefits and risks, providing transparency to the assessment process. There is however no well-established method for propagating uncertainty of treatment effects data through such models to provide a sense of the variability of the benefit-risk balance. Here, we present a Bayesian statistical method that directly models the outcomes observed in randomized placebo-controlled trials and uses this to infer indirect comparisons between competing active treatments. The resulting treatment effects estimates are suitable for use within the MCDA setting, and it is possible to derive the distribution of the overall benefit-risk balance through Markov Chain Monte Carlo simulation. The method is illustrated using a case study of natalizumab for relapsing-remitting multiple sclerosis.

  3. 2007 Wholesale Power Rate Case Initial Proposal : Risk Analysis Study.

    SciTech Connect

    United States. Bonneville Power Administration.

    2005-11-01

    The Federal Columbia River Power System (FCRPS), operated on behalf of the ratepayers of the PNW by BPA and other Federal agencies, faces many uncertainties during the FY 2007-2009 rate period. Among these uncertainties, the largest revolve around hydro conditions, market prices and river operations for fish recovery. In order to provide a high probability of making its U.S. Treasury payments, BPA performs a Risk Analysis as part of its rate-making process. In this Risk Analysis, BPA identifies key risks, models their relationships, and then analyzes their impacts on net revenues (total revenues less expenses). BPA subsequently evaluates in the ToolKit Model the Treasury Payment Probability (TPP) resulting from the rates, risks, and risk mitigation measures described here and in the Wholesale Power Rate Development Study (WPRDS). If the TPP falls short of BPA's standard, additional risk mitigation revenues, such as PNRR and CRAC revenues are incorporated in the modeling in ToolKit until the TPP standard is met. Increased wholesale market price volatility and six years of drought have significantly changed the profile of risk and uncertainty facing BPA and its stakeholders. These present new challenges for BPA in its effort to keep its power rates as low as possible while fully meeting its obligations to the U.S. Treasury. As a result, the risk BPA faces in not receiving the level of secondary revenues that have been credited to power rates before receiving those funds is greater. In addition to market price volatility, BPA also faces uncertainty around the financial impacts of operations for fish programs in FY 2006 and in the FY 2007-2009 rate period. A new Biological Opinion or possible court-ordered change to river operations in FY 2006 through FY 2009 may reduce BPA's net revenues included Initial Proposal. Finally, the FY 2007-2009 risk analysis includes new operational risks as well as a more comprehensive analysis of non-operating risks. Both the operational

  4. RiskChanges Spatial Decision Support system for the analysis of changing multi-hazard risk

    NASA Astrophysics Data System (ADS)

    van Westen, Cees; Zhang, Kaixi; Bakker, Wim; Andrejchenko, Vera; Berlin, Julian; Olyazadeh, Roya; Cristal, Irina

    2015-04-01

    Within the framework of the EU FP7 Marie Curie Project CHANGES and the EU FP7 Copernicus project INCREO a spatial decision support system was developed with the aim to analyse the effect of risk reduction planning alternatives on reducing the risk now and in the future, and support decision makers in selecting the best alternatives. Central to the SDSS are the stakeholders. The envisaged users of the system are organizations involved in planning of risk reduction measures, and that have staff capable of visualizing and analyzing spatial data at a municipal scale. The SDSS should be able to function in different countries with different legal frameworks and with organizations with different mandates. These could be subdivided into Civil protection organization with the mandate to design disaster response plans, Expert organizations with the mandate to design structural risk reduction measures (e.g. dams, dikes, check-dams etc), and planning organizations with the mandate to make land development plans. The SDSS can be used in different ways: analyzing the current level of risk, analyzing the best alternatives for risk reduction, the evaluation of the consequences of possible future scenarios to the risk levels, and the evaluation how different risk reduction alternatives will lead to risk reduction under different future scenarios. The SDSS is developed based on open source software and following open standards, for code as well as for data formats and service interfaces. Code development was based upon open source software as well. The architecture of the system is modular. The various parts of the system are loosely coupled, extensible, using standards for interoperability, flexible and web-based. The Spatial Decision Support System is composed of a number of integrated components. The Risk Assessment component allows to carry out spatial risk analysis, with different degrees of complexity, ranging from simple exposure (overlay of hazard and assets maps) to

  5. Standardised survey method for identifying catchment risks to water quality.

    PubMed

    Baker, D L; Ferguson, C M; Chier, P; Warnecke, M; Watkinson, A

    2016-06-01

    This paper describes the development and application of a systematic methodology to identify and quantify risks in drinking water and recreational catchments. The methodology assesses microbial and chemical contaminants from both diffuse and point sources within a catchment using Escherichia coli, protozoan pathogens and chemicals (including fuel and pesticides) as index contaminants. Hazard source information is gathered by a defined sanitary survey process involving use of a software tool which groups hazards into six types: sewage infrastructure, on-site sewage systems, industrial, stormwater, agriculture and recreational sites. The survey estimates the likelihood of the site affecting catchment water quality, and the potential consequences, enabling the calculation of risk for individual sites. These risks are integrated to calculate a cumulative risk for each sub-catchment and the whole catchment. The cumulative risks process accounts for the proportion of potential input sources surveyed and for transfer of contaminants from upstream to downstream sub-catchments. The output risk matrices show the relative risk sources for each of the index contaminants, highlighting those with the greatest impact on water quality at a sub-catchment and catchment level. Verification of the sanitary survey assessments and prioritisation is achieved by comparison with water quality data and microbial source tracking.

  6. From risk analysis to risk governance - Adapting to an ever more complex future.

    PubMed

    Pfeiffer, Dirk U

    2014-01-01

    Risk analysis is now widely accepted amongst veterinary authorities and other stakeholders around the world as a conceptual framework for integrating scientific evidence into animal health decision making. The resulting risk management for most diseases primarily involves linking epidemiological understanding with diagnostics and/or vaccines. Recent disease outbreaks such as Nipah virus, SARS, avian influenza H5N1, bluetongue serotype 8 and Schmallenberg virus have led to realising that we need to explicitly take into account the underlying complex interactions between environmental, epidemiological and social factors which are often also spatially and temporally heterogeneous as well as interconnected across affected regions and beyond. A particular challenge is to obtain adequate understanding of the influence of human behaviour and to translate this into effective mechanisms leading to appropriate behaviour change where necessary. Both, the One Health and the ecohealth approaches reflect the need for such a holistic systems perspective, however the current implementation of risk analysis frameworks for animal health and food safety is still dominated by a natural or biomedical perspective of science as is the implementation of control and prevention policies. This article proposes to integrate the risk analysis approach with a risk governance framework which explicitly adds the socio-economic context to policy development and emphasizes the need for organisational change and stakeholder engagement.

  7. [Framework analysis method in qualitative research].

    PubMed

    Liao, Xing; Liu, Jian-ping; Robison, Nicola; Xie, Ya-ming

    2014-05-01

    In recent years a number of qualitative research methods have gained popularity within the health care arena. Despite this popularity, different qualitative analysis methods pose many challenges to most researchers. The present paper responds to the needs expressed by recent Chinese medicine researches. The present paper is mainly focused on the concepts, nature, application of framework analysis, especially on how to use it, in such a way to assist the newcomer of Chinese medicine researchers to engage with the methodology.

  8. Airbreathing hypersonic vehicle design and analysis methods

    NASA Technical Reports Server (NTRS)

    Lockwood, Mary Kae; Petley, Dennis H.; Hunt, James L.; Martin, John G.

    1996-01-01

    The design, analysis, and optimization of airbreathing hypersonic vehicles requires analyses involving many highly coupled disciplines at levels of accuracy exceeding those traditionally considered in a conceptual or preliminary-level design. Discipline analysis methods including propulsion, structures, thermal management, geometry, aerodynamics, performance, synthesis, sizing, closure, and cost are discussed. Also, the on-going integration of these methods into a working environment, known as HOLIST, is described.

  9. Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture

    SciTech Connect

    Sanfilippo, Antonio P; Cowell, Andrew J; Gregory, Michelle L; Baddeley, Robert L; Paulson, Patrick R; Tratz, Stephen C; Hohimer, Ryan E

    2012-03-20

    Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture are described according to some aspects. In one aspect, a hypothesis analysis method includes providing a hypothesis, providing an indicator which at least one of supports and refutes the hypothesis, using the indicator, associating evidence with the hypothesis, weighting the association of the evidence with the hypothesis, and using the weighting, providing information regarding the accuracy of the hypothesis.

  10. Laboratory theory and methods for sediment analysis

    USGS Publications Warehouse

    Guy, Harold P.

    1969-01-01

    The diverse character of fluvial sediments makes the choice of laboratory analysis somewhat arbitrary and the pressing of sediment samples difficult. This report presents some theories and methods used by the Water Resources Division for analysis of fluvial sediments to determine the concentration of suspended-sediment samples and the particle-size distribution of both suspended-sediment and bed-material samples. Other analyses related to these determinations may include particle shape, mineral content, and specific gravity, the organic matter and dissolved solids of samples, and the specific weight of soils. The merits and techniques of both the evaporation and filtration methods for concentration analysis are discussed. Methods used for particle-size analysis of suspended-sediment samples may include the sieve pipet, the VA tube-pipet, or the BW tube-VA tube depending on the equipment available, the concentration and approximate size of sediment in the sample, and the settling medium used. The choice of method for most bed-material samples is usually limited to procedures suitable for sand or to some type of visual analysis for large sizes. Several tested forms are presented to help insure a well-ordered system in the laboratory to handle the samples, to help determine the kind of analysis required for each, to conduct the required processes, and to assist in the required computations. Use of the manual should further 'standardize' methods of fluvial sediment analysis among the many laboratories and thereby help to achieve uniformity and precision of the data.

  11. The Risk Analysis of Reservoir Water Supply under High Turbidity- Case Study of the Shihmen Reservoir

    NASA Astrophysics Data System (ADS)

    Chang, Y.; Chang, L.; Ko, S.; Ho, C.; Chen, Y.

    2010-12-01

    Due to the unstable geological condition of the Shihmen reservoir basin, the high turbidity of the outflow of water from the basin to the reservoir during typhoons causes rapid increase of turbidity in reservoir water. Because high water turbidity reduces the capacity of water treatment plants, water shortages could occur more frequently during typhoons and flood seasons. Based on the scenario described above, this study used the Monte Carlo analysis to evaluate shortage of water in the Taoyuan area when the Shihmen reservoir water was under conditions of high turbidity. The risk analysis process consisted of four sub-models: sub-model of rainfall synthesis, sub-model of rainfall runoff, sub-model of prediction of turbidity, and sub-model of management of water allocation under conditions of high turbidity. Two methods of prediction of turbidity, the artificial neural network (ANN) method and the unit characteristic hydrograph method, were developed and compared. The unit characteristic hydrograph method was modified from the unit hydrograph method and represented the relationship of reservoir inflow or outflow to the turbidity of reservoir water according to the function of unit response. Results of comparison of the two-methods indicated that the unit characteristic hydrograph method was more stable than the ANN method and included physical concepts that were easily understandable. Risk analysis showed a 57% possibility of water shortage during typhoons. Risk of water shortage decreased to 34% when tolerance of water shortage increased by 5%. Results of the case study demonstrated the reliability of the proposed procedure for risk assessment and method of prediction of turbidity prediction method. These methods could be extended to other reservoirs that have problems of high turbidity problem to assess risk of water shortage.

  12. Two MIS Analysis Methods: An Experimental Comparison.

    ERIC Educational Resources Information Center

    Wang, Shouhong

    1996-01-01

    In China, 24 undergraduate business students applied data flow diagrams (DFD) to a mini-case, and 20 used object-oriented analysis (OOA). DFD seemed easier to learn, but after training, those using the OOA method for systems analysis made fewer errors. (SK)

  13. Psoriasis and Risk of Celiac Disease: A Systematic Review and Meta-analysis

    PubMed Central

    Ungprasert, Patompong; Wijarnpreecha, Karn; Kittanamongkolchai, Wonngarm

    2017-01-01

    Background and Objectives: The possible association between psoriasis and celiac disease (CD) has long been observed, but epidemiologic studies attempting to characterize this association have yielded inconclusive results. This meta-analysis was conducted with the aims to summarize all available data. Methods: We conducted a systematic review and meta-analysis of observational studies that reported relative risk, hazard ratio, odds ratio (OR), or standardized incidence ratio with 95% confidence interval (CI) comparing the risk of CD in patients with psoriasis versus participants without psoriasis. Pooled risk ratio and 95% CI were calculated using random-effect, generic inverse-variance methods of DerSimonian and Laird. Results: Four retrospective cohort studies with 12,912 cases of psoriasis and 24,739 comparators were included in this meta-analysis. The pooled analysis demonstrated a significantly higher risk of CD among patients with psoriasis compared with participants without psoriasis with the pooled OR of 3.09 (95% CI, 1.92–4.97). Limitations: Most primary studies reported unadjusted estimated effect, raising a concern over confounders. Conclusions: Our meta-analysis demonstrated an approximately 3-fold increased risk of CD among patients with psoriasis. PMID:28216724

  14. New risk metrics and mathematical tools for risk analysis: Current and future challenges

    NASA Astrophysics Data System (ADS)

    Skandamis, Panagiotis N.; Andritsos, Nikolaos; Psomas, Antonios; Paramythiotis, Spyridon

    2015-01-01

    The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) the Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total `failure' that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user-friendly softwares

  15. New risk metrics and mathematical tools for risk analysis: Current and future challenges

    SciTech Connect

    Skandamis, Panagiotis N. Andritsos, Nikolaos Psomas, Antonios Paramythiotis, Spyridon

    2015-01-22

    The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) the Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total ‘failure’ that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user

  16. Causal Moderation Analysis Using Propensity Score Methods

    ERIC Educational Resources Information Center

    Dong, Nianbo

    2012-01-01

    This paper is based on previous studies in applying propensity score methods to study multiple treatment variables to examine the causal moderator effect. The propensity score methods will be demonstrated in a case study to examine the causal moderator effect, where the moderators are categorical and continuous variables. Moderation analysis is an…

  17. Downside Risk analysis applied to the Hedge Funds universe

    NASA Astrophysics Data System (ADS)

    Perelló, Josep

    2007-09-01

    Hedge Funds are considered as one of the portfolio management sectors which shows a fastest growing for the past decade. An optimal Hedge Fund management requires an appropriate risk metrics. The classic CAPM theory and its Ratio Sharpe fail to capture some crucial aspects due to the strong non-Gaussian character of Hedge Funds statistics. A possible way out to this problem while keeping the CAPM simplicity is the so-called Downside Risk analysis. One important benefit lies in distinguishing between good and bad returns, that is: returns greater or lower than investor's goal. We revisit most popular Downside Risk indicators and provide new analytical results on them. We compute these measures by taking the Credit Suisse/Tremont Investable Hedge Fund Index Data and with the Gaussian case as a benchmark. In this way, an unusual transversal lecture of the existing Downside Risk measures is provided.

  18. Method of Breast Reconstruction Determines Venous Thromboembolism Risk Better Than Current Prediction Models

    PubMed Central

    Patel, Niyant V.; Wagner, Douglas S.

    2015-01-01

    Background: Venous thromboembolism (VTE) risk models including the Davison risk score and the 2005 Caprini risk assessment model have been validated in plastic surgery patients. However, their utility and predictive value in breast reconstruction has not been well described. We sought to determine the utility of current VTE risk models in this population and the VTE rate observed in various methods of breast reconstruction. Methods: A retrospective review of breast reconstructions by a single surgeon was performed. One hundred consecutive transverse rectus abdominis myocutaneous (TRAM) patients, 100 consecutive implant patients, and 100 consecutive latissimus dorsi patients were identified over a 10-year period. Patient demographics and presence of symptomatic VTE were collected. 2005 Caprini risk scores and Davison risk scores were calculated for each patient. Results: The TRAM reconstruction group was found to have a higher VTE rate (6%) than the implant (0%) and latissimus (0%) reconstruction groups (P < 0.01). Mean Davison risk scores and 2005 Caprini scores were similar across all reconstruction groups (P > 0.1). The vast majority of patients were stratified as high risk (87.3%) by the VTE risk models. However, only TRAM reconstruction patients demonstrated significant VTE risk. Conclusions: TRAM reconstruction appears to have a significantly higher risk of VTE than both implant and latissimus reconstruction. Current risk models do not effectively stratify breast reconstruction patients at risk for VTE. The method of breast reconstruction appears to have a significant role in patients’ VTE risk. PMID:26090287

  19. Risk Factors for Childhood Stunting in 137 Developing Countries: A Comparative Risk Assessment Analysis at Global, Regional, and Country Levels

    PubMed Central

    Danaei, Goodarz; Andrews, Kathryn G.; Sudfeld, Christopher R.; Fink, Günther; McCoy, Dana Charles; Sania, Ayesha; Smith Fawzi, Mary C.; Fawzi, Wafaie W.

    2016-01-01

    Background Stunting affects one-third of children under 5 y old in developing countries, and 14% of childhood deaths are attributable to it. A large number of risk factors for stunting have been identified in epidemiological studies. However, the relative contribution of these risk factors to stunting has not been examined across countries. We estimated the number of stunting cases among children aged 24–35 mo (i.e., at the end of the 1,000 days’ period of vulnerability) that are attributable to 18 risk factors in 137 developing countries. Methods and Findings We classified risk factors into five clusters: maternal nutrition and infection, teenage motherhood and short birth intervals, fetal growth restriction (FGR) and preterm birth, child nutrition and infection, and environmental factors. We combined published estimates and individual-level data from population-based surveys to derive risk factor prevalence in each country in 2010 and identified the most recent meta-analysis or conducted de novo reviews to derive effect sizes. We estimated the prevalence of stunting and the number of stunting cases that were attributable to each risk factor and cluster of risk factors by country and region. The leading risk worldwide was FGR, defined as being term and small for gestational age, and 10.8 million cases (95% CI 9.1 million–12.6 million) of stunting (out of 44.1 million) were attributable to it, followed by unimproved sanitation, with 7.2 million (95% CI 6.3 million–8.2 million), and diarrhea with 5.8 million (95% CI 2.4 million–9.2 million). FGR and preterm birth was the leading risk factor cluster in all regions. Environmental risks had the second largest estimated impact on stunting globally and in the South Asia, sub-Saharan Africa, and East Asia and Pacific regions, whereas child nutrition and infection was the second leading cluster of risk factors in other regions. Although extensive, our analysis is limited to risk factors for which effect sizes and

  20. Comparative risk analysis of technological hazards (a review).

    PubMed Central

    Kates, R W; Kasperson, J X

    1983-01-01

    Hazards are threats to people and what they value and risks are measures of hazards. Comparative analyses of the risks and hazards of technology can be dated to Starr's 1969 paper [Starr, C. (1969) Science 165, 1232-1238] but are rooted in recent trends in the evolution of technology, the identification of hazard, the perception of risk, and the activities of society. These trends have spawned an interdisciplinary quasi profession with new terminology, methodology, and literature. A review of 54 English-language monographs and book-length collections, published between 1970 and 1983, identified seven recurring themes: (i) overviews of the field of risk assessment, (ii) efforts to estimate and quantify risk, (iii) discussions of risk acceptability, (iv) perception, (v) analyses of regulation, (vi) case studies of specific technological hazards, and (vii) agenda for research. Within this field, science occupies a unique niche, for many technological hazards transcend the realm of ordinary experience and require expert study. Scientists can make unique contributions to each area of hazard management but their primary contribution is the practice of basic science. Beyond that, science needs to further risk assessment by understanding the more subtle processes of hazard creation and by establishing conventions for estimating risk and for presenting and handling uncertainty. Scientists can enlighten the discussion of tolerable risk by setting risks into comparative contexts, by studying the process of evaluation, and by participating as knowledgeable individuals, but they cannot decide the issue. Science can inform the hazard management process by broadening the range of alternative control actions and modes of implementation and by devising methods to evaluate their effectiveness. PMID:6580625

  1. Summary of the workshop on issues in risk assessment: quantitative methods for developmental toxicology.

    PubMed

    Mattison, D R; Sandler, J D

    1994-08-01

    This report summarizes the proceedings of a conference on quantitative methods for assessing the risks of developmental toxicants. The conference was planned by a subcommittee of the National Research Council's Committee on Risk Assessment Methodology in conjunction with staff from several federal agencies, including the U.S. Environmental Protection Agency, U.S. Food and Drug Administration, U.S. Consumer Products Safety Commission, and Health and Welfare Canada. Issues discussed at the workshop included computerized techniques for hazard identification, use of human and animal data for defining risks in a clinical setting, relationships between end points in developmental toxicity testing, reference dose calculations for developmental toxicology, analysis of quantitative dose-response data, mechanisms of developmental toxicity, physiologically based pharmacokinetic models, and structure-activity relationships. Although a formal consensus was not sought, many participants favored the evolution of quantitative techniques for developmental toxicology risk assessment, including the replacement of lowest observed adverse effect levels (LOAELs) and no observed adverse effect levels (NOAELs) with the benchmark dose methodology.

  2. Hybrid least squares multivariate spectral analysis methods

    DOEpatents

    Haaland, David M.

    2002-01-01

    A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following estimation or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The "hybrid" method herein means a combination of an initial classical least squares analysis calibration step with subsequent analysis by an inverse multivariate analysis method. A "spectral shape" herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The "shape" can be continuous, discontinuous, or even discrete points illustrative of the particular effect.

  3. A free and open source QGIS plugin for flood risk analysis: FloodRisk

    NASA Astrophysics Data System (ADS)

    Albano, Raffaele; Sole, Aurelia; Mancusi, Leonardo

    2016-04-01

    An analysis of global statistics shows a substantial increase in flood damage over the past few decades. Moreover, it is expected that flood risk will continue to rise due to the combined effect of increasing numbers of people and economic assets in risk-prone areas and the effects of climate change. In order to increase the resilience of European economies and societies, the improvement of risk assessment and management has been pursued in the last years. This results in a wide range of flood analysis models of different complexities with substantial differences in underlying components needed for its implementation, as geographical, hydrological and social differences demand specific approaches in the different countries. At present, it is emerging the need of promote the creation of open, transparent, reliable and extensible tools for a comprehensive, context-specific and applicable flood risk analysis. In this context, the free and open-source Quantum GIS (QGIS) plugin "FloodRisk" is a good starting point to address this objective. The vision of the developers of this free and open source software (FOSS) is to combine the main features of state-of-the-art science, collaboration, transparency and interoperability in an initiative to assess and communicate flood risk worldwide and to assist authorities to facilitate the quality and fairness of flood risk management at multiple scales. Among the scientific community, this type of activity can be labelled as "participatory research", intended as adopting a set of techniques that "are interactive and collaborative" and reproducible, "providing a meaningful research experience that both promotes learning and generates knowledge and research data through a process of guided discovery"' (Albano et al., 2015). Moreover, this FOSS geospatial approach can lowering the financial barriers to understanding risks at national and sub-national levels through a spatio-temporal domain and can provide better and more complete

  4. Rocky Flats Plant Live-Fire Range Risk Analysis Report

    SciTech Connect

    Nicolosi, S.L.; Rodriguez, M.A.

    1994-04-01

    The objective of the Live-Fire Range Risk Analysis Report (RAR) is to provide an authorization basis for operation as required by DOE 5480.16. The existing Live-Fire Range does not have a safety analysis-related authorization basis. EG&G Rocky Flats, Inc. has worked with DOE and its representatives to develop a format and content description for development of an RAR for the Live-Fire Range. Development of the RAR is closely aligned with development of the design for a baffle system to control risks from errant projectiles. DOE 5480.16 requires either an RAR or a safety analysis report (SAR) for live-fire ranges. An RAR rather than a SAR was selected in order to gain flexibility to more closely address the safety analysis and conduct of operation needs for a live-fire range in a cost-effective manner.

  5. Mudflow Hazards in the Georgian Caucasus - Using Participatory Methods to Investigate Disaster Risk

    NASA Astrophysics Data System (ADS)

    Spanu, Valentina; McCall, Michael; Gaprindashvili, George

    2014-05-01

    /Management (DRR and DRM). Participatory surveys (and participatory monitoring) elicit local people's knowledge about the specifics of the hazard concerning frequency, timing, warning signals, rates of flow, spatial extent, etc. And significantly, only this local knowledge from informants can reveal essential information about different vulnerabilities of people and places, and about any coping or adjustment mechanisms that local people have. The participatory methods employed in Mleta included historical discussions with key informants, village social transects, participatory mapping with children, semi-structured interviews with inhabitants, and VCA (Vulnerability & Capacity Analysis). The geolomorphological map produced on the base of the local geology has been realized with ArcGIS. This allowed the assessment of the areas at risk and the relative maps. We adapted and tested the software programme CyberTracker as a survey tool, a digital device method of field data collection. Google Earth, OpenStreetMap, Virtual Earth and Ilwis have been used for data processing.

  6. RASOR Project: Rapid Analysis and Spatialisation of Risk, from Hazard to Risk using EO data

    NASA Astrophysics Data System (ADS)

    Rossi, Lauro; Rudari, Roberto

    2016-04-01

    Over recent decades, there has been a dramatic rise in disasters, and their impact on human populations. Escalation in complexities in our societies is making risks increasingly difficult to understand and changing the ways in which hazards interact with each other. The Rapid Analysis and Spatialisation and Of Risk (RASOR) project developed a multi-hazard risk analysis platform to support the full cycle of disaster management. RASOR provides up-to-date hazard information across floods and geohazards, up-to-date exposure data from known sources and newly-generated EO-based data, and characterised quantitatively their vulnerabilities. RASOR also adapts the newly-developed 12m resolution global TanDEM-X Digital Elevation Model (DEM) to risk management applications, using it as a base layer to develop specific disaster scenarios. RASOR overlays archived and near real-time very high resolution optical and radar satellite data, combined with in situ data for both global and local applications. A scenario-driven query system allows users to project situations into the future and model multi-hazard risk both before and during an event. Applications with regards to different case study sites are presented in order to illustrate the platform potential.

  7. Accidental Water Pollution Risk Analysis of Mine Tailings Ponds in Guanting Reservoir Watershed, Zhangjiakou City, China

    PubMed Central

    Liu, Renzhi; Liu, Jing; Zhang, Zhijiao; Borthwick, Alistair; Zhang, Ke

    2015-01-01

    Over the past half century, a surprising number of major pollution incidents occurred due to tailings dam failures. Most previous studies of such incidents comprised forensic analyses of environmental impacts after a tailings dam failure, with few considering the combined pollution risk before incidents occur at a watershed-scale. We therefore propose Watershed-scale Tailings-pond Pollution Risk Analysis (WTPRA), designed for multiple mine tailings ponds, stemming from previous watershed-scale accidental pollution risk assessments. Transferred and combined risk is embedded using risk rankings of multiple routes of the “source-pathway-target” in the WTPRA. The previous approach is modified using multi-criteria analysis, dam failure models, and instantaneous water quality models, which are modified for application to multiple tailings ponds. The study area covers the basin of Gutanting Reservoir (the largest backup drinking water source for Beijing) in Zhangjiakou City, where many mine tailings ponds are located. The resultant map shows that risk is higher downstream of Gutanting Reservoir and in its two tributary basins (i.e., Qingshui River and Longyang River). Conversely, risk is lower in the midstream and upstream reaches. The analysis also indicates that the most hazardous mine tailings ponds are located in Chongli and Xuanhua, and that Guanting Reservoir is the most vulnerable receptor. Sensitivity and uncertainty analyses are performed to validate the robustness of the WTPRA method. PMID:26633450

  8. Accidental Water Pollution Risk Analysis of Mine Tailings Ponds in Guanting Reservoir Watershed, Zhangjiakou City, China.

    PubMed

    Liu, Renzhi; Liu, Jing; Zhang, Zhijiao; Borthwick, Alistair; Zhang, Ke

    2015-12-02

    Over the past half century, a surprising number of major pollution incidents occurred due to tailings dam failures. Most previous studies of such incidents comprised forensic analyses of environmental impacts after a tailings dam failure, with few considering the combined pollution risk before incidents occur at a watershed-scale. We therefore propose Watershed-scale Tailings-pond Pollution Risk Analysis (WTPRA), designed for multiple mine tailings ponds, stemming from previous watershed-scale accidental pollution risk assessments. Transferred and combined risk is embedded using risk rankings of multiple routes of the "source-pathway-target" in the WTPRA. The previous approach is modified using multi-criteria analysis, dam failure models, and instantaneous water quality models, which are modified for application to multiple tailings ponds. The study area covers the basin of Gutanting Reservoir (the largest backup drinking water source for Beijing) in Zhangjiakou City, where many mine tailings ponds are located. The resultant map shows that risk is higher downstream of Gutanting Reservoir and in its two tributary basins (i.e., Qingshui River and Longyang River). Conversely, risk is lower in the midstream and upstream reaches. The analysis also indicates that the most hazardous mine tailings ponds are located in Chongli and Xuanhua, and that Guanting Reservoir is the most vulnerable receptor. Sensitivity and uncertainty analyses are performed to validate the robustness of the WTPRA method.

  9. An Empirical Bayes Method for Multivariate Meta-analysis with an Application in Clinical Trials

    PubMed Central

    Chen, Yong; Luo, Sheng; Chu, Haitao; Su, Xiao; Nie, Lei

    2013-01-01

    We propose an empirical Bayes method for evaluating overall and study-specific treatment effects in multivariate meta-analysis with binary outcome. Instead of modeling transformed proportions or risks via commonly used multivariate general or generalized linear models, we directly model the risks without any transformation. The exact posterior distribution of the study-specific relative risk is derived. The hyperparameters in the posterior distribution can be inferred through an empirical Bayes procedure. As our method does not rely on the choice of transformation, it provides a flexible alternative to the existing methods and in addition, the correlation parameter can be intuitively interpreted as the correlation coefficient between risks. PMID:25089070

  10. The Use of Object-Oriented Analysis Methods in Surety Analysis

    SciTech Connect

    Craft, Richard L.; Funkhouser, Donald R.; Wyss, Gregory D.

    1999-05-01

    Object-oriented analysis methods have been used in the computer science arena for a number of years to model the behavior of computer-based systems. This report documents how such methods can be applied to surety analysis. By embodying the causality and behavior of a system in a common object-oriented analysis model, surety analysts can make the assumptions that underlie their models explicit and thus better communicate with system designers. Furthermore, given minor extensions to traditional object-oriented analysis methods, it is possible to automatically derive a wide variety of traditional risk and reliability analysis methods from a single common object model. Automatic model extraction helps ensure consistency among analyses and enables the surety analyst to examine a system from a wider variety of viewpoints in a shorter period of time. Thus it provides a deeper understanding of a system's behaviors and surety requirements. This report documents the underlying philosophy behind the common object model representation, the methods by which such common object models can be constructed, and the rules required to interrogate the common object model for derivation of traditional risk and reliability analysis models. The methodology is demonstrated in an extensive example problem.

  11. Allergy reduces the risk of meningioma: a meta-analysis

    PubMed Central

    Wang, Peng-fei; Ji, Wen-Jun; Zhang, Xiao-hui; Li, Shou-wei; Yan, Chang-Xiang

    2017-01-01

    Meningiomas are the most common brain tumours; however, little is known regarding their aetiology. The data are inconsistent concerning atopic disease and the risk of developing meningioma. Thus, we conducted a meta-analysis to investigate the association between allergic conditions and the risk of developing meningioma. A systematic literature search was conducted using PubMed and Web of SCI from Jan 1979 to Feb 2016. Two investigators independently selected the relevant articles according to the inclusion criteria. Eight case-control studies and 2 cohort studies were included in the final analysis, comprising 5,679 meningioma cases and 55,621 control subjects. Compared with no history of allergy, the pooled odds ratio (OR) for allergic conditions was 0.81 (0.70–0.94) for meningioma in a random-effects meta-analysis. Inverse correlations of meningioma occurrence were also identified for asthma and eczema, in which the pooled ORs were 0.78 (0.70–0.86) and 0.78 (0.69–0.87), respectively. A reduced risk of meningioma occurrence was identified in hay fever; however, the association was weak (0.88, 95% CI = 0.78–0.99). The source of this heterogeneity could be the various confounding variables in individual studies. Overall, the current meta-analysis indicated that allergy reduced the risk of developing meningiomas. Large cohort studies are required to investigate this relationship. PMID:28071746

  12. Quantitative risk analysis of oil storage facilities in seismic areas.

    PubMed

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference.

  13. CUMULATIVE RISK ASSESSMENT: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS

    EPA Science Inventory

    INTRODUCTION: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS FOR CUMULATIVE RISK

    Hugh A. Barton1 and Carey N. Pope2
    1US EPA, Office of Research and Development, National Health and Environmental Effects Research Laboratory, Research Triangle Park, NC
    2Department of...

  14. Pitfalls in Pathways: Some Perspectives on Competing Risks Event History Analysis in Education Research

    ERIC Educational Resources Information Center

    Scott, Marc A.; Kennedy, Benjamin B.

    2005-01-01

    A set of discrete-time methods for competing risks event history analysis is presented. The approach used is accessible to the practitioner and the article describes the strengths, weaknesses, and interpretation of both exploratory and model-based tools. These techniques are applied to the impact of "nontraditional" enrollment features (working,…

  15. School Health Promotion Policies and Adolescent Risk Behaviors in Israel: A Multilevel Analysis

    ERIC Educational Resources Information Center

    Tesler, Riki; Harel-Fisch, Yossi; Baron-Epel, Orna

    2016-01-01

    Background: Health promotion policies targeting risk-taking behaviors are being implemented across schools in Israel. This study identified the most effective components of these policies influencing cigarette smoking and alcohol consumption among adolescents. Methods: Logistic hierarchical linear model (HLM) analysis of data for 5279 students in…

  16. Applying Uncertainty Analysis to a Risk Assessment for the Pesticide Permethrin

    EPA Science Inventory

    We discuss the application of methods of uncertainty analysis from our previous poster to the problem of a risk assessment for exposure to the food-use pesticide permethrin resulting from residential pesticide crack and crevice application. Exposures are simulated by the SHEDS (S...

  17. Risk Factors for the Perpetration of Child Sexual Abuse: A Review and Meta-Analysis

    ERIC Educational Resources Information Center

    Whitaker, Daniel J.; Le, Brenda; Hanson, R. Karl; Baker, Charlene K.; McMahon, Pam M.; Ryan, Gail; Klein, Alisa; Rice, Deborah Donovan

    2008-01-01

    Objectives: Since the late 1980s, there has been a strong theoretical focus on psychological and social influences of perpetration of child sexual abuse. This paper presents the results of a review and meta-analysis of studies examining risk factors for perpetration of child sexual abuse published since 1990. Method: Eighty-nine studies published…

  18. Scientific commentary: Strategic analysis of environmental policy risks--heat maps, risk futures and the character of environmental harm.

    PubMed

    Prpich, G; Dagonneau, J; Rocks, S A; Lickorish, F; Pollard, S J T

    2013-10-01

    We summarise our recent efforts on the policy-level risk appraisal of environmental risks. These have necessitated working closely with policy teams and a requirement to maintain crisp and accessible messages for policy audiences. Our comparative analysis uses heat maps, supplemented with risk narratives, and employs the multidimensional character of risks to inform debates on the management of current residual risk and future threats. The policy research and ensuing analysis raises core issues about how comparative risk analyses are used by policy audiences, their validation and future developments that are discussed in the commentary below.

  19. Risk Costs for New Dams: Economic Analysis and Effects of Monitoring

    NASA Astrophysics Data System (ADS)

    Paté-Cornell, M. Elisabeth; Tagaras, George

    1986-01-01

    This paper presents new developments and illustrations of the introduction of risk and costs in cost-benefit analysis for new dams. The emphasis is on a method of evaluation of the risk costs based on the structure of the local economy. Costs to agricultural property as well as residential, commercial, industrial, and public property are studied in detail. Of particular interest is the case of sequential dam failure and the evaluation of the risk costs attributable to a new dam upstream from an existing one. Three real cases are presented as illustrations of the method: the Auburn Dam, the Dickey-Lincoln School Project, and the Teton Dam, which failed in 1976. This last case provides a calibration tool for the estimation of loss ratios. For these three projects, the risk-modified benefit-cost ratios are computed to assess the effect of the risk on the economic performance of the project. The role of a warning system provided by systematic monitoring of the dam is analyzed: by reducing the risk costs, the warning system attenuates their effect on the benefit-cost ratio. The precursors, however, can be missed or misinterpreted: monitoring does not guarantee that the risks to human life can be reduced to zero. This study shows, in particular, that it is critical to consider the risk costs in the decision to build a new dam when the flood area is large and densely populated.

  20. Safety risk analysis of an innovative environmental technology.

    PubMed

    Parnell, G S; Frimpon, M; Barnes, J; Kloeber, J M; Deckro, R E; Jackson, J A

    2001-02-01

    The authors describe a decision and risk analysis performed for the cleanup of a large Department of Energy mixed-waste subsurface disposal area governed by the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA). In a previous study, the authors worked with the site decision makers, state regulators, and U.S. Environmental Protection Agency regional regulators to develop a CERCLA-based multiobjective decision analysis value model and used the model to perform a screening analysis of 28 remedial alternatives. The analysis results identified an innovative technology, in situ vitrification, with high effectiveness versus cost. Since this technology had not been used on this scale before, the major uncertainties were contaminant migration and pressure buildup. Pressure buildup was a safety concern due to the potential risks to worker safety. With the help of environmental technology experts remedial alternative changes were identified to mitigate the concerns about contaminant migration and pressure buildup. The analysis results showed that the probability of an event with a risk to worker safety had been significantly reduced. Based on these results, site decision makers have refocused their test program to examine in situ vitrification and have continued the use of the CERCLA-based decision analysis methodology to analyze remedial alternatives.

  1. Rationale and methods of the cardiometabolic valencian study (escarval-risk) for validation of risk scales in mediterranean patients with hypertension, diabetes or dyslipidemia

    PubMed Central

    2010-01-01

    Background The Escarval-Risk study aims to validate cardiovascular risk scales in patients with hypertension, diabetes or dyslipidemia living in the Valencia Community, a European Mediterranean region, based on data from an electronic health recording system comparing predicted events with observed during 5 years follow-up study. Methods/Design A cohort prospective 5 years follow-up study has been designed including 25000 patients with hypertension, diabetes and/or dyslipidemia attended in usual clinical practice. All information is registered in a unique electronic health recording system (ABUCASIS) that is the usual way to register clinical practice in the Valencian Health System (primary and secondary care). The system covers about 95% of population (near 5 million people). The system is linked with database of mortality register, hospital withdrawals, prescriptions and assurance databases in which each individual have a unique identification number. Diagnoses in clinical practice are always registered based on IDC-9. Occurrence of CV disease was the main outcomes of interest. Risk survival analysis methods will be applied to estimate the cumulative incidence of developing CV events over time. Discussion The Escarval-Risk study will provide information to validate different cardiovascular risk scales in patients with hypertension, diabetes or dyslipidemia from a low risk Mediterranean Region, the Valencia Community. PMID:21092179

  2. Quantitative risk assessment methods for cancer and noncancer effects.

    PubMed

    Baynes, Ronald E

    2012-01-01

    Human health risk assessments have evolved from the more qualitative approaches to more quantitative approaches in the past decade. This has been facilitated by the improvement in computer hardware and software capability and novel computational approaches being slowly recognized by regulatory agencies. These events have helped reduce the reliance on experimental animals as well as better utilization of published animal toxicology data in deriving quantitative toxicity indices that may be useful for risk management purposes. This chapter briefly describes some of the approaches as described in the guidance documents from several of the regulatory agencies as it pertains to hazard identification and dose-response assessment of a chemical. These approaches are contrasted with more novel computational approaches that provide a better grasp of the uncertainty often associated with chemical risk assessments.

  3. Transfer of sampling methods for studies on most-at-risk populations (MARPs) in Brazil.

    PubMed

    Barbosa Júnior, Aristides; Pascom, Ana Roberta Pati; Szwarcwald, Célia Landmann; Kendall, Carl; McFarland, Willi

    2011-01-01

    The objective of this paper was to describe the process of transferring two methods for sampling most-at-risk populations: respondent-driven sampling (RDS) and time-space sampling (TSS). The article describes steps in the process, the methods used in the 10 pilot studies, and lessons learned. The process was conducted in six steps, from a state-of-the-art seminar to a workshop on writing articles with the results of the pilot studies. The principal investigators reported difficulties in the fieldwork and data analysis, independently of the pilot sampling method. One of the most important results of the transfer process is that Brazil now has more than 100 researchers able to sample MARPs using RDS or TSS. The process also enabled the construction of baselines for MARPS, thus providing a broader understanding of the dynamics of HIV infection in the country and the use of evidence to plan the national response to the epidemic in these groups.

  4. The Short- and Long-Term Risk of Stroke after Herpes Zoster: A Meta-Analysis

    PubMed Central

    Liu, Xuechun; Guan, Yeming; Hou, Liang; Huang, Haili; Liu, Hongjuan; Li, Chuanwen; Zhu, Yingying; Tao, Xingyong; Wang, Qingsong

    2016-01-01

    Background Accumulating evidence indicates that stroke risk may be increased following herpes zoster. The aim of this study is to perform a meta-analysis of current literature to systematically analyze and quantitatively estimate the short and long-term effects of herpes zoster on the risk of stroke. Methods Embase, PubMed and Cochrane library databases were searched for relevant studies up to March 2016. Studies were selected for analysis based on certain inclusion and exclusion criteria. Relative risks with 95% confidence interval (CI) were extracted to assess the association between herpes zoster and stroke. Results A total of 8 articles were included in our analysis. The present meta-analysis showed that the risks of stroke after herpes zoster were 2.36 (95% CI: 2.17–2.56) for first 2 weeks, 1.56 (95% CI: 1.46–1.66) for first month, 1.17 (95% CI: 1.13–1.22) for first year, and 1.09 (95% CI: 1.02–1.16) for more than 1 year, respectively. Conclusion The results of our study demonstrated that herpes zoster was associated with a higher risk of stroke, but the risks decreased along with the time after herpes zoster. PMID:27768762

  5. Risk of Chromosomal Abnormalities in Early Spontaneous Abortion after Assisted Reproductive Technology: A Meta-Analysis

    PubMed Central

    Qin, Jun-Zhen; Pang, Li-Hong; Li, Min-Qing; Xu, Jing; Zhou, Xing

    2013-01-01

    Background Studies on the risk of chromosomal abnormalities in early spontaneous abortion after assisted reproductive technology (ART) are relatively controversial and insufficient. Thus, to obtain a more precise evaluation of the risk of embryonic chromosomal abnormalities in first-trimester miscarriage after ART, we performed a meta-analysis of all available case–control studies relating to the cytogenetic analysis of chromosomal abnormalities in first-trimester miscarriage after ART. Methods Literature search in the electronic databases MEDLINE, EMBASE, and Cochrane Central Register of Controlled Trials (CENTRAL) based on the established strategy. Meta-regression, subgroup analysis, and Galbraith plots were conducted to explore the sources of heterogeneity. Results A total of 15 studies with 1,896 cases and 1,186 controls relevant to the risk of chromosomal abnormalities in first- trimester miscarriage after ART, and 8 studies with 601 cases and 602 controls evaluating frequency of chromosome anomaly for maternal age≥35 versus <35 were eligible for the meta-analysis. No statistical difference was found in risk of chromosomally abnormal miscarriage compared to natural conception and the different types of ART utilized, whereas the risk of fetal aneuploidy significantly increased with maternal age≥35 (OR 2.88, 95% CI: 1.74–4.77). Conclusions ART treatment does not present an increased risk for chromosomal abnormalities occurring in a first trimester miscarriage, but incidence of fetal aneuploidy could increase significantly with advancing maternal age. PMID:24130752

  6. Method and apparatus for ceramic analysis

    DOEpatents

    Jankowiak, Ryszard J.; Schilling, Chris; Small, Gerald J.; Tomasik, Piotr

    2003-04-01

    The present invention relates to a method and apparatus for ceramic analysis, in particular, a method for analyzing density, density gradients and/or microcracks, including an apparatus with optical instrumentation for analysis of density, density gradients and/or microcracks in ceramics. The method provides analyzing density of a ceramic comprising exciting a component on a surface/subsurface of the ceramic by exposing the material to excitation energy. The method may further include the step of obtaining a measurement of an emitted energy from the component. The method may additionally include comparing the measurement of the emitted energy from the component with a predetermined reference measurement so as to obtain a density for said ceramic.

  7. Bioanalytical methods for food contaminant analysis.

    PubMed

    Van Emon, Jeanette M

    2010-01-01

    Foods are complex mixtures of lipids, carbohydrates, proteins, vitamins, organic compounds, and other naturally occurring substances. Sometimes added to this mixture are residues of pesticides, veterinary and human drugs, microbial toxins, preservatives, contaminants from food processing and packaging, and other residues. This milieu of compounds can pose difficulties in the analysis of food contaminants. There is an expanding need for rapid and cost-effective residue methods for difficult food matrixes to safeguard our food supply. Bioanalytical methods are established for many food contaminants such as mycotoxins and are the method of choice for many food allergens. Bioanalytical methods are often more cost-effective and sensitive than instrumental procedures. Recent developments in bioanalytical methods may provide more applications for their use in food analysis.

  8. Efficacy of ACL injury risk screening methods in identifying high-risk landing patterns during a sport-specific task.

    PubMed

    Fox, A S; Bonacci, J; McLean, S G; Saunders, N

    2016-06-12

    Screening methods sensitive to movement strategies that increase anterior cruciate ligament (ACL) loads are likely to be effective in identifying athletes at-risk of ACL injury. Current ACL injury risk screening methods are yet to be evaluated for their ability to identify athletes' who exhibit high-risk lower limb mechanics during sport-specific maneuvers associated with ACL injury occurrences. The purpose of this study was to examine the efficacy of two ACL injury risk screening methods in identifying high-risk lower limb mechanics during a sport-specific landing task. Thirty-two female athletes were screened using the Landing Error Scoring System (LESS) and Tuck Jump Assessment. Participants' also completed a sport-specific landing task, during which three-dimensional kinematic and kinetic data were collected. One-dimensional statistical parametric mapping was used to examine the relationships between screening method scores, and the three-dimensional hip and knee joint rotation and moment data from the sport-specific landing. Higher LESS scores were associated with reduced knee flexion from 30 to 57 ms after initial contact (P = 0.003) during the sport-specific landing; however, no additional relationships were found. These findings suggest the LESS and Tuck Jump Assessment may have minimal applicability in identifying athletes' who exhibit high-risk landing postures in the sport-specific task examined.

  9. Retinal image analysis for automated glaucoma risk evaluation

    NASA Astrophysics Data System (ADS)

    Nyúl, László G.

    2009-10-01

    Images of the eye ground not only provide an insight to important parts of the visual system but also reflect the general state of health of the entire human body. Automatic retina image analysis is becoming an important screening tool for early detection of certain risks and diseases. Glaucoma is one of the most common causes of blindness and is becoming even more important considering the ageing society. Robust mass-screening may help to extend the symptom-free life of affected patients. Our research is focused on a novel automated classification system for glaucoma, based on image features from fundus photographs. Our new data-driven approach requires no manual assistance and does not depend on explicit structure segmentation and measurements. First, disease independent variations, such as nonuniform illumination, size differences, and blood vessels are eliminated from the images. Then, the extracted high-dimensional feature vectors are compressed via PCA and combined before classification with SVMs takes place. The technique achieves an accuracy of detecting glaucomatous retina fundus images comparable to that of human experts. The "vessel-free" images and intermediate output of the methods are novel representations of the data for the physicians that may provide new insight into and help to better understand glaucoma.

  10. Identifying High-Risk Populations of Tuberculosis Using Environmental Factors and GIS Based Multi-Criteria Decision Making Method

    NASA Astrophysics Data System (ADS)

    Rasam, A. R. Abdul; Shariff, N. M.; Dony, J. F.

    2016-09-01

    Development of an innovative method to enhance the detection of tuberculosis (TB) in Malaysia is the latest agenda of the Ministry of Health. Therefore, a geographical information system (GIS) based index model is proposed as an alternative method for defining potential high-risk areas of local TB cases at Section U19, Shah Alam. It is adopted a spatial multi-criteria decision making (MCDM) method for ranking environmental risk factors of the disease in a standardised five-score scale. Scale 1 and 5 illustrate the lowest and the highest risk of the TB spread respectively, while scale from 3 to 5 is included as a potential risk level. These standardised scale values are then combined with expert normalised weights (0 to 1) to calculate the overall index values and produce a TB ranked map using a GIS overlay analysis and weighted linear combination. It is discovered that 71.43% of the Section is potential as TB high risk areas particularly at urban and densely populated settings. This predictive result is also reliable with the current real cases in 2015 by 76.00% accuracy. A GIS based MCDM method has demonstrated analytical capabilities in targeting high-risk spots and TB surveillance monitoring system of the country, but the result could be strengthened by applying other uncertainty assessment method.

  11. Hazardous materials transportation: a risk-analysis-based routing methodology.

    PubMed

    Leonelli, P; Bonvicini, S; Spadoni, G

    2000-01-07

    This paper introduces a new methodology based on risk analysis for the selection of the best route for the transport of a hazardous substance. In order to perform this optimisation, the network is considered as a graph composed by nodes and arcs; each arc is assigned a cost per unit vehicle travelling on it and a vehicle capacity. After short discussion about risk measures suitable for linear risk sources, the arc capacities are introduced by comparison between the societal and individual risk measures of each arc with hazardous materials transportation risk criteria; then arc costs are defined in order to take into account both transportation out-of-pocket expenses and risk-related costs. The optimisation problem can thus be formulated as a 'minimum cost flow problem', which consists of determining for a specific hazardous substance the cheapest flow distribution, honouring the arc capacities, from the origin nodes to the destination nodes. The main features of the optimisation procedure, implemented on the computer code OPTIPATH, are presented. Test results about shipments of ammonia are discussed and finally further research developments are proposed.

  12. tropical cyclone risk analysis: a decisive role of its track

    NASA Astrophysics Data System (ADS)

    Chelsea Nam, C.; Park, Doo-Sun R.; Ho, Chang-Hoi

    2016-04-01

    The tracks of 85 tropical cyclones (TCs) that made landfall to South Korea for the period 1979-2010 are classified into four clusters by using a fuzzy c-means clustering method. The four clusters are characterized by 1) east-short, 2) east-long, 3) west-long, and 4) west-short based on the moving routes around Korean peninsula. We conducted risk comparison analysis for these four clusters regarding their hazards, exposure, and damages. Here, hazard parameters are calculated from two different sources independently, one from the best-track data (BT) and the other from the 60 weather stations over the country (WS). The results show distinct characteristics of the four clusters in terms of the hazard parameters and economic losses (EL), suggesting that there is a clear track-dependency in the overall TC risk. It is appeared that whether there occurred an "effective collision" overweighs the intensity of the TC per se. The EL ranking did not agree with the BT parameters (maximum wind speed, central pressure, or storm radius), but matches to WS parameter (especially, daily accumulated rainfall and TC-influenced period). The west-approaching TCs (i.e. west-long and west-short clusters) generally recorded larger EL than the east-approaching TCs (i.e. east-short and east-long clusters), although the east-long clusters are the strongest in BT point of view. This can be explained through the spatial distribution of the WS parameters and the regional EL maps corresponding to it. West-approaching TCs accompanied heavy rainfall on the southern regions with the helps of the topographic effect on their tracks, and of the extended stay on the Korean Peninsula in their extratropical transition, that were not allowed to the east-approaching TCs. On the other hand, some regions had EL that are not directly proportional to the hazards, and this is partly attributed to spatial disparity in wealth and vulnerability. Correlation analysis also revealed the importance of rainfall; daily

  13. Engine non-containment: UK risk assessment methods

    NASA Technical Reports Server (NTRS)

    Wallin, J. C.

    1977-01-01

    More realistic guideline data must be developed for use in aircraft design in order to comply with recent changes in British civil airworthiness requirements. Unrealistically pessimistic results were obtained when the methodology developed during the Concorde SST certification program was extended to assess catastrophic risks resulting from uncontained engine rotors.

  14. 31 CFR 223.11 - Limitation of risk: Protective methods.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... § 223.12 any risk in excess of 10 percent of the latter company's paid-up capital and surplus. (c) Other... property so held may not be disposed of or pledged in any way without the consent of the insuring company....

  15. [Risk assessment and safety evaluation using system normative indexes integration method for non-point source pollution on watershed scale].

    PubMed

    Liu, Jian-Chang; Yan, Yan; Liu, Feng; Ding, Ding; Zhao, Ming

    2008-03-01

    Decision-makers take non-point source pollution under control as well as possible results from enough information of risk trend of nonpoint source pollution on watershed scale. System normative indexes integration evaluation method about system risk trend was developed when focusing on that the probability values of some elements attributing to some trend of the system were more than one, and that the system evaluation needed a formula from the system structure. On the basis of analysis on aspects and characteristics of the system risk normalization, a new valuation method, the relationship between the normalization values of the system and the factors was established. The Lugu Lake Watershed in Southwest China was selected as study area to assess the risk of non-point source loss to surface water using this method. The results indicate that 1) the wholly risk of non-point source loss to surface water in this watershed is in a high level; 2) the system indexes integration evaluation method is an universal method to evaluate a quality or a trend of any system and shows a great power in comparing several systems; 3) the method is helpful to attain an effective and integrated assessment on a system when it is combined with other methods.

  16. Flood risk assessment in France: comparison of extreme flood estimation methods (EXTRAFLO project, Task 7)

    NASA Astrophysics Data System (ADS)

    Garavaglia, F.; Paquet, E.; Lang, M.; Renard, B.; Arnaud, P.; Aubert, Y.; Carre, J.

    2013-12-01

    In flood risk assessment the methods can be divided in two families: deterministic methods and probabilistic methods. In the French hydrologic community the probabilistic methods are historically preferred to the deterministic ones. Presently a French research project named EXTRAFLO (RiskNat Program of the French National Research Agency, https://extraflo.cemagref.fr) deals with the design values for extreme rainfall and floods. The object of this project is to carry out a comparison of the main methods used in France for estimating extreme values of rainfall and floods, to obtain a better grasp of their respective fields of application. In this framework we present the results of Task 7 of EXTRAFLO project. Focusing on French watersheds, we compare the main extreme flood estimation methods used in French background: (i) standard flood frequency analysis (Gumbel and GEV distribution), (ii) regional flood frequency analysis (regional Gumbel and GEV distribution), (iii) local and regional flood frequency analysis improved by historical information (Naulet et al., 2005), (iv) simplify probabilistic method based on rainfall information (i.e. Gradex method (CFGB, 1994), Agregee method (Margoum, 1992) and Speed method (Cayla, 1995)), (v) flood frequency analysis by continuous simulation approach and based on rainfall information (i.e. Schadex method (Paquet et al., 2013, Garavaglia et al., 2010), Shyreg method (Lavabre et al., 2003)) and (vi) multifractal approach. The main result of this comparative study is that probabilistic methods based on additional information (i.e. regional, historical and rainfall information) provide better estimations than the standard flood frequency analysis. Another interesting result is that, the differences between the various extreme flood quantile estimations of compared methods increase with return period, staying relatively moderate up to 100-years return levels. Results and discussions are here illustrated throughout with the example

  17. Intercellular Adhesion Molecule-1 (ICAM-1) Polymorphisms and Cancer Risk: A Meta-Analysis

    PubMed Central

    CHENG, Daye; LIANG, Bin

    2015-01-01

    Background: Intercellular adhesion molecule-1 (ICAM-1) Lys469Glu (K469E) polymorphism and Gly 241Arg (G241R) polymorphism might play important roles in cancer development and progression. However, the results of previous studies are inconsistent. The aim of this study was to evaluate the association between ICAM-1 K469E and G241R polymorphisms and the risk of cancer by meta-analysis. Methods: A comprehensive literature search (last search updated in November 2013) was conducted to identify case-control studies that investigated the association between ICAM-1 K469E and G241R polymorphisms and cancer risk. Results: A total of 18 case-control studies for ICAM-1 polymorphisms were included in the meta-analysis, including 4,844 cancer cases and 5,618 healthy controls. For K469E polymorphism, no significant association was found between K469E polymorphism and cancer risk. However, subgroup analysis by ethnicity revealed one genetic comparison (GG vs. AA) presented the relationship with cancer risk in Asian subgroup, and two genetic models (GG+GA vs. AA and GA vs. AA) in European subgroup, respectively. For G241R polymorphism, G241R polymorphism was significantly association with cancer risk in overall analysis. The subgroup analysis by ethnicity showed that G241R polymorphism was significantly associated with cancer risk in European subgroup. Conclusion: ICAM-1 G241R polymorphism might be associated with cancer risk, especially in European populations, but the results doesn’t support ICAM-1 K469E polymorphism as a risk factor for cancer. PMID:26284202

  18. Space flight risk data collection and analysis project: Risk and reliability database

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The focus of the NASA 'Space Flight Risk Data Collection and Analysis' project was to acquire and evaluate space flight data with the express purpose of establishing a database containing measurements of specific risk assessment - reliability - availability - maintainability - supportability (RRAMS) parameters. The developed comprehensive RRAMS database will support the performance of future NASA and aerospace industry risk and reliability studies. One of the primary goals has been to acquire unprocessed information relating to the reliability and availability of launch vehicles and the subsystems and components thereof from the 45th Space Wing (formerly Eastern Space and Missile Command -ESMC) at Patrick Air Force Base. After evaluating and analyzing this information, it was encoded in terms of parameters pertinent to ascertaining reliability and availability statistics, and then assembled into an appropriate database structure.

  19. Dietary Patterns and Pancreatic Cancer Risk: A Meta-Analysis

    PubMed Central

    Lu, Pei-Ying; Shu, Long; Shen, Shan-Shan; Chen, Xu-Jiao; Zhang, Xiao-Yan

    2017-01-01

    A number of studies have examined the associations between dietary patterns and pancreatic cancer risk, but the findings have been inconclusive. Herein, we conducted this meta-analysis to assess the associations between dietary patterns and the risk of pancreatic cancer. MEDLINE (provided by the National Library of Medicine) and EBSCO (Elton B. Stephens Company) databases were searched for relevant articles published up to May 2016 that identified common dietary patterns. Thirty-two studies met the inclusion criteria and were finally included in this meta-analysis. A reduced risk of pancreatic cancer was shown for the highest compared with the lowest categories of healthy patterns (odds ratio, OR = 0.86; 95% confidence interval, CI: 0.77–0.95; p = 0.004) and light–moderate drinking patterns (OR = 0.90; 95% CI: 0.83–0.98; p = 0.02). There was evidence of an increased risk for pancreatic cancer in the highest compared with the lowest categories of western-type pattern (OR = 1.24; 95% CI: 1.06–1.45; p = 0.008) and heavy drinking pattern (OR = 1.29; 95% CI: 1.10–1.48; p = 0.002). The results of this meta-analysis demonstrate that healthy and light–moderate drinking patterns may decrease the risk of pancreatic cancer, whereas western-type and heavy drinking patterns may increase the risk of pancreatic cancer. Additional prospective studies are needed to confirm these findings. PMID:28067765

  20. Hybrid least squares multivariate spectral analysis methods

    DOEpatents

    Haaland, David M.

    2004-03-23

    A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following prediction or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The hybrid method herein means a combination of an initial calibration step with subsequent analysis by an inverse multivariate analysis method. A spectral shape herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The shape can be continuous, discontinuous, or even discrete points illustrative of the particular effect.

  1. Spectroscopic chemical analysis methods and apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F. (Inventor); Reid, Ray D. (Inventor)

    2010-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted simultaneously with native fluorescence spectroscopy to provide high levels of sensitivity and specificity in the same instrument.

  2. Spectroscopic chemical analysis methods and apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F. (Inventor); Reid, Ray D. (Inventor)

    2009-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted simultaneously with native fluorescence spectroscopy to provide high levels of sensitivity and specificity in the same instrument.

  3. Advanced analysis methods in particle physics

    SciTech Connect

    Bhat, Pushpalatha C.; /Fermilab

    2010-10-01

    Each generation of high energy physics experiments is grander in scale than the previous - more powerful, more complex and more demanding in terms of data handling and analysis. The spectacular performance of the Tevatron and the beginning of operations of the Large Hadron Collider, have placed us at the threshold of a new era in particle physics. The discovery of the Higgs boson or another agent of electroweak symmetry breaking and evidence of new physics may be just around the corner. The greatest challenge in these pursuits is to extract the extremely rare signals, if any, from huge backgrounds arising from known physics processes. The use of advanced analysis techniques is crucial in achieving this goal. In this review, I discuss the concepts of optimal analysis, some important advanced analysis methods and a few examples. The judicious use of these advanced methods should enable new discoveries and produce results with better precision, robustness and clarity.

  4. Method for Assessing the Integrated Risk of Soil Pollution in Industrial and Mining Gathering Areas

    PubMed Central

    Guan, Yang; Shao, Chaofeng; Gu, Qingbao; Ju, Meiting; Zhang, Qian

    2015-01-01

    Industrial and mining activities are recognized as major sources of soil pollution. This study proposes an index system for evaluating the inherent risk level of polluting factories and introduces an integrated risk assessment method based on human health risk. As a case study, the health risk, polluting factories and integrated risks were analyzed in a typical industrial and mining gathering area in China, namely, Binhai New Area. The spatial distribution of the risk level was determined using a Geographic Information System. The results confirmed the following: (1) Human health risk in the study area is moderate to extreme, with heavy metals posing the greatest threat; (2) Polluting factories pose a moderate to extreme inherent risk in the study area. Such factories are concentrated in industrial and urban areas, but are irregularly distributed and also occupy agricultural land, showing a lack of proper planning and management; (3) The integrated risks of soil are moderate to high in the study area. PMID:26580644

  5. Method for Assessing the Integrated Risk of Soil Pollution in Industrial and Mining Gathering Areas.

    PubMed

    Guan, Yang; Shao, Chaofeng; Gu, Qingbao; Ju, Meiting; Zhang, Qian

    2015-11-13

    Industrial and mining activities are recognized as major sources of soil pollution. This study proposes an index system for evaluating the inherent risk level of polluting factories and introduces an integrated risk assessment method based on human health risk. As a case study, the health risk, polluting factories and integrated risks were analyzed in a typical industrial and mining gathering area in China, namely, Binhai New Area. The spatial distribution of the risk level was determined using a Geographic Information System. The results confirmed the following: (1) Human health risk in the study area is moderate to extreme, with heavy metals posing the greatest threat; (2) Polluting factories pose a moderate to extreme inherent risk in the study area. Such factories are concentrated in industrial and urban areas, but are irregularly distributed and also occupy agricultural land, showing a lack of proper planning and management; (3) The integrated risks of soil are moderate to high in the study area.

  6. Analysis of automated highway system risks and uncertainties. Volume 5

    SciTech Connect

    Sicherman, A.

    1994-10-01

    This volume describes a risk analysis performed to help identify important Automated Highway System (AHS) deployment uncertainties and quantify their effect on costs and benefits for a range of AHS deployment scenarios. The analysis identified a suite of key factors affecting vehicle and roadway costs, capacities and market penetrations for alternative AHS deployment scenarios. A systematic protocol was utilized for obtaining expert judgments of key factor uncertainties in the form of subjective probability percentile assessments. Based on these assessments, probability distributions on vehicle and roadway costs, capacity and market penetration were developed for the different scenarios. The cost/benefit risk methodology and analysis provide insights by showing how uncertainties in key factors translate into uncertainties in summary cost/benefit indices.

  7. Risk-based planning analysis for a single levee

    NASA Astrophysics Data System (ADS)

    Hui, Rui; Jachens, Elizabeth; Lund, Jay

    2016-04-01

    Traditional risk-based analysis for levee planning focuses primarily on overtopping failure. Although many levees fail before overtopping, few planning studies explicitly include intermediate geotechnical failures in flood risk analysis. This study develops a risk-based model for two simplified levee failure modes: overtopping failure and overall intermediate geotechnical failure from through-seepage, determined by the levee cross section represented by levee height and crown width. Overtopping failure is based only on water level and levee height, while through-seepage failure depends on many geotechnical factors as well, mathematically represented here as a function of levee crown width using levee fragility curves developed from professional judgment or analysis. These levee planning decisions are optimized to minimize the annual expected total cost, which sums expected (residual) annual flood damage and annualized construction costs. Applicability of this optimization approach to planning new levees or upgrading existing levees is demonstrated preliminarily for a levee on a small river protecting agricultural land, and a major levee on a large river protecting a more valuable urban area. Optimized results show higher likelihood of intermediate geotechnical failure than overtopping failure. The effects of uncertainty in levee fragility curves, economic damage potential, construction costs, and hydrology (changing climate) are explored. Optimal levee crown width is more sensitive to these uncertainties than height, while the derived general principles and guidelines for risk-based optimal levee planning remain the same.

  8. Risk-Based Measurement and Analysis: Application to Software Security

    DTIC Science & Technology

    2012-02-01

    Risk Diagnostic ( MRD ) 12 4.1 Driver Identification 12 4.1.1 Mission 12 4.1.2 Objectives 13 4.1.3 Drivers 14 4.1.4 Deriving a Set of Drivers 15...4.4 Mission Risk 21 4.5 The MRD : Key Tasks and Steps 22 5 Integrated Measurement and Analysis Framework (IMAF) 23 5.1 Using the IMAF to Direct...Identification 27 6.2 Standard Mapping 28 6.3 Driver Modeling 29 7 Summary and Next Steps 32 7.1 The IMAF and the MRD 32 7.2 Additional Research

  9. Adversarial Risk Analysis for Urban Security Resource Allocation.

    PubMed

    Gil, César; Rios Insua, David; Rios, Jesus

    2016-04-01

    Adversarial risk analysis (ARA) provides a framework to deal with risks originating from intentional actions of adversaries. We show how ARA may be used to allocate security resources in the protection of urban spaces. We take into account the spatial structure and consider both proactive and reactive measures, in that we aim at both trying to reduce criminality as well as recovering as best as possible from it, should it happen. We deal with the problem by deploying an ARA model over each spatial unit, coordinating the models through resource constraints, value aggregation, and proximity. We illustrate our approach with an example that uncovers several relevant policy issues.

  10. Risk Interfaces to Support Integrated Systems Analysis and Development

    NASA Technical Reports Server (NTRS)

    Mindock, Jennifer; Lumpkins, Sarah; Shelhamer, Mark; Anton, Wilma; Havenhill, Maria

    2016-01-01

    Objectives for systems analysis capability: Develop integrated understanding of how a complex human physiological-socio-technical mission system behaves in spaceflight. Why? Support development of integrated solutions that prevent unwanted outcomes (Implementable approaches to minimize mission resources(mass, power, crew time, etc.)); Support development of tools for autonomy (need for exploration) (Assess and maintain resilience -individuals, teams, integrated system). Output of this exercise: -Representation of interfaces based on Human System Risk Board (HSRB) Risk Summary information and simple status based on Human Research Roadmap; Consolidated HSRB information applied to support communication; Point-of-Departure for HRP Element planning; Ability to track and communicate status of collaborations. 4

  11. A Review on Methods of Risk Adjustment and their Use in Integrated Healthcare Systems

    PubMed Central

    Juhnke, Christin; Bethge, Susanne

    2016-01-01

    Introduction: Effective risk adjustment is an aspect that is more and more given weight on the background of competitive health insurance systems and vital healthcare systems. The objective of this review was to obtain an overview of existing models of risk adjustment as well as on crucial weights in risk adjustment. Moreover, the predictive performance of selected methods in international healthcare systems should be analysed. Theory and methods: A comprehensive, systematic literature review on methods of risk adjustment was conducted in terms of an encompassing, interdisciplinary examination of the related disciplines. Results: In general, several distinctions can be made: in terms of risk horizons, in terms of risk factors or in terms of the combination of indicators included. Within these, another differentiation by three levels seems reasonable: methods based on mortality risks, methods based on morbidity risks as well as those based on information on (self-reported) health status. Conclusions and discussion: After the final examination of different methods of risk adjustment it was shown that the methodology used to adjust risks varies. The models differ greatly in terms of their included morbidity indicators. The findings of this review can be used in the evaluation of integrated healthcare delivery systems and can be integrated into quality- and patient-oriented reimbursement of care providers in the design of healthcare contracts. PMID:28316544

  12. Hybrid Safety Analysis Using Functional and Risk Decompositions

    SciTech Connect

    COOPER,J. ARLIN; JOHNSON,ALICE J.; WERNER,PAUL W.

    2000-07-15

    Safety analysis of complex systems depends on decomposing the systems into manageable subsystems, from which analysis can be rolled back up to the system level. The authors have found that there is no single best way to decompose; in fact hybrid combinations of decompositions are generally necessary to achieve optimum results. They are currently using two backbone coordinated decompositions--functional and risk, supplemented by other types, such as organizational. An objective is to derive metrics that can be used to efficiently and accurately aggregate information through analysis, to contribute toward assessing system safety, and to contribute information necessary for defensible decisions.

  13. NOA: a novel Network Ontology Analysis method.

    PubMed

    Wang, Jiguang; Huang, Qiang; Liu, Zhi-Ping; Wang, Yong; Wu, Ling-Yun; Chen, Luonan; Zhang, Xiang-Sun

    2011-07-01

    Gene ontology analysis has become a popular and important tool in bioinformatics study, and current ontology analyses are mainly conducted in individual gene or a gene list. However, recent molecular network analysis reveals that the same list of genes with different interactions may perform different functions. Therefore, it is necessary to consider molecular interactions to correctly and specifically annotate biological networks. Here, we propose a novel Network Ontology Analysis (NOA) method to perform gene ontology enrichment analysis on biological networks. Specifically, NOA first defines link ontology that assigns functions to interactions based on the known annotations of joint genes via optimizing two novel indexes 'Coverage' and 'Diversity'. Then, NOA generates two alternative reference sets to statistically rank the enriched functional terms for a given biological network. We compare NOA with traditional enrichment analysis methods in several biological networks, and find that: (i) NOA can capture the change of functions not only in dynamic transcription regulatory networks but also in rewiring protein interaction networks while the traditional methods cannot and (ii) NOA can find more relevant and specific functions than traditional methods in different types of static networks. Furthermore, a freely accessible web server for NOA has been developed at http://www.aporc.org/noa/.

  14. Risk Factor Analysis in Low-Temperature Geothermal Play Fairway Analysis for the Appalachian Basin (GPFA-AB)

    SciTech Connect

    Teresa E. Jordan

    2015-09-30

    This submission contains information used to compute the risk factors for the GPFA-AB project (DE-EE0006726). The risk factors are natural reservoir quality, thermal resource quality, potential for induced seismicity, and utilization. The methods used to combine the risk factors included taking the product, sum, and minimum of the four risk factors. The files are divided into images, rasters, shapefiles, and supporting information. The image files show what the raster and shapefiles should look like. The raster files contain the input risk factors, calculation of the scaled risk factors, and calculation of the combined risk factors. The shapefiles include definition of the fairways, definition of the US Census Places, the center of the raster cells, and locations of industries. Supporting information contains details of the calculations or processing used in generating the files. An image of the raster will have the same name except *.png as the file ending instead of *.tif. Images with “fairways” or “industries” added to the name are composed of a raster with the relevant shapefile added. The file About_GPFA-AB_Phase1RiskAnalysisTask5DataUpload.pdf contains information the citation, special use considerations, authorship, etc. More details on each file are given in the spreadsheet “list_of_contents.csv” in the folder “SupportingInfo”. Code used to calculate values is available at https://github.com/calvinwhealton/geothermal_pfa under the folder “combining_metrics”.

  15. Chromatographic methods for analysis of triazine herbicides.

    PubMed

    Abbas, Hana Hassan; Elbashir, Abdalla A; Aboul-Enein, Hassan Y

    2015-01-01

    Gas chromatography (GC) and high-performance liquid chromatography (HPLC) coupled to different detectors, and in combination with different sample extraction methods, are most widely used for analysis of triazine herbicides in different environmental samples. Nowadays, many variations and modifications of extraction and sample preparation methods such as solid-phase microextraction (SPME), hollow fiber-liquid phase microextraction (HF-LPME), stir bar sportive extraction (SBSE), headspace-solid phase microextraction (HS-SPME), dispersive liquid-liquid microextraction (DLLME), dispersive liquid-liquid microextraction based on solidification of floating organic droplet (DLLME-SFO), ultrasound-assisted emulsification microextraction (USAEME), and others have been introduced and developed to obtain sensitive and accurate methods for the analysis of these hazardous compounds. In this review, several analytical properties such as linearity, sensitivity, repeatability, and accuracy for each developed method are discussed, and excellent results were obtained for the most of developed methods combined with GC and HPLC techniques for the analysis of triazine herbicides. This review gives an overview of recent publications of the application of GC and HPLC for analysis of triazine herbicides residues in various samples.

  16. Patient-specific meta-analysis for risk assessment using multivariate proportional hazards regression.

    PubMed

    Crager, Michael R; Tang, Gong

    We propose a method for assessing an individual patient's risk of a future clinical event using clinical trial or cohort data and Cox proportional hazards regression, combining the information from several studies using meta-analysis techniques. The method combines patient-specific estimates of the log cumulative hazard across studies, weighting by the relative precision of the estimates, using either fixed- or random-effects meta-analysis calculations. Risk assessment can be done for any future patient using a few key summary statistics determined once and for all from each study. Generalizations of the method to logistic regression and linear models are immediate. We evaluate the methods using simulation studies and illustrate their application using real data.

  17. Groundwater-risk analysis of New York utilizing GIS technology

    NASA Astrophysics Data System (ADS)

    Hillenbrand, Charles John, III

    Using Geographic Information System (GIS) technology, data layers can be processed and analyzed to produce a regional groundwater-risk grid of New York State (NYS). GIS can be used to assess the potential to introduce contaminants at the ground surface, and assess the potential for the contaminants to migrate through the vadose zone and be introduced to an aquifer at the water-table. The potential to introduce contaminants to the ground surface was assessed utilizing existing database information in combination with the United States Geological Survey (USGS) Multi-Resolution Land Classification (MRLC) land use grid. The databases allowed an analysis of contaminant association with Standard Industrial Classification (SIC) codes, risk evaluation of the contaminants using groundwater intake values protective of human health, the development of SIC code-risk values, the construction of a SIC code-risked facility point coverage, and the construction of a land use-risk grid; this grid assesses the potential to introduce contaminants to the ground surface. Aquifer susceptibility was determined by analyzing vadose zone residence time assuming saturated conditions. Vadose zone residence time is a measure of the vadose zone's ability to attenuate and retard the migration of contaminants. Existing data layers were processed to produce a depth to water-table (vadose zone thickness) grid. Existing GIS data layers of soil, surficial geology and bedrock geology, along with review of literature and pump/slug test data, enabled the creation of thickness, porosity and vertical hydraulic conductivity grids for the three considered components of the vadose zone. The average linear velocity was then calculated for each vadose zone component by dividing their hydraulic conductivity grid by their respective porosity grid. The thickness grid of each vadose zone component was then divided by their respective average linear velocity grid to produce vadose zone residence time grids. The sum

  18. Design analysis, robust methods, and stress classification

    SciTech Connect

    Bees, W.J.

    1993-01-01

    This special edition publication volume is comprised of papers presented at the 1993 ASME Pressure Vessels and Piping Conference, July 25--29, 1993 in Denver, Colorado. The papers were prepared for presentations in technical sessions developed under the auspices of the PVPD Committees on Computer Technology, Design and Analysis, Operations Applications and Components. The topics included are: Analysis of Pressure Vessels and Components; Expansion Joints; Robust Methods; Stress Classification; and Non-Linear Analysis. Individual papers have been processed separately for inclusion in the appropriate data bases.

  19. Simplified method for nonlinear structural analysis

    NASA Technical Reports Server (NTRS)

    Kaufman, A.

    1983-01-01

    A simplified inelastic analysis computer program was developed for predicting the stress-strain history of a thermomechanically cycled structure from an elastic solution. The program uses an iterative and incremental procedure to estimate the plastic strains from the material stress-strain properties and a simulated plasticity hardening model. The simplified method was exercised on a number of problems involving uniaxial and multiaxial loading, isothermal and nonisothermal conditions, and different materials and plasticity models. Good agreement was found between these analytical results and nonlinear finite element solutions for these problems. The simplified analysis program used less than 1 percent of the CPU time required for a nonlinear finite element analysis.

  20. Association among Dietary Flavonoids, Flavonoid Subclasses and Ovarian Cancer Risk: A Meta-Analysis

    PubMed Central

    You, Ruxu; Yang, Yu; Liao, Jing; Chen, Dongsheng; Yu, Lixiu

    2016-01-01

    Background Previous studies have indicated that intake of dietary flavonoids or flavonoid subclasses is associated with the ovarian cancer risk, but presented controversial results. Therefore, we conducted a meta-analysis to derive a more precise estimation of these associations. Methods We performed a search in PubMed, Google Scholar and ISI Web of Science from their inception to April 25, 2015 to select studies on the association among dietary flavonoids, flavonoid subclasses and ovarian cancer risk. The information was extracted by two independent authors. We assessed the heterogeneity, sensitivity, publication bias and quality of the articles. A random-effects model was used to calculate the pooled risk estimates. Results Five cohort studies and seven case-control studies were included in the final meta-analysis. We observed that intake of dietary flavonoids can decrease ovarian cancer risk, which was demonstrated by pooled RR (RR = 0.82, 95% CI = 0.68–0.98). In a subgroup analysis by flavonoid subtypes, the ovarian cancer risk was also decreased for isoflavones (RR = 0.67, 95% CI = 0.50–0.92) and flavonols (RR = 0.68, 95% CI = 0.58–0.80). While there was no compelling evidence that consumption of flavones (RR = 0.86, 95% CI = 0.71–1.03) could decrease ovarian cancer risk, which revealed part sources of heterogeneity. The sensitivity analysis indicated stable results, and no publication bias was observed based on the results of Funnel plot analysis and Egger’s test (p = 0.26). Conclusions This meta-analysis suggested that consumption of dietary flavonoids and subtypes (isoflavones, flavonols) has a protective effect against ovarian cancer with a reduced risk of ovarian cancer except for flavones consumption. Nevertheless, further investigations on a larger population covering more flavonoid subclasses are warranted. PMID:26960146

  1. Numerical analysis of the orthogonal descent method

    SciTech Connect

    Shokov, V.A.; Shchepakin, M.B.

    1994-11-01

    The author of the orthogonal descent method has been testing it since 1977. The results of these tests have only strengthened the need for further analysis and development of orthogonal descent algorithms for various classes of convex programming problems. Systematic testing of orthogonal descent algorithms and comparison of test results with other nondifferentiable optimization methods was conducted at TsEMI RAN in 1991-1992 using the results.

  2. Iterative methods for design sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Belegundu, A. D.; Yoon, B. G.

    1989-01-01

    A numerical method is presented for design sensitivity analysis, using an iterative-method reanalysis of the structure generated by a small perturbation in the design variable; a forward-difference scheme is then employed to obtain the approximate sensitivity. Algorithms are developed for displacement and stress sensitivity, as well as for eignevalues and eigenvector sensitivity, and the iterative schemes are modified so that the coefficient matrices are constant and therefore decomposed only once.

  3. Looking beyond borders: integrating best practices in benefit-risk analysis into the field of food and nutrition.

    PubMed

    Tijhuis, M J; Pohjola, M V; Gunnlaugsdóttir, H; Kalogeras, N; Leino, O; Luteijn, J M; Magnússon, S H; Odekerken-Schröder, G; Poto, M; Tuomisto, J T; Ueland, O; White, B C; Holm, F; Verhagen, H

    2012-01-01

    An integrated benefit-risk analysis aims to give guidance in decision situations where benefits do not clearly prevail over risks, and explicit weighing of benefits and risks is thus indicated. The BEPRARIBEAN project aims to advance benefit-risk analysis in the area of food and nutrition by learning from other fields. This paper constitutes the final stage of the project, in which commonalities and differences in benefit-risk analysis are identified between the Food and Nutrition field and other fields, namely Medicines, Food Microbiology, Environmental Health, Economics and Marketing-Finance, and Consumer Perception. From this, ways forward are characterized for benefit-risk analysis in Food and Nutrition. Integrated benefit-risk analysis in Food and Nutrition may advance in the following ways: Increased engagement and communication between assessors, managers, and stakeholders; more pragmatic problem-oriented framing of assessment; accepting some risk; pre- and post-market analysis; explicit communication of the assessment purpose, input and output; more human (dose-response) data and more efficient use of human data; segmenting populations based on physiology; explicit consideration of value judgments in assessment; integration of multiple benefits and risks from multiple domains; explicit recognition of the impact of consumer beliefs, opinions, views, perceptions, and attitudes on behaviour; and segmenting populations based on behaviour; the opportunities proposed here do not provide ultimate solutions; rather, they define a collection of issues to be taken account of in developing methods, tools, practices and policies, as well as refining the regulatory context, for benefit-risk analysis in Food and Nutrition and other fields. Thus, these opportunities will now need to be explored further and incorporated into benefit-risk practice and policy. If accepted, incorporation of these opportunities will also involve a paradigm shift in Food and Nutrition benefit-risk

  4. Walking the line: Understanding pedestrian behaviour and risk at rail level crossings with cognitive work analysis.

    PubMed

    Read, Gemma J M; Salmon, Paul M; Lenné, Michael G; Stanton, Neville A

    2016-03-01

    Pedestrian fatalities at rail level crossings (RLXs) are a public safety concern for governments worldwide. There is little literature examining pedestrian behaviour at RLXs and no previous studies have adopted a formative approach to understanding behaviour in this context. In this article, cognitive work analysis is applied to understand the constraints that shape pedestrian behaviour at RLXs in Melbourne, Australia. The five phases of cognitive work analysis were developed using data gathered via document analysis, behavioural observation, walk-throughs and critical decision method interviews. The analysis demonstrates the complex nature of pedestrian decision making at RLXs and the findings are synthesised to provide a model illustrating the influences on pedestrian decision making in this context (i.e. time, effort and social pressures). Further, the CWA outputs are used to inform an analysis of the risks to safety associated with pedestrian behaviour at RLXs and the identification of potential interventions to reduce risk.

  5. Ankylosing spondylitis and risk of venous thromboembolism: A systematic review and meta-analysis

    PubMed Central

    Ungprasert, Patompong; Srivali, Narat; Kittanamongkolchai, Wonngarm

    2016-01-01

    Background: Several immune-mediated inflammatory disorders, such as rheumatoid arthritis, psoriatic arthritis, and systemic lupus erythematosus have been linked to an increased risk of venous thromboembolism (VTE). However, the data on ankylosing spondylitis (AS) are limited. Methods: We conducted a systematic review and meta-analysis of observational studies that reported odds ratio, relative risk, hazard ratio, or standardized incidence ratio comparing the risk of VTE and possible pulmonary embolism (PE) in patients with AS versus non-AS participants. Pooled risk ratio and 95% confidence intervals were calculated using a random-effect, generic inverse variance method of DerSimonian and Laird. Results: Of 423 potentially relevant articles, three studies met our inclusion criteria and thus, were included in the data analysis. The pooled risk ratio of VTE in patients with AS was 1.60 (95% confidence interval: 1.05–2.44). The statistical heterogeneity of this study was high with an I2 of 93%. Conclusion: Our study demonstrated a statistically significant increased VTE risk among patients with AS. PMID:27890993

  6. [Evaluation of the risk related to repetitive work activities: testing of several methods proposed in the literature].

    PubMed

    Capodaglio, E M; Facioli, M; Bazzini, G

    2001-01-01

    Pathologies due to the repetitive activity of the upper limbs constitutes a growing part of the work-related musculo-skeletal disorders. At the moment, there are no universally accepted and validated methods for the description and assessment of the work-related risks. Yet, the criteria fundamentally characterizing the exposure are rather clear and even. This study reports a practical example of the application of some recent risk assessment methods proposed in the literature, combining objective and subjective measures obtained on the field, with the traditional activity analysis.

  7. Risk Analysis as Regulatory Science: Toward The Establishment of Standards

    PubMed Central

    Murakami, Michio

    2016-01-01

    Understanding how to establish standards is essential for risk communication and also provides perspectives for further study. In this paper, the concept of risk analysis as regulatory science for the establishment of standards is demonstrated through examples of standards for evacuation and provisional regulation values in foods and drinking water. Moreover, academic needs for further studies related to standards are extracted. The concepts of the traditional ‘Standard I’, which has a paternalistic orientation, and ‘Standard II’, established through stakeholder consensus, are then systemized by introducing the current status of the new standards-related movement that developed after the Fukushima nuclear power plant accident, and the perspectives of the standards are discussed. Preparation of standards on the basis of stakeholder consensus through intensive risk dialogue before a potential nuclear power plant accident is suggested to be a promising approach to ensure a safe society and enhance subjective well-being. PMID:27475751

  8. Risk factors for rape re-victimisation: a retrospective analysis.

    PubMed

    Lurie, S; Boaz, M; Golan, A

    2013-11-01

    Sexual re-victimisation refers to a pattern in which the sexual assault victim has an increased risk of subsequent victimisation relative to an individual who was never victimised. The purpose of our study was to identify risks factors for a second rape, the severest form of sexual re-victimisation. All rape victims treated at the First Regional Israeli Center for Sexual Assault Victims between October 2000 and July 2010 were included in this retrospective analysis. We compared characteristics of 53 rape victims who were victimised twice to those of 1,939 rape victims who were victimised once. We identified several risk factors for a second rape, which can be used in prevention programmes. These are: psychiatric background, history of social services involvement, adulthood, non-virginity and minority ethnicity.

  9. Risk Analysis as Regulatory Science: Toward The Establishment of Standards.

    PubMed

    Murakami, Michio

    2016-09-01

    Understanding how to establish standards is essential for risk communication and also provides perspectives for further study. In this paper, the concept of risk analysis as regulatory science for the establishment of standards is demonstrated through examples of standards for evacuation and provisional regulation values in foods and drinking water. Moreover, academic needs for further studies related to standards are extracted. The concepts of the traditional 'Standard I', which has a paternalistic orientation, and 'Standard II', established through stakeholder consensus, are then systemized by introducing the current status of the new standards-related movement that developed after the Fukushima nuclear power plant accident, and the perspectives of the standards are discussed. Preparation of standards on the basis of stakeholder consensus through intensive risk dialogue before a potential nuclear power plant accident is suggested to be a promising approach to ensure a safe society and enhance subjective well-being.

  10. Risk assessment and its application to flight safety analysis

    SciTech Connect

    Keese, D.L.; Barton, W.R.

    1989-12-01

    Potentially hazardous test activities have historically been a part of Sandia National Labs mission to design, develop, and test new weapons systems. These test activities include high speed air drops for parachute development, sled tests for component and system level studies, multiple stage rocket experiments, and artillery firings of various projectiles. Due to the nature of Sandia's test programs, the risk associated with these activities can never be totally eliminated. However, a consistent set of policies should be available to provide guidance into the level of risk that is acceptable in these areas. This report presents a general set of guidelines for addressing safety issues related to rocket flight operations at Sandia National Laboratories. Even though the majority of this report deals primarily with rocket flight safety, these same principles could be applied to other hazardous test activities. The basic concepts of risk analysis have a wide range of applications into many of Sandia's current operations. 14 refs., 1 tab.

  11. Probabilistic risk analysis and fault trees: Initial discussion of application to identification of risk at a wellhead

    NASA Astrophysics Data System (ADS)

    Rodak, C.; Silliman, S.

    2012-02-01

    Wellhead protection is of critical importance for managing groundwater resources. While a number of previous authors have addressed questions related to uncertainties in advective capture zones, methods for addressing wellhead protection in the presence of uncertainty in the chemistry of groundwater contaminants, the relationship between land-use and contaminant sources, and the impact on health of the receiving population are limited. It is herein suggested that probabilistic risk analysis (PRA) combined with fault trees (FT) provides a structure whereby chemical transport can be combined with uncertainties in source, chemistry, and health impact to assess the probability of negative health outcomes in the population. As such, PRA-FT provides a new strategy for the identification of areas of probabilistically high human health risk. Application of this approach is demonstrated through a simplified case study involving flow to a well in an unconfined aquifer with heterogeneity in aquifer properties and contaminant sources.

  12. Robotic Mars Sample Return: Risk Assessment and Analysis Report

    NASA Technical Reports Server (NTRS)

    Lalk, Thomas R.; Spence, Cliff A.

    2003-01-01

    A comparison of the risk associated with two alternative scenarios for a robotic Mars sample return mission was conducted. Two alternative mission scenarios were identified, the Jet Propulsion Lab (JPL) reference Mission and a mission proposed by Johnson Space Center (JSC). The JPL mission was characterized by two landers and an orbiter, and a Mars orbit rendezvous to retrieve the samples. The JSC mission (Direct/SEP) involves a solar electric propulsion (SEP) return to earth followed by a rendezvous with the space shuttle in earth orbit. A qualitative risk assessment to identify and characterize the risks, and a risk analysis to quantify the risks were conducted on these missions. Technical descriptions of the competing scenarios were developed in conjunction with NASA engineers and the sequence of events for each candidate mission was developed. Risk distributions associated with individual and combinations of events were consolidated using event tree analysis in conjunction with Monte Carlo techniques to develop probabilities of mission success for each of the various alternatives. The results were the probability of success of various end states for each candidate scenario. These end states ranged from complete success through various levels of partial success to complete failure. Overall probability of success for the Direct/SEP mission was determined to be 66% for the return of at least one sample and 58% for the JPL mission for the return of at least one sample cache. Values were also determined for intermediate events and end states as well as for the probability of violation of planetary protection. Overall mission planetary protection event probabilities of occurrence were determined to be 0.002% and 1.3% for the Direct/SEP and JPL Reference missions respectively.

  13. Quality-by-Design II: Application of Quantitative Risk Analysis to the Formulation of Ciprofloxacin Tablets.

    PubMed

    Claycamp, H Gregg; Kona, Ravikanth; Fahmy, Raafat; Hoag, Stephen W

    2016-04-01

    Qualitative risk assessment methods are often used as the first step to determining design space boundaries; however, quantitative assessments of risk with respect to the design space, i.e., calculating the probability of failure for a given severity, are needed to fully characterize design space boundaries. Quantitative risk assessment methods in design and operational spaces are a significant aid to evaluating proposed design space boundaries. The goal of this paper is to demonstrate a relatively simple strategy for design space definition using a simplified Bayesian Monte Carlo simulation. This paper builds on a previous paper that used failure mode and effects analysis (FMEA) qualitative risk assessment and Plackett-Burman design of experiments to identity the critical quality attributes. The results show that the sequential use of qualitative and quantitative risk assessments can focus the design of experiments on a reduced set of critical material and process parameters that determine a robust design space under conditions of limited laboratory experimentation. This approach provides a strategy by which the degree of risk associated with each known parameter can be calculated and allocates resources in a manner that manages risk to an acceptable level.

  14. Assessing the validity of prospective hazard analysis methods: a comparison of two techniques

    PubMed Central

    2014-01-01

    Background Prospective Hazard Analysis techniques such as Healthcare Failure Modes and Effects Analysis (HFMEA) and Structured What If Technique (SWIFT) have the potential to increase safety by identifying risks before an adverse event occurs. Published accounts of their application in healthcare have identified benefits, but the reliability of some methods has been found to be low. The aim of this study was to examine the validity of SWIFT and HFMEA by comparing their outputs in the process of risk assessment, and comparing the results with risks identified by retrospective methods. Methods The setting was a community-based anticoagulation clinic, in which risk assessment activities had been previously performed and were available. A SWIFT and an HFMEA workshop were conducted consecutively on the same day by experienced experts. Participants were a mixture of pharmacists, administrative staff and software developers. Both methods produced lists of risks scored according to the method’s procedure. Participants’ views about the value of the workshops were elicited with a questionnaire. Results SWIFT identified 61 risks and HFMEA identified 72 risks. For both methods less than half the hazards were identified by the other method. There was also little overlap between the results of the workshops and risks identified by prior root cause analysis, staff interviews or clinical governance board discussions. Participants’ feedback indicated that the workshops were viewed as useful. Conclusions Although there was limited overlap, both methods raised important hazards. Scoping the problem area had a considerable influence on the outputs. The opportunity for teams to discuss their work from a risk perspective is valuable, but these methods cannot be relied upon in isolation to provide a comprehensive description. Multiple methods for identifying hazards should be used and data from different sources should be integrated to give a comprehensive view of risk in a system

  15. GPFA-AB_Phase1RiskAnalysisTask5DataUpload

    DOE Data Explorer

    Teresa E. Jordan

    2015-09-30

    This submission contains information used to compute the risk factors for the GPFA-AB project (DE-EE0006726). The risk factors are natural reservoir quality, thermal resource quality, potential for induced seismicity, and utilization. The methods used to combine the risk factors included taking the product, sum, and minimum of the four risk factors. The files are divided into images, rasters, shapefiles, and supporting information. The image files show what the raster and shapefiles should look like. The raster files contain the input risk factors, calculation of the scaled risk factors, and calculation of the combined risk factors. The shapefiles include definition of the fairways, definition of the US Census Places, the center of the raster cells, and locations of industries. Supporting information contains details of the calculations or processing used in generating the files. An image of the raster will have the same name except *.png as the file ending instead of *.tif. Images with “fairways” or “industries” added to the name are composed of a raster with the relevant shapefile added. The file About_GPFA-AB_Phase1RiskAnalysisTask5DataUpload.pdf contains information the citation, special use considerations, authorship, etc. More details on each file are given in the spreadsheet “list_of_contents.csv” in the folder “SupportingInfo”. Code used to calculate values is available at https://github.com/calvinwhealton/geothermal_pfa under the folder “combining_metrics”.

  16. Metabolic syndrome is associated with increased breast cancer risk: a systematic review with meta-analysis.

    PubMed

    Bhandari, Ruchi; Kelley, George A; Hartley, Tara A; Rockett, Ian R H

    2014-01-01

    Background. Although individual metabolic risk factors are reported to be associated with breast cancer risk, controversy surrounds risk of breast cancer from metabolic syndrome (MS). We report the first systematic review and meta-analysis of the association between MS and breast cancer risk in all adult females. Methods. Studies were retrieved by searching four electronic reference databases [PubMed, Cumulative Index to Nursing and Allied Health Literature (CINAHL), Web of Science, and ProQuest through June 30, 2012] and cross-referencing retrieved articles. Eligible for inclusion were longitudinal studies reporting associations between MS and breast cancer risk among females aged 18 years and older. Relative risks and 95% confidence intervals were calculated for each study and pooled using random-effects models. Publication bias was assessed quantitatively (Trim and Fill) and qualitatively (funnel plots). Heterogeneity was examined using Q and I (2) statistics. Results. Representing nine independent cohorts and 97,277 adult females, eight studies met the inclusion criteria. A modest, positive association was observed between MS and breast cancer risk (RR: 1.47, 95% CI, 1.15-1.87; z = 3.13; p = 0.002; Q = 26.28, p = 0.001; I (2) = 69.55%). No publication bias was observed. Conclusions. MS is associated with increased breast cancer risk in adult women.

  17. Methods for Chemical Analysis of Fresh Waters.

    ERIC Educational Resources Information Center

    Golterman, H. L.

    This manual, one of a series prepared for the guidance of research workers conducting studies as part of the International Biological Programme, contains recommended methods for the analysis of fresh water. The techniques are grouped in the following major sections: Sample Taking and Storage; Conductivity, pH, Oxidation-Reduction Potential,…

  18. Data analysis method for evaluating dialogic learning.

    PubMed

    Janhonen, S; Sarja, A

    2000-02-01

    The purpose of this paper is to introduce a new method of analysing and evaluating dialogic learning. Dialogic learning offers possibilities that have not previously been found in nursing or nursing education, although some nursing researchers have lately become interested in dialogic nursing interaction between nurses and patients. The stages of analysis of dialogic learning have been illustrated by using an example. The data for this illustration were collected by video-taping a planning process where students for a Master's degree (qualifying them to be nursing instructors in Finland) plan, implement and evaluate a course for nursing students, on the care of terminally ill patients. However, it is possible to use this method of analysis for other dialogic learning situations both in nursing practice (for example, collaborative meetings between experts and patients) and in nursing education (for example, collaborative learning situations). The focus of this method of analysis concentrates on various situations where participants in interaction see the object of discussion from various points of view. This method of analysis helps the participants in the interaction to develop their interactional skills both through an awareness of their own views, and through understanding the other participants' various views in a particular nursing situation.

  19. Analysis methods for tocopherols and tocotrienols

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Tocopherols and tocotrienols, sometimes called tocochromanols or tocols, are also collectively termed Vitamin E. Vitamins A, D, E, and K, are referred to as fat soluble vitamins. Since the discovery of Vitamin E in 1922, many methods have been developed for the analysis of tocopherols and tocotrie...

  20. Integrated method for chaotic time series analysis

    DOEpatents

    Hively, L.M.; Ng, E.G.

    1998-09-29

    Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data are disclosed. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated. 8 figs.