Science.gov

Sample records for risk analysis method

  1. Probabilistic methods in fire-risk analysis

    SciTech Connect

    Brandyberry, M.D.

    1989-01-01

    The first part of this work outlines a method for assessing the frequency of ignition of a consumer product in a building and shows how the method would be used in an example scenario utilizing upholstered furniture as the product and radiant auxiliary heating devices (electric heaters, wood stoves) as the ignition source. Deterministic thermal models of the heat-transport processes are coupled with parameter uncertainty analysis of the models and with a probabilistic analysis of the events involved in a typical scenario. This leads to a distribution for the frequency of ignition for the product. In second part, fire-risk analysis as currently used in nuclear plants is outlines along with a discussion of the relevant uncertainties. The use of the computer code COMPBRN is discussed for use in the fire-growth analysis along with the use of response-surface methodology to quantify uncertainties in the code's use. Generalized response surfaces are developed for temperature versus time for a cable tray, as well as a surface for the hot gas layer temperature and depth for a room of arbitrary geometry within a typical nuclear power plant compartment. These surfaces are then used to simulate the cable tray damage time in a compartment fire experiment.

  2. Bayes Method Plant Aging Risk Analysis

    1992-03-13

    DORIAN is an integrated package for performing Bayesian aging analysis of reliability data; e.g. for identifying trends in component failure rates and/or outage durations as a function of time. The user must specify several alternatives hypothesized aging models (i.e. possible trends) along with prior probabilities indicating the subjective probability that each trend is actually the correct one. DORIAN then uses component failure and/or repair data over time to update these prior probabilities and develop amore » posterior probability for each aging model, representing the probability that each model is the correct one in light of the observed data rather than a priori. Mean, median, and 5th and 95th percentile trends are also compiled from the posterior probabilities.« less

  3. The development of a 3D risk analysis method.

    PubMed

    I, Yet-Pole; Cheng, Te-Lung

    2008-05-01

    Much attention has been paid to the quantitative risk analysis (QRA) research in recent years due to more and more severe disasters that have happened in the process industries. Owing to its calculation complexity, very few software, such as SAFETI, can really make the risk presentation meet the practice requirements. However, the traditional risk presentation method, like the individual risk contour in SAFETI, is mainly based on the consequence analysis results of dispersion modeling, which usually assumes that the vapor cloud disperses over a constant ground roughness on a flat terrain with no obstructions and concentration fluctuations, which is quite different from the real situations of a chemical process plant. All these models usually over-predict the hazardous regions in order to maintain their conservativeness, which also increases the uncertainty of the simulation results. On the other hand, a more rigorous model such as the computational fluid dynamics (CFD) model can resolve the previous limitations; however, it cannot resolve the complexity of risk calculations. In this research, a conceptual three-dimensional (3D) risk calculation method was proposed via the combination of results of a series of CFD simulations with some post-processing procedures to obtain the 3D individual risk iso-surfaces. It is believed that such technique will not only be limited to risk analysis at ground level, but also be extended into aerial, submarine, or space risk analyses in the near future.

  4. Comprehensive safeguards evaluation methods and societal risk analysis

    SciTech Connect

    Richardson, J.M.

    1982-03-01

    Essential capabilities of an integrated evaluation methodology for analyzing safeguards systems are discussed. Such a methodology must be conceptually meaningful, technically defensible, discriminating and consistent. A decompostion of safeguards systems by function is mentioned as a possible starting point for methodology development. The application of a societal risk equation to safeguards systems analysis is addressed. Conceptual problems with this approach are discussed. Technical difficulties in applying this equation to safeguards systems are illustrated through the use of confidence intervals, information content, hypothesis testing and ranking and selection procedures.

  5. Development of partial failure analysis method in probability risk assessments

    SciTech Connect

    Ni, T.; Modarres, M.

    1996-12-01

    This paper presents a new approach to evaluate the partial failure effect on current Probability Risk Assessments (PRAs). An integrated methodology of the thermal-hydraulic analysis and fuzzy logic simulation using the Dynamic Master Logic Diagram (DMLD) was developed. The thermal-hydraulic analysis used in this approach is to identify partial operation effect of any PRA system function in a plant model. The DMLD is used to simulate the system performance of the partial failure effect and inspect all minimal cut sets of system functions. This methodology can be applied in the context of a full scope PRA to reduce core damage frequency. An example of this application of the approach is presented. The partial failure data used in the example is from a survey study of partial failure effects from the Nuclear Plant Reliability Data System (NPRDS).

  6. Simplified Plant Analysis Risk (SPAR) Human Reliability Analysis (HRA) Methodology: Comparisons with other HRA Methods

    SciTech Connect

    Byers, James Clifford; Gertman, David Ira; Hill, Susan Gardiner; Blackman, Harold Stabler; Gentillon, Cynthia Ann; Hallbert, Bruce Perry; Haney, Lon Nolan

    2000-08-01

    The 1994 Accident Sequence Precursor (ASP) human reliability analysis (HRA) methodology was developed for the U.S. Nuclear Regulatory Commission (USNRC) in 1994 by the Idaho National Engineering and Environmental Laboratory (INEEL). It was decided to revise that methodology for use by the Simplified Plant Analysis Risk (SPAR) program. The 1994 ASP HRA methodology was compared, by a team of analysts, on a point-by-point basis to a variety of other HRA methods and sources. This paper briefly discusses how the comparisons were made and how the 1994 ASP HRA methodology was revised to incorporate desirable aspects of other methods. The revised methodology was renamed the SPAR HRA methodology.

  7. Review of Research Trends and Methods in Nano Environmental, Health, and Safety Risk Analysis.

    PubMed

    Erbis, Serkan; Ok, Zeynep; Isaacs, Jacqueline A; Benneyan, James C; Kamarthi, Sagar

    2016-08-01

    Despite the many touted benefits of nanomaterials, concerns remain about their possible environmental, health, and safety (EHS) risks in terms of their toxicity, long-term accumulation effects, or dose-response relationships. The published studies on EHS risks of nanomaterials have increased significantly over the past decade and half, with most focused on nanotoxicology. Researchers are still learning about health consequences of nanomaterials and how to make environmentally responsible decisions regarding their production. This article characterizes the scientific literature on nano-EHS risk analysis to map the state-of-the-art developments in this field and chart guidance for the future directions. First, an analysis of keyword co-occurrence networks is investigated for nano-EHS literature published in the past decade to identify the intellectual turning points and research trends in nanorisk analysis studies. The exposure groups targeted in emerging nano-EHS studies are also assessed. System engineering methods for risk, safety, uncertainty, and system reliability analysis are reviewed, followed by detailed descriptions where applications of these methods are utilized to analyze nanomaterial EHS risks. Finally, the trends, methods, future directions, and opportunities of system engineering methods in nano-EHS research are discussed. The analysis of nano-EHS literature presented in this article provides important insights on risk assessment and risk management tools associated with nanotechnology, nanomanufacturing, and nano-enabled products. PMID:26882074

  8. Review of Research Trends and Methods in Nano Environmental, Health, and Safety Risk Analysis.

    PubMed

    Erbis, Serkan; Ok, Zeynep; Isaacs, Jacqueline A; Benneyan, James C; Kamarthi, Sagar

    2016-08-01

    Despite the many touted benefits of nanomaterials, concerns remain about their possible environmental, health, and safety (EHS) risks in terms of their toxicity, long-term accumulation effects, or dose-response relationships. The published studies on EHS risks of nanomaterials have increased significantly over the past decade and half, with most focused on nanotoxicology. Researchers are still learning about health consequences of nanomaterials and how to make environmentally responsible decisions regarding their production. This article characterizes the scientific literature on nano-EHS risk analysis to map the state-of-the-art developments in this field and chart guidance for the future directions. First, an analysis of keyword co-occurrence networks is investigated for nano-EHS literature published in the past decade to identify the intellectual turning points and research trends in nanorisk analysis studies. The exposure groups targeted in emerging nano-EHS studies are also assessed. System engineering methods for risk, safety, uncertainty, and system reliability analysis are reviewed, followed by detailed descriptions where applications of these methods are utilized to analyze nanomaterial EHS risks. Finally, the trends, methods, future directions, and opportunities of system engineering methods in nano-EHS research are discussed. The analysis of nano-EHS literature presented in this article provides important insights on risk assessment and risk management tools associated with nanotechnology, nanomanufacturing, and nano-enabled products.

  9. Handbook of methods for risk-based analysis of Technical Specification requirements

    SciTech Connect

    Samanta, P.K.; Vesely, W.E.

    1993-12-31

    Technical Specifications (TS) requirements for nuclear power plants define the Limiting Conditions for Operation (LCOs) and Surveillance Requirements (SRs) to assure safety during operation. In general, these requirements were based on deterministic analysis and engineering judgments. Experiences with plant operation indicate that some elements of the requirements are unnecessarily restrictive, while others may not be conducive to safety. Improvements in these requirements are facilitated by the availability of plant specific Probabilistic Safety Assessments (PSAs). The use of risk and reliability-based methods to improve TS requirements has gained wide interest because these methods can: quantitatively evaluate the risk impact and justify changes based on objective risk arguments. Provide a defensible basis for these requirements for regulatory applications. The United States Nuclear Regulatory Commission (USNRC) Office of Research is sponsoring research to develop systematic risk-based methods to improve various aspects of TS requirements. The handbook of methods, which is being prepared, summarizes such risk-based methods. The scope of the handbook includes reliability and risk-based methods for evaluating allowed outage times (AOTs), action statements requiring shutdown where shutdown risk may be substantial, surveillance test intervals (STIs), defenses against common-cause failures, managing plant configurations, and scheduling maintenances. For each topic, the handbook summarizes methods of analysis and data needs, outlines the insights to be gained, lists additional references, and presents examples of evaluations.

  10. A comparative analysis of PRA and intelligent adversary methods for counterterrorism risk management.

    PubMed

    Merrick, Jason; Parnell, Gregory S

    2011-09-01

    In counterterrorism risk management decisions, the analyst can choose to represent terrorist decisions as defender uncertainties or as attacker decisions. We perform a comparative analysis of probabilistic risk analysis (PRA) methods including event trees, influence diagrams, Bayesian networks, decision trees, game theory, and combined methods on the same illustrative examples (container screening for radiological materials) to get insights into the significant differences in assumptions and results. A key tenent of PRA and decision analysis is the use of subjective probability to assess the likelihood of possible outcomes. For each technique, we compare the assumptions, probability assessment requirements, risk levels, and potential insights for risk managers. We find that assessing the distribution of potential attacker decisions is a complex judgment task, particularly considering the adaptation of the attacker to defender decisions. Intelligent adversary risk analysis and adversarial risk analysis are extensions of decision analysis and sequential game theory that help to decompose such judgments. These techniques explicitly show the adaptation of the attacker and the resulting shift in risk based on defender decisions. PMID:21418080

  11. An improved method for risk evaluation in failure modes and effects analysis of CNC lathe

    NASA Astrophysics Data System (ADS)

    Rachieru, N.; Belu, N.; Anghel, D. C.

    2015-11-01

    Failure mode and effects analysis (FMEA) is one of the most popular reliability analysis tools for identifying, assessing and eliminating potential failure modes in a wide range of industries. In general, failure modes in FMEA are evaluated and ranked through the risk priority number (RPN), which is obtained by the multiplication of crisp values of the risk factors, such as the occurrence (O), severity (S), and detection (D) of each failure mode. However, the crisp RPN method has been criticized to have several deficiencies. In this paper, linguistic variables, expressed in Gaussian, trapezoidal or triangular fuzzy numbers, are used to assess the ratings and weights for the risk factors S, O and D. A new risk assessment system based on the fuzzy set theory and fuzzy rule base theory is to be applied to assess and rank risks associated to failure modes that could appear in the functioning of Turn 55 Lathe CNC. Two case studies have been shown to demonstrate the methodology thus developed. It is illustrated a parallel between the results obtained by the traditional method and fuzzy logic for determining the RPNs. The results show that the proposed approach can reduce duplicated RPN numbers and get a more accurate, reasonable risk assessment. As a result, the stability of product and process can be assured.

  12. Genotype relative risks: Methods for design and analysis of candidate-gene association studies

    SciTech Connect

    Shaid, D.J.; Sommer, S.S. )

    1993-11-01

    Design and analysis methods are presented for studying the association of a candidate gene with a disease by using parental data in place of nonrelated controls. This alternating design eliminates spurious differences in allele frequencies between cases and nonrelated controls resulting from different ethnic origins and population stratification for these two groups. The authors present analysis methods which are based on two genetic relative risks: (1) the relative risk of disease for homozygotes with two copies of the candidate gene versus homozygotes without the candidate gene and (2) the relative risk for heterozygotes with one copy of the candidate gene versus homozygotes without the candidate gene. In addition to estimating the magnitude of these relative risks, likelihood methods allow specific hypotheses to be tested, namely, a test for overall association of the candidate gene with disease, as well as specific genetic hypotheses, such as dominant or recessive inheritance. Two likelihood methods are presented: (1) a likelihood method appropriate when Hardy-Weinberg equilibrium holds and (2) a likelihood method in which the authors condition on parental genotype data when Hardy-Weinberg equilibrium does not hold. The results for the relative efficiency of these two methods suggest that the conditional approach may at times be preferable, even when equilibrium holds. Sample-size and power calculations are presented for a multitiered design. Tier 1 detects the presence of an abnormal sequence for a postulated candidate gene among a small group of cases. Tier 2 tests for association of the abnormal variant with disease, such as by the likelihood methods presented. Tier 3 confirms positive results from tier 2. Results indicate that required sample sizes are smaller when expression of disease is recessive, rather than dominant, and that, for recessive disease and large relative risks, necessary sample sizes may be feasible. 19 refs., 2 figs., 2 tabs.

  13. Method of assessing a lipid-related health risk based on ion mobility analysis of lipoproteins

    DOEpatents

    Benner, W. Henry; Krauss, Ronald M.; Blanche, Patricia J.

    2010-12-14

    A medical diagnostic method and instrumentation system for analyzing noncovalently bonded agglomerated biological particles is described. The method and system comprises: a method of preparation for the biological particles; an electrospray generator; an alpha particle radiation source; a differential mobility analyzer; a particle counter; and data acquisition and analysis means. The medical device is useful for the assessment of human diseases, such as cardiac disease risk and hyperlipidemia, by rapid quantitative analysis of lipoprotein fraction densities. Initially, purification procedures are described to reduce an initial blood sample to an analytical input to the instrument. The measured sizes from the analytical sample are correlated with densities, resulting in a spectrum of lipoprotein densities. The lipoprotein density distribution can then be used to characterize cardiac and other lipid-related health risks.

  14. An integrated optimization method for river water quality management and risk analysis in a rural system.

    PubMed

    Liu, J; Li, Y P; Huang, G H; Zeng, X T; Nie, S

    2016-01-01

    In this study, an interval-stochastic-based risk analysis (RSRA) method is developed for supporting river water quality management in a rural system under uncertainty (i.e., uncertainties exist in a number of system components as well as their interrelationships). The RSRA method is effective in risk management and policy analysis, particularly when the inputs (such as allowable pollutant discharge and pollutant discharge rate) are expressed as probability distributions and interval values. Moreover, decision-makers' attitudes towards system risk can be reflected using a restricted resource measure by controlling the variability of the recourse cost. The RSRA method is then applied to a real case of water quality management in the Heshui River Basin (a rural area of China), where chemical oxygen demand (COD), total nitrogen (TN), total phosphorus (TP), and soil loss are selected as major indicators to identify the water pollution control strategies. Results reveal that uncertainties and risk attitudes have significant effects on both pollutant discharge and system benefit. A high risk measure level can lead to a reduced system benefit; however, this reduction also corresponds to raised system reliability. Results also disclose that (a) agriculture is the dominant contributor to soil loss, TN, and TP loads, and abatement actions should be mainly carried out for paddy and dry farms; (b) livestock husbandry is the main COD discharger, and abatement measures should be mainly conducted for poultry farm; (c) fishery accounts for a high percentage of TN, TP, and COD discharges but a has low percentage of overall net benefit, and it may be beneficial to cease fishery activities in the basin. The findings can facilitate the local authority in identifying desired pollution control strategies with the tradeoff between socioeconomic development and environmental sustainability. PMID:26310705

  15. An integrated optimization method for river water quality management and risk analysis in a rural system.

    PubMed

    Liu, J; Li, Y P; Huang, G H; Zeng, X T; Nie, S

    2016-01-01

    In this study, an interval-stochastic-based risk analysis (RSRA) method is developed for supporting river water quality management in a rural system under uncertainty (i.e., uncertainties exist in a number of system components as well as their interrelationships). The RSRA method is effective in risk management and policy analysis, particularly when the inputs (such as allowable pollutant discharge and pollutant discharge rate) are expressed as probability distributions and interval values. Moreover, decision-makers' attitudes towards system risk can be reflected using a restricted resource measure by controlling the variability of the recourse cost. The RSRA method is then applied to a real case of water quality management in the Heshui River Basin (a rural area of China), where chemical oxygen demand (COD), total nitrogen (TN), total phosphorus (TP), and soil loss are selected as major indicators to identify the water pollution control strategies. Results reveal that uncertainties and risk attitudes have significant effects on both pollutant discharge and system benefit. A high risk measure level can lead to a reduced system benefit; however, this reduction also corresponds to raised system reliability. Results also disclose that (a) agriculture is the dominant contributor to soil loss, TN, and TP loads, and abatement actions should be mainly carried out for paddy and dry farms; (b) livestock husbandry is the main COD discharger, and abatement measures should be mainly conducted for poultry farm; (c) fishery accounts for a high percentage of TN, TP, and COD discharges but a has low percentage of overall net benefit, and it may be beneficial to cease fishery activities in the basin. The findings can facilitate the local authority in identifying desired pollution control strategies with the tradeoff between socioeconomic development and environmental sustainability.

  16. Quantitative environmental risk analysis

    SciTech Connect

    Klovning, J.; Nilsen, E.F.

    1995-12-31

    According to regulations relating to implementation and rise of risk analysis in the petroleum activities issued by the Norwegian Petroleum Directorate, it is mandatory for an operator on the Norwegian Continental Shelf to establish acceptance criteria for environmental risk in the activities and carry out environmental risk analysis. This paper presents a {open_quotes}new{close_quotes} method for environmental risk analysis developed by the company. The objective has been to assist the company to meet rules and regulations and to assess and describe the environmental risk in a systematic manner. In the environmental risk analysis the most sensitive biological resource in the affected area is used to assess the environmental damage. The analytical method is based on the methodology for quantitative risk analysis related to loss of life. In addition it incorporates the effect of seasonal fluctuations in the environmental risk evaluations. The paper is describing the function of the main analytical sequences exemplified through an analysis of environmental risk related to exploration drilling in an environmental sensitive area on the Norwegian Continental Shelf.

  17. Animal Disease Import Risk Analysis--a Review of Current Methods and Practice.

    PubMed

    Peeler, E J; Reese, R A; Thrush, M A

    2015-10-01

    The application of risk analysis to the spread of disease with international trade in animals and their products, that is, import risk analysis (IRA), has been largely driven by the Sanitary and Phytosanitary (SPS) agreement of the World Trade Organization (WTO). The degree to which the IRA standard established by the World Organization for Animal Health (OIE), and associated guidance, meets the needs of the SPS agreement is discussed. The use of scenario trees is the core modelling approach used to represent the steps necessary for the hazard to occur. There is scope to elaborate scenario trees for commodity IRA so that the quantity of hazard at each step is assessed, which is crucial to the likelihood of establishment. The dependence between exposure and establishment suggests that they should fall within the same subcomponent. IRA undertaken for trade reasons must include an assessment of consequences to meet SPS criteria, but guidance is sparse. The integration of epidemiological and economic modelling may open a path for better methods. Matrices have been used in qualitative IRA to combine estimates of entry and exposure, and consequences with likelihood, but this approach has flaws and better methods are needed. OIE IRA standards and guidance indicate that the volume of trade should be taken into account, but offer no detail. Some published qualitative IRAs have assumed current levels and patterns of trade without specifying the volume of trade, which constrains the use of IRA to determine mitigation measures (to reduce risk to an acceptable level) and whether the principle of equivalence, fundamental to the SPS agreement, has been observed. It is questionable whether qualitative IRA can meet all the criteria set out in the SPS agreement. Nevertheless, scope exists to elaborate the current standards and guidance, so they better serve the principle of science-based decision-making.

  18. Stochastic Drought Risk Analysis and Projection Methods For Thermoelectric Power Systems

    NASA Astrophysics Data System (ADS)

    Bekera, Behailu Belamo

    Combined effects of socio-economic, environmental, technological and political factors impact fresh cooling water availability, which is among the most important elements of thermoelectric power plant site selection and evaluation criteria. With increased variability and changes in hydrologic statistical stationarity, one concern is the increased occurrence of extreme drought events that may be attributable to climatic changes. As hydrological systems are altered, operators of thermoelectric power plants need to ensure a reliable supply of water for cooling and generation requirements. The effects of climate change are expected to influence hydrological systems at multiple scales, possibly leading to reduced efficiency of thermoelectric power plants. This study models and analyzes drought characteristics from a thermoelectric systems operational and regulation perspective. A systematic approach to characterize a stream environment in relation to extreme drought occurrence, duration and deficit-volume is proposed and demonstrated. More specifically, the objective of this research is to propose a stochastic water supply risk analysis and projection methods from thermoelectric power systems operation and management perspectives. The study defines thermoelectric drought as a shortage of cooling water due to stressed supply or beyond operable water temperature limits for an extended period of time requiring power plants to reduce production or completely shut down. It presents a thermoelectric drought risk characterization framework that considers heat content and water quantity facets of adequate water availability for uninterrupted operation of such plants and safety of its surroundings. In addition, it outlines mechanisms to identify rate of occurrences of the said droughts and stochastically quantify subsequent potential losses to the sector. This mechanism is enabled through a model based on compound Nonhomogeneous Poisson Process. This study also demonstrates how

  19. Risk analysis methods and techniques for veterinary biologicals used in Australia.

    PubMed

    Owusu, J

    1995-12-01

    Advances in modern science and technology and the globalisation of the veterinary manufacturing industry, coupled with the relaxation of trade restrictions by the General Agreement on Tariffs and Trade treaty on sanitary and phytosanitary measures, call for an international approach to standards of acceptable risk and risk analysis methodology. In Australia, different elements of risk analysis are undertaken by different agencies. The agencies employ screening risk assessment, which uses simple worst-case scenarios and conservative data to set priorities and identify issues of limited risk. The approach is multi-factorial, assessing risk to public health, animals and the environment. The major components of the analysis process are risk assessment, risk management, and procedures for communicating and monitoring risk. The author advocates the possible use of quantitative risk assessment, based on acceptable international standards, in making international trade decisions. In the absence of acceptable international standards, it is proposed that countries adopt mutual recognition of comparable standards and specifications employed by national agencies.

  20. Risk Analysis

    NASA Technical Reports Server (NTRS)

    Morring, Frank, Jr.

    2004-01-01

    A National Academies panel says the Hubble Space Telescope is too valuable ;or gamblingon a long-shot robotic mission to extend its service life, and urges Directly contradicting Administrator Sean O'Keefe, who killed a planned fifth shuttle servicing mission to the telescope on grounds it was too dangerous for a human crew in the post-Challenger environment, the expert committee found that upgrades to shuttle safety actually should make it less hazardous to fly to the telescope than it was before Columbia was lost. Risks of a telescope-servicing mission are only marginally greater than the planned missions to the International Space Station (ISS) O'Keefe has authorized, the panel found. After comparing those risks to the dangers inherent in trying to develop a complex space robot in the 39 months remaining in the Hubble s estimated service life, the panel opted for the human mission to save one of the major achievements of the American space program, in the words of Louis J. Lanzerotti, its chairman.

  1. Is risk analysis scientific?

    PubMed

    Hansson, Sven Ove; Aven, Terje

    2014-07-01

    This article discusses to what extent risk analysis is scientific in view of a set of commonly used definitions and criteria. We consider scientific knowledge to be characterized by its subject matter, its success in developing the best available knowledge in its fields of study, and the epistemic norms and values that guide scientific investigations. We proceed to assess the field of risk analysis according to these criteria. For this purpose, we use a model for risk analysis in which science is used as a base for decision making on risks, which covers the five elements evidence, knowledge base, broad risk evaluation, managerial review and judgment, and the decision; and that relates these elements to the domains experts and decisionmakers, and to the domains fact-based or value-based. We conclude that risk analysis is a scientific field of study, when understood as consisting primarily of (i) knowledge about risk-related phenomena, processes, events, etc., and (ii) concepts, theories, frameworks, approaches, principles, methods and models to understand, assess, characterize, communicate, and manage risk, in general and for specific applications (the instrumental part).

  2. Does simplifying transport and exposure yield reliable results? An analysis of four risk assessment methods.

    PubMed

    Zhang, Q; Crittenden, J C; Mihelcic, J R

    2001-03-15

    Four approaches for predicting the risk of chemicals to humans and fish under different scenarios were compared to investigate whether it is appropriate to simplify risk evaluations in situations where an individual is making environmentally conscious manufacturing decisions or interpreting toxics release inventory (TRI) data: (1) the relative risk method, that compares only a chemical's relative toxicity; (2) the toxicity persistence method, that considers a chemical's relative toxicity and persistence; (3) the partitioning, persistence toxicity method, that considers a chemical's equilibrium partitioning to air, land, water, and sediment, persistence in each medium, and its relative toxicity; and (4) the detailed chemical fate and toxicity method, that considers the chemical's relative toxicity, and realistic attenuation mechanisms such as advection, mass transfer and reaction in air, land, water, and sediment. In all four methods, the magnitude of the risk was estimated by comparing the risk of the chemical's release to that of a reference chemical. Three comparative scenarios were selected to evaluate the four approaches for making pollution prevention decisions: (1) evaluation of nine dry cleaning solvents, (2) evaluation of four reaction pathways to produce glycerine, and (3) comparison of risks for the chemical manufacturing and petroleum industry. In all three situations, it was concluded that ignoring or simplifying exposure calculations is not appropriate, except in cases where either the toxicity was very great or when comparing chemicals with similar fate. When the toxicity is low to moderate and comparable for chemicals, the chemicals' fate influences the results; therefore, we recommend using a detailed chemical fate and toxicity method because the fate of chemicals in the environment is assessed with consideration of more realistic attenuation mechanisms than the other three methods. In addition, our study shows that evaluating the risk associated

  3. The haplotype-relative-risk (HRR) method for analysis of association in nuclear families

    SciTech Connect

    Knapp, M.; Seuchter, S.A.; Baur, M.P. )

    1993-06-01

    One major problem in studying an association between a marker locus and a disease is the selection of an appropriate group of controls. However, this problem of population stratification can be circumvented in a quite elegant manner by family-based methods. The haplotype-relative-risk (HRR) method, which samples nuclear families with a single affected child and uses the parental haplotypes not transmitted to that child as a control individual, represents such a method for estimating the relative risk of a marker phenotype. In the special case of a recessive disease, it was already known that the equivalence of the HRR method with the classical relative risk (RR) obtained from independent samples holds only if the probability [theta] of a recombination between marker and disease locus is zero. The authors extend this result to an arbitrary mode of inheritance. Furthermore, they compare the distribution of the estimators for HRR and RR and show that, in the case of a positive linkage disequilibrium between a marker and disease allele, the distribution of the estimator for HRR is (stochastically) smaller than that for RR, irrespective of the recombination fraction. The practical implication of this result is that, for the HRR method, there is no tendency to give unduly high risk estimators, even for [theta] > 0. Finally, the authors give an expression for the standard error of the estimator for HRR by taking into account the nonindependence of transmitted and nontransmitted parental marker alleles in the case of [theta] > 0. 16 refs., 3 tabs.

  4. Indoor Soiling Method and Outdoor Statistical Risk Analysis of Photovoltaic Power Plants

    NASA Astrophysics Data System (ADS)

    Rajasekar, Vidyashree

    This is a two-part thesis. Part 1 presents an approach for working towards the development of a standardized artificial soiling method for laminated photovoltaic (PV) cells or mini-modules. Construction of an artificial chamber to maintain controlled environmental conditions and components/chemicals used in artificial soil formulation is briefly explained. Both poly-Si mini-modules and a single cell mono-Si coupons were soiled and characterization tests such as I-V, reflectance and quantum efficiency (QE) were carried out on both soiled, and cleaned coupons. From the results obtained, poly-Si mini-modules proved to be a good measure of soil uniformity, as any non-uniformity present would not result in a smooth curve during I-V measurements. The challenges faced while executing reflectance and QE characterization tests on poly-Si due to smaller size cells was eliminated on the mono-Si coupons with large cells to obtain highly repeatable measurements. This study indicates that the reflectance measurements between 600-700 nm wavelengths can be used as a direct measure of soil density on the modules. Part 2 determines the most dominant failure modes of field aged PV modules using experimental data obtained in the field and statistical analysis, FMECA (Failure Mode, Effect, and Criticality Analysis). The failure and degradation modes of about 744 poly-Si glass/polymer frameless modules fielded for 18 years under the cold-dry climate of New York was evaluated. Defect chart, degradation rates (both string and module levels) and safety map were generated using the field measured data. A statistical reliability tool, FMECA that uses Risk Priority Number (RPN) is used to determine the dominant failure or degradation modes in the strings and modules by means of ranking and prioritizing the modes. This study on PV power plants considers all the failure and degradation modes from both safety and performance perspectives. The indoor and outdoor soiling studies were jointly

  5. A method for risk analysis across governance systems: a Great Barrier Reef case study

    NASA Astrophysics Data System (ADS)

    Dale, Allan; Vella, Karen; Pressey, Robert L.; Brodie, Jon; Yorkston, Hugh; Potts, Ruth

    2013-03-01

    Healthy governance systems are key to delivering sound environmental management outcomes from global to local scales. There are, however, surprisingly few risk assessment methods that can pinpoint those domains and sub-domains within governance systems that are most likely to influence good environmental outcomes at any particular scale, or those if absent or dysfunctional, most likely to prevent effective environmental management. This paper proposes a new risk assessment method for analysing governance systems. This method is then tested through its preliminary application to a significant real-world context: governance as it relates to the health of Australia’s Great Barrier Reef (GBR). The GBR exists at a supra-regional scale along most of the north eastern coast of Australia. Brodie et al (2012 Mar. Pollut. Bull. 65 81-100) have recently reviewed the state and trend of the health of the GBR, finding that overall trends remain of significant concern. At the same time, official international concern over the governance of the reef has recently been signalled globally by the International Union for the Conservation of Nature (IUCN). These environmental and political contexts make the GBR an ideal candidate for use in testing and reviewing the application of improved tools for governance risk assessment.

  6. Qualitative methods for assessing risk

    SciTech Connect

    Mahn, J.A.; Hannaman, G.W.; Kryska, P.

    1995-04-01

    The Department of Energy`s (DOE) non-nuclear facilities generally require only a qualitative accident analysis to assess facility risks in accordance with DOE Order 5481.1B, Safety Analysis and Review System. Achieving a meaningful qualitative assessment of risk necessarily requires the use of suitable non-numerical assessment criteria. Typically, the methods and criteria for assigning facility-specific accident scenarios to the qualitative severity and likelihood classification system in the DOE order requires significant judgment in many applications. Systematic methods for more consistently assigning the total accident scenario frequency and associated consequences are required to substantiate and enhance future risk ranking between various activities at Sandia National Laboratories (SNL). SNL`s Risk Management and National Environmental Policy Act (NEPA) Department has developed an improved methodology for performing qualitative risk assessments in accordance wi the DOE order requirements. Products of this effort are an improved set of qualitative description that permit (1) definition of the severity for both technical and programmatic consequences that may result from a variety of accident scenarios, and (2) qualitative representation of the likelihood of occurrence. These sets of descriptions are intended to facilitate proper application of DOE criteria for assessing facility risks.

  7. Method for improved prediction of bone fracture risk using bone mineral density in structural analysis

    NASA Technical Reports Server (NTRS)

    Cann, Christopher E. (Inventor); Faulkner, Kenneth G. (Inventor)

    1992-01-01

    A non-invasive in-vivo method of analyzing a bone for fracture risk includes obtaining data from the bone such as by computed tomography or projection imaging which data represents a measure of bone material characteristics such as bone mineral density. The distribution of the bone material characteristics is used to generate a finite element method (FEM) mesh from which load capability of the bone can be determined. In determining load capability, the bone is mathematically compressed, and stress, strain force, force/area versus bone material characteristics are determined.

  8. Integrated seismic risk analysis using simple weighting method: the case of residential Eskişehir, Turkey

    NASA Astrophysics Data System (ADS)

    Pekkan, E.; Tun, M.; Guney, Y.; Mutlu, S.

    2015-06-01

    A large part of the residential areas in Turkey are at risk from earthquakes. The main factors that threaten residential areas during an earthquake are poor quality building stock and soil problems. Liquefaction, loss of bearing capacity, amplification, slope failure, and landslide hazards must be taken into account for residential areas that are close to fault zones and covered with younger sediments. Analyzing these hazards separately and then combining the analyses would ensure a more realistic risk evaluation according to population density than analyzing several risks based on a single parameter. In this study, an integrated seismic risk analysis of central Eskişehir was performed based on two earthquake related parameters, liquefaction and amplification. The analysis used a simple weighting method. Other earthquake-related problems such as loss of bearing capacity, landslides, and slope failures are not significant for Eskişehir because of the geological and the topographical conditions of the region. According to the integrated seismic risk analysis of the Eskişehir residential area, the populated area is found to be generally at medium to high risk during a potential earthquake.

  9. Use of a systematic risk analysis method to improve safety in the production of paediatric parenteral nutrition solutions

    PubMed Central

    Bonnabry, P; Cingria, L; Sadeghipour, F; Ing, H; Fonzo-Christe, C; Pfister, R

    2005-01-01

    Background: Until recently, the preparation of paediatric parenteral nutrition formulations in our institution included re-transcription and manual compounding of the mixture. Although no significant clinical problems have occurred, re-engineering of this high risk activity was undertaken to improve its safety. Several changes have been implemented including new prescription software, direct recording on a server, automatic printing of the labels, and creation of a file used to pilot a BAXA MM 12 automatic compounder. The objectives of this study were to compare the risks associated with the old and new processes, to quantify the improved safety with the new process, and to identify the major residual risks. Methods: A failure modes, effects, and criticality analysis (FMECA) was performed by a multidisciplinary team. A cause-effect diagram was built, the failure modes were defined, and the criticality index (CI) was determined for each of them on the basis of the likelihood of occurrence, the severity of the potential effect, and the detection probability. The CIs for each failure mode were compared for the old and new processes and the risk reduction was quantified. Results: The sum of the CIs of all 18 identified failure modes was 3415 for the old process and 1397 for the new (reduction of 59%). The new process reduced the CIs of the different failure modes by a mean factor of 7. The CI was smaller with the new process for 15 failure modes, unchanged for two, and slightly increased for one. The greatest reduction (by a factor of 36) concerned re-transcription errors, followed by readability problems (by a factor of 30) and chemical cross contamination (by a factor of 10). The most critical steps in the new process were labelling mistakes (CI 315, maximum 810), failure to detect a dosage or product mistake (CI 288), failure to detect a typing error during the prescription (CI 175), and microbial contamination (CI 126). Conclusions: Modification of the process

  10. Methods Development for a Spatially Explicit Population-Level Risk Assessment, Uncertainty Analysis, and Comparison with Risk Quotient Approaches

    EPA Science Inventory

    The standard framework of Ecological Risk Assessment (ERA) uses organism-level assessment endpoints to qualitatively determine the risk to populations. While organism-level toxicity data provide the pathway by which a species may be affected by a chemical stressor, they neither i...

  11. Probabilistic risk analysis and terrorism risk.

    PubMed

    Ezell, Barry Charles; Bennett, Steven P; von Winterfeldt, Detlof; Sokolowski, John; Collins, Andrew J

    2010-04-01

    Since the terrorist attacks of September 11, 2001, and the subsequent establishment of the U.S. Department of Homeland Security (DHS), considerable efforts have been made to estimate the risks of terrorism and the cost effectiveness of security policies to reduce these risks. DHS, industry, and the academic risk analysis communities have all invested heavily in the development of tools and approaches that can assist decisionmakers in effectively allocating limited resources across the vast array of potential investments that could mitigate risks from terrorism and other threats to the homeland. Decisionmakers demand models, analyses, and decision support that are useful for this task and based on the state of the art. Since terrorism risk analysis is new, no single method is likely to meet this challenge. In this article we explore a number of existing and potential approaches for terrorism risk analysis, focusing particularly on recent discussions regarding the applicability of probabilistic and decision analytic approaches to bioterrorism risks and the Bioterrorism Risk Assessment methodology used by the DHS and criticized by the National Academies and others.

  12. Survival analysis in total joint replacement: an alternative method of accounting for the presence of competing risk.

    PubMed

    Fennema, P; Lubsen, J

    2010-05-01

    Survival analysis is an important tool for assessing the outcome of total joint replacement. The Kaplan-Meier method is used to estimate the incidence of revision of a prosthesis over time, but does not account appropriately for competing events which preclude revision. In the presence of competing death, this method will lead to statistical bias and the curve will lose its interpretability. A valid comparison of survival results between studies using the method is impossible without accounting for different rates of competing events. An alternative and easily applicable approach, the cumulative incidence of competing risk, is proposed. Using three simulated data sets and realistic data from a cohort of 406 consecutive cementless total hip prostheses, followed up for a minimum of ten years, both approaches were compared and the magnitude of potential bias was highlighted. The Kaplan-Meier method overestimated the incidence of revision by almost 4% (60% relative difference) in the simulations and more than 1% (31.3% relative difference) in the realistic data set. The cumulative incidence of competing risk approach allows for appropriate accounting of competing risk and, as such, offers an improved ability to compare survival results across studies.

  13. Risk analysis methodology survey

    NASA Technical Reports Server (NTRS)

    Batson, Robert G.

    1987-01-01

    NASA regulations require that formal risk analysis be performed on a program at each of several milestones as it moves toward full-scale development. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from simple to complex network-based simulation were surveyed. A Program Risk Analysis Handbook was prepared in order to provide both analyst and manager with a guide for selection of the most appropriate technique.

  14. Methods for Multitemporal Analysis of Satellite Data Aimed at Environmental Risk Monitoring

    NASA Astrophysics Data System (ADS)

    Caprioli, M.; Scognamiglio, A.

    2012-08-01

    In the last years the topic of Environmental monitoring has raised a particular importance, also according to minor short-term stability and predictability of climatic events. Facing this situation, often in terms of emergency, involves high and unpredictable costs for public Agencies. Prevention of damages caused by natural disasters does not regard only weather forecasts, but requires constant attention and practice of monitoring and control of human activity on territory. Practically, the problem is not knowing if and when an event will affect a determined area, but recognizing the possible damages if this event happened, by adopting the adequate measures to reduce them to a minimum, and requiring the necessary tools for a timely intervention. On the other hand, the surveying technologies should be the most possible accurate and updatable in order to guarantee high standards, involving the analysis of a great amount of data. The management of such data requires the integration and calculation systems with specialized software and fast and reliable connection and communication networks. To solve such requirements, current satellite technology, with recurrent data acquisition for the timely generation of cartographic products updated and coherent to the territorial investigation, offers the possibility to fill the temporal gap between the need of urgent information and official reference information. Among evolved image processing techniques, Change detection analysis is useful to facilitate individuation of environmental temporal variations, contributing to reduce the users intervention by means of the processes automation and improving in a progressive way the qualitative and quantitative accuracy of results. The research investigate automatic methods on land cover transformations by means of "Change detection" techniques executable on satellite data that are heterogeneous for spatial and spectral resolution with homogenization and registration in an unique

  15. Methods for evaluating Lyme disease risks using geographic information systems and geospatial analysis.

    PubMed

    Nicholson, M C; Mather, T N

    1996-09-01

    Lyme disease is a tick-transmitted borreliosis of humans and domestic animals emerging as one of the most significant threats to public health in north temperate regions of the world. However, despite a myriad of studies into symptomology, causes, and treatment of the disease, few researchers have addressed the spatial aspects of Lyme disease transmission. Using statewide data collected in Rhode Island (United States) as a test case, we demonstrated that exposure to deer ticks and the risk of contracting Lyme disease occurs mostly in the peridomestic environment. A Geographic Information System model was developed indicating a strong association among Lyme disease in humans, the degree of nymphal blacklegged tick, Ixodes scapularis Say, abundance in the environment, and prevalence of Borrelia burgdorferi infection in ticks. In contrast, occurrence of plant communities suitable for sustaining I. scapularis populations (forests) was not predictive of Lyme disease risk. Instead, we observed a highly significant spatial trend for decreasing number of ticks and incident cases of Lyme disease with increasing latitude. Geostatistics were employed for modeling spatial autocorrelation of tick densities. These findings were combined to create a model that predicts Lyme disease transmission risk, thereby demonstrating the utility of incorporating geospatial modeling techniques in studying the epidemiology of Lyme disease. PMID:8840676

  16. Risk/Stress Analysis.

    ERIC Educational Resources Information Center

    Schwerdtfeger, Don; Howell, Richard E.

    1986-01-01

    Identifies stress as a definite health hazard and risk factor involved in a variety of health situations. Proposes that stress identification efforts be considered in environmental analysis so that a more complete approach to risk assessment and management and health hazard prevention can occur. (ML)

  17. Multidimensional Risk Analysis: MRISK

    NASA Technical Reports Server (NTRS)

    McCollum, Raymond; Brown, Douglas; O'Shea, Sarah Beth; Reith, William; Rabulan, Jennifer; Melrose, Graeme

    2015-01-01

    Multidimensional Risk (MRISK) calculates the combined multidimensional score using Mahalanobis distance. MRISK accounts for covariance between consequence dimensions, which de-conflicts the interdependencies of consequence dimensions, providing a clearer depiction of risks. Additionally, in the event the dimensions are not correlated, Mahalanobis distance reduces to Euclidean distance normalized by the variance and, therefore, represents the most flexible and optimal method to combine dimensions. MRISK is currently being used in NASA's Environmentally Responsible Aviation (ERA) project o assess risk and prioritize scarce resources.

  18. Analysis of the LaSalle Unit 2 nuclear power plant: Risk Methods Integration and Evaluation Program (RMIEP). Volume 8, Seismic analysis

    SciTech Connect

    Wells, J.E.; Lappa, D.A.; Bernreuter, D.L.; Chen, J.C.; Chuang, T.Y.; Johnson, J.J.; Campbell, R.D.; Hashimoto, P.S.; Maslenikov, O.R.; Tiong, L.W.; Ravindra, M.K.; Kincaid, R.H.; Sues, R.H.; Putcha, C.S.

    1993-11-01

    This report describes the methodology used and the results obtained from the application of a simplified seismic risk methodology to the LaSalle County Nuclear Generating Station Unit 2. This study is part of the Level I analysis being performed by the Risk Methods Integration and Evaluation Program (RMIEP). Using the RMIEP developed event and fault trees, the analysis resulted in a seismically induced core damage frequency point estimate of 6.OE-7/yr. This result, combined with the component importance analysis, indicated that system failures were dominated by random events. The dominant components included diesel generator failures (failure to swing, failure to start, failure to run after started), and condensate storage tank.

  19. Recasting risk analysis methods in terms of object-oriented modeling techniques

    SciTech Connect

    Wyss, G.D.; Craft, R.L.; Vandewart, R.L.; Funkhouser, D.R.

    1998-08-01

    For more than two decades, risk analysts have relied on powerful logic-based models to perform their analyses. However, the applicability of these models has been limited because they can be complex and expensive to develop. Analysts must frequently start from scratch when analyzing a new (but similar) system because the understanding of how the system works exists only in the mind of the analyst and is only incompletely instantiated in the actual logic model. This paper introduces the notion of using explicit object-oriented system models, such as those embodied in computer-aided software engineering (CASE) tools, to document the analyst`s understanding of the system and appropriately capture how the system works. It also shows that from these models, standard assessment products, such as fault trees and event trees, can be automatically derived.

  20. Program risk analysis handbook

    NASA Technical Reports Server (NTRS)

    Batson, R. G.

    1987-01-01

    NASA regulations specify that formal risk analysis be performed on a program at each of several milestones. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from extremely simple to complex network-based simulation, are described in this handbook in order to provide both analyst and manager with a guide for selection of the most appropriate technique. All program risk assessment techniques are shown to be based on elicitation and encoding of subjective probability estimates from the various area experts on a program. Techniques to encode the five most common distribution types are given. Then, a total of twelve distinct approaches to risk assessment are given. Steps involved, good and bad points, time involved, and degree of computer support needed are listed. Why risk analysis should be used by all NASA program managers is discussed. Tools available at NASA-MSFC are identified, along with commercially available software. Bibliography (150 entries) and a program risk analysis check-list are provided.

  1. The influence of the free space environment on the superlight-weight thermal protection system: conception, methods, and risk analysis

    NASA Astrophysics Data System (ADS)

    Yatsenko, Vitaliy; Falchenko, Iurii; Fedorchuk, Viktor; Petrushynets, Lidiia

    2016-07-01

    This report focuses on the results of the EU project "Superlight-weight thermal protection system for space application (LIGHT-TPS)". The bottom line is an analysis of influence of the free space environment on the superlight-weight thermal protection system (TPS). This report focuses on new methods that based on the following models: synergetic, physical, and computational. This report concentrates on four approaches. The first concerns the synergetic approach. The synergetic approach to the solution of problems of self-controlled synthesis of structures and creation of self-organizing technologies is considered in connection with the super-problem of creation of materials with new functional properties. Synergetics methods and mathematical design are considered according to actual problems of material science. The second approach describes how the optimization methods can be used to determine material microstructures with optimized or targeted properties. This technique enables one to find unexpected microstructures with exotic behavior (e.g., negative thermal expansion coefficients). The third approach concerns the dynamic probabilistic risk analysis of TPS l elements with complex characterizations for damages using a physical model of TPS system and a predictable level of ionizing radiation and space weather. Focusing is given mainly on the TPS model, mathematical models for dynamic probabilistic risk assessment and software for the modeling and prediction of the influence of the free space environment. The probabilistic risk assessment method for TPS is presented considering some deterministic and stochastic factors. The last approach concerns results of experimental research of the temperature distribution on the surface of the honeycomb sandwich panel size 150 x 150 x 20 mm at the diffusion welding in vacuum are considered. An equipment, which provides alignment of temperature fields in a product for the formation of equal strength of welded joints is

  2. Designing a Software for Flood Risk Assessment Based on Multi Criteria Desicion Analysis and Information Diffusion Methods

    NASA Astrophysics Data System (ADS)

    Musaoglu, N.; Saral, A.; Seker, D. Z.

    2012-12-01

    Flooding is one of the major natural disasters not only in Turkey but also in all over the world and it causes serious damage and harm. It is estimated that of the total economic loss caused by all kinds of disasters, 40% was due to floods. In July 1995, the Ayamama Creek in Istanbul was flooded, the insurance sector received around 1,200 claims notices during that period, insurance companies had to pay a total of $40 million for claims. In 2009, the same creek was flooded again and killed 31 people over two days and insurance firms paid for damages around cost €150 million for claims. To solve these kinds of problems modern tools such as GIS and Remote Sensing should be utilized. In this study, a software was designed for the flood risk analysis with Analytic Hierarchy Process (AHP) and Information Diffusion( InfoDif) methods.In the developed sofware, five evaluation criterias were taken into account, which were slope, aspect, elevation, geology and land use which were extracted from the satellite sensor data. The Digital Elevation Model (DEM) of the Ayamama River Basin was acquired from the SPOT 5 satellite image with 2.5 meter spatial resolution. Slope and aspect values of the study basin were extracted from this DEM. The land use of the Ayamama Creek was obtained by performing object-oriented nearest neighbor classification method by image segmentation on SPOT 5 image dated 2010. All produced data were used as an input for the part of Multi Criteria Desicion Analysis (MCDA) method of this software. Criterias and their each sub criteras were weighted and flood vulnerability was determined with MCDA-AHP. Also, daily flood data was collected from Florya Meteorological Station, between 1975 to 2009 years and the daily flood peak discharge was calculated with the method of Soil Conservation Service-Curve Number (SCS-CN) and were used as an input in the software for the part of InfoDif.Obtained results were verified using ground truth data and it has been clearly

  3. Risk assessment of water pollution sources based on an integrated k-means clustering and set pair analysis method in the region of Shiyan, China.

    PubMed

    Li, Chunhui; Sun, Lian; Jia, Junxiang; Cai, Yanpeng; Wang, Xuan

    2016-07-01

    Source water areas are facing many potential water pollution risks. Risk assessment is an effective method to evaluate such risks. In this paper an integrated model based on k-means clustering analysis and set pair analysis was established aiming at evaluating the risks associated with water pollution in source water areas, in which the weights of indicators were determined through the entropy weight method. Then the proposed model was applied to assess water pollution risks in the region of Shiyan in which China's key source water area Danjiangkou Reservoir for the water source of the middle route of South-to-North Water Diversion Project is located. The results showed that eleven sources with relative high risk value were identified. At the regional scale, Shiyan City and Danjiangkou City would have a high risk value in term of the industrial discharge. Comparatively, Danjiangkou City and Yunxian County would have a high risk value in terms of agricultural pollution. Overall, the risk values of north regions close to the main stream and reservoir of the region of Shiyan were higher than that in the south. The results of risk level indicated that five sources were in lower risk level (i.e., level II), two in moderate risk level (i.e., level III), one in higher risk level (i.e., level IV) and three in highest risk level (i.e., level V). Also risks of industrial discharge are higher than that of the agricultural sector. It is thus essential to manage the pillar industry of the region of Shiyan and certain agricultural companies in the vicinity of the reservoir to reduce water pollution risks of source water areas. PMID:27016678

  4. Targeted assets risk analysis.

    PubMed

    Bouwsema, Barry

    2013-01-01

    Risk assessments utilising the consolidated risk assessment process as described by Public Safety Canada and the Centre for Security Science utilise the five threat categories of natural, human accidental, technological, human intentional and chemical, biological, radiological, nuclear or explosive (CBRNE). The categories of human intentional and CBRNE indicate intended actions against specific targets. It is therefore necessary to be able to identify which pieces of critical infrastructure represent the likely targets of individuals with malicious intent. Using the consolidated risk assessment process and the target capabilities list, coupled with the CARVER methodology and a security vulnerability analysis, it is possible to identify these targeted assets and their weaknesses. This process can help emergency managers to identify where resources should be allocated and funding spent. Targeted Assets Risk Analysis (TARA) presents a new opportunity to improve how risk is measured, monitored, managed and minimised through the four phases of emergency management, namely, prevention, preparation, response and recovery. To reduce risk throughout Canada, Defence Research and Development Canada is interested in researching the potential benefits of a comprehensive approach to risk assessment and management. The TARA provides a framework against which potential human intentional threats can be measured and quantified, thereby improving safety for all Canadians.

  5. Targeted assets risk analysis.

    PubMed

    Bouwsema, Barry

    2013-01-01

    Risk assessments utilising the consolidated risk assessment process as described by Public Safety Canada and the Centre for Security Science utilise the five threat categories of natural, human accidental, technological, human intentional and chemical, biological, radiological, nuclear or explosive (CBRNE). The categories of human intentional and CBRNE indicate intended actions against specific targets. It is therefore necessary to be able to identify which pieces of critical infrastructure represent the likely targets of individuals with malicious intent. Using the consolidated risk assessment process and the target capabilities list, coupled with the CARVER methodology and a security vulnerability analysis, it is possible to identify these targeted assets and their weaknesses. This process can help emergency managers to identify where resources should be allocated and funding spent. Targeted Assets Risk Analysis (TARA) presents a new opportunity to improve how risk is measured, monitored, managed and minimised through the four phases of emergency management, namely, prevention, preparation, response and recovery. To reduce risk throughout Canada, Defence Research and Development Canada is interested in researching the potential benefits of a comprehensive approach to risk assessment and management. The TARA provides a framework against which potential human intentional threats can be measured and quantified, thereby improving safety for all Canadians. PMID:23615063

  6. [Groundwater pollution risk mapping method].

    PubMed

    Shen, Li-na; Li, Guang-he

    2010-04-01

    Based on methods for groundwater vulnerability assessment not involving in contamination source elements, and lack of the systemic and effective techniques and parameter system on groundwater pollution risk mapping in present, through analyzing the structure of groundwater system and characteristics of contaminant sources, and coupling groundwater intrinsic vulnerability with contaminant sources, the integrated multi-index models were developed to evaluate the risk sources of groundwater contaminant and form the groundwater pollution risk mapping in this paper. The models had been used to a large-scale karst groundwater source of northern China as a case study. The results indicated that vulnerability assessment overlaid risk pollution sources of groundwater could effectively confirm the high risk regions of groundwater pollution, and the methods might provide necessary support for the supervision of groundwater pollution.

  7. DWPF risk analysis summary

    SciTech Connect

    Shedrow, C.B.

    1990-10-01

    This document contains selected risk analysis data from Chapter 9 (Safety Analysis) of the Defense Waste Processing Facility Safety Analysis Report DWPF SAR and draft Addendum 1 to the Waste Tank Farms SAR. Although these data may be revised prior to finalization of the draft SAR and the draft addendum, they are presently the best available information and were therefore used in preparing the risk analysis portion of the DWPF Environmental Analysis (DWPF EA). This information has been extracted from those draft documents an approved under separate cover so that it can be used as reference material for the DWPF EA when it is placed in the public reading rooms. 9 refs., 4 tabs.

  8. A Comparison of Rule-based Analysis with Regression Methods in Understanding the Risk Factors for Study Withdrawal in a Pediatric Study.

    PubMed

    Haghighi, Mona; Johnson, Suzanne Bennett; Qian, Xiaoning; Lynch, Kristian F; Vehik, Kendra; Huang, Shuai

    2016-01-01

    Regression models are extensively used in many epidemiological studies to understand the linkage between specific outcomes of interest and their risk factors. However, regression models in general examine the average effects of the risk factors and ignore subgroups with different risk profiles. As a result, interventions are often geared towards the average member of the population, without consideration of the special health needs of different subgroups within the population. This paper demonstrates the value of using rule-based analysis methods that can identify subgroups with heterogeneous risk profiles in a population without imposing assumptions on the subgroups or method. The rules define the risk pattern of subsets of individuals by not only considering the interactions between the risk factors but also their ranges. We compared the rule-based analysis results with the results from a logistic regression model in The Environmental Determinants of Diabetes in the Young (TEDDY) study. Both methods detected a similar suite of risk factors, but the rule-based analysis was superior at detecting multiple interactions between the risk factors that characterize the subgroups. A further investigation of the particular characteristics of each subgroup may detect the special health needs of the subgroup and lead to tailored interventions.

  9. A Comparison of Rule-based Analysis with Regression Methods in Understanding the Risk Factors for Study Withdrawal in a Pediatric Study

    PubMed Central

    Haghighi, Mona; Johnson, Suzanne Bennett; Qian, Xiaoning; Lynch, Kristian F.; Vehik, Kendra; Huang, Shuai; Rewers, Marian; Barriga, Katherine; Baxter, Judith; Eisenbarth, George; Frank, Nicole; Gesualdo, Patricia; Hoffman, Michelle; Norris, Jill; Ide, Lisa; Robinson, Jessie; Waugh, Kathleen; She, Jin-Xiong; Schatz, Desmond; Hopkins, Diane; Steed, Leigh; Choate, Angela; Silvis, Katherine; Shankar, Meena; Huang, Yi-Hua; Yang, Ping; Wang, Hong-Jie; Leggett, Jessica; English, Kim; McIndoe, Richard; Dequesada, Angela; Haller, Michael; Anderson, Stephen W.; Ziegler, Anette G.; Boerschmann, Heike; Bonifacio, Ezio; Bunk, Melanie; Försch, Johannes; Henneberger, Lydia; Hummel, Michael; Hummel, Sandra; Joslowski, Gesa; Kersting, Mathilde; Knopff, Annette; Kocher, Nadja; Koletzko, Sibylle; Krause, Stephanie; Lauber, Claudia; Mollenhauer, Ulrike; Peplow, Claudia; Pflüger, Maren; Pöhlmann, Daniela; Ramminger, Claudia; Rash-Sur, Sargol; Roth, Roswith; Schenkel, Julia; Thümer, Leonore; Voit, Katja; Winkler, Christiane; Zwilling, Marina; Simell, Olli G.; Nanto-Salonen, Kirsti; Ilonen, Jorma; Knip, Mikael; Veijola, Riitta; Simell, Tuula; Hyöty, Heikki; Virtanen, Suvi M.; Kronberg-Kippilä, Carina; Torma, Maija; Simell, Barbara; Ruohonen, Eeva; Romo, Minna; Mantymaki, Elina; Schroderus, Heidi; Nyblom, Mia; Stenius, Aino; Lernmark, Åke; Agardh, Daniel; Almgren, Peter; Andersson, Eva; Andrén-Aronsson, Carin; Ask, Maria; Karlsson, Ulla-Marie; Cilio, Corrado; Bremer, Jenny; Ericson-Hallström, Emilie; Gard, Thomas; Gerardsson, Joanna; Gustavsson, Ulrika; Hansson, Gertie; Hansen, Monica; Hyberg, Susanne; Håkansson, Rasmus; Ivarsson, Sten; Johansen, Fredrik; Larsson, Helena; Lernmark, Barbro; Markan, Maria; Massadakis, Theodosia; Melin, Jessica; Månsson-Martinez, Maria; Nilsson, Anita; Nilsson, Emma; Rahmati, Kobra; Rang, Sara; Järvirova, Monica Sedig; Sibthorpe, Sara; Sjöberg, Birgitta; Törn, Carina; Wallin, Anne; Wimar, Åsa; Hagopian, William A.; Yan, Xiang; Killian, Michael; Crouch, Claire Cowen; Hay, Kristen M.; Ayres, Stephen; Adams, Carissa; Bratrude, Brandi; Fowler, Greer; Franco, Czarina; Hammar, Carla; Heaney, Diana; Marcus, Patrick; Meyer, Arlene; Mulenga, Denise; Scott, Elizabeth; Skidmore, Jennifer; Small, Erin; Stabbert, Joshua; Stepitova, Viktoria; Becker, Dorothy; Franciscus, Margaret; Dalmagro-Elias Smith, MaryEllen; Daftary, Ashi; Krischer, Jeffrey P.; Abbondondolo, Michael; Ballard, Lori; Brown, Rasheedah; Cuthbertson, David; Eberhard, Christopher; Gowda, Veena; Lee, Hye-Seung; Liu, Shu; Malloy, Jamie; McCarthy, Cristina; McLeod, Wendy; Smith, Laura; Smith, Stephen; Smith, Susan; Uusitalo, Ulla; Yang, Jimin; Akolkar, Beena; Briese, Thomas; Erlich, Henry; Oberste, Steve

    2016-01-01

    Regression models are extensively used in many epidemiological studies to understand the linkage between specific outcomes of interest and their risk factors. However, regression models in general examine the average effects of the risk factors and ignore subgroups with different risk profiles. As a result, interventions are often geared towards the average member of the population, without consideration of the special health needs of different subgroups within the population. This paper demonstrates the value of using rule-based analysis methods that can identify subgroups with heterogeneous risk profiles in a population without imposing assumptions on the subgroups or method. The rules define the risk pattern of subsets of individuals by not only considering the interactions between the risk factors but also their ranges. We compared the rule-based analysis results with the results from a logistic regression model in The Environmental Determinants of Diabetes in the Young (TEDDY) study. Both methods detected a similar suite of risk factors, but the rule-based analysis was superior at detecting multiple interactions between the risk factors that characterize the subgroups. A further investigation of the particular characteristics of each subgroup may detect the special health needs of the subgroup and lead to tailored interventions. PMID:27561809

  10. A Comparison of Rule-based Analysis with Regression Methods in Understanding the Risk Factors for Study Withdrawal in a Pediatric Study.

    PubMed

    Haghighi, Mona; Johnson, Suzanne Bennett; Qian, Xiaoning; Lynch, Kristian F; Vehik, Kendra; Huang, Shuai

    2016-01-01

    Regression models are extensively used in many epidemiological studies to understand the linkage between specific outcomes of interest and their risk factors. However, regression models in general examine the average effects of the risk factors and ignore subgroups with different risk profiles. As a result, interventions are often geared towards the average member of the population, without consideration of the special health needs of different subgroups within the population. This paper demonstrates the value of using rule-based analysis methods that can identify subgroups with heterogeneous risk profiles in a population without imposing assumptions on the subgroups or method. The rules define the risk pattern of subsets of individuals by not only considering the interactions between the risk factors but also their ranges. We compared the rule-based analysis results with the results from a logistic regression model in The Environmental Determinants of Diabetes in the Young (TEDDY) study. Both methods detected a similar suite of risk factors, but the rule-based analysis was superior at detecting multiple interactions between the risk factors that characterize the subgroups. A further investigation of the particular characteristics of each subgroup may detect the special health needs of the subgroup and lead to tailored interventions. PMID:27561809

  11. Augmenting the Deliberative Method for Ranking Risks.

    PubMed

    Susel, Irving; Lasley, Trace; Montezemolo, Mark; Piper, Joel

    2016-01-01

    The Department of Homeland Security (DHS) characterized and prioritized the physical cross-border threats and hazards to the nation stemming from terrorism, market-driven illicit flows of people and goods (illegal immigration, narcotics, funds, counterfeits, and weaponry), and other nonmarket concerns (movement of diseases, pests, and invasive species). These threats and hazards pose a wide diversity of consequences with very different combinations of magnitudes and likelihoods, making it very challenging to prioritize them. This article presents the approach that was used at DHS to arrive at a consensus regarding the threats and hazards that stand out from the rest based on the overall risk they pose. Due to time constraints for the decision analysis, it was not feasible to apply multiattribute methodologies like multiattribute utility theory or the analytic hierarchy process. Using a holistic approach was considered, such as the deliberative method for ranking risks first published in this journal. However, an ordinal ranking alone does not indicate relative or absolute magnitude differences among the risks. Therefore, the use of the deliberative method for ranking risks is not sufficient for deciding whether there is a material difference between the top-ranked and bottom-ranked risks, let alone deciding what the stand-out risks are. To address this limitation of ordinal rankings, the deliberative method for ranking risks was augmented by adding an additional step to transform the ordinal ranking into a ratio scale ranking. This additional step enabled the selection of stand-out risks to help prioritize further analysis.

  12. Initial Decision and Risk Analysis

    SciTech Connect

    Engel, David W.

    2012-02-29

    Decision and Risk Analysis capabilities will be developed for industry consideration and possible adoption within Year 1. These tools will provide a methodology for merging qualitative ranking of technology maturity and acknowledged risk contributors with quantitative metrics that drive investment decision processes. Methods and tools will be initially introduced as applications to the A650.1 case study, but modular spreadsheets and analysis routines will be offered to industry collaborators as soon as possible to stimulate user feedback and co-development opportunities.

  13. Bivariate hydrologic risk analysis based on a coupled entropy-copula method for the Xiangxi River in the Three Gorges Reservoir area, China

    NASA Astrophysics Data System (ADS)

    Fan, Y. R.; Huang, W. W.; Huang, G. H.; Huang, K.; Li, Y. P.; Kong, X. M.

    2016-07-01

    In this study, a bivariate hydrologic risk framework is proposed based on a coupled entropy-copula method. In the proposed risk analysis framework, bivariate flood frequency would be analyzed for different flood variable pairs (i.e., flood peak-volume, flood peak-duration, flood volume-duration). The marginal distributions of flood peak, volume, and duration are quantified through both parametric (i.e., gamma, general extreme value (GEV), and lognormal distributions) and nonparametric (i.e., entropy) approaches. The joint probabilities of flood peak-volume, peak-duration, and volume-duration are established through copulas. The bivariate hydrologic risk is then derived based on the joint return period to reflect the interactive effects of flood variables on the final hydrologic risk values. The proposed method is applied to the risk analysis for the Xiangxi River in the Three Gorges Reservoir area, China. The results indicate the entropy method performs best in quantifying the distribution of flood duration. Bivariate hydrologic risk would then be generated to characterize the impacts of flood volume and duration on the occurrence of a flood. The results suggest that the bivariate risk for flood peak-volume would not decrease significantly for the flood volume less than 1000 m3/s. Moreover, a flood in the Xiangxi River may last at least 5 days without significant decrease of the bivariate risk for flood peak-duration.

  14. [Cost-effectiveness analysis of preventive methods for occlusal surface according to caries risk: results of a controlled clinical trial].

    PubMed

    Tagliaferro, Elaine Pereira da Silva; Marinho, Daniel Savignon; Pereira, Claudia Cristina de Aguiar; Pardi, Vanessa; Ambrosano, Gláucia Maria Bovi; Meneghim, Marcelo de Castro; Pereira, Antonio Carlos

    2013-11-01

    This study presents the results of a cost-effectiveness analysis in a controlled clinical trial on the effectiveness of a modified glass ionomer resin sealant ( Vitremer, 3M ESPE) and the application of fluoride varnish (Duraphat, Colgate) on occlusal surfaces of first permanent molars in children 6-8 years of age (N = 268), according to caries risk (high versus low). Children were examined semiannually by the same calibrated dentist for 24 months after allocation in six groups: high and low risk controls (oral health education every three months); high and low risk with varnish (oral health education every three months + varnish biannually); and high and low risk with sealant (oral health education every three months + a single application of sealant). Economic analysis showed that sealing permanent first molars of high-risk schoolchildren showed a C/E ratio of US$ 119.80 per saved occlusal surface and an incremental C/E ratio of US$ 108.36 per additional saved occlusal surface. The study concluded that sealing permanent first molars of high-risk schoolchildren was the most cost-effective intervention.

  15. [Cost-effectiveness analysis of preventive methods for occlusal surface according to caries risk: results of a controlled clinical trial].

    PubMed

    Tagliaferro, Elaine Pereira da Silva; Marinho, Daniel Savignon; Pereira, Claudia Cristina de Aguiar; Pardi, Vanessa; Ambrosano, Gláucia Maria Bovi; Meneghim, Marcelo de Castro; Pereira, Antonio Carlos

    2013-11-01

    This study presents the results of a cost-effectiveness analysis in a controlled clinical trial on the effectiveness of a modified glass ionomer resin sealant ( Vitremer, 3M ESPE) and the application of fluoride varnish (Duraphat, Colgate) on occlusal surfaces of first permanent molars in children 6-8 years of age (N = 268), according to caries risk (high versus low). Children were examined semiannually by the same calibrated dentist for 24 months after allocation in six groups: high and low risk controls (oral health education every three months); high and low risk with varnish (oral health education every three months + varnish biannually); and high and low risk with sealant (oral health education every three months + a single application of sealant). Economic analysis showed that sealing permanent first molars of high-risk schoolchildren showed a C/E ratio of US$ 119.80 per saved occlusal surface and an incremental C/E ratio of US$ 108.36 per additional saved occlusal surface. The study concluded that sealing permanent first molars of high-risk schoolchildren was the most cost-effective intervention. PMID:25402241

  16. The report of Task Group 100 of the AAPM: Application of risk analysis methods to radiation therapy quality management.

    PubMed

    Huq, M Saiful; Fraass, Benedick A; Dunscombe, Peter B; Gibbons, John P; Ibbott, Geoffrey S; Mundt, Arno J; Mutic, Sasa; Palta, Jatinder R; Rath, Frank; Thomadsen, Bruce R; Williamson, Jeffrey F; Yorke, Ellen D

    2016-07-01

    The increasing complexity of modern radiation therapy planning and delivery challenges traditional prescriptive quality management (QM) methods, such as many of those included in guidelines published by organizations such as the AAPM, ASTRO, ACR, ESTRO, and IAEA. These prescriptive guidelines have traditionally focused on monitoring all aspects of the functional performance of radiotherapy (RT) equipment by comparing parameters against tolerances set at strict but achievable values. Many errors that occur in radiation oncology are not due to failures in devices and software; rather they are failures in workflow and process. A systematic understanding of the likelihood and clinical impact of possible failures throughout a course of radiotherapy is needed to direct limit QM resources efficiently to produce maximum safety and quality of patient care. Task Group 100 of the AAPM has taken a broad view of these issues and has developed a framework for designing QM activities, based on estimates of the probability of identified failures and their clinical outcome through the RT planning and delivery process. The Task Group has chosen a specific radiotherapy process required for "intensity modulated radiation therapy (IMRT)" as a case study. The goal of this work is to apply modern risk-based analysis techniques to this complex RT process in order to demonstrate to the RT community that such techniques may help identify more effective and efficient ways to enhance the safety and quality of our treatment processes. The task group generated by consensus an example quality management program strategy for the IMRT process performed at the institution of one of the authors. This report describes the methodology and nomenclature developed, presents the process maps, FMEAs, fault trees, and QM programs developed, and makes suggestions on how this information could be used in the clinic. The development and implementation of risk-assessment techniques will make radiation therapy

  17. The report of Task Group 100 of the AAPM: Application of risk analysis methods to radiation therapy quality management

    PubMed Central

    Huq, M. Saiful; Fraass, Benedick A.; Dunscombe, Peter B.; Gibbons, John P.; Mundt, Arno J.; Mutic, Sasa; Palta, Jatinder R.; Rath, Frank; Thomadsen, Bruce R.; Williamson, Jeffrey F.; Yorke, Ellen D.

    2016-01-01

    The increasing complexity of modern radiation therapy planning and delivery challenges traditional prescriptive quality management (QM) methods, such as many of those included in guidelines published by organizations such as the AAPM, ASTRO, ACR, ESTRO, and IAEA. These prescriptive guidelines have traditionally focused on monitoring all aspects of the functional performance of radiotherapy (RT) equipment by comparing parameters against tolerances set at strict but achievable values. Many errors that occur in radiation oncology are not due to failures in devices and software; rather they are failures in workflow and process. A systematic understanding of the likelihood and clinical impact of possible failures throughout a course of radiotherapy is needed to direct limit QM resources efficiently to produce maximum safety and quality of patient care. Task Group 100 of the AAPM has taken a broad view of these issues and has developed a framework for designing QM activities, based on estimates of the probability of identified failures and their clinical outcome through the RT planning and delivery process. The Task Group has chosen a specific radiotherapy process required for “intensity modulated radiation therapy (IMRT)” as a case study. The goal of this work is to apply modern risk-based analysis techniques to this complex RT process in order to demonstrate to the RT community that such techniques may help identify more effective and efficient ways to enhance the safety and quality of our treatment processes. The task group generated by consensus an example quality management program strategy for the IMRT process performed at the institution of one of the authors. This report describes the methodology and nomenclature developed, presents the process maps, FMEAs, fault trees, and QM programs developed, and makes suggestions on how this information could be used in the clinic. The development and implementation of risk-assessment techniques will make radiation

  18. The report of Task Group 100 of the AAPM: Application of risk analysis methods to radiation therapy quality management.

    PubMed

    Huq, M Saiful; Fraass, Benedick A; Dunscombe, Peter B; Gibbons, John P; Ibbott, Geoffrey S; Mundt, Arno J; Mutic, Sasa; Palta, Jatinder R; Rath, Frank; Thomadsen, Bruce R; Williamson, Jeffrey F; Yorke, Ellen D

    2016-07-01

    The increasing complexity of modern radiation therapy planning and delivery challenges traditional prescriptive quality management (QM) methods, such as many of those included in guidelines published by organizations such as the AAPM, ASTRO, ACR, ESTRO, and IAEA. These prescriptive guidelines have traditionally focused on monitoring all aspects of the functional performance of radiotherapy (RT) equipment by comparing parameters against tolerances set at strict but achievable values. Many errors that occur in radiation oncology are not due to failures in devices and software; rather they are failures in workflow and process. A systematic understanding of the likelihood and clinical impact of possible failures throughout a course of radiotherapy is needed to direct limit QM resources efficiently to produce maximum safety and quality of patient care. Task Group 100 of the AAPM has taken a broad view of these issues and has developed a framework for designing QM activities, based on estimates of the probability of identified failures and their clinical outcome through the RT planning and delivery process. The Task Group has chosen a specific radiotherapy process required for "intensity modulated radiation therapy (IMRT)" as a case study. The goal of this work is to apply modern risk-based analysis techniques to this complex RT process in order to demonstrate to the RT community that such techniques may help identify more effective and efficient ways to enhance the safety and quality of our treatment processes. The task group generated by consensus an example quality management program strategy for the IMRT process performed at the institution of one of the authors. This report describes the methodology and nomenclature developed, presents the process maps, FMEAs, fault trees, and QM programs developed, and makes suggestions on how this information could be used in the clinic. The development and implementation of risk-assessment techniques will make radiation therapy

  19. [The management of risks by the global risk analysis].

    PubMed

    Desroches, A

    2013-05-01

    After a reminder on the fundamental concepts of the management of risk, the author describes the overall analysis of risk (AGR), name given by the author to the up-to-date APR method which after several changes of the initial process aims to cover a perimeter of analysis and broader management both at the level of structural that business risks of any kind throughout the system development life cycle, of the study of its feasibility to dismantling.

  20. Qualitative methods for assessing risk

    SciTech Connect

    Mahn, J.A.; Hannaman, G.W.; Kryska, P.

    1995-03-01

    The purpose of this document is to describe a qualitative risk assessment process that supplements the requirements of DOE/AL 5481.1B. Although facility managers have a choice of assessing risk either quantitatively or qualitatively, trade offs are involved in making the most appropriate choice for a given application. The results that can be obtained from a quantitative risk assessment are significantly more robust than those results derived from a qualitative approach. However, the advantages derived from quantitative risk assessment are achieved at a greater expenditure of money, time and convenience. This document provides the elements of a framework for performing a much less costly qualitative risk assessment, while retaining the best attributes of quantitative methods. The approach discussed herein will; (1) provide facility managers with the tools to prepare consistent, site wide assessments, and (2) aid the reviewers who may be tasked to evaluate the assessments. Added cost/benefit measures of the qualitative methodology include the identification of mechanisms for optimally allocating resources for minimizing risk in an expeditious, and fiscally responsible manner.

  1. Study Of The Risks Arising From Natural Disasters And Hazards On Urban And Intercity Motorways By Using Failure Mode Effect Analysis (FMEA) Methods

    NASA Astrophysics Data System (ADS)

    DELİCE, Yavuz

    2015-04-01

    Highways, Located in the city and intercity locations are generally prone to many kind of natural disaster risks. Natural hazards and disasters that may occur firstly from highway project making to construction and operation stages and later during the implementation of highway maintenance and repair stages have to be taken into consideration. And assessment of risks that may occur against adverse situations is very important in terms of project design, construction, operation maintenance and repair costs. Making hazard and natural disaster risk analysis is largely depending on the definition of the likelihood of the probable hazards on the highways. However, assets at risk , and the impacts of the events must be examined and to be rated in their own. With the realization of these activities, intended improvements against natural hazards and disasters will be made with the utilization of Failure Mode Effects Analysis (FMEA) method and their effects will be analyzed with further works. FMEA, is a useful method to identify the failure mode and effects depending on the type of failure rate effects priorities and finding the most optimum economic and effective solution. Although relevant measures being taken for the identified risks by this analysis method , it may also provide some information for some public institutions about the nature of these risks when required. Thus, the necessary measures will have been taken in advance in the city and intercity highways. Many hazards and natural disasters are taken into account in risk assessments. The most important of these dangers can be listed as follows; • Natural disasters 1. Meteorological based natural disasters (floods, severe storms, tropical storms, winter storms, avalanches, etc.). 2. Geological based natural disasters (earthquakes, tsunamis, landslides, subsidence, sinkholes, etc) • Human originated disasters 1. Transport accidents (traffic accidents), originating from the road surface defects (icing

  2. A two-stage method of quantitative flood risk analysis for reservoir real-time operation using ensemble-based hydrologic forecasts

    NASA Astrophysics Data System (ADS)

    Liu, P.

    2013-12-01

    Quantitative analysis of the risk for reservoir real-time operation is a hard task owing to the difficulty of accurate description of inflow uncertainties. The ensemble-based hydrologic forecasts directly depict the inflows not only the marginal distributions but also their persistence via scenarios. This motivates us to analyze the reservoir real-time operating risk with ensemble-based hydrologic forecasts as inputs. A method is developed by using the forecast horizon point to divide the future time into two stages, the forecast lead-time and the unpredicted time. The risk within the forecast lead-time is computed based on counting the failure number of forecast scenarios, and the risk in the unpredicted time is estimated using reservoir routing with the design floods and the reservoir water levels of forecast horizon point. As a result, a two-stage risk analysis method is set up to quantify the entire flood risks by defining the ratio of the number of scenarios that excessive the critical value to the total number of scenarios. The China's Three Gorges Reservoir (TGR) is selected as a case study, where the parameter and precipitation uncertainties are implemented to produce ensemble-based hydrologic forecasts. The Bayesian inference, Markov Chain Monte Carlo, is used to account for the parameter uncertainty. Two reservoir operation schemes, the real operated and scenario optimization, are evaluated for the flood risks and hydropower profits analysis. With the 2010 flood, it is found that the improvement of the hydrologic forecast accuracy is unnecessary to decrease the reservoir real-time operation risk, and most risks are from the forecast lead-time. It is therefore valuable to decrease the avarice of ensemble-based hydrologic forecasts with less bias for a reservoir operational purpose.

  3. [Comparative analysis of two different methods for risk assessment of groundwater pollution: a case study in Beijing plain].

    PubMed

    Wang, Hong-na; He, Jiang-tao; Ma, Wen-jie; Xu, Zhen

    2015-01-01

    Groundwater contamination risk assessment has important meaning to groundwater contamination prevention planning and groundwater exploitation potentiality. Recently, UN assessment system and WP assessment system have become the focuses of international research. In both systems, the assessment framework and indices were drawn from five aspects: intrinsic vulnerability, aquifer storage, groundwater quality, groundwater resource protection zone and contamination load. But, the five factors were built up in different ways. In order to expound the difference between the UN and WP assessment systems, and explain the main reasons, the UN and WP assessment systems were applied to Beijing Plain, China. The maps constructed from the UN and WP risk assessment systems were compared. The results showed that both kinds of groundwater contamination risk assessment maps were in accordance with the actual conditions and were similar in spatial distribution trends. However, there was quite significant different in the coverage area at the same level. It also revealed that during the system construction process, the structural hierarchy, relevant overlaying principles and classification method might have effects on the groundwater contamination risk assessment map. UN assessment system and WP assessment system were both suitable for groundwater contamination risk assessment of the plain, however, their emphasis was different.

  4. Comparison of nonlinear methods symbolic dynamics, detrended fluctuation, and Poincaré plot analysis in risk stratification in patients with dilated cardiomyopathy

    NASA Astrophysics Data System (ADS)

    Voss, Andreas; Schroeder, Rico; Truebner, Sandra; Goernig, Matthias; Figulla, Hans Reiner; Schirdewan, Alexander

    2007-03-01

    Dilated cardiomyopathy (DCM) has an incidence of about 20/100 000 new cases per annum and accounts for nearly 10 000 deaths per year in the United States. Approximately 36% of patients with dilated cardiomyopathy (DCM) suffer from cardiac death within five years after diagnosis. Currently applied methods for an early risk prediction in DCM patients are rather insufficient. The objective of this study was to investigate the suitability of short-term nonlinear methods symbolic dynamics (STSD), detrended fluctuation (DFA), and Poincaré plot analysis (PPA) for risk stratification in these patients. From 91 DCM patients and 30 healthy subjects (REF), heart rate and blood pressure variability (HRV, BPV), STSD, DFA, and PPA were analyzed. Measures from BPV analysis, DFA, and PPA revealed highly significant differences (p<0.0011) discriminating REF and DCM. For risk stratification in DCM patients, four parameters from BPV analysis, STSD, and PPA revealed significant differences between low and high risk (maximum sensitivity: 90%, specificity: 90%). These results suggest that STSD and PPA are useful nonlinear methods for enhanced risk stratification in DCM patients.

  5. [METHODS AND TECHNOLOGIES OF HEALTH RISK ANALYSIS IN THE SYSTEM OF THE STATE MANAGEMENT UNDER ASSURANCE OF THE SANITATION AND EPIDEMIOLOGICAL WELFARE OF POPULATION].

    PubMed

    Zaĭtseva, N V; Popova, A Iu; Maĭ, I V; Shur, P Z

    2015-01-01

    The methodology of the analysis of health risk at the present stage of development of Russian society is in-demand at all levels of government management. In conjunction with the methods of mathematical modeling, spatial-temporal analysis and economic tools the risk assessment in the analysis of the situation makes it possible to determine the level of safety of the population, workers and consumers, to select prior resources, and threat factors as a point for exertion efforts. At the planning stage risk assessment is a basis for the establishment of most effective measures for the minimization of hazard and dangers. At the realization stage the methodology allows to estimate the efficiency of measures; at the control and supervision phase it permits to select out priorities for the concentration of efforts on the objects of maximal health risk for population. Risk assessments, including the elements of evolutionary modeling, are incorporated in the system of state hygienic regulation, the formation of evidence base of harm to health, the organization of control and supervisory activities. This allows you to harmonize the domestic legal framework with ternational legal requirements and ultimately enhances the credibility of the Russian data on the safety of the environment products and services. There is seemed to be actual the further assignment of enforcement of methodology of health risk analysis in the field of assurance of sanitary and epidemiological well-being and health of employers; he development of informational and analytical base in the part of the establishment of models of dependencies "exposure-response" for different types and levels of exposure and risk contingents; the accuracy enhancement of estimations of exposure; improvement of the economic aspects of health risk analysis and forecasting of measures aimed at mitigation of the losses associated with the negative impact of manifold factors on the health of citizens. PMID:26155657

  6. [METHODS AND TECHNOLOGIES OF HEALTH RISK ANALYSIS IN THE SYSTEM OF THE STATE MANAGEMENT UNDER ASSURANCE OF THE SANITATION AND EPIDEMIOLOGICAL WELFARE OF POPULATION].

    PubMed

    Zaĭtseva, N V; Popova, A Iu; Maĭ, I V; Shur, P Z

    2015-01-01

    The methodology of the analysis of health risk at the present stage of development of Russian society is in-demand at all levels of government management. In conjunction with the methods of mathematical modeling, spatial-temporal analysis and economic tools the risk assessment in the analysis of the situation makes it possible to determine the level of safety of the population, workers and consumers, to select prior resources, and threat factors as a point for exertion efforts. At the planning stage risk assessment is a basis for the establishment of most effective measures for the minimization of hazard and dangers. At the realization stage the methodology allows to estimate the efficiency of measures; at the control and supervision phase it permits to select out priorities for the concentration of efforts on the objects of maximal health risk for population. Risk assessments, including the elements of evolutionary modeling, are incorporated in the system of state hygienic regulation, the formation of evidence base of harm to health, the organization of control and supervisory activities. This allows you to harmonize the domestic legal framework with ternational legal requirements and ultimately enhances the credibility of the Russian data on the safety of the environment products and services. There is seemed to be actual the further assignment of enforcement of methodology of health risk analysis in the field of assurance of sanitary and epidemiological well-being and health of employers; he development of informational and analytical base in the part of the establishment of models of dependencies "exposure-response" for different types and levels of exposure and risk contingents; the accuracy enhancement of estimations of exposure; improvement of the economic aspects of health risk analysis and forecasting of measures aimed at mitigation of the losses associated with the negative impact of manifold factors on the health of citizens.

  7. Augmenting the Deliberative Method for Ranking Risks.

    PubMed

    Susel, Irving; Lasley, Trace; Montezemolo, Mark; Piper, Joel

    2016-01-01

    The Department of Homeland Security (DHS) characterized and prioritized the physical cross-border threats and hazards to the nation stemming from terrorism, market-driven illicit flows of people and goods (illegal immigration, narcotics, funds, counterfeits, and weaponry), and other nonmarket concerns (movement of diseases, pests, and invasive species). These threats and hazards pose a wide diversity of consequences with very different combinations of magnitudes and likelihoods, making it very challenging to prioritize them. This article presents the approach that was used at DHS to arrive at a consensus regarding the threats and hazards that stand out from the rest based on the overall risk they pose. Due to time constraints for the decision analysis, it was not feasible to apply multiattribute methodologies like multiattribute utility theory or the analytic hierarchy process. Using a holistic approach was considered, such as the deliberative method for ranking risks first published in this journal. However, an ordinal ranking alone does not indicate relative or absolute magnitude differences among the risks. Therefore, the use of the deliberative method for ranking risks is not sufficient for deciding whether there is a material difference between the top-ranked and bottom-ranked risks, let alone deciding what the stand-out risks are. To address this limitation of ordinal rankings, the deliberative method for ranking risks was augmented by adding an additional step to transform the ordinal ranking into a ratio scale ranking. This additional step enabled the selection of stand-out risks to help prioritize further analysis. PMID:26224206

  8. Integrated seismic risk analysis using simple weighting method: the case of residential Eskişehir, Turkey

    NASA Astrophysics Data System (ADS)

    Pekkan, E.; Tun, M.; Guney, Y.

    2014-11-01

    A large part of the residential areas in Turkey are at risk for earthquakes. The main factors that threaten residential areas during an earthquake are poor quality building stock and soil problems. Liquefaction, loss of bearing capacity, amplification, slope failure, and landslide risks must be taken into account for residential areas that are close to the fault zones and covered with younger sediments. If these risks were separately analyzed and these analyses were combined, this would be more realistic than analyzing several hazard maps based a single parameter. In this study, an integrated seismic hazard map of central Eskişehir was created based on two earthquake related parameters, liquefaction, and amplification, by using a simple weighting method. Other earthquake-related problems such as loss of bearing capacity, landslides, and slope failures are not significant for Eskişehir because of the geologic and the topographic conditions of the region. According to the integrated seismic hazard map of the Eskişehir residential area, the area is found to be generally at medium-high risk during a potential earthquake.

  9. Sensitivity Analysis Using Risk Measures.

    PubMed

    Tsanakas, Andreas; Millossovich, Pietro

    2016-01-01

    In a quantitative model with uncertain inputs, the uncertainty of the output can be summarized by a risk measure. We propose a sensitivity analysis method based on derivatives of the output risk measure, in the direction of model inputs. This produces a global sensitivity measure, explicitly linking sensitivity and uncertainty analyses. We focus on the case of distortion risk measures, defined as weighted averages of output percentiles, and prove a representation of the sensitivity measure that can be evaluated on a Monte Carlo sample, as a weighted average of gradients over the input space. When the analytical model is unknown or hard to work with, nonparametric techniques are used for gradient estimation. This process is demonstrated through the example of a nonlinear insurance loss model. Furthermore, the proposed framework is extended in order to measure sensitivity to constant model parameters, uncertain statistical parameters, and random factors driving dependence between model inputs.

  10. Risk Analysis Virtual ENvironment

    SciTech Connect

    2014-02-10

    RAVEN has 3 major functionalities: 1. Provides a Graphical User Interface for the pre- and post-processing of the RELAP-7 input and output. 2. Provides the capability to model nuclear power plants control logic for the RELAP-7 code and dynamic control of the accident scenario evolution. This capability is based on a software structure that realizes a direct connection between the RELAP-7 solver engine (MOOSE) and a python environment where the variables describing the plant status are accessible in a scripting environment. RAVEN support the generation of the probabilistic scenario control by supplying a wide range of probability and cumulative distribution functions and their inverse functions. 3. Provides a general environment to perform probability risk analysis for RELAP-7, RELAP-5 and any generic MOOSE based applications. The probabilistic analysis is performed by sampling the input space of the coupled code parameters and it is enhanced by using modern artificial intelligence algorithms that accelerate the identification of the areas of major risk (in the input parameter space). This environment also provides a graphical visualization capability to analyze the outcomes. Among other approaches, the classical Monte Carlo and Latin Hypercube sampling algorithms are available. For the acceleration of the convergence of the sampling methodologies, Support Vector Machines, Bayesian regression, and collocation stochastic polynomials chaos are implemented. The same methodologies here described could be used to solve optimization and uncertainties propagation problems using the RAVEN framework.

  11. Risk Analysis Virtual ENvironment

    2014-02-10

    RAVEN has 3 major functionalities: 1. Provides a Graphical User Interface for the pre- and post-processing of the RELAP-7 input and output. 2. Provides the capability to model nuclear power plants control logic for the RELAP-7 code and dynamic control of the accident scenario evolution. This capability is based on a software structure that realizes a direct connection between the RELAP-7 solver engine (MOOSE) and a python environment where the variables describing the plant statusmore » are accessible in a scripting environment. RAVEN support the generation of the probabilistic scenario control by supplying a wide range of probability and cumulative distribution functions and their inverse functions. 3. Provides a general environment to perform probability risk analysis for RELAP-7, RELAP-5 and any generic MOOSE based applications. The probabilistic analysis is performed by sampling the input space of the coupled code parameters and it is enhanced by using modern artificial intelligence algorithms that accelerate the identification of the areas of major risk (in the input parameter space). This environment also provides a graphical visualization capability to analyze the outcomes. Among other approaches, the classical Monte Carlo and Latin Hypercube sampling algorithms are available. For the acceleration of the convergence of the sampling methodologies, Support Vector Machines, Bayesian regression, and collocation stochastic polynomials chaos are implemented. The same methodologies here described could be used to solve optimization and uncertainties propagation problems using the RAVEN framework.« less

  12. A ubiquitous method for street scale spatial data collection and analysis in challenging urban environments: mapping health risks using spatial video in Haiti

    PubMed Central

    2013-01-01

    Background Fine-scale and longitudinal geospatial analysis of health risks in challenging urban areas is often limited by the lack of other spatial layers even if case data are available. Underlying population counts, residential context, and associated causative factors such as standing water or trash locations are often missing unless collected through logistically difficult, and often expensive, surveys. The lack of spatial context also hinders the interpretation of results and designing intervention strategies structured around analytical insights. This paper offers a ubiquitous spatial data collection approach using a spatial video that can be used to improve analysis and involve participatory collaborations. A case study will be used to illustrate this approach with three health risks mapped at the street scale for a coastal community in Haiti. Methods Spatial video was used to collect street and building scale information, including standing water, trash accumulation, presence of dogs, cohort specific population characteristics, and other cultural phenomena. These data were digitized into Google Earth and then coded and analyzed in a GIS using kernel density and spatial filtering approaches. The concentrations of these risks around area schools which are sometimes sources of diarrheal disease infection because of the high concentration of children and variable sanitary practices will show the utility of the method. In addition schools offer potential locations for cholera education interventions. Results Previously unavailable fine scale health risk data vary in concentration across the town, with some schools being proximate to greater concentrations of the mapped risks. The spatial video is also used to validate coded data and location specific risks within these “hotspots”. Conclusions Spatial video is a tool that can be used in any environment to improve local area health analysis and intervention. The process is rapid and can be repeated in study

  13. Object Oriented Risk Analysis Workshop

    NASA Astrophysics Data System (ADS)

    Pons, M. Güell I.; Jaboyedoff, M.

    2009-04-01

    In the framework of the RISET Project (Interfaculty Network of Support to Education and Technology) an educational tool for introducing risk analysis has been developed. This workshop enables to carry a group of students (role-play game) through a step-by-step process of risk identification and quantification. The aim is to assess risk in a characteristic alpine village regarding natural hazards (rockfall, snow avalanche, flooding…) and is oriented to affected objects such as buildings, infrastructures... The workshop contains the following steps: 1.- Planning of the study and definition of stakeholders 2.- Hazard identification 3.- Risk analysis 4.- Risk assessment 5.- Proposition of mitigation measures 6- Risk management and cost-benefit analysis. During the process, information related to past events and useful concepts are provided in order to bring up discussion and decision making. The Risk Matrix and other graphical tools allow having a visual representation of the risk level and help to prioritize counter measures. At the end of the workshop, there is the possibility to compare the results between different groups and print out a summarizing report. This approach provides a rapid and comprehensible risk evaluation. The workshop is accessible from the internet and will be used for educational purposes at bachelor and master level as well as for external persons dealing with risk analysis.

  14. The Components of Microbiological Risk Analysis

    PubMed Central

    Bentley, Stefano; Giacometti, Federica; Serraino, Andrea

    2015-01-01

    The paper describes the process of risk analysis in a food safety perspective. The steps of risk analysis defined as a process consisting of three interconnected components (risk assessment, risk management, and risk communication) are analysed. The different components of the risk assessment, risk management and risk communication are further described. PMID:27800384

  15. Risk/benefit analysis

    SciTech Connect

    Crouch, E.A.C.; Wilson, R.

    1982-01-01

    The Reagan administration is intent on rolling back regulations it considers unwise to give new life to American industry, but regulations were instituted to protect individuals against long-term hazards. The authors believe these hazards must be assessed before a regulation is modified, suspended, or implemented. They point out the problems inherent in defining, perceiving, and estimating risk. Throughout, they combine theoretical discussions with actual case studies covering the risk associated with nuclear power plants, saccharin use, mass chest radiography, and others. They believe that risk assessment should be distinct from decision making, with the risk assessor supplying clear and objective information about hazards and the probability of damage as well as pointing out the uncertainties to policy makers. 149 references, 29 figures, 8 tables.

  16. Fire Risk Implications in Safety Analysis Reports

    SciTech Connect

    Blanchard, A.

    1999-03-31

    Fire can be a significant risk for facilities that store and handle radiological material. Such events must be evaluated as part of a comprehensive safety analysis. SRS has been developing methods to evaluate radiological fire risk in such facilities. These methods combined with the analysis techniques proposed by DOE-STD-3009-94 have provided a better understanding of how fire risks in nuclear facilities should be managed. To ensure that these new insights are properly disseminated the DOE Savannah River Office and the Defense Nuclear Facility Safety Board (DNFSB) requested Westinghouse Savannah River Company (WSRC) prepare this paper.

  17. Assessing environmental risks for high intensity agriculture using the material flow analysis method--a case study of the Dongting Lake basin in South Central China.

    PubMed

    Yin, Guanyi; Liu, Liming; Yuan, Chengcheng

    2015-07-01

    This study primarily examined the assessment of environmental risk in high intensity agricultural areas. Dongting Lake basin was taken as a case study, which is one of the major grain producing areas in China. Using data obtained from 1989 to 2012, we applied Material Flow Analysis (MFA) to show the material consumption, pollutant output and production storage in the agricultural-environmental system and assessed the environmental risk index on the basis of the MFA results. The results predicted that the status of the environmental quality of the Dongting Lake area is unsatisfactory for the foreseeable future. The direct material input (DMI) declined by 13.9%, the domestic processed output (DPO) increased by 28.21%, the intensity of material consumption (IMC) decreased by 36.7%, the intensity of material discharge (IMD) increased by 10%, the material productivity (MP) increased by 27 times, the environmental efficiency (EE) increased by 15.31 times, and the material storage (PAS) increased by 0.23%. The DMI and DPO was higher at rural places on the edge of cities, whereas the risk of urban agriculture has arisen due to the higher increasing rate of DMI and DPO in cities compared with the counties. The composite environmental risk index increased from 0.33 to 0.96, indicating that the total environmental risk changed gradually but seriously during the 24 years assessed. The driving factors that affect environmental risk in high intensity agriculture can be divided into five classes: social, economic, human, natural and disruptive incidents. This study discussed a number of effective measures for protecting the environment while ensuring food production yields. Additional research in other areas and certain improvements of this method in future studies may be necessary to develop a more effective method of managing and controlling agricultural-environmental interactions.

  18. Budget Risk & Prioritization Analysis Tool

    2010-12-31

    BRPAtool performs the following: •Assists managers in making solid decisions on what scope/activities to reduce and/or eliminate, to meet constrained budgets, based on multiple risk factors •Enables analysis of different budget scenarios •Can analyze risks and cost for each activity based on technical, quantifiable risk criteria and management-determined risks •Real-time analysis •Enables managers to determine the multipliers and where funding is best applied •Promotes solid budget defense

  19. [ANALYSIS OF THE METHODIC APPROACHES TO THE OCCUPATIONAL RISK ASSESSMENT AT THE ENTERPRISES OF THE SVERDLOVSK REGION].

    PubMed

    Gurvich, V B; Kuz'min, S V; Plotko, E G; Roslyĭ, O F; Fedoruk, A A; Ruzakov, V O

    2015-01-01

    This study is devoted to the practice of the application of the current legislative regulatory and procedural base for the occupational health and safety issues. The issues of the occupational risks assessment at the enterprises of the Sverdlovsk region are discussed Approaches to the creation of the occupational risk management and assessment systems are proposed.

  20. Confronting deep uncertainties in risk analysis.

    PubMed

    Cox, Louis Anthony

    2012-10-01

    How can risk analysts help to improve policy and decision making when the correct probabilistic relation between alternative acts and their probable consequences is unknown? This practical challenge of risk management with model uncertainty arises in problems from preparing for climate change to managing emerging diseases to operating complex and hazardous facilities safely. We review constructive methods for robust and adaptive risk analysis under deep uncertainty. These methods are not yet as familiar to many risk analysts as older statistical and model-based methods, such as the paradigm of identifying a single "best-fitting" model and performing sensitivity analyses for its conclusions. They provide genuine breakthroughs for improving predictions and decisions when the correct model is highly uncertain. We demonstrate their potential by summarizing a variety of practical risk management applications. PMID:22489541

  1. Probabilistic Exposure Analysis for Chemical Risk Characterization

    PubMed Central

    Bogen, Kenneth T.; Cullen, Alison C.; Frey, H. Christopher; Price, Paul S.

    2009-01-01

    This paper summarizes the state of the science of probabilistic exposure assessment (PEA) as applied to chemical risk characterization. Current probabilistic risk analysis methods applied to PEA are reviewed. PEA within the context of risk-based decision making is discussed, including probabilistic treatment of related uncertainty, interindividual heterogeneity, and other sources of variability. Key examples of recent experience gained in assessing human exposures to chemicals in the environment, and other applications to chemical risk characterization and assessment, are presented. It is concluded that, although improvements continue to be made, existing methods suffice for effective application of PEA to support quantitative analyses of the risk of chemically induced toxicity that play an increasing role in key decision-making objectives involving health protection, triage, civil justice, and criminal justice. Different types of information required to apply PEA to these different decision contexts are identified, and specific PEA methods are highlighted that are best suited to exposure assessment in these separate contexts. PMID:19223660

  2. At-Risk Youngsters: Methods That Work.

    ERIC Educational Resources Information Center

    Obiakor, Festus E.

    This paper examines problems faced by youngsters at risk of failure in school, and discusses methods for helping them succeed in educational programs. At-risk youngsters confront many problems in school and in mainstream society, and are frequently misidentified, misdiagnosed, and improperly instructed. Problems faced by at-risk youngsters…

  3. Post hoc Analysis for Detecting Individual Rare Variant Risk Associations Using Probit Regression Bayesian Variable Selection Methods in Case-Control Sequencing Studies.

    PubMed

    Larson, Nicholas B; McDonnell, Shannon; Albright, Lisa Cannon; Teerlink, Craig; Stanford, Janet; Ostrander, Elaine A; Isaacs, William B; Xu, Jianfeng; Cooney, Kathleen A; Lange, Ethan; Schleutker, Johanna; Carpten, John D; Powell, Isaac; Bailey-Wilson, Joan; Cussenot, Olivier; Cancel-Tassin, Geraldine; Giles, Graham; MacInnis, Robert; Maier, Christiane; Whittemore, Alice S; Hsieh, Chih-Lin; Wiklund, Fredrik; Catolona, William J; Foulkes, William; Mandal, Diptasri; Eeles, Rosalind; Kote-Jarai, Zsofia; Ackerman, Michael J; Olson, Timothy M; Klein, Christopher J; Thibodeau, Stephen N; Schaid, Daniel J

    2016-09-01

    Rare variants (RVs) have been shown to be significant contributors to complex disease risk. By definition, these variants have very low minor allele frequencies and traditional single-marker methods for statistical analysis are underpowered for typical sequencing study sample sizes. Multimarker burden-type approaches attempt to identify aggregation of RVs across case-control status by analyzing relatively small partitions of the genome, such as genes. However, it is generally the case that the aggregative measure would be a mixture of causal and neutral variants, and these omnibus tests do not directly provide any indication of which RVs may be driving a given association. Recently, Bayesian variable selection approaches have been proposed to identify RV associations from a large set of RVs under consideration. Although these approaches have been shown to be powerful at detecting associations at the RV level, there are often computational limitations on the total quantity of RVs under consideration and compromises are necessary for large-scale application. Here, we propose a computationally efficient alternative formulation of this method using a probit regression approach specifically capable of simultaneously analyzing hundreds to thousands of RVs. We evaluate our approach to detect causal variation on simulated data and examine sensitivity and specificity in instances of high RV dimensionality as well as apply it to pathway-level RV analysis results from a prostate cancer (PC) risk case-control sequencing study. Finally, we discuss potential extensions and future directions of this work.

  4. Post hoc Analysis for Detecting Individual Rare Variant Risk Associations Using Probit Regression Bayesian Variable Selection Methods in Case-Control Sequencing Studies.

    PubMed

    Larson, Nicholas B; McDonnell, Shannon; Albright, Lisa Cannon; Teerlink, Craig; Stanford, Janet; Ostrander, Elaine A; Isaacs, William B; Xu, Jianfeng; Cooney, Kathleen A; Lange, Ethan; Schleutker, Johanna; Carpten, John D; Powell, Isaac; Bailey-Wilson, Joan; Cussenot, Olivier; Cancel-Tassin, Geraldine; Giles, Graham; MacInnis, Robert; Maier, Christiane; Whittemore, Alice S; Hsieh, Chih-Lin; Wiklund, Fredrik; Catolona, William J; Foulkes, William; Mandal, Diptasri; Eeles, Rosalind; Kote-Jarai, Zsofia; Ackerman, Michael J; Olson, Timothy M; Klein, Christopher J; Thibodeau, Stephen N; Schaid, Daniel J

    2016-09-01

    Rare variants (RVs) have been shown to be significant contributors to complex disease risk. By definition, these variants have very low minor allele frequencies and traditional single-marker methods for statistical analysis are underpowered for typical sequencing study sample sizes. Multimarker burden-type approaches attempt to identify aggregation of RVs across case-control status by analyzing relatively small partitions of the genome, such as genes. However, it is generally the case that the aggregative measure would be a mixture of causal and neutral variants, and these omnibus tests do not directly provide any indication of which RVs may be driving a given association. Recently, Bayesian variable selection approaches have been proposed to identify RV associations from a large set of RVs under consideration. Although these approaches have been shown to be powerful at detecting associations at the RV level, there are often computational limitations on the total quantity of RVs under consideration and compromises are necessary for large-scale application. Here, we propose a computationally efficient alternative formulation of this method using a probit regression approach specifically capable of simultaneously analyzing hundreds to thousands of RVs. We evaluate our approach to detect causal variation on simulated data and examine sensitivity and specificity in instances of high RV dimensionality as well as apply it to pathway-level RV analysis results from a prostate cancer (PC) risk case-control sequencing study. Finally, we discuss potential extensions and future directions of this work. PMID:27312771

  5. The public-use National Health Interview Survey linked mortality files: methods of reidentification risk avoidance and comparative analysis.

    PubMed

    Lochner, Kimberly; Hummer, Robert A; Bartee, Stephanie; Wheatcroft, Gloria; Cox, Christine

    2008-08-01

    The National Center for Health Statistics (NCHS) conducts mortality follow-up for its major population-based surveys. In 2004, NCHS updated the mortality follow-up for the 1986-2000 National Health Interview Survey (NHIS) years, which because of confidentiality protections was made available only through the NCHS Research Data Center. In 2007, NCHS released a public-use version of the NHIS Linked Mortality Files that includes a limited amount of perturbed information for decedents. The modification of the public-use version included conducting a reidentification risk scenario to determine records at risk for reidentification and then imputing values for either date or cause of death for a select sample of records. To demonstrate the comparability between the public-use and restricted-use versions of the linked mortality files, the authors estimated relative hazards for all-cause and cause-specific mortality risk using a Cox proportional hazards model. The pooled 1986-2000 NHIS Linked Mortality Files contain 1,576,171 records and 120,765 deaths. The sample for the comparative analyses included 897,232 records and 114,264 deaths. The comparative analyses show that the two data files yield very similar results for both all-cause and cause-specific mortality. Analytical considerations when examining cause-specific analyses of numerically small demographic subgroups are addressed.

  6. Adversarial risk analysis for counterterrorism modeling.

    PubMed

    Rios, Jesus; Rios Insua, David

    2012-05-01

    Recent large-scale terrorist attacks have raised interest in models for resource allocation against terrorist threats. The unifying theme in this area is the need to develop methods for the analysis of allocation decisions when risks stem from the intentional actions of intelligent adversaries. Most approaches to these problems have a game-theoretic flavor although there are also several interesting decision-analytic-based proposals. One of them is the recently introduced framework for adversarial risk analysis, which deals with decision-making problems that involve intelligent opponents and uncertain outcomes. We explore how adversarial risk analysis addresses some standard counterterrorism models: simultaneous defend-attack models, sequential defend-attack-defend models, and sequential defend-attack models with private information. For each model, we first assess critically what would be a typical game-theoretic approach and then provide the corresponding solution proposed by the adversarial risk analysis framework, emphasizing how to coherently assess a predictive probability model of the adversary's actions, in a context in which we aim at supporting decisions of a defender versus an attacker. This illustrates the application of adversarial risk analysis to basic counterterrorism models that may be used as basic building blocks for more complex risk analysis of counterterrorism problems. PMID:22150163

  7. Multiattribute risk analysis in nuclear emergency management.

    PubMed

    Hämäläinen, R P; Lindstedt, M R; Sinkko, K

    2000-08-01

    Radiation protection authorities have seen a potential for applying multiattribute risk analysis in nuclear emergency management and planning to deal with conflicting objectives, different parties involved, and uncertainties. This type of approach is expected to help in the following areas: to ensure that all relevant attributes are considered in decision making; to enhance communication between the concerned parties, including the public; and to provide a method for explicitly including risk analysis in the process. A multiattribute utility theory analysis was used to select a strategy for protecting the population after a simulated nuclear accident. The value-focused approach and the use of a neutral facilitator were identified as being useful. PMID:11051070

  8. Selecting Needs Analysis Methods.

    ERIC Educational Resources Information Center

    Newstrom, John W.; Lilyquist, John M.

    1979-01-01

    Presents a contingency model for decision making with regard to needs analysis methods. Focus is on 12 methods with brief discussion of their defining characteristics and some operational guidelines for their use. (JOW)

  9. Recursive Partitioning Method on Competing Risk Outcomes

    PubMed Central

    Xu, Wei; Che, Jiahua; Kong, Qin

    2016-01-01

    In some cancer clinical studies, researchers have interests to explore the risk factors associated with competing risk outcomes such as recurrence-free survival. We develop a novel recursive partitioning framework on competing risk data for both prognostic and predictive model constructions. We define specific splitting rules, pruning algorithm, and final tree selection algorithm for the competing risk tree models. This methodology is quite flexible that it can corporate both semiparametric method using Cox proportional hazards model and parametric competing risk model. Both prognostic and predictive tree models are developed to adjust for potential confounding factors. Extensive simulations show that our methods have well-controlled type I error and robust power performance. Finally, we apply both Cox proportional hazards model and flexible parametric model for prognostic tree development on a retrospective clinical study on oropharyngeal cancer patients. PMID:27486300

  10. Recursive Partitioning Method on Competing Risk Outcomes.

    PubMed

    Xu, Wei; Che, Jiahua; Kong, Qin

    2016-01-01

    In some cancer clinical studies, researchers have interests to explore the risk factors associated with competing risk outcomes such as recurrence-free survival. We develop a novel recursive partitioning framework on competing risk data for both prognostic and predictive model constructions. We define specific splitting rules, pruning algorithm, and final tree selection algorithm for the competing risk tree models. This methodology is quite flexible that it can corporate both semiparametric method using Cox proportional hazards model and parametric competing risk model. Both prognostic and predictive tree models are developed to adjust for potential confounding factors. Extensive simulations show that our methods have well-controlled type I error and robust power performance. Finally, we apply both Cox proportional hazards model and flexible parametric model for prognostic tree development on a retrospective clinical study on oropharyngeal cancer patients. PMID:27486300

  11. Psychiatrists' follow-up of identified metabolic risk: a mixed-method analysis of outcomes and influences on practice

    PubMed Central

    Patterson, Sue; Freshwater, Kathleen; Goulter, Nicole; Ewing, Julie; Leamon, Boyd; Choudhary, Anand; Moudgil, Vikas; Emmerson, Brett

    2016-01-01

    Aims and method To describe and explain psychiatrists' responses to metabolic abnormalities identified during screening. We carried out an audit of clinical records to assess rates of monitoring and follow-up practice. Semi-structured interviews with 36 psychiatrists followed by descriptive and thematic analyses were conducted. Results Metabolic abnormalities were identified in 76% of eligible patients screened. Follow-up, recorded for 59%, was variable but more likely with four or more abnormalities. Psychiatrists endorse guidelines but ambivalence about responsibility, professional norms, resource constraints and skills deficits as well as patient factors influences practice. Therapeutic optimism and desire to be a ‘good doctor’ supported comprehensive follow-up. Clinical implications Psychiatrists are willing to attend to physical healthcare, and obstacles to recommended practice are surmountable. Psychiatrists seek consensus among stakeholders about responsibilities and a systemic approach addressing the social determinants of health inequities. Understanding patients' expectations is critical to promoting best practice. PMID:27752343

  12. Failure analysis of pinch-torsion tests as a thermal runaway risk evaluation method of Li-Ion Cells

    SciTech Connect

    Xia, Yuzhi; Li, Dr. Tianlei; Ren, Prof. Fei; Gao, Yanfei; Wang, Hsin

    2014-01-01

    Recently a pinch-torsion test is developed for safety testing of Li-ion batteries (Ren et al., J. Power Source, 2013). It has been demonstrated that this test can generate small internal short-circuit spots in the separator in a controllable and repeatable manner. In the current research, the failure mechanism is examined by numerical simulations and comparisons to experimental observations. Finite element models are developed to evaluate the deformation of the separators under both pure pinch and pinch-torsion loading conditions. It is discovered that the addition of the torsion component significantly increased the maximum principal strain, which is believed to induce the internal short circuit. In addition, the applied load in the pinch-torsion test is significantly less than in the pure pinch test, thus dramatically improving the applicability of this method to ultra-thick batteries which otherwise require heavy load in excess of machine capability. It is further found that the separator failure is achieved in the early stage of torsion (within a few degree of rotation). Effect of coefficient of friction on the maximum principal strain is also examined.

  13. Risk analysis of exploration plays

    SciTech Connect

    Rose, P.R.

    1996-08-01

    The most difficult and crucial decision in petroleum exploration is not which prospect to drill, but rather, which new play to enter. Such a decision, whether ultimately profitable or not, commits the Organization to years of involvement, expenditures of $millions, and hundreds of man-years of effort. Even though uncertainties and risks are high, organizations commonly make the new-play decision in a disjointed, non-analytic, even superficial way. The economic consequences of a bad play choice can be disastrous. Using established principles of prospect risk analysis, modern petroleum exploration organizations routinely assign economic value to individual prospects, but they actually operate via exploration programs in plays and trends. Accordingly, the prospect is the economic unit of exploration, whereas the play is the operational unit. Plays can be successfully analyzed as full-cycle economic risk ventures, however, using many principles of prospect risk analysis. Economic measures such as Expected Present Value, DCFROR, etc. apply equally to plays or prospects. The predicted field-size distribution of the play is analogous to the forecast prospect reserves distribution, Economic truncation applies to both. Variance of play reserves is usually much greater than for prospect reserves. Geologic chance factors such as P{sub reservoir}, P{sub generation}, etc., must be distinguished as independent or shared among prospects in the play, so they should be defined so as to apply equally to the play and to its constituent prospects. They are analogous to multiple objectives on a prospect, and are handled differently in performing the risk analysis.

  14. Risk analysis of exploration plays

    SciTech Connect

    Rose, P.R.

    1996-06-01

    The most difficult and crucial decision in petroleum exploration is not which prospect to drill, but rather, which new play to enter. Such a decision, whether ultimately profitable or not, commits the Organization to years of involvement, expenditures of $millions, and hundreds of man-years of effort. Even though uncertainties and risks are high, organizations commonly make the new-play decision in a disjointed, non-analytic, even superficial way. The economic consequences of a bad play choice can be disastrous. Using established principles of prospect risk analysis, modern petroleum exploration organizations routinely assign economic value to individual prospects, but they actually operate via exploration programs in plays and trends. Accordingly, the prospect is the economic unit of exploration, whereas the play is the operational unit. Plays can be successfully analyzed as full-cycle economic risk ventures, however, using many principles of prospect risk analysis. Economic measures such as Expected Present Value, DCFROR, etc. apply equally to plays or prospects. The predicted field-size distribution of the play is analogous to the forecast prospect reserves distribution. Economic truncation applies to both. Variance of play reserves is usually much greater than for prospect reserves. Geologic chance factors such as P{sub reservoir}, P{sub generation}, etc., must be distinguished as independent or shared among prospects in the play, so they should be defined so as to apply equally to the play and to its constituent prospects. They are analogous to multiple objectives on a prospect, and are handled differently in performing the risk analysis.

  15. Draft Waste Management Programmatic Environmental Impact Statement for managing treatment, storage, and disposal of radioactive and hazardous waste. Volume 3, Appendix A: Public response to revised NOI, Appendix B: Environmental restoration, Appendix C, Environmental impact analysis methods, Appendix D, Risk

    SciTech Connect

    1995-08-01

    Volume three contains appendices for the following: Public comments do DOE`s proposed revisions to the scope of the waste management programmatic environmental impact statement; Environmental restoration sensitivity analysis; Environmental impacts analysis methods; and Waste management facility human health risk estimates.

  16. Russian risk assessment methods and approaches

    SciTech Connect

    Dvorack, M.A.; Carlson, D.D.; Smith, R.E.

    1996-07-01

    One of the benefits resulting from the collapse of the Soviet Union is the increased dialogue currently taking place between American and Russian nuclear weapons scientists in various technical arenas. One of these arenas currently being investigated involves collaborative studies which illustrate how risk assessment is perceived and utilized in the Former Soviet Union (FSU). The collaborative studies indicate that, while similarities exist with respect to some methodologies, the assumptions and approaches in performing risk assessments were, and still are, somewhat different in the FSU as opposed to that in the US. The purpose of this paper is to highlight the present knowledge of risk assessment methodologies and philosophies within the two largest nuclear weapons laboratories of the Former Soviet Union, Arzamas-16 and Chelyabinsk-70. Furthermore, This paper will address the relative progress of new risk assessment methodologies, such as Fuzzy Logic, within the framework of current risk assessment methods at these two institutes.

  17. [Methods of risk assessment and their validation].

    PubMed

    Baracco, Alessandro

    2014-01-01

    The review of the literature data shows several methods for the the risks assessment of biomnechanical overload of the musculoskeletal system in activities with repetitive strain of the upper limbs and manual material handling. The application of these methods should allow the quantification ofriskfor the working population, the identification of the preventive measures to reduce the risk and their effectiveness and thle design of a specific health surveillance scheme. In this paper we analyze the factors which must be taken into account in Occupational Medicine to implement a process of validation of these methods. In conclusion we believe it will necessary in the future the availability of new methods able to analyze and reduce the risk already in the design phase of the production process. PMID:25558718

  18. Risk-Stratified Imputation in Survival Analysis

    PubMed Central

    Kennedy, Richard E.; Adragni, Kofi P.; Tiwari, Hemant K.; Voeks, Jenifer H.; Brott, Thomas G.; Howard, George

    2013-01-01

    Background Censoring that is dependent on covariates associated with survival can arise in randomized trials due to changes in recruitment and eligibility criteria to minimize withdrawals, potentially leading to biased treatment effect estimates. Imputation approaches have been proposed to address censoring in survival analysis; and while these approaches may provide unbiased estimates of treatment effects, imputation of a large number of outcomes may over- or underestimate the associated variance based on the imputation pool selected. Purpose We propose an improved method, risk-stratified imputation, as an alternative to address withdrawal related to the risk of events in the context of time-to-event analyses. Methods Our algorithm performs imputation from a pool of replacement subjects with similar values of both treatment and covariate(s) of interest, that is, from a risk-stratified sample. This stratification prior to imputation addresses the requirement of time-to-event analysis that censored observations are representative of all other observations in the risk group with similar exposure variables. We compared our risk-stratified imputation to case deletion and bootstrap imputation in a simulated dataset in which the covariate of interest (study withdrawal) was related to treatment. A motivating example from a recent clinical trial is also presented to demonstrate the utility of our method. Results In our simulations, risk-stratified imputation gives estimates of treatment effect comparable to bootstrap and auxiliary variable imputation while avoiding inaccuracies of the latter two in estimating the associated variance. Similar results were obtained in analysis of clinical trial data. Limitations Risk-stratified imputation has little advantage over other imputation methods when covariates of interest are not related to treatment, although its performance is superior when covariates are related to treatment. Risk-stratified imputation is intended for

  19. Comparison Of Intake Gate Closure Methods At Lower Granite, Little Goose, Lower Monumental, And Mcnary Dams Using Risk-Based Analysis

    SciTech Connect

    Gore, Bryan F.; Blackburn, Tyrone R.; Heasler, Patrick G.; Mara, Neil L.; Phan, Hahn K.; Bardy, David M.; Hollenbeck, Robert E.

    2001-01-19

    The objective of this report is to compare the benefits and costs of modifications proposed for intake gate closure systems at four hydroelectric stations on the Lower Snake and Upper Columbia Rivers in the Walla Walla District that are unable to meet the COE 10-minute closure rule due to the installation of fish screens. The primary benefit of the proposed modifications is to reduce the risk of damage to the station and environs when emergency intake gate closure is required. Consequently, this report presents the results and methodology of an extensive risk analysis performed to assess the reliability of powerhouse systems and the costs and timing of potential damages resulting from events requiring emergency intake gate closure. As part of this analysis, the level of protection provided by the nitrogen emergency closure system was also evaluated. The nitrogen system was the basis for the original recommendation to partially disable the intake gate systems. The risk analysis quantifies this protection level.

  20. Comparison of Intake Gate Closure Methods At Lower Granite, Little Goose, Lower Monumental, And McNary Dams Using Risk-Based Analysis

    SciTech Connect

    Gore, Bryan F; Blackburn, Tye R; Heasler, Patrick G; Mara, Neil L

    2001-01-19

    The objective of this report is to compare the benefits and costs of modifications proposed for intake gate closure systems at four hydroelectric stations on the Lower Snake and Upper Columbia Rivers in the Walla Walla District that are unable to meet the COE 10-minute closure rule due to the installation of fish screens. The primary benefit of the proposed modifications is to reduce the risk of damage to the station and environs when emergency intake gate closure is required. Consequently, this report presents the results and methodology of an extensive risk analysis performed to assess the reliability of powerhouse systems and the costs and timing of potential damages resulting from events requiring emergency intake gate closure. As part of this analysis, the level of protection provided by the nitrogen emergency closure system was also evaluated. The nitrogen system was the basis for the original recommendation to partially disable the intake gate systems. The risk analysis quantifies this protection level.

  1. Command Process Modeling & Risk Analysis

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  2. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2007-01-01

    A formal method is described to quantify structural reliability and risk in the presence of a multitude of uncertainties. The method is based on the materials behavior level where primitive variables with their respective scatters are used to describe that behavior. Computational simulation is then used to propagate those uncertainties to the structural scale where reliability and risk are usually specified. A sample case is described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from this method demonstrate that the method is mature and that it can be used for future strategic projections and planning to assure better, cheaper, faster products for competitive advantages in world markets. The results also indicate that the methods are suitable for predicting remaining life in aging or deteriorating structures.

  3. Risk analysis of hematopoietic stem cell transplant process: failure mode, effect, and criticality analysis and hazard analysis critical control point methods integration based on guidelines to good manufacturing practice for medicinal product ANNEX 20 (February 2008).

    PubMed

    Gianassi, S; Bisin, S; Bindi, B; Spitaleri, I; Bambi, F

    2010-01-01

    The collection and handling of hematopoietic stem cells (HSCs) must meet high quality requirements. An integrated Quality Risk Management can help to identify and contain potential risks related to HSC production. Risk analysis techniques allow one to "weigh" identified hazards, considering the seriousness of their effects, frequency, and detectability, seeking to prevent the most harmful hazards. The Hazard Analysis Critical Point, recognized as the most appropriate technique to identify risks associated with physical, chemical, and biological hazards for cellular products, consists of classifying finished product specifications and limits of acceptability, identifying all off-specifications, defining activities that can cause them, and finally establishing both a monitoring system for each Critical Control Point and corrective actions for deviations. The severity of possible effects on patients, as well as the occurrence and detectability of critical parameters, are measured on quantitative scales (Risk Priority Number [RPN]). Risk analysis was performed with this technique on manipulation process of HPC performed at our blood center. The data analysis showed that hazards with higher values of RPN with greater impact on the process are loss of dose and tracking; technical skills of operators and manual transcription of data were the most critical parameters. Problems related to operator skills are handled by defining targeted training programs, while other critical parameters can be mitigated with the use of continuous control systems. The blood center management software was completed by a labeling system with forms designed to be in compliance with standards in force and by starting implementation of a cryopreservation management module.

  4. [Study on the risk assessment method of regional groundwater pollution].

    PubMed

    Yang, Yan; Yu, Yun-Jiang; Wang, Zong-Qing; Li, Ding-Long; Sun, Hong-Wei

    2013-02-01

    Based on the boundary elements of system risk assessment, the regional groundwater pollution risk assessment index system was preliminarily established, which included: regional groundwater specific vulnerability assessment, the regional pollution sources characteristics assessment and the health risk assessment of regional featured pollutants. The three sub-evaluation systems were coupled with the multi-index comprehensive method, the risk was characterized with the Spatial Analysis of ArcMap, and a new method to evaluate regional groundwater pollution risk that suitable for different parts of natural conditions, different types of pollution was established. Take Changzhou as an example, the risk of shallow groundwater pollution was studied with the new method, and found that the vulnerability index of groundwater in Changzhou is high and distributes unevenly; The distribution of pollution sources is concentrated and has a great impact on groundwater pollution risks; Influenced by the pollutants and pollution sources, the values of health risks are high in the urban area of Changzhou. The pollution risk of shallow groundwater is high and distributes unevenly, and distributes in the north of the line of Anjia-Xuejia-Zhenglu, the center of the city and the southeast, where the human activities are more intense and the pollution sources are intensive.

  5. Contribution of European research to risk analysis.

    PubMed

    Boenke, A

    2001-12-01

    The European Commission's, Quality of Life Research Programme, Key Action 1-Health, Food & Nutrition is mission-oriented and aims, amongst other things, at providing a healthy, safe and high-quality food supply leading to reinforced consumer confidence in the safety, of European food. Its objectives also include the enhancing of the competitiveness of the European food supply. Key Action 1 is currently supporting a number of different types of European collaborative projects in the area of risk analysis. The objectives of these projects range from the development and validation of prevention strategies including the reduction of consumers risks; development and validation of new modelling approaches, harmonization of risk assessment principles methodologies and terminology; standardization of methods and systems used for the safety evaluation of transgenic food; providing of tools for the evaluation of human viral contamination of shellfish and quality control; new methodologies for assessing the potential of unintended effects of genetically modified (genetically modified) foods; development of a risk assessment model for Cryptosporidium parvum related to the food and water industries, to the development of a communication platform for genetically modified organism, producers, retailers, regulatory authorities and consumer groups to improve safety assessment procedures, risk management strategies and risk communication; development and validation of new methods for safety testing of transgenic food; evaluation of the safety and efficacy of iron supplementation in pregnant women, evaluation of the potential cancer-preventing activity of pro- and pre-biotic ('synbiotic') combinations in human volunteers. An overview of these projects is presented here.

  6. Evaluating methods for estimating existential risks.

    PubMed

    Tonn, Bruce; Stiefel, Dorian

    2013-10-01

    Researchers and commissions contend that the risk of human extinction is high, but none of these estimates have been based upon a rigorous methodology suitable for estimating existential risks. This article evaluates several methods that could be used to estimate the probability of human extinction. Traditional methods evaluated include: simple elicitation; whole evidence Bayesian; evidential reasoning using imprecise probabilities; and Bayesian networks. Three innovative methods are also considered: influence modeling based on environmental scans; simple elicitation using extinction scenarios as anchors; and computationally intensive possible-worlds modeling. Evaluation criteria include: level of effort required by the probability assessors; level of effort needed to implement the method; ability of each method to model the human extinction event; ability to incorporate scientific estimates of contributory events; transparency of the inputs and outputs; acceptability to the academic community (e.g., with respect to intellectual soundness, familiarity, verisimilitude); credibility and utility of the outputs of the method to the policy community; difficulty of communicating the method's processes and outputs to nonexperts; and accuracy in other contexts. The article concludes by recommending that researchers assess the risks of human extinction by combining these methods. PMID:23551083

  7. Methods for external event screening quantification: Risk Methods Integration and Evaluation Program (RMIEP) methods development

    SciTech Connect

    Ravindra, M.K.; Banon, H. )

    1992-07-01

    In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as a three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC's PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described.

  8. Methods for external event screening quantification: Risk Methods Integration and Evaluation Program (RMIEP) methods development

    SciTech Connect

    Ravindra, M.K.; Banon, H.

    1992-07-01

    In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as a three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC`s PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described.

  9. Method and apparatus for assessing cardiovascular risk

    NASA Technical Reports Server (NTRS)

    Albrecht, Paul (Inventor); Bigger, J. Thomas (Inventor); Cohen, Richard J. (Inventor)

    1998-01-01

    The method for assessing risk of an adverse clinical event includes detecting a physiologic signal in the subject and determining from the physiologic signal a sequence of intervals corresponding to time intervals between heart beats. The long-time structure of fluctuations in the intervals over a time period of more than fifteen minutes is analyzed to assess risk of an adverse clinical event. In a preferred embodiment, the physiologic signal is an electrocardiogram and the time period is at least fifteen minutes. A preferred method for analyzing the long-time structure variability in the intervals includes computing the power spectrum and fitting the power spectrum to a power law dependence on frequency over a selected frequency range such as 10.sup.-4 to 10.sup.-2 Hz. Characteristics of the long-time structure fluctuations in the intervals is used to assess risk of an adverse clinical event.

  10. A classification scheme for risk assessment methods.

    SciTech Connect

    Stamp, Jason Edwin; Campbell, Philip LaRoche

    2004-08-01

    This report presents a classification scheme for risk assessment methods. This scheme, like all classification schemes, provides meaning by imposing a structure that identifies relationships. Our scheme is based on two orthogonal aspects--level of detail, and approach. The resulting structure is shown in Table 1 and is explained in the body of the report. Each cell in the Table represent a different arrangement of strengths and weaknesses. Those arrangements shift gradually as one moves through the table, each cell optimal for a particular situation. The intention of this report is to enable informed use of the methods so that a method chosen is optimal for a situation given. This report imposes structure on the set of risk assessment methods in order to reveal their relationships and thus optimize their usage.We present a two-dimensional structure in the form of a matrix, using three abstraction levels for the rows and three approaches for the columns. For each of the nine cells in the matrix we identify the method type by name and example. The matrix helps the user understand: (1) what to expect from a given method, (2) how it relates to other methods, and (3) how best to use it. Each cell in the matrix represent a different arrangement of strengths and weaknesses. Those arrangements shift gradually as one moves through the table, each cell optimal for a particular situation. The intention of this report is to enable informed use of the methods so that a method chosen is optimal for a situation given. The matrix, with type names in the cells, is introduced in Table 2 on page 13 below. Unless otherwise stated we use the word 'method' in this report to refer to a 'risk assessment method', though often times we use the full phrase. The use of the terms 'risk assessment' and 'risk management' are close enough that we do not attempt to distinguish them in this report. The remainder of this report is organized as follows. In Section 2 we provide context for this report

  11. Flood hazard energy in urban areas: a new integrated method for flood risk analysis in synthesizing interactions with urban boundary layer

    NASA Astrophysics Data System (ADS)

    Park, S. Y.; Schmidt, A.

    2015-12-01

    Since urban physical characteristics (such as morphology and land-use/land-cover) are different from those of nature, altered interactions between the surface and atmosphere (especially urban boundary layer, UBL) or surface and subsurface can affect the hydrologic behavior and hence the flood hazards. In this research we focus on three main aspects of the urban surface/atmosphere interactions that affect flood hazard: urban heat island (UHI) effect, increased surface roughness, and accumulated aerosols. These factors, along with the uncertainties in quantifying these components make risk analysis intractable. In order to perform a risk analysis, the impact of these components needs to be mapped to a variable that can be mathematically described in a risk-analysis framework. We propose defining hazard energy as a surrogate for the combined effect of these three components. Perturbations that can change the hazard energy come from diverse sources in the urban areas and these somewhat disconnected things can be combined by the energy concept to characterize the impacts of urban areas in risk assessment. This approach synthesizes across hydrological and hydraulic processes in UBL, land surface, subsurface, and sewer network with scrutinizing energy exchange across places. We can extend our understanding about not only the influence of cities on local climate in rural areas or larger scales but also the interaction of cities and nature affecting each other.

  12. Tsunamis: Global Exposure and Local Risk Analysis

    NASA Astrophysics Data System (ADS)

    Harbitz, C. B.; Løvholt, F.; Glimsdal, S.; Horspool, N.; Griffin, J.; Davies, G.; Frauenfelder, R.

    2014-12-01

    The 2004 Indian Ocean tsunami led to a better understanding of the likelihood of tsunami occurrence and potential tsunami inundation, and the Hyogo Framework for Action (HFA) was one direct result of this event. The United Nations International Strategy for Disaster Risk Reduction (UN-ISDR) adopted HFA in January 2005 in order to reduce disaster risk. As an instrument to compare the risk due to different natural hazards, an integrated worldwide study was implemented and published in several Global Assessment Reports (GAR) by UN-ISDR. The results of the global earthquake induced tsunami hazard and exposure analysis for a return period of 500 years are presented. Both deterministic and probabilistic methods (PTHA) are used. The resulting hazard levels for both methods are compared quantitatively for selected areas. The comparison demonstrates that the analysis is rather rough, which is expected for a study aiming at average trends on a country level across the globe. It is shown that populous Asian countries account for the largest absolute number of people living in tsunami prone areas, more than 50% of the total exposed people live in Japan. Smaller nations like Macao and the Maldives are among the most exposed by population count. Exposed nuclear power plants are limited to Japan, China, India, Taiwan, and USA. On the contrary, a local tsunami vulnerability and risk analysis applies information on population, building types, infrastructure, inundation, flow depth for a certain tsunami scenario with a corresponding return period combined with empirical data on tsunami damages and mortality. Results and validation of a GIS tsunami vulnerability and risk assessment model are presented. The GIS model is adapted for optimal use of data available for each study. Finally, the importance of including landslide sources in the tsunami analysis is also discussed.

  13. Comprehensive rotorcraft analysis methods

    NASA Technical Reports Server (NTRS)

    Stephens, Wendell B.; Austin, Edward E.

    1988-01-01

    The development and application of comprehensive rotorcraft analysis methods in the field of rotorcraft technology are described. These large scale analyses and the resulting computer programs are intended to treat the complex aeromechanical phenomena that describe the behavior of rotorcraft. They may be used to predict rotor aerodynamics, acoustic, performance, stability and control, handling qualities, loads and vibrations, structures, dynamics, and aeroelastic stability characteristics for a variety of applications including research, preliminary and detail design, and evaluation and treatment of field problems. The principal comprehensive methods developed or under development in recent years and generally available to the rotorcraft community because of US Army Aviation Research and Technology Activity (ARTA) sponsorship of all or part of the software systems are the Rotorcraft Flight Simulation (C81), Dynamic System Coupler (DYSCO), Coupled Rotor/Airframe Vibration Analysis Program (SIMVIB), Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics (CAMRAD), General Rotorcraft Aeromechanical Stability Program (GRASP), and Second Generation Comprehensive Helicopter Analysis System (2GCHAS).

  14. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multifactor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.

  15. New method for assessing risks of email

    NASA Astrophysics Data System (ADS)

    Raja, Seyyed H.; Afrooz, Farzad

    2013-03-01

    E-mail technology, has become one of the requirements of human lives for correspondence between individuals. Given this, the important point is that the messages, server and client of e-mail and correspondences that exchanged between different people have acceptable security, to make people sure to use of this technology. In the information age, many of financial and non financial transactions are done electronically, data exchange takes place via the internet and theft and manipulation of data can make exorbitant cost in terms of integrity, financial, political, economic and culture. E-mail correspondence in there is same and it is very important. With review took place, a method that will focus on email system for risks assessment is not provided. We are examining ways of assessing for other systems and their strengths and weaknesses, then we use Mr Convery method for assessing email risks which it is for assessing network risks. At the end of paper we have offered special table for email risk assessment.

  16. RAMS (Risk Analysis - Modular System) methodology

    SciTech Connect

    Stenner, R.D.; Strenge, D.L.; Buck, J.W.

    1996-10-01

    The Risk Analysis - Modular System (RAMS) was developed to serve as a broad scope risk analysis tool for the Risk Assessment of the Hanford Mission (RAHM) studies. The RAHM element provides risk analysis support for Hanford Strategic Analysis and Mission Planning activities. The RAHM also provides risk analysis support for the Hanford 10-Year Plan development activities. The RAMS tool draws from a collection of specifically designed databases and modular risk analysis methodologies and models. RAMS is a flexible modular system that can be focused on targeted risk analysis needs. It is specifically designed to address risks associated with overall strategy, technical alternative, and `what if` questions regarding the Hanford cleanup mission. RAMS is set up to address both near-term and long-term risk issues. Consistency is very important for any comparative risk analysis, and RAMS is designed to efficiently and consistently compare risks and produce risk reduction estimates. There is a wide range of output information that can be generated by RAMS. These outputs can be detailed by individual contaminants, waste forms, transport pathways, exposure scenarios, individuals, populations, etc. However, they can also be in rolled-up form to support high-level strategy decisions.

  17. Economic Risk Analysis: Using Analytical and Monte Carlo Techniques.

    ERIC Educational Resources Information Center

    O'Donnell, Brendan R.; Hickner, Michael A.; Barna, Bruce A.

    2002-01-01

    Describes the development and instructional use of a Microsoft Excel spreadsheet template that facilitates analytical and Monte Carlo risk analysis of investment decisions. Discusses a variety of risk assessment methods followed by applications of the analytical and Monte Carlo methods. Uses a case study to illustrate use of the spreadsheet tool…

  18. The JPL Cost Risk Analysis Approach that Incorporates Engineering Realism

    NASA Technical Reports Server (NTRS)

    Harmon, Corey C.; Warfield, Keith R.; Rosenberg, Leigh S.

    2006-01-01

    This paper discusses the JPL Cost Engineering Group (CEG) cost risk analysis approach that accounts for all three types of cost risk. It will also describe the evaluation of historical cost data upon which this method is based. This investigation is essential in developing a method that is rooted in engineering realism and produces credible, dependable results to aid decision makers.

  19. Applying the partitioned multiobjective risk method (PMRM) to portfolio selection.

    PubMed

    Reyes Santos, Joost; Haimes, Yacov Y

    2004-06-01

    The analysis of risk-return tradeoffs and their practical applications to portfolio analysis paved the way for Modern Portfolio Theory (MPT), which won Harry Markowitz a 1992 Nobel Prize in Economics. A typical approach in measuring a portfolio's expected return is based on the historical returns of the assets included in a portfolio. On the other hand, portfolio risk is usually measured using volatility, which is derived from the historical variance-covariance relationships among the portfolio assets. This article focuses on assessing portfolio risk, with emphasis on extreme risks. To date, volatility is a major measure of risk owing to its simplicity and validity for relatively small asset price fluctuations. Volatility is a justified measure for stable market performance, but it is weak in addressing portfolio risk under aberrant market fluctuations. Extreme market crashes such as that on October 19, 1987 ("Black Monday") and catastrophic events such as the terrorist attack of September 11, 2001 that led to a four-day suspension of trading on the New York Stock Exchange (NYSE) are a few examples where measuring risk via volatility can lead to inaccurate predictions. Thus, there is a need for a more robust metric of risk. By invoking the principles of the extreme-risk-analysis method through the partitioned multiobjective risk method (PMRM), this article contributes to the modeling of extreme risks in portfolio performance. A measure of an extreme portfolio risk, denoted by f(4), is defined as the conditional expectation for a lower-tail region of the distribution of the possible portfolio returns. This article presents a multiobjective problem formulation consisting of optimizing expected return and f(4), whose solution is determined using Evolver-a software that implements a genetic algorithm. Under business-as-usual market scenarios, the results of the proposed PMRM portfolio selection model are found to be compatible with those of the volatility-based model

  20. Applying the partitioned multiobjective risk method (PMRM) to portfolio selection.

    PubMed

    Reyes Santos, Joost; Haimes, Yacov Y

    2004-06-01

    The analysis of risk-return tradeoffs and their practical applications to portfolio analysis paved the way for Modern Portfolio Theory (MPT), which won Harry Markowitz a 1992 Nobel Prize in Economics. A typical approach in measuring a portfolio's expected return is based on the historical returns of the assets included in a portfolio. On the other hand, portfolio risk is usually measured using volatility, which is derived from the historical variance-covariance relationships among the portfolio assets. This article focuses on assessing portfolio risk, with emphasis on extreme risks. To date, volatility is a major measure of risk owing to its simplicity and validity for relatively small asset price fluctuations. Volatility is a justified measure for stable market performance, but it is weak in addressing portfolio risk under aberrant market fluctuations. Extreme market crashes such as that on October 19, 1987 ("Black Monday") and catastrophic events such as the terrorist attack of September 11, 2001 that led to a four-day suspension of trading on the New York Stock Exchange (NYSE) are a few examples where measuring risk via volatility can lead to inaccurate predictions. Thus, there is a need for a more robust metric of risk. By invoking the principles of the extreme-risk-analysis method through the partitioned multiobjective risk method (PMRM), this article contributes to the modeling of extreme risks in portfolio performance. A measure of an extreme portfolio risk, denoted by f(4), is defined as the conditional expectation for a lower-tail region of the distribution of the possible portfolio returns. This article presents a multiobjective problem formulation consisting of optimizing expected return and f(4), whose solution is determined using Evolver-a software that implements a genetic algorithm. Under business-as-usual market scenarios, the results of the proposed PMRM portfolio selection model are found to be compatible with those of the volatility-based model

  1. Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581

    NASA Astrophysics Data System (ADS)

    Prayogo, Galang Sandy; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin

    2016-04-01

    Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presented a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.

  2. [Risk sharing methods in middle income countries].

    PubMed

    Inotai, András; Kaló, Zoltán

    2012-01-01

    The pricing strategy of innovative medicines is based on the therapeutic value in the largest pharmaceutical markets. The cost-effectiveness of new medicines with value based ex-factory price is justifiable. Due to the international price referencing and parallel trade the ex-factory price corridor of new medicines has been narrowed in recent years. Middle income countries have less negotiation power to change the narrow drug pricing corridor, although their fair intention is to buy pharmaceuticals at lower price from their scarce public resources compared to higher income countries. Therefore the reimbursement of new medicines at prices of Western-European countries may not be justifiable in Central-Eastern European countries. Confidential pricing agreements (i.e. confidential price discounts, claw-back or rebate) in lower income countries of the European Union can alleviate this problem, as prices of new medicines can be adjusted to local purchasing power without influencing the published ex-factory price and so the accessibility of patients to these drugs in other countries. In order to control the drug budget payers tend to apply financial risk sharing agreements for new medicines in more and more countries to shift the consequences of potential overspending to pharmaceutical manufacturers. The major paradox of financial risk-sharing schemes is that increased mortality, poor persistence of patients, reduced access to healthcare providers, and no treatment reduce pharmaceutical spending. Consequently, payers have started to apply outcome based risk sharing agreements for new medicines recently to improve the quality of health care provision. Our paper aims to review and assess the published financial and outcome based risk sharing methods. Introduction of outcome based risk-sharing schemes can be a major advancement in the drug reimbursement strategy of payers in middle income countries. These schemes can help to reduce the medical uncertainty in coverage

  3. Cumulative Benefit Analysis for Ranking Risk Reduction Actions

    SciTech Connect

    Leverenz, Fred L.; Aysa Jimenez, Julio

    2007-04-25

    The Hazard and Operability (HAZOP) study approach, and other similar methods, are very effective ways to qualitatively identify a comprehensive set of accident scenarios for a facility. If these analyses are modified to incorporate a simple system for evaluating relative risk, such as an order-of-magnitude scoring system, the resultant study can be a very powerful input to developing risk reduction strategies. By adding the concept of Risk Reduction Worth evaluations for all accident Causes, Safeguards, and proposal Action Items, an analyst can then formulate a strategy to select the minimal set of risk reduction actions that maximizes risk reduction. One strategy for doing this involves the iterative evaluation of RRW after postulation of risk reduction actions, until the residual risk reaches a tolerable value, termed Cumulative Risk Benefit Analysis. This concept was developed for the evaluation of a set of pipeline pumping stations, and provided valuable insight into how to reduce risk in a sensible, prioritized fashion.

  4. Cumulative Risk Assessment Toolbox: Methods and Approaches for the Practitioner

    PubMed Central

    MacDonell, Margaret M.; Haroun, Lynne A.; Teuschler, Linda K.; Rice, Glenn E.; Hertzberg, Richard C.; Butler, James P.; Chang, Young-Soo; Clark, Shanna L.; Johns, Alan P.; Perry, Camarie S.; Garcia, Shannon S.; Jacobi, John H.; Scofield, Marcienne A.

    2013-01-01

    The historical approach to assessing health risks of environmental chemicals has been to evaluate them one at a time. In fact, we are exposed every day to a wide variety of chemicals and are increasingly aware of potential health implications. Although considerable progress has been made in the science underlying risk assessments for real-world exposures, implementation has lagged because many practitioners are unaware of methods and tools available to support these analyses. To address this issue, the US Environmental Protection Agency developed a toolbox of cumulative risk resources for contaminated sites, as part of a resource document that was published in 2007. This paper highlights information for nearly 80 resources from the toolbox and provides selected updates, with practical notes for cumulative risk applications. Resources are organized according to the main elements of the assessment process: (1) planning, scoping, and problem formulation; (2) environmental fate and transport; (3) exposure analysis extending to human factors; (4) toxicity analysis; and (5) risk and uncertainty characterization, including presentation of results. In addition to providing online access, plans for the toolbox include addressing nonchemical stressors and applications beyond contaminated sites and further strengthening resource accessibility to support evolving analyses for cumulative risk and sustainable communities. PMID:23762048

  5. Method ruggedness studies incorporating a risk based approach: a tutorial.

    PubMed

    Borman, Phil J; Chatfield, Marion J; Damjanov, Ivana; Jackson, Patrick

    2011-10-10

    This tutorial explains how well thought-out application of design and analysis methodology, combined with risk assessment, leads to improved assessment of method ruggedness. The authors define analytical method ruggedness as an experimental evaluation of noise factors such as analyst, instrument or stationary phase batch. Ruggedness testing is usually performed upon transfer of a method to another laboratory, however, it can also be employed during method development when an assessment of the method's inherent variability is required. The use of a ruggedness study provides a more rigorous method for assessing method precision than a simple comparative intermediate precision study which is typically performed as part of method validation. Prior to designing a ruggedness study, factors that are likely to have a significant effect on the performance of the method should be identified (via a risk assessment) and controlled where appropriate. Noise factors that are not controlled are considered for inclusion in the study. The purpose of the study should be to challenge the method and identify whether any noise factors significantly affect the method's precision. The results from the study are firstly used to identify any special cause variability due to specific attributable circumstances. Secondly, common cause variability is apportioned to determine which factors are responsible for most of the variability. The total common cause variability can then be used to assess whether the method's precision requirements are achievable. The approach used to design and analyse method ruggedness studies will be covered in this tutorial using a real example.

  6. Latent Model Analysis of Substance Use and HIV Risk Behaviors among High-Risk Minority Adults

    ERIC Educational Resources Information Center

    Wang, Min Qi; Matthew, Resa F.; Chiu, Yu-Wen; Yan, Fang; Bellamy, Nikki D.

    2007-01-01

    Objectives: This study evaluated substance use and HIV risk profile using a latent model analysis based on ecological theory, inclusive of a risk and protective factor framework, in sexually active minority adults (N=1,056) who participated in a federally funded substance abuse and HIV prevention health initiative from 2002 to 2006. Methods: Data…

  7. Risk Analysis for Resource Planning Optimization

    NASA Technical Reports Server (NTRS)

    Chueng, Kar-Ming

    2008-01-01

    The main purpose of this paper is to introduce a risk management approach that allows planners to quantify the risk and efficiency tradeoff in the presence of uncertainties, and to make forward-looking choices in the development and execution of the plan. Demonstrate a planning and risk analysis framework that tightly integrates mathematical optimization, empirical simulation, and theoretical analysis techniques to solve complex problems.

  8. Risk Analysis and Uncertainty: Implications for Counselling

    ERIC Educational Resources Information Center

    Hassenzahl, David

    2004-01-01

    Over the past two decades, the risk analysis community has made substantial advances in understanding and describing uncertainty. Uncertainty is ubiquitous, complex, both quantitative and qualitative in nature, and often irreducible. Uncertainty thus creates a challenge when using risk analysis to evaluate the rationality of group and individual…

  9. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant...) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive... possible after the data breach, a non-VA entity with relevant expertise in data breach assessment and...

  10. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant...) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive... possible after the data breach, a non-VA entity with relevant expertise in data breach assessment and...

  11. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant...) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive... possible after the data breach, a non-VA entity with relevant expertise in data breach assessment and...

  12. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant...) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive... possible after the data breach, a non-VA entity with relevant expertise in data breach assessment and...

  13. Carbon Fiber Risk Analysis. [conference

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The scope and status of the effort to assess the risks associated with the accidental release of carbon/graphite fibers from civil aircraft is presented. Vulnerability of electrical and electronic equipment to carbon fibers, dispersal of carbon fibers, effectiveness of filtering systems, impact of fiber induced failures, and risk methodology are among the topics covered.

  14. A GIS-based method for flood risk assessment

    NASA Astrophysics Data System (ADS)

    Kalogeropoulos, Kleomenis; Stathopoulos, Nikos; Psarogiannis, Athanasios; Penteris, Dimitris; Tsiakos, Chrisovalantis; Karagiannopoulou, Aikaterini; Krikigianni, Eleni; Karymbalis, Efthimios; Chalkias, Christos

    2016-04-01

    Floods are physical global hazards with negative environmental and socio-economic impacts on local and regional scale. The technological evolution during the last decades, especially in the field of geoinformatics, has offered new advantages in hydrological modelling. This study seeks to use this technology in order to quantify flood risk assessment. The study area which was used is an ungauged catchment and by using mostly GIS hydrological and geomorphological analysis together with a GIS-based distributed Unit Hydrograph model, a series of outcomes have risen. More specifically, this paper examined the behaviour of the Kladeos basin (Peloponnese, Greece) using real rainfall data, as well hypothetical storms. The hydrological analysis held using a Digital Elevation Model of 5x5m pixel size, while the quantitative drainage basin characteristics were calculated and were studied in terms of stream order and its contribution to the flood. Unit Hydrographs are, as it known, useful when there is lack of data and in this work, based on time-area method, a sequences of flood risk assessments have been made using the GIS technology. Essentially, the proposed methodology estimates parameters such as discharge, flow velocity equations etc. in order to quantify flood risk assessment. Keywords Flood Risk Assessment Quantification; GIS; hydrological analysis; geomorphological analysis.

  15. Student Choices: Using a Competing Risks Model of Survival Analysis.

    ERIC Educational Resources Information Center

    Denson, Katy; Schumacker, Randall E.

    By using a competing risks model, survival analysis methods can be extended to predict which of several mutually exclusive outcomes students will choose based on predictor variables, thereby ascertaining if the profile of risk differs across groups. The paper begins with a brief introduction to logistic regression and some of the basic concepts of…

  16. Carbon Fiber Risk Analysis: Conclusions

    NASA Technical Reports Server (NTRS)

    Huston, R. J.

    1979-01-01

    It was concluded that preliminary estimates indicate that the public risk due to accidental release of carbon fiber from air transport aircraft is small. It was also concluded that further work is required to increase confidence in these estimates.

  17. Bridging the two cultures of risk analysis

    SciTech Connect

    Jasanoff, S. )

    1993-04-01

    During the past 15 years, risk analysis has come of age as an interdisciplinary field of remarkable breadth, nurturing connections among fields as diverse as mathematics, biostatistics, toxicology, and engineering on one hand, and law, psychology, sociology, and economics on the other hand. In this editorial, the author addresses the question: What has the presence of social scientists in the network meant to the substantive development of the field of risk analysis The answers offered here discuss the substantial progress in bridging the two cultures of risk analysis. Emphasis is made of the continual need for monitoring risk analysis. Topics include: the micro-worlds of risk assessment; constraining assumptions; and exchange programs. 14 refs.

  18. Initial Risk Analysis and Decision Making Framework

    SciTech Connect

    Engel, David W.

    2012-02-01

    Commercialization of new carbon capture simulation initiative (CCSI) technology will include two key elements of risk management, namely, technical risk (will process and plant performance be effective, safe, and reliable) and enterprise risk (can project losses and costs be controlled within the constraints of market demand to maintain profitability and investor confidence). Both of these elements of risk are incorporated into the risk analysis subtask of Task 7. Thus far, this subtask has developed a prototype demonstration tool that quantifies risk based on the expected profitability of expenditures when retrofitting carbon capture technology on a stylized 650 MW pulverized coal electric power generator. The prototype is based on the selection of specific technical and financial factors believed to be important determinants of the expected profitability of carbon capture, subject to uncertainty. The uncertainty surrounding the technical performance and financial variables selected thus far is propagated in a model that calculates the expected profitability of investments in carbon capture and measures risk in terms of variability in expected net returns from these investments. Given the preliminary nature of the results of this prototype, additional work is required to expand the scope of the model to include additional risk factors, additional information on extant and proposed risk factors, the results of a qualitative risk factor elicitation process, and feedback from utilities and other interested parties involved in the carbon capture project. Additional information on proposed distributions of these risk factors will be integrated into a commercial implementation framework for the purpose of a comparative technology investment analysis.

  19. How important are venue-based HIV risks among male clients of female sex workers? A mixed methods analysis of the risk environment in nightlife venues in Tijuana, Mexico.

    PubMed

    Goldenberg, Shira M; Strathdee, Steffanie A; Gallardo, Manuel; Nguyen, Lucie; Lozada, Remedios; Semple, Shirley J; Patterson, Thomas L

    2011-05-01

    In 2008, 400 males ≥18 years old who paid or traded for sex with a female sex worker (FSW) in Tijuana, Mexico, in the past 4 months completed surveys and HIV/STI testing; 30 also completed qualitative interviews. To analyze environmental sources of HIV vulnerability among male clients of FSWs in Tijuana, we used mixed methods to investigate correlates of clients who met FSWs in nightlife venues and clients' perspectives on venue-based HIV risk. Logistic regression identified micro-level correlates of meeting FSWs in nightlife venues, which were triangulated with clients' narratives regarding macro-level influences. In a multivariate model, offering increased pay for unprotected sex and binge drinking were micro-level factors that were independently associated with meeting FSWs in nightlife venues versus other places. In qualitative interviews, clients characterized nightlife venues as high risk due to the following macro-level features: social norms dictating heavy alcohol consumption; economic exploitation by establishment owners; and poor enforcement of sex work regulations in nightlife venues. Structural interventions in nightlife venues are needed to address venue-based risks. PMID:21396875

  20. How important are venue-based HIV risks among male clients of female sex workers? A mixed methods analysis of the risk environment in nightlife venues in Tijuana, Mexico

    PubMed Central

    Goldenberg, Shira; Strathdee, Steffanie A.; Gallardo, Manuel; Nguyen, Lucie; Lozada, Remedios; Semple, Shirley J.; Patterson, Thomas L.

    2011-01-01

    In 2008, 400 males ≥ 18 years old who paid or traded for sex with a female sex worker (FSW) in Tijuana, Mexico, in the past 4 months completed surveys and HIV/STI testing; 30 also completed qualitative interviews. To analyze environmental HIV vulnerability among male clients of FSWs in Tijuana, Mexico, we used mixed methods to investigate correlates of clients who met FSWs in nightlife venues and clients’ perspectives on venue-based risks. Logistic regression identified micro-level correlates of meeting FSWs in nightlife venues, which were triangulated with clients’ narratives regarding macro-level influences. In a multivariate model, offering increased pay for unprotected sex and binge drinking were micro-level factors that were independently associated with meeting FSWs in nightlife venues versus other places. In qualitative interviews, clients characterized nightlife venues as high risk due to the following macro-level features: social norms dictating heavy alcohol consumption; economic exploitation by establishment owners; and poor enforcement of sex work regulations in nightlife venues. Structural interventions in nightlife venues are needed to address venue-based risks. PMID:21396875

  1. Development of an inverse method for coastal risk management

    NASA Astrophysics Data System (ADS)

    Idier, D.; Rohmer, J.; Bulteau, T.; Delvallée, E.

    2013-04-01

    Recent flooding events, like Katrina (USA, 2005) or Xynthia (France, 2010), illustrate the complexity of coastal systems and the limits of traditional flood risk analysis. Among other questions, these events raised issues such as: "how to choose flooding scenarios for risk management purposes?", "how to make a society more aware and prepared for such events?" and "which level of risk is acceptable to a population?". The present paper aims at developing an inverse approach that could seek to address these three issues. The main idea of the proposed method is the inversion of the usual risk assessment steps: starting from the maximum acceptable hazard level (defined by stakeholders as the one leading to the maximum tolerable consequences) to finally obtain the return period of this threshold. Such an "inverse" approach would allow for the identification of all the offshore forcing conditions (and their occurrence probability) inducing a threat for critical assets of the territory, such information being of great importance for coastal risk management. This paper presents the first stage in developing such a procedure. It focuses on estimation (through inversion of the flooding model) of the offshore conditions leading to the acceptable hazard level, estimation of the return period of the associated combinations, and thus of the maximum acceptable hazard level. A first application for a simplified case study (based on real data), located on the French Mediterranean coast, is presented, assuming a maximum acceptable hazard level. Even if only one part of the full inverse method has been developed, we demonstrate how the inverse method can be useful in (1) estimating the probability of exceeding the maximum inundation height for identified critical assets, (2) providing critical offshore conditions for flooding in early warning systems, and (3) raising awareness of stakeholders and eventually enhance preparedness for future flooding events by allowing them to assess

  2. [Quantitative method of representative contaminants in groundwater pollution risk assessment].

    PubMed

    Wang, Jun-Jie; He, Jiang-Tao; Lu, Yan; Liu, Li-Ya; Zhang, Xiao-Liang

    2012-03-01

    In the light of the problem that stress vulnerability assessment in groundwater pollution risk assessment is lack of an effective quantitative system, a new system was proposed based on representative contaminants and corresponding emission quantities through the analysis of groundwater pollution sources. And quantitative method of the representative contaminants in this system was established by analyzing the three properties of representative contaminants and determining the research emphasis using analytic hierarchy process. The method had been applied to the assessment of Beijing groundwater pollution risk. The results demonstrated that the representative contaminants hazards greatly depended on different research emphasizes. There were also differences between the sequence of three representative contaminants hazards and their corresponding properties. It suggested that subjective tendency of the research emphasis had a decisive impact on calculation results. In addition, by the means of sequence to normalize the three properties and to unify the quantified properties results would zoom in or out of the relative properties characteristic of different representative contaminants.

  3. Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William

    2009-01-01

    This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).

  4. Risk analysis approach. [of carbon fiber release

    NASA Technical Reports Server (NTRS)

    Huston, R. J.

    1979-01-01

    The assessment of the carbon fiber hazard is outlined. Program objectives, requirements of the risk analysis, and elements associated with the physical phenomena of the accidental release are described.

  5. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive... affected; (5) Ease of logical data access to the lost, stolen or improperly accessed data in light of...

  6. Risk Based Requirements for Long Term Stewardship: A Proof-of-Principle Analysis of an Analytic Method Tested on Selected Hanford Locations

    SciTech Connect

    GM Gelston; JW Buck; LR Huesties; MS Peffers; TB Miley; TT Jarvis; WB Andrews

    1998-12-03

    Since 1989, the Department of Energy's (DOE) Environmental Management (EM) Program has managed the environmental legacy of US nuclear weapons production research and testing at 137 facilities in 31 states and one US territory. The EM program has conducted several studies on the public risks posed by contaminated sites at these facilities. In Risks and the Risk Debate DOE, 1995a, the Department analyzed the risks at sites before, during, and after remediation work by the EM program. The results indicated that aside from a few urgent risks, most hazards present little inherent risk because physical and active site management controls limit both the releases of site contaminants, and public access to these hazards. Without these controls, these sites would pose greater risks to the public. Past risk reports, however, provided little irdiormation about post- cleanup risk, primarily because of uncertainty about fiture site uses and site characteristics at the end of planned cleanup activities. This is of concern because in many cases current cleanup technologies, and remedies, will last a shorter period of time than the waste itself and the resulting contamination will remain hazardous.

  7. Risk based requirements for long term stewardship: A proof-of-principle analysis of an analytic method tested on selected Hanford locations

    SciTech Connect

    Jarvis, T.T.; Andrews, W.B.; Buck, J.W.

    1998-03-01

    Since 1989, the Department of Energy`s (DOE) Environmental Management (EM) Program has managed the environmental legacy of US nuclear weapons production, research and testing at 137 facilities in 31 states and one US territory. The EM program has conducted several studies on the public risks posed by contaminated sites at these facilities. In Risks and the Risk Debate [DOE, 1995a], the Department analyzed the risks at sites before, during, and after remediation work by the EM program. The results indicated that aside from a few urgent risks, most hazards present little inherent risk because physical and active site management controls limit both the releases of site contaminants, and public access to these hazards. Without these controls, these sites would pose greater risks to the public. Past risk reports, however, provided little information about post-cleanup risk, primarily because of uncertainty about future site uses and site characteristics at the end of planned cleanup activities. This is of concern because in many cases current cleanup technologies, and remedies, will last a shorter period of time than the waste itself and the resulting contamination will remain hazardous.

  8. Occupational safety and HIV risk among female sex workers in China: A mixed-methods analysis of sex-work harms and mommies

    PubMed Central

    Yi, Huso; Zheng, Tiantian; Wan, Yanhai; Mantell, Joanne E.; Park, Minah; Csete, Joanne

    2013-01-01

    Female sex workers (FSWs) in China are exposed to multiple work-related harms that increase HIV vulnerability. Using mixed-methods, we explored the social-ecological aspects of sexual risk among 348 FSWs in Beijing. Sex-work harms were assessed by property stolen, being underpaid or not paid at all, verbal and sexual abuse, forced drinking; and forced sex more than once. The majority (90%) reported at least one type of harm, 38% received harm protection from ‘mommies’ (i.e., managers) and 32% reported unprotected sex with clients. In multivariate models, unprotected sex was significantly associated with longer involvement in sex work, greater exposure to harms, and no protection from mommies. Mommies’ protection moderated the effect of sex-work harms on unprotected sex with clients. Our ethnography indicated that mommies played a core role in sex-work networks. Such networks provide a basis for social capital; they are not only profitable economically, but also protect FSWs from sex-work harms. Effective HIV prevention interventions for FSWs in China must address the occupational safety and health of FSWs by facilitating social capital and protection agency (e.g., mommies) in the sex-work industry. PMID:22375698

  9. Fuzzy risk analysis for nuclear safeguards

    SciTech Connect

    Zardecki, A.

    1993-05-01

    Analysis of a safeguards system, based on the notion of fuzzy sets and linguistic variables, concerns such as complexity and inherent imprecision in estimating the possibility of loss or compromise. The automated risk analysis allows the risk to be determined for an entire system based on estimates for lowest level components and the component proportion. In addition, for each component (asset) the most effective combination of protection mechanisms against a given set of threats is determined. A distinction between bar and featured risk is made.

  10. Fuzzy risk analysis for nuclear safeguards

    SciTech Connect

    Zardecki, A.

    1993-01-01

    Analysis of a safeguards system, based on the notion of fuzzy sets and linguistic variables, concerns such as complexity and inherent imprecision in estimating the possibility of loss or compromise. The automated risk analysis allows the risk to be determined for an entire system based on estimates for lowest level components and the component proportion. In addition, for each component (asset) the most effective combination of protection mechanisms against a given set of threats is determined. A distinction between bar and featured risk is made.

  11. 18F-FDG-PET/CT in the assessment of pulmonary solitary nodules: comparison of different analysis methods and risk variables in the prediction of malignancy

    PubMed Central

    García Vicente, Ana María; Honguero Martínez, Antonio Francisco; Jiménez Londoño, Germán Andrés; Vega Caicedo, Carlos Hugo; León Atance, Pablo; Soriano Castrejón, Ángel María

    2015-01-01

    Objective To compare the diagnostic performance of different metabolical, morphological and clinical criteria for correct presurgical classification of the solitary pulmonary nodule (SPN). Methods Fifty-five patients, with SPN were retrospectively analyzed. All patients underwent preoperative 18F-fluorodeoxyglucose (FDG)-positron emission tomography (PET)/computed tomography (CT). Maximum diameter in CT, maximum standard uptake value (SUVmax), histopathologic result, age, smoking history and gender were obtained. Different criteria were established to classify a SPN as malignant: (I) visually detectable metabolism, (II) SUVmax >2.5 regardless of SPN diameter, (III) SUVmax threshold depending of SPN diameter, and (IV) ratio SUVmax/diameter greater than 1. For each criterion, statistical diagnostic parameters were obtained. Receiver operating characteristic (ROC) analysis was performed to select the best diagnostic SUVmax and SUVmax/diameter cutoff. Additionally, a predictive model of malignancy of the SPN was derived by multivariate logistic regression. Results Fifteen SPN (27.3%) were benign and 40 (72.7%) malignant. The mean values ± standard deviation (SD) of SPN diameter and SUVmax were 1.93±0.57 cm and 3.93±2.67 respectively. Sensitivity (Se) and specificity (Sp) of the different diagnostic criteria were (I): 97.5% and 13.1%; (II) 67.5% and 53.3%; (III) 70% and 53.3%; and (IV) 85% and 33.3%, respectively. The SUVmax cut-off value with the best diagnostic performance was 1.95 (Se: 80%; Sp: 53.3%). The predictive model had a Se of 87.5% and Sp of 46.7%. The SUVmax was independent variables to predict malignancy. Conclusions The assessment by semiquantitative methods did not improve the Se of visual analysis. The limited Sp was independent on the method used. However, the predictive model combining SUVmax and age was the best diagnostic approach. PMID:26207210

  12. Communication Network Analysis Methods.

    ERIC Educational Resources Information Center

    Farace, Richard V.; Mabee, Timothy

    This paper reviews a variety of analytic procedures that can be applied to network data, discussing the assumptions and usefulness of each procedure when applied to the complexity of human communication. Special attention is paid to the network properties measured or implied by each procedure. Factor analysis and multidimensional scaling are among…

  13. Complementing Gender Analysis Methods.

    PubMed

    Kumar, Anant

    2016-01-01

    The existing gender analysis frameworks start with a premise that men and women are equal and should be treated equally. These frameworks give emphasis on equal distribution of resources between men and women and believe that this will bring equality which is not always true. Despite equal distribution of resources, women tend to suffer and experience discrimination in many areas of their lives such as the power to control resources within social relationships, and the need for emotional security and reproductive rights within interpersonal relationships. These frameworks believe that patriarchy as an institution plays an important role in women's oppression, exploitation, and it is a barrier in their empowerment and rights. Thus, some think that by ensuring equal distribution of resources and empowering women economically, institutions like patriarchy can be challenged. These frameworks are based on proposed equality principle which puts men and women in competing roles. Thus, the real equality will never be achieved. Contrary to the existing gender analysis frameworks, the Complementing Gender Analysis framework proposed by the author provides a new approach toward gender analysis which not only recognizes the role of economic empowerment and equal distribution of resources but suggests to incorporate the concept and role of social capital, equity, and doing gender in gender analysis which is based on perceived equity principle, putting men and women in complementing roles that may lead to equality. In this article the author reviews the mainstream gender theories in development from the viewpoint of the complementary roles of gender. This alternative view is argued based on existing literature and an anecdote of observations made by the author. While criticizing the equality theory, the author offers equity theory in resolving the gender conflict by using the concept of social and psychological capital. PMID:25941756

  14. Complementing Gender Analysis Methods.

    PubMed

    Kumar, Anant

    2016-01-01

    The existing gender analysis frameworks start with a premise that men and women are equal and should be treated equally. These frameworks give emphasis on equal distribution of resources between men and women and believe that this will bring equality which is not always true. Despite equal distribution of resources, women tend to suffer and experience discrimination in many areas of their lives such as the power to control resources within social relationships, and the need for emotional security and reproductive rights within interpersonal relationships. These frameworks believe that patriarchy as an institution plays an important role in women's oppression, exploitation, and it is a barrier in their empowerment and rights. Thus, some think that by ensuring equal distribution of resources and empowering women economically, institutions like patriarchy can be challenged. These frameworks are based on proposed equality principle which puts men and women in competing roles. Thus, the real equality will never be achieved. Contrary to the existing gender analysis frameworks, the Complementing Gender Analysis framework proposed by the author provides a new approach toward gender analysis which not only recognizes the role of economic empowerment and equal distribution of resources but suggests to incorporate the concept and role of social capital, equity, and doing gender in gender analysis which is based on perceived equity principle, putting men and women in complementing roles that may lead to equality. In this article the author reviews the mainstream gender theories in development from the viewpoint of the complementary roles of gender. This alternative view is argued based on existing literature and an anecdote of observations made by the author. While criticizing the equality theory, the author offers equity theory in resolving the gender conflict by using the concept of social and psychological capital.

  15. Starlink corn: a risk analysis.

    PubMed Central

    Bucchini, Luca; Goldman, Lynn R

    2002-01-01

    Modern biotechnology has dramatically increased our ability to alter the agronomic traits of plants. Among the novel traits that biotechnology has made available, an important group includes Bacillus thuringiensis-derived insect resistance. This technology has been applied to potatoes, cotton, and corn. Benefits of Bt crops, and biotechnology generally, can be realized only if risks are assessed and managed properly. The case of Starlink corn, a plant modified with a gene that encodes the Bt protein Cry9c, was a severe test of U.S. regulatory agencies. The U.S. Environmental Protection Agency had restricted its use to animal feed due to concern about the potential for allergenicity. However, Starlink corn was later found throughout the human food supply, resulting in food recalls by the Food and Drug Administration and significant disruption of the food supply. Here we examine the regulatory history of Starlink, the assessment framework employed by the U.S. government, assumptions and information gaps, and the key elements of government efforts to manage the product. We explore the impacts on regulations, science, and society and conclude that only significant advances in our understanding of food allergies and improvements in monitoring and enforcement will avoid similar events in the future. Specifically, we need to develop a stronger fundamental basis for predicting allergic sensitization and reactions if novel proteins are to be introduced in this fashion. Mechanisms are needed to assure that worker and community aeroallergen risks are considered. Requirements are needed for the development of valid assays so that enforcement and post market surveillance activities can be conducted. PMID:11781159

  16. Risk Assessment and Integration Team (RAIT) Portfolio Risk Analysis Strategy

    NASA Technical Reports Server (NTRS)

    Edwards, Michelle

    2010-01-01

    Impact at management level: Qualitative assessment of risk criticality in conjunction with risk consequence, likelihood, and severity enable development of an "investment policy" towards managing a portfolio of risks. Impact at research level: Quantitative risk assessments enable researchers to develop risk mitigation strategies with meaningful risk reduction results. Quantitative assessment approach provides useful risk mitigation information.

  17. Fault tree analysis for integrated and probabilistic risk analysis of drinking water systems.

    PubMed

    Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof

    2009-04-01

    Drinking water systems are vulnerable and subject to a wide range of risks. To avoid sub-optimisation of risk-reduction options, risk analyses need to include the entire drinking water system, from source to tap. Such an integrated approach demands tools that are able to model interactions between different events. Fault tree analysis is a risk estimation tool with the ability to model interactions between events. Using fault tree analysis on an integrated level, a probabilistic risk analysis of a large drinking water system in Sweden was carried out. The primary aims of the study were: (1) to develop a method for integrated and probabilistic risk analysis of entire drinking water systems; and (2) to evaluate the applicability of Customer Minutes Lost (CML) as a measure of risk. The analysis included situations where no water is delivered to the consumer (quantity failure) and situations where water is delivered but does not comply with water quality standards (quality failure). Hard data as well as expert judgements were used to estimate probabilities of events and uncertainties in the estimates. The calculations were performed using Monte Carlo simulations. CML is shown to be a useful measure of risks associated with drinking water systems. The method presented provides information on risk levels, probabilities of failure, failure rates and downtimes of the system. This information is available for the entire system as well as its different sub-systems. Furthermore, the method enables comparison of the results with performance targets and acceptable levels of risk. The method thus facilitates integrated risk analysis and consequently helps decision-makers to minimise sub-optimisation of risk-reduction options.

  18. Preliminary Technical Risk Analysis for the Geothermal Technologies Program

    SciTech Connect

    2009-01-18

    This report explains the goals, methods, and results of a probabilistic analysis of technical risk for a portfolio of R&D projects in the DOE Geothermal Technologies Program (The Program). The analysis is a task by Princeton Energy Resources International, LLC, in support of the National Renewable Energy Laboratory on behalf of the Program. The main challenge in the analysis lies in translating R&D results to a quantitative reflection of technical risk for a key Program metric: levelized cost of energy (LCOE).

  19. A risk analysis model for radioactive wastes.

    PubMed

    Külahcı, Fatih

    2011-07-15

    Hazardous wastes affect natural environmental systems to a significant extend, and therefore, it is necessary to control their harm through risk analysis. Herein, an effective risk methodology is proposed by considering their uncertain behaviors on stochastic, statistical and probabilistic bases. The basic element is attachment of a convenient probability distribution function (pdf) to a given waste quality measurement sequence. In this paper, (40)K contaminant measurements are adapted for risk assessment application after derivation of necessary fundamental formulations. The spatial contaminant distribution of (40)K is presented in the forms of maps and three-dimensional surfaces.

  20. Traditional Methods for Mineral Analysis

    NASA Astrophysics Data System (ADS)

    Ward, Robert E.; Carpenter, Charles E.

    This chapter describes traditional methods for analysis of minerals involving titrimetric and colorimetric procedures, and the use of ion selective electrodes. Other traditional methods of mineral analysis include gravimetric titration (i.e., insoluble forms of minerals are precipitated, rinse, dried, and weighed) and redox reactions (i.e., mineral is part of an oxidation-reduction reaction, and product is quantitated). However, these latter two methods will not be covered because they currently are used little in the food industry. The traditional methods that will be described have maintained widespread usage in the food industry despite the development of more modern instrumentation such as atomic absorption spectroscopy and inductively coupled plasma-atomic emission spectroscopy (Chap. 24). Traditional methods generally require chemicals and equipment that are routinely available in an analytical laboratory and are within the experience of most laboratory technicians. Additionally, traditional methods often form the basis for rapid analysis kits (e.g., Quantab®; for salt determination) that are increasingly in demand. Procedures for analysis of minerals of major nutritional or food processing concern are used for illustrative purposes. For additional examples of traditional methods refer to references (1-6). Slight modifications of these traditional methods are often needed for specific foodstuffs to minimize interferences or to be in the range of analytical performance. For analytical requirements for specific foods see the Official Methods of Analysis of AOAC International (5) and related official methods (6).

  1. Seismic vulnerability assessments in risk analysis

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Larionov, Valery; Bonnin, Jean; Ugarov, Alexander

    2013-04-01

    The assessment of seismic vulnerability is a critical issue within natural and technological risk analysis. In general, there are three common types of methods used for development of vulnerability functions of different elements at risk: empirical, analytical and expert estimations. The paper addresses the empirical methods for seismic vulnerability estimation for residential buildings and industrial facilities. The results of engineering analysis of past earthquake consequences, as well as the statistical data on buildings behavior during strong earthquakes presented in the different seismic intensity scales, are used to verify the regional parameters of mathematical models in order to simulate physical and economic vulnerability for different building types classified according to seismic scale MMSK-86. Verified procedure has been used to estimate the physical and economic vulnerability of buildings and constructions against earthquakes for the Northern Caucasus Federal region of the Russian Federation and Krasnodar area, which are characterized by rather high level of seismic activity and high population density. In order to estimate expected damage states to buildings and constructions in the case of the earthquakes according to the OSR-97B (return period T=1,000 years) within big cities and towns, they were divided into unit sites and their coordinates were presented as dots located in the centers of unit sites. Then the indexes obtained for each unit site were summed up. The maps of physical vulnerability zoning for Northern Caucasus Federal region of the Russian Federation and Krasnodar area includes two elements: percent of different damage states for settlements with number of inhabitants less than 1,000 and vulnerability for cities and towns with number of inhabitants more than 1,000. The hypsometric scale is used to represent both elements on the maps. Taking into account the size of oil pipe line systems located in the highly active seismic zones in

  2. Measuring and modelling pollution for risk analysis.

    PubMed

    Zidek, J V; Le, N D

    1999-01-01

    The great scale and complexity of environmental risk analysis offers major methodological challenges to those engaged in policymaking. In this paper we describe some of those challenges from the perspective gained through our work at the University of British Columbia (UBC). We describe some of our experiences with respect to the difficult problems of formulating environmental standards and developing abatement strategies. A failed but instructive attempt to find support for experiments on a promising method of reducing acid rain will be described. Then we describe an approach to scenario analysis under hypothetical new standards. Even with measurements of ambient environmental conditions in hand the problem of inferring actual human exposures remains. For example, in very hot weather people will tend to stay inside and population levels of exposure to e.g. ozone could be well below those predicted by the ambient measurements. Setting air quality criteria should ideally recognize the discrepancies likely to arise. Computer models that incorporate spatial random pollution fields and predict actual exposures from ambient levels will be described. From there we turn to the statistical issues of measurement and modelling and some of the contributions in these areas by the UBC group and its partners elsewhere. In particular we discuss the problem of measurement error when non-linear regression models are used. We sketch our approach to imputing unmeasured predictors needed in such models, deferring details to references cited below. We describe in general terms how those imputed measurements and their errors can be accommodated within the framework of health impact analysis.

  3. A method for assessing the risks of pipeline operations

    SciTech Connect

    Gloven, M.P.

    1996-09-01

    This paper presents a method for assessing the risks of hazardous liquid and natural gas pipeline systems. The method assesses risk by measuring historical and projected performance data against selected benchmarks, which if exceeded, may indicate that the pipeline may have a greater potential for failure or adverse consequence at certain points. Once these areas are determined, plans are developed and implemented to minimize risk.

  4. 31 CFR 223.11 - Limitation of risk: Protective methods.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false Limitation of risk: Protective methods. 223.11 Section 223.11 Money and Finance: Treasury Regulations Relating to Money and Finance... BUSINESS WITH THE UNITED STATES § 223.11 Limitation of risk: Protective methods. The limitation of...

  5. A predictive Bayesian approach to risk analysis in health care

    PubMed Central

    Aven, Terje; Eidesen, Karianne

    2007-01-01

    Background The Bayesian approach is now widely recognised as a proper framework for analysing risk in health care. However, the traditional text-book Bayesian approach is in many cases difficult to implement, as it is based on abstract concepts and modelling. Methods The essential points of the risk analyses conducted according to the predictive Bayesian approach are identification of observable quantities, prediction and uncertainty assessments of these quantities, using all the relevant information. The risk analysis summarizes the knowledge and lack of knowledge concerning critical operations and other activities, and give in this way a basis for making rational decisions. Results It is shown that Bayesian risk analysis can be significantly simplified and made more accessible compared to the traditional text-book Bayesian approach by focusing on predictions of observable quantities and performing uncertainty assessments of these quantities using subjective probabilities. Conclusion The predictive Bayesian approach provides a framework for ensuring quality of risk analysis. The approach acknowledges that risk cannot be adequately described and evaluated simply by reference to summarising probabilities. Risk is defined by the combination of possible consequences and associated uncertainties. PMID:17714597

  6. Acute proliferative retrolental fibroplasia: multivariate risk analysis.

    PubMed Central

    Flynn, J T

    1983-01-01

    This study has presented a two-way analysis of a data set consisting of demographic, diagnostic, and therapeutic variables against the risk of occurrence of APRLF and its location in the retina in a population of 639 infants in birthweights ranging from 600 to 1500 gm. Univariate and multivariate risk analysis techniques were employed to analyze the data. As established from previous studies, birthweight was a powerful predictor of the outcome variable. Oxygen therapy as defined and quantified in this study was not. Duration of ventilatory assistance did seem associated. The population was not uniform. Infants below 1000 gm birthweight had such a high incidence of APRLF that no other exogenous risk factors seemed of significance. Above 1000 gm birthweight, certain factors, particularly duration of ventilation, seemed of predictive strength and significance. Images FIGURE 5 A FIGURE 5 B FIGURE 4 A FIGURE 4 B PMID:6689564

  7. A Method for Dynamic Risk Assessment and Management of Rockbursts in Drill and Blast Tunnels

    NASA Astrophysics Data System (ADS)

    Liu, Guo-Feng; Feng, Xia-Ting; Feng, Guang-Liang; Chen, Bing-Rui; Chen, Dong-Fang; Duan, Shu-Qian

    2016-08-01

    Focusing on the problems caused by rockburst hazards in deep tunnels, such as casualties, damage to construction equipment and facilities, construction schedule delays, and project cost increase, this research attempts to present a methodology for dynamic risk assessment and management of rockbursts in D&B tunnels. The basic idea of dynamic risk assessment and management of rockbursts is determined, and methods associated with each step in the rockburst risk assessment and management process are given, respectively. Among them, the main parts include a microseismic method for early warning the occurrence probability of rockburst risk, an estimation method that aims to assess potential consequences of rockburst risk, an evaluation method that utilizes a new quantitative index considering both occurrence probability and consequences for determining the level of rockburst risk, and the dynamic updating. Specifically, this research briefly describes the referenced microseismic method of warning rockburst, but focuses on the analysis of consequences and associated risk assessment and management of rockburst. Using the proposed method of risk assessment and management of rockburst, the occurrence probability, potential consequences, and the level of rockburst risk can be obtained in real-time during tunnel excavation, which contributes to the dynamic optimisation of risk mitigation measures and their application. The applicability of the proposed method has been verified by those cases from the Jinping II deep headrace and water drainage tunnels at depths of 1900-2525 m (with a length of 11.6 km in total for D&B tunnels).

  8. Campylobacter detection along the food chain--towards improved quantitative risk analysis by live/dead discriminatory culture-independent methods.

    PubMed

    Stingl, Kerstin; Buhler, Christiane; Krüger, Nora-Johanna

    2015-01-01

    Death, although absolute in its consequence, is not measurable by an absolute parameter in bacteria. Viability assays address different aspects of life, e. g. the capability to form a colony on an agar plate (CFU), metabolic properties or mem- brane integrity. For food safety, presence of infectious potential is the relevant criterion for risk assessment, currently only partly reflected by the quantification of CFU. It will be necessary for future improved risk assessment, in particular when fastidious bacterial pathogens are implicated, to enhance the informative value of CFU. This might be feasible by quantification of the number of intact and potentially infectious Campylobacter, impermeable to the DNA intercalating dye propidium monoazide (PMA). The latter are quantifiable by the combination of PMA with real-time PCR, although thorough controls have to be developed for standardization and the circumvention of pitfalls. Under consideration of differ- ent physiological states of the food-borne pathogen, we provide an overview of current and future suitable detection/quantification targets along the food chain, including putative limitations of detection.

  9. Selected Tools for Risk Analysis in Logistics Processes

    NASA Astrophysics Data System (ADS)

    Kulińska, Ewa

    2012-03-01

    As each organization aims at managing effective logistics processes, risk factors can and should be controlled through proper system of risk management. Implementation of complex approach to risk management allows for the following: - evaluation of significant risk groups associated with logistics processes implementation, - composition of integrated strategies of risk management, - composition of tools for risk analysis in logistics processes.

  10. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    SciTech Connect

    Lowry, Peter P.; Wagner, Katie A.

    2015-04-01

    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish a lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.

  11. Method for Analyzing District Level IAI Data Bases to Identify Learning Opportunity Risks.

    ERIC Educational Resources Information Center

    Milazzo, Patricia; And Others

    A learning opportunity risk is defined as an absence of instruction or insufficient attention to proficiency at an early grade of instruction in a subject matter which will generate serious learning problems in later grades. A method for identifying such risks has been derived from analysis of district-level Instructional Accomplishment…

  12. 14 CFR Appendix C to Part 420 - Risk Analysis

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Risk Analysis C Appendix C to Part 420... TRANSPORTATION LICENSING LICENSE TO OPERATE A LAUNCH SITE Pt. 420, App. C Appendix C to Part 420—Risk Analysis (a... risk is minimal. (2) An applicant shall perform a risk analysis when a populated area is located...

  13. 14 CFR Appendix C to Part 420 - Risk Analysis

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Risk Analysis C Appendix C to Part 420... TRANSPORTATION LICENSING LICENSE TO OPERATE A LAUNCH SITE Pt. 420, App. C Appendix C to Part 420—Risk Analysis (a... risk is minimal. (2) An applicant shall perform a risk analysis when a populated area is located...

  14. 14 CFR Appendix C to Part 420 - Risk Analysis

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Risk Analysis C Appendix C to Part 420... TRANSPORTATION LICENSING LICENSE TO OPERATE A LAUNCH SITE Pt. 420, App. C Appendix C to Part 420—Risk Analysis (a... risk is minimal. (2) An applicant shall perform a risk analysis when a populated area is located...

  15. 14 CFR Appendix C to Part 420 - Risk Analysis

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Risk Analysis C Appendix C to Part 420... TRANSPORTATION LICENSING LICENSE TO OPERATE A LAUNCH SITE Pt. 420, App. C Appendix C to Part 420—Risk Analysis (a... risk is minimal. (2) An applicant shall perform a risk analysis when a populated area is located...

  16. Import risk analysis: the experience of Italy.

    PubMed

    Caporale, V; Giovannini, A; Calistri, P; Conte, A

    1999-12-01

    The authors propose a contribution to the possible revision of Chapters 1.4.1. and 1.4.2. of the International Animal Health Code (Code) of the Office International des Epizooties (OIE). In particular, data are presented to illustrate some of the inadequacies of both the rationale and the results of the method for risk assessment reported in the Code. The method suggested by the Code for risk assessment is based on the calculation of the 'probability of the occurrence of at least one outbreak' of a given disease following the importation of a given quantity of either live animals or animal products (unrestricted risk estimate). This is usually undertaken when dealing with rare events. For a country such as Italy, this method may not be particularly useful as the frequency of disease outbreaks is what should be estimated, so as to provide decision makers with appropriate and relevant information. Practical use of risk information generated by the use of the OIE risk assessment method for swine vesicular disease (SVD) would have encouraged the Chief Veterinary Officer of Italy to prohibit all imports of swine from the Netherlands and Belgium for at least two years in the early 1990s, with the consequential heavy economic losses for both Italy and the exporting countries. On the contrary, the number of actual outbreaks of the disease due to direct imports of swine from Member States of the European Union (EU), which occurred in Italy in 1992, 1993 and 1994 was very low (two to five outbreaks due to direct imports of swine from the Netherlands and one to two from Belgium). An example of a method for assessing the risks associated with high volumes of trade in commodities is also described. This method is based on the Monte Carlo simulation and provides the information required to evaluate the costs of the strategies compared. The method can be used to predict the number of outbreaks which are likely to occur following importation and enables a comparison to be made of

  17. Key Attributes of the SAPHIRE Risk and Reliability Analysis Software for Risk-Informed Probabilistic Applications

    SciTech Connect

    Curtis Smith; James Knudsen; Kellie Kvarfordt; Ted Wood

    2008-08-01

    The Idaho National Laboratory is a primary developer of probabilistic risk and reliability analysis (PRRA) tools, dating back over 35 years. Evolving from mainframe-based software, the current state-of-the-practice has lead to the creation of the SAPHIRE software. Currently, agencies such as the Nuclear Regulatory Commission, the National Aeronautics and Aerospace Agency, the Department of Energy, and the Department of Defense use version 7 of the SAPHIRE software for many of their risk-informed activities. In order to better understand and appreciate the power of software as part of risk-informed applications, we need to recall that our current analysis methods and solution methods have built upon pioneering work done 30 to 40 years ago. We contrast this work with the current capabilities in the SAPHIRE analysis package. As part of this discussion, we provide information for both the typical features and special analysis capabilities which are available. We also present the application and results typically found with state-of-the-practice PRRA models. By providing both a high-level and detailed look at the SAPHIRE software, we give a snapshot in time for the current use of software tools in a risk-informed decision arena.

  18. Hypertension and Risk of Cataract: A Meta-Analysis

    PubMed Central

    Yu, Xiaoning; Lyu, Danni; Dong, Xinran; He, Jiliang; Yao, Ke

    2014-01-01

    Background Cataract is the major cause of blindness across the world. Many epidemiologic studies indicated that hypertension might play an important role in the development of cataract, while others not. We therefore conducted this meta-analysis to determine the relationship between risk of cataract and hypertension. Methods Retrieved studies on the association of hypertension with cataract risk were collected from PubMed, Web of Science and the Cochrane Library during June 2014 and were included into the final analysis according to the definite inclusion criteria. Odds ratio (OR) or risk ratio (RR) were pooled with 95% confidence interval (CI) to evaluate the relationship between hypertension and cataract risk. Subgroup analyses were carried out on the basis of cataract type, race and whether studies were adjusted for main components of metabolic syndrome (MS). Results The final meta-analysis included 25 studies (9 cohort, 5 case-control and 11 cross-sectional) from 23 articles. The pooled results showed that cataract risk in populations with hypertension significantly increased among cohort studies (RR 1.08; 95% CI: 1.05–1.12) and case-control or cross-sectional studies (OR 1.28; 95% CI: 1.12–1.45). This association was proved to be true among both Mongolians and Caucasians, and the significance was not altered by the adjustment of main components of MS. Subgroup analysis on cataract types indicated that an increased incidence of posterior subcapsular cataract (PSC) resulted among cohort studies (RR 1.22; 95% CI: 1.03–1.46) and cross-sectional/case-control studies (OR 1.23; 95% CI: 1.09–1.39). No association of hypertension with risk of nuclear cataract was found. Conclusions The present meta-analysis suggests that hypertension increases the risk of cataract, especially PSC. Further efforts should be made to explore the potential biological mechanisms. PMID:25474403

  19. Integrated Reliability and Risk Analysis System (IRRAS)

    SciTech Connect

    Russell, K D; McKay, M K; Sattison, M.B. Skinner, N.L.; Wood, S T; Rasmuson, D M

    1992-01-01

    The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the user the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification. Version 1.0 of the IRRAS program was released in February of 1987. Since that time, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 4.0 and is the subject of this Reference Manual. Version 4.0 of IRRAS provides the same capabilities as Version 1.0 and adds a relational data base facility for managing the data, improved functionality, and improved algorithm performance.

  20. Risk Analysis for Environmental Health Triage

    SciTech Connect

    Bogen, K T

    2005-11-18

    The Homeland Security Act mandates development of a national, risk-based system to support planning for, response to and recovery from emergency situations involving large-scale toxic exposures. To prepare for and manage consequences effectively, planners and responders need not only to identify zones of potentially elevated individual risk, but also to predict expected casualties. Emergency response support systems now define ''consequences'' by mapping areas in which toxic chemical concentrations do or may exceed Acute Exposure Guideline Levels (AEGLs) or similar guidelines. However, because AEGLs do not estimate expected risks, current unqualified claims that such maps support consequence management are misleading. Intentionally protective, AEGLs incorporate various safety/uncertainty factors depending on scope and quality of chemical-specific toxicity data. Some of these factors are irrelevant, and others need to be modified, whenever resource constraints or exposure-scenario complexities require responders to make critical trade-off (triage) decisions in order to minimize expected casualties. AEGL-exceedance zones cannot consistently be aggregated, compared, or used to calculate expected casualties, and so may seriously misguide emergency response triage decisions. Methods and tools well established and readily available to support environmental health protection are not yet developed for chemically related environmental health triage. Effective triage decisions involving chemical risks require a new assessment approach that focuses on best estimates of likely casualties, rather than on upper plausible bounds of individual risk. If risk-based consequence management is to become a reality, federal agencies tasked with supporting emergency response must actively coordinate to foster new methods that can support effective environmental health triage.

  1. Risk and value analysis of SETI.

    PubMed

    Billingham, J

    1990-01-01

    This paper attempts to apply a traditional risk and value analysis to the Search for Extraterrestrial Intelligence--SETI. In view of the difficulties of assessing the probability of success, a comparison is made between SETI and a previous search for extraterrestrial life, the biological component of Project Viking. Our application of simple Utility Theory, given some reasonable assumptions, suggests that SETI is at least as worthwhile as the biological experiment on Viking.

  2. Risk and value analysis of SETI

    NASA Technical Reports Server (NTRS)

    Billingham, J.

    1990-01-01

    This paper attempts to apply a traditional risk and value analysis to the Search for Extraterrestrial Intelligence--SETI. In view of the difficulties of assessing the probability of success, a comparison is made between SETI and a previous search for extraterrestrial life, the biological component of Project Viking. Our application of simple Utility Theory, given some reasonable assumptions, suggests that SETI is at least as worthwhile as the biological experiment on Viking.

  3. 14 CFR 417.225 - Debris risk analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Debris risk analysis. 417.225 Section 417... OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.225 Debris risk analysis. A flight safety analysis must demonstrate that the risk to the public potentially exposed to inert...

  4. 14 CFR 417.225 - Debris risk analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Debris risk analysis. 417.225 Section 417... OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.225 Debris risk analysis. A flight safety analysis must demonstrate that the risk to the public potentially exposed to inert...

  5. 14 CFR 417.225 - Debris risk analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Debris risk analysis. 417.225 Section 417... OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.225 Debris risk analysis. A flight safety analysis must demonstrate that the risk to the public potentially exposed to inert...

  6. Comparison study on qualitative and quantitative risk assessment methods for urban natural gas pipeline network.

    PubMed

    Han, Z Y; Weng, W G

    2011-05-15

    In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment.

  7. Fire Risk Analysis for Armenian NPP Confinement

    SciTech Connect

    Poghosyan, Shahen; Malkhasyan, Albert; Bznuni, Surik; Amirjanyan, Armen

    2006-07-01

    Major fire occurred at Armenian NPP (ANPP) in October 1982 showed that fire-induced initiating events (IE) can have dominant contribution in overall risk of core damage. Probabilistic Safety Assessment study for fire-induced initiating events for ANPP was initiated in 2002. Analysis was performed for compartments fires in which could result in failure of components which are necessary for reactor cold shutdown. Analysis shows that main risk from fire at ANPP is conditioned by fire in cable tunnels 61-64. Meanwhile fire in confinement compartments don't have significant contribution to overall risk of core damage. The exception is so called 'confinement valves compartment' (room no.A-013/2) fire (more than 7.5% of CDF) in which fire could result in the loss of coolant accident with unavailability of primary makeup system, which directly leads to core damage. Detailed analysis of this problem that is common for typical WWER-440/230 reactors with no hermetic MCPs and recommendations for solution are presented in this paper. (authors)

  8. Sensitivity analysis of a two-dimensional probabilistic risk assessment model using analysis of variance.

    PubMed

    Mokhtari, Amirhossein; Frey, H Christopher

    2005-12-01

    This article demonstrates application of sensitivity analysis to risk assessment models with two-dimensional probabilistic frameworks that distinguish between variability and uncertainty. A microbial food safety process risk (MFSPR) model is used as a test bed. The process of identifying key controllable inputs and key sources of uncertainty using sensitivity analysis is challenged by typical characteristics of MFSPR models such as nonlinearity, thresholds, interactions, and categorical inputs. Among many available sensitivity analysis methods, analysis of variance (ANOVA) is evaluated in comparison to commonly used methods based on correlation coefficients. In a two-dimensional risk model, the identification of key controllable inputs that can be prioritized with respect to risk management is confounded by uncertainty. However, as shown here, ANOVA provided robust insights regarding controllable inputs most likely to lead to effective risk reduction despite uncertainty. ANOVA appropriately selected the top six important inputs, while correlation-based methods provided misleading insights. Bootstrap simulation is used to quantify uncertainty in ranks of inputs due to sampling error. For the selected sample size, differences in F values of 60% or more were associated with clear differences in rank order between inputs. Sensitivity analysis results identified inputs related to the storage of ground beef servings at home as the most important. Risk management recommendations are suggested in the form of a consumer advisory for better handling and storage practices.

  9. Preliminary Technical Risk Analysis for the Geothermal Technologies Program

    SciTech Connect

    McVeigh, J.; Cohen, J.; Vorum, M.; Porro, G.; Nix, G.

    2007-03-01

    This report explains the goals, methods, and results of a probabilistic analysis of technical risk for a portfolio of R&D projects in the DOE Geothermal Technologies Program ('the Program'). The analysis is a task by Princeton Energy Resources International, LLC (PERI), in support of the National Renewable Energy Laboratory (NREL) on behalf of the Program. The main challenge in the analysis lies in translating R&D results to a quantitative reflection of technical risk for a key Program metric: levelized cost of energy (LCOE). This requires both computational development (i.e., creating a spreadsheet-based analysis tool) and a synthesis of judgments by a panel of researchers and experts of the expected results of the Program's R&D.

  10. Compounding conservatisms: EPA's health risk assessment methods

    SciTech Connect

    Stackelberg, K. von; Burmaster, D.E. )

    1993-03-01

    Superfund conjures up images of hazardous waste sites, which EPA is spending billions of dollars to remediate. One of the law's most worrisome effects is that it drains enormous economic resources without returning commensurate benefits. In a Sept. 1, 1991, front page article in The New York Times, experts argued that most health dangers at Superfund sites could be eliminated for a fraction of the billions that will be spent cleaning up the 1,200 high-priority sites across the country. Even EPA has suggested that the Superfund program may receive disproportionate resources, compared with other public health programs, such as radon in houses, the diminishing ozone layer and occupational diseases. Public opinion polls over the last decade consistently have mirrored the public's vast fear of hazardous waste sites, a fear as great as that held for nuclear power plants. Fear notwithstanding, the high cost of chosen remedies at given sites may have less to do with public health goals than with the method EPA uses to translate them into acceptable contaminant concentrations in soil, groundwater and other environmental media.

  11. Method of multivariate spectral analysis

    DOEpatents

    Keenan, Michael R.; Kotula, Paul G.

    2004-01-06

    A method of determining the properties of a sample from measured spectral data collected from the sample by performing a multivariate spectral analysis. The method can include: generating a two-dimensional matrix A containing measured spectral data; providing a weighted spectral data matrix D by performing a weighting operation on matrix A; factoring D into the product of two matrices, C and S.sup.T, by performing a constrained alternating least-squares analysis of D=CS.sup.T, where C is a concentration intensity matrix and S is a spectral shapes matrix; unweighting C and S by applying the inverse of the weighting used previously; and determining the properties of the sample by inspecting C and S. This method can be used to analyze X-ray spectral data generated by operating a Scanning Electron Microscope (SEM) with an attached Energy Dispersive Spectrometer (EDS).

  12. Incorporating Classification Uncertainty in Competing- risks Nest- failure Analysis

    EPA Science Inventory

    Nesting birds risk nest failure due to many causes. Though partitioning risk of failure among causes has long been of interest to ornithologists, formal methods for estimating competing risk have been lacking.

  13. Approach to uncertainty in risk analysis

    SciTech Connect

    Rish, W.R.

    1988-08-01

    In the Fall of 1985 EPA's Office of Radiation Programs (ORP) initiated a project to develop a formal approach to dealing with uncertainties encountered when estimating and evaluating risks to human health and the environment. Based on a literature review of modeling uncertainty, interviews with ORP technical and management staff, and input from experts on uncertainty analysis, a comprehensive approach was developed. This approach recognizes by design the constraints on budget, time, manpower, expertise, and availability of information often encountered in ''real world'' modeling. It is based on the observation that in practice risk modeling is usually done to support a decision process. As such, the approach focuses on how to frame a given risk modeling problem, how to use that framing to select an appropriate mixture of uncertainty analyses techniques, and how to integrate the techniques into an uncertainty assessment that effectively communicates important information and insight to decision-makers. The approach is presented in this report. Practical guidance on characterizing and analyzing uncertainties about model form and quantities and on effectively communicating uncertainty analysis results is included. Examples from actual applications are presented.

  14. Forecasting method of nationak-level forest fire risk rating

    NASA Astrophysics Data System (ADS)

    Qin, Xian-lin; Zhang, Zi-hui; Li, Zeng-yuan; Yi, Hao-ruo

    2008-11-01

    The risk level of forest fire not only depends on weather, topography, human activities, socio-economic conditions, but is also closely related to the types, growth, moisture content, and quantity of forest fuel on the ground. How to timely acquire information about the growth and moisture content of forest fuel and climate for the whole country is critical to national-level forest fire risk forecasting. The development and application of remote sensing (RS), geographic information system (GIS), databases, internet, and other modern information technologies has provided important technical means for macro-regional forest fire risk forecasting. In this paper, quantified forecasting of national-level forest fire risk was studied using Fuel State Index (FSI) and Background Composite Index (BCI). The FSI was estimated using Moderate Resolution Imaging Spectroradiaometer (MODIS) data. National meteorological data and other basic data on distribution of fuel types and forest fire risk rating were standardized in ArcGIS platform to calculate BCI. The FSI and the BCI were used to calculate the Forest Fire Danger Index (FFDI), which is regarded as a quantitative indicator for national forest fire risk forecasting and forest fire risk rating, shifting from qualitative description to quantitative estimation. The major forest fires occurred in recent years were taken as examples to validate the above method, and results indicated that the method can be used for quantitative forecasting of national-level forest fire risks.

  15. Method of photon spectral analysis

    DOEpatents

    Gehrke, Robert J.; Putnam, Marie H.; Killian, E. Wayne; Helmer, Richard G.; Kynaston, Ronnie L.; Goodwin, Scott G.; Johnson, Larry O.

    1993-01-01

    A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and .gamma.-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2000 keV), as well as high-energy .gamma. rays (>1 MeV). A 8192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The .gamma.-ray portion of each spectrum is analyzed by a standard Ge .gamma.-ray analysis program. This method can be applied to any analysis involving x- and .gamma.-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the .gamma.-ray analysis and accommodated during the x-ray analysis.

  16. Method of photon spectral analysis

    DOEpatents

    Gehrke, R.J.; Putnam, M.H.; Killian, E.W.; Helmer, R.G.; Kynaston, R.L.; Goodwin, S.G.; Johnson, L.O.

    1993-04-27

    A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and [gamma]-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2,000 keV), as well as high-energy [gamma] rays (>1 MeV). A 8,192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The [gamma]-ray portion of each spectrum is analyzed by a standard Ge [gamma]-ray analysis program. This method can be applied to any analysis involving x- and [gamma]-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the [gamma]-ray analysis and accommodated during the x-ray analysis.

  17. Defining Human Failure Events for Petroleum Risk Analysis

    SciTech Connect

    Ronald L. Boring; Knut Øien

    2014-06-01

    In this paper, an identification and description of barriers and human failure events (HFEs) for human reliability analysis (HRA) is performed. The barriers, called target systems, are identified from risk significant accident scenarios represented as defined situations of hazard and accident (DSHAs). This report serves as the foundation for further work to develop petroleum HFEs compatible with the SPAR-H method and intended for reuse in future HRAs.

  18. Project 6: Cumulative Risk Assessment (CRA) Methods and Applications

    EPA Science Inventory

    Project 6: CRA Methods and Applications addresses the need to move beyond traditional risk assessment practices by developing CRA methods to integrate and evaluate impacts of chemical and nonchemical stressors on the environment and human health. Project 6 has three specific obje...

  19. Risk Analysis Related to Quality Management Principles

    NASA Astrophysics Data System (ADS)

    Vykydal, David; Halfarová, Petra; Nenadál, Jaroslav; Plura, Jiří; Hekelová, Edita

    2012-12-01

    Efficient and effective implementation of quality management principles asks for a responsible approach from top managers' perspectives. A study of the current state of affairs in Czech organizations discovers a lot of shortcomings in this field that can be changed to vary managerial risks. The article identifies and analyses some of them and gives short guidance for appropriate treatment. Text of the article reflects the authors' experience as well as knowledge obtained from the systematic analysis of industrial companies' environments.

  20. FORTRAN computer program for seismic risk analysis

    USGS Publications Warehouse

    McGuire, Robin K.

    1976-01-01

    A program for seismic risk analysis is described which combines generality of application, efficiency and accuracy of operation, and the advantage of small storage requirements. The theoretical basis for the program is first reviewed, and the computational algorithms used to apply this theory are described. The information required for running the program is listed. Published attenuation functions describing the variation with earthquake magnitude and distance of expected values for various ground motion parameters are summarized for reference by the program user. Finally, suggestions for use of the program are made, an example problem is described (along with example problem input and output) and the program is listed.

  1. New challenges on uncertainty propagation assessment of flood risk analysis

    NASA Astrophysics Data System (ADS)

    Martins, Luciano; Aroca-Jiménez, Estefanía; Bodoque, José M.; Díez-Herrero, Andrés

    2016-04-01

    Natural hazards, such as floods, cause considerable damage to the human life, material and functional assets every year and around the World. Risk assessment procedures has associated a set of uncertainties, mainly of two types: natural, derived from stochastic character inherent in the flood process dynamics; and epistemic, that are associated with lack of knowledge or the bad procedures employed in the study of these processes. There are abundant scientific and technical literature on uncertainties estimation in each step of flood risk analysis (e.g. rainfall estimates, hydraulic modelling variables); but very few experience on the propagation of the uncertainties along the flood risk assessment. Therefore, epistemic uncertainties are the main goal of this work, in particular,understand the extension of the propagation of uncertainties throughout the process, starting with inundability studies until risk analysis, and how far does vary a proper analysis of the risk of flooding. These methodologies, such as Polynomial Chaos Theory (PCT), Method of Moments or Monte Carlo, are used to evaluate different sources of error, such as data records (precipitation gauges, flow gauges...), hydrologic and hydraulic modelling (inundation estimation), socio-demographic data (damage estimation) to evaluate the uncertainties propagation (UP) considered in design flood risk estimation both, in numerical and cartographic expression. In order to consider the total uncertainty and understand what factors are contributed most to the final uncertainty, we used the method of Polynomial Chaos Theory (PCT). It represents an interesting way to handle to inclusion of uncertainty in the modelling and simulation process. PCT allows for the development of a probabilistic model of the system in a deterministic setting. This is done by using random variables and polynomials to handle the effects of uncertainty. Method application results have a better robustness than traditional analysis

  2. Meta-analysis of osteoporosis: fracture risks, medication and treatment.

    PubMed

    Liu, W; Yang, L-H; Kong, X-C; An, L-K; Wang, R

    2015-08-01

    Osteoporosis is a brittle bone disease that can cause fractures mostly in older men and women. Meta-analysis is the statistical method which is applied in the frame work for the assessment of results obtained from various research studies conducted in several years. A meta-analysis of osteoporotic fracture risk with medication non-adherence has been described to assess the bone fracture risk among patients non-adherent versus adherent to therapy for osteoporosis by many researchers. Osteoporosis therapy reduces the risk of fracture in clinical trials, and real-world adherence to therapy which is suboptimal and can reduce the effectiveness of intervention. The methods of Medline, Embase, and CINAHL were literature searched for these observational studies from year 1998 to 2009, and up to 2015. The results of meta-analysis of osteoporosis research on fractures of postmenopausal women and men are presented. The use of bisphosphonate therapy for osteoporosis has been described with other drugs. The authors, design, studies (women %), years (data), follow-up (wks), fractures (types), and compliance or persistence results from years 2004 to 2009 from are shown in a brief table. The meta-analysis studies have been reviewed from other researchers on osteoporosis and fractures, medications and treatments.

  3. [Legal and methodical aspects of occupational risk management].

    PubMed

    2011-01-01

    Legal and methodical aspects of occupational risk management (ORM) are considered with account of new official documents. Introduction of risk and risk management notions into Labor Code reflects the change of forms of occupational health and safety. The role of hygienist and occupational medicine professionals in workplace conditions certification (WCC) and periodical medical examinations (PME) is strengthened. The ORM could be improved by introducing the block of prognosis and causation based on IT-technologies that could match systems of WCC and PME thus improving the effectiveness of prophylactics.

  4. Bleeding after endoscopic submucosal dissection: Risk factors and preventive methods

    PubMed Central

    Kataoka, Yosuke; Tsuji, Yosuke; Sakaguchi, Yoshiki; Minatsuki, Chihiro; Asada-Hirayama, Itsuko; Niimi, Keiko; Ono, Satoshi; Kodashima, Shinya; Yamamichi, Nobutake; Fujishiro, Mitsuhiro; Koike, Kazuhiko

    2016-01-01

    Endoscopic submucosal dissection (ESD) has become widely accepted as a standard method of treatment for superficial gastrointestinal neoplasms because it enables en block resection even for large lesions or fibrotic lesions with minimal invasiveness, and decreases the local recurrence rate. Moreover, specimens resected in an en block fashion enable accurate histological assessment. Taking these factors into consideration, ESD seems to be more advantageous than conventional endoscopic mucosal resection (EMR), but the associated risks of perioperative adverse events are higher than in EMR. Bleeding after ESD is the most frequent among these adverse events. Although post-ESD bleeding can be controlled by endoscopic hemostasis in most cases, it may lead to serious conditions including hemorrhagic shock. Even with preventive methods including administration of acid secretion inhibitors and preventive hemostasis, post-ESD bleeding cannot be completely prevented. In addition high-risk cases for post-ESD bleeding, which include cases with the use of antithrombotic agents or which require large resection, are increasing. Although there have been many reports about associated risk factors and methods of preventing post-ESD bleeding, many issues remain unsolved. Therefore, in this review, we have overviewed risk factors and methods of preventing post-ESD bleeding from previous studies. Endoscopists should have sufficient knowledge of these risk factors and preventive methods when performing ESD. PMID:27468187

  5. A method for scenario-based risk assessment for robust aerospace systems

    NASA Astrophysics Data System (ADS)

    Thomas, Victoria Katherine

    In years past, aircraft conceptual design centered around creating a feasible aircraft that could be built and could fly the required missions. More recently, aircraft viability entered into conceptual design, allowing that the product's potential to be profitable should also be examined early in the design process. While examining an aerospace system's feasibility and viability early in the design process is extremely important, it is also important to examine system risk. In traditional aerospace systems risk analysis, risk is examined from the perspective of performance, schedule, and cost. Recently, safety and reliability analysis have been brought forward in the design process to also be examined during late conceptual and early preliminary design. While these analyses work as designed, existing risk analysis methods and techniques are not designed to examine an aerospace system's external operating environment and the risks present there. A new method has been developed here to examine, during the early part of concept design, the risk associated with not meeting assumptions about the system's external operating environment. The risks are examined in five categories: employment, culture, government and politics, economics, and technology. The risks are examined over a long time-period, up to the system's entire life cycle. The method consists of eight steps over three focus areas. The first focus area is Problem Setup. During problem setup, the problem is defined and understood to the best of the decision maker's ability. There are four steps in this area, in the following order: Establish the Need, Scenario Development, Identify Solution Alternatives, and Uncertainty and Risk Identification. There is significant iteration between steps two through four. Focus area two is Modeling and Simulation. In this area the solution alternatives and risks are modeled, and a numerical value for risk is calculated. A risk mitigation model is also created. The four steps

  6. Risk analysis for renewable energy projects due to constraints arising

    NASA Astrophysics Data System (ADS)

    Prostean, G.; Vasar, C.; Prostean, O.; Vartosu, A.

    2016-02-01

    Starting from the target of the European Union (EU) to use renewable energy in the area that aims a binding target of 20% renewable energy in final energy consumption by 2020, this article illustrates the identification of risks for implementation of wind energy projects in Romania, which could lead to complex technical implications, social and administrative. In specific projects analyzed in this paper were identified critical bottlenecks in the future wind power supply chain and reasonable time periods that may arise. Renewable energy technologies have to face a number of constraints that delayed scaling-up their production process, their transport process, the equipment reliability, etc. so implementing these types of projects requiring complex specialized team, the coordination of which also involve specific risks. The research team applied an analytical risk approach to identify major risks encountered within a wind farm project developed in Romania in isolated regions with different particularities, configured for different geographical areas (hill and mountain locations in Romania). Identification of major risks was based on the conceptual model set up for the entire project implementation process. Throughout this conceptual model there were identified specific constraints of such process. Integration risks were examined by an empirical study based on the method HAZOP (Hazard and Operability). The discussion describes the analysis of our results implementation context of renewable energy projects in Romania and creates a framework for assessing energy supply to any entity from renewable sources.

  7. System Analysis and Risk Assessment System.

    2000-11-20

    Version 00 SARA4.16 is a program that allows the user to review the results of a Probabilistic Risk Assessment (PRA) and to perform limited sensitivity analysis on these results. This tool is intended to be used by a less technical oriented user and does not require the level of understanding of PRA concepts required by a full PRA analysis tool. With this program a user can review the information generated by a PRA analyst andmore » compare the results to those generated by making limited modifications to the data in the PRA. Also included in this program is the ability to graphically display the information stored in the database. This information includes event trees, fault trees, P&IDs and uncertainty distributions. SARA 4.16 is incorporated in the SAPHIRE 5.0 code package.« less

  8. [Risk analysis in radiation therapy: state of the art].

    PubMed

    Mazeron, R; Aguini, N; Deutsch, É

    2013-01-01

    Five radiotherapy accidents, from which two serial, occurred in France from 2003 to 2007, led the authorities to establish a roadmap for securing radiotherapy. By analogy with industrial processes, a technical decision form the French Nuclear Safety Authority in 2008 requires radiotherapy professionals to conduct analyzes of risks to patients. The process of risk analysis had been tested in three pilot centers, before the occurrence of accidents, with the creation of cells feedback. The regulation now requires all radiotherapy services to have similar structures to collect precursor events, incidents and accidents, to perform analyzes following rigorous methods and to initiate corrective actions. At the same time, it is also required to conduct analyzes a priori, less intuitive, and usually require the help of a quality engineer, with the aim of reducing risk. The progressive implementation of these devices is part of an overall policy to improve the quality of radiotherapy. Since 2007, no radiotherapy accident was reported. PMID:23787020

  9. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments

    PubMed Central

    Conners, Erin E.; West, Brooke S.; Roth, Alexis M.; Meckel-Parker, Kristen G.; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D.; Brouwer, Kimberly C.

    2016-01-01

    Increasingly, ‘place’, including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions. PMID:27191846

  10. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments.

    PubMed

    Conners, Erin E; West, Brooke S; Roth, Alexis M; Meckel-Parker, Kristen G; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D; Brouwer, Kimberly C

    2016-01-01

    Increasingly, 'place', including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions. PMID:27191846

  11. Voltammetric analysis apparatus and method

    DOEpatents

    Almon, A.C.

    1993-06-08

    An apparatus and method is described for electrochemical analysis of elements in solution. An auxiliary electrode, a reference electrode, and five working electrodes are positioned in a container containing a sample solution. The working electrodes are spaced apart evenly from each other and the auxiliary electrode to minimize any inter-electrode interference that may occur during analysis. An electric potential is applied between the auxiliary electrode and each of the working electrodes. Simultaneous measurements taken of the current flow through each of the working electrodes for each given potential in a potential range are used for identifying chemical elements present in the sample solution and their respective concentrations. Multiple working electrodes enable a more positive identification to be made by providing unique data characteristic of chemical elements present in the sample solution.

  12. Voltametric analysis apparatus and method

    DOEpatents

    Almon, Amy C.

    1993-01-01

    An apparatus and method for electrochemical analysis of elements in solution. An auxiliary electrode 14, a reference electrode 18, and five working electrodes 20, 22, 26, 28, and 30 are positioned in a container 12 containing a sample solution 34. The working electrodes are spaced apart evenly from each other and auxiliary electrode 14 to minimize any inter-electrode interference that may occur during analysis. An electric potential is applied between auxiliary electrode 14 and each of the working electrodes 20, 22, 26, 28, and 30. Simultaneous measurements taken of the current flow through each of the working electrodes for each given potential in a potential range are used for identifying chemical elements present in sample solution 34 and their respective concentrations. Multiple working electrodes enable a more positive identification to be made by providing unique data characteristic of chemical elements present in the sample solution.

  13. Method and system for dynamic probabilistic risk assessment

    NASA Technical Reports Server (NTRS)

    Dugan, Joanne Bechta (Inventor); Xu, Hong (Inventor)

    2013-01-01

    The DEFT methodology, system and computer readable medium extends the applicability of the PRA (Probabilistic Risk Assessment) methodology to computer-based systems, by allowing DFT (Dynamic Fault Tree) nodes as pivot nodes in the Event Tree (ET) model. DEFT includes a mathematical model and solution algorithm, supports all common PRA analysis functions and cutsets. Additional capabilities enabled by the DFT include modularization, phased mission analysis, sequence dependencies, and imperfect coverage.

  14. Correlation method of electrocardiogram analysis

    NASA Astrophysics Data System (ADS)

    Strinadko, Marina M.; Timochko, Katerina B.

    2002-02-01

    The electrocardiograph method is the informational source for functional heart state characteristics. The electrocardiogram parameters are the integrated map of many component characteristics of the heart system and depend on disturbance requirements of each device. In the research work the attempt of making the skeleton diagram of perturbation of the heart system is made by the characteristic description of its basic components and connections between them through transition functions, which are written down by the differential equations of the first and second order with the purpose to build-up and analyze electrocardiogram. Noting the vector character of perturbation and the various position of heart in each organism, we offer own coordinate system connected with heart. The comparative analysis of electrocardiogram was conducted with the usage of correlation method.

  15. 49 CFR 260.17 - Credit risk premium analysis.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 4 2011-10-01 2011-10-01 false Credit risk premium analysis. 260.17 Section 260... Financial Assistance § 260.17 Credit risk premium analysis. (a) When Federal appropriations are not available to cover the total subsidy cost, the Administrator will determine the Credit Risk...

  16. 49 CFR 260.17 - Credit risk premium analysis.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Credit risk premium analysis. 260.17 Section 260... Financial Assistance § 260.17 Credit risk premium analysis. (a) When Federal appropriations are not available to cover the total subsidy cost, the Administrator will determine the Credit Risk...

  17. [EFFECT OF FEEDING METHOD ON THE DEVELOPMENT OF ATOPY IN NEWBORNS AT RISK].

    PubMed

    Ivantsiv-Griga, I S

    2015-01-01

    The analysis of risk factors for the realization of allergy was carried out; the effect of genetic factors on the atopy development at the ages of 6 and 12 months was analyzed; incidence (prevalence) structure at the ages of 6 and 12 months was analyzed; based on the studies of the effects of feeding, nature of the formulas, and the terms of formula administration the conclusions about the methods of optimization of feeding of newborns at risk were drawn. PMID:26118049

  18. Movement Recognition Technology as a Method of Assessing Spontaneous General Movements in High Risk Infants

    PubMed Central

    Marcroft, Claire; Khan, Aftab; Embleton, Nicholas D.; Trenell, Michael; Plötz, Thomas

    2015-01-01

    Preterm birth is associated with increased risks of neurological and motor impairments such as cerebral palsy. The risks are highest in those born at the lowest gestations. Early identification of those most at risk is challenging meaning that a critical window of opportunity to improve outcomes through therapy-based interventions may be missed. Clinically, the assessment of spontaneous general movements is an important tool, which can be used for the prediction of movement impairments in high risk infants. Movement recognition aims to capture and analyze relevant limb movements through computerized approaches focusing on continuous, objective, and quantitative assessment. Different methods of recording and analyzing infant movements have recently been explored in high risk infants. These range from camera-based solutions to body-worn miniaturized movement sensors used to record continuous time-series data that represent the dynamics of limb movements. Various machine learning methods have been developed and applied to the analysis of the recorded movement data. This analysis has focused on the detection and classification of atypical spontaneous general movements. This article aims to identify recent translational studies using movement recognition technology as a method of assessing movement in high risk infants. The application of this technology within pediatric practice represents a growing area of inter-disciplinary collaboration, which may lead to a greater understanding of the development of the nervous system in infants at high risk of motor impairment. PMID:25620954

  19. Code System to Calculate Integrated Reliability and Risk Analysis.

    2002-02-18

    Version 04 IRRAS Version 4.16, the latest in a series (2.0, 2.5, 4.0, 4.15), is a program developed for the purpose of performing those functions necessary to create and analyze a complete Probabilistic Risk Assessment (PRA). This program includes functions to allow the user to create event trees and fault trees, to define accident sequences and basic event failure data, to solve system and accident sequence fault trees, to quantify cut sets, and to performmore » uncertainty analysis on the results. Also included in this program are features to allow the analyst to generate reports and displays that can be used to document the results of an analysis. Since this software is a very detailed technical tool, the user of this program should be familiar with PRA concepts and the methods used to perform these analyses. IRRAS Version 4.16 is the latest in the stand-alone IRRAS series (2.0, 2.5, 4.0, 4.15). Be sure to review the PSR-405/ SAPHIRE 7.06 package which was released in January 2000 and includes three programs: the Integrated Reliability and Risk Analysis System (IRRAS), the System Analysis and Risk Assessment (SARA) system, the Models And Results Database (MAR-D) system, and the Fault tree, Event tree and P&ID (FEP) editors.« less

  20. INTERIM REPORT IMPROVED METHODS FOR INCORPORATING RISK IN DECISION MAKING

    SciTech Connect

    Clausen, M. J.; Fraley, D. W.; Denning, R. S.

    1980-08-01

    This paper reports observations and preliminary investigations in the first phase of a research program covering methodologies for making safety-related decisions. The objective has been to gain insight into NRC perceptions of the value of formal decision methods, their possible applications, and how risk is, or may be, incorporated in decision making. The perception of formal decision making techniques, held by various decision makers, and what may be done to improve them, were explored through interviews with NRC staff. An initial survey of decision making methods, an assessment of the applicability of formal methods vis-a-vis the available information, and a review of methods of incorporating risk and uncertainty have also been conducted.

  1. Methodology for risk analysis based on atmospheric dispersion modelling from nuclear risk sites

    NASA Astrophysics Data System (ADS)

    Baklanov, A.; Mahura, A.; Sørensen, J. H.; Rigina, O.

    2003-04-01

    The main purpose of this multidisciplinary study is to develop a methodology for complex nuclear risk and vulnerability assessment, and to test it on example of estimation of nuclear risk to the population in the Nordic countries in case of a severe accident at a nuclear risk site (NRS). The main focus of the paper is the methodology for the evaluation of the atmospheric transport and deposition of radioactive pollutants from NRSs. The method developed for this evaluation is derived from a probabilistic point of view. The main question we are trying to answer is: What is the probability for radionuclide atmospheric transport and impact to different neighbouring regions and countries in case of an accident at an NPP? To answer this question we applied a number of different tools: (i) Trajectory Modelling - to calculate multiyear forward trajectories originating over the locations of selected risk sites; (ii) Dispersion Modelling - for long-term simulation and case studies of radionuclide transport from hypothetical accidental releases at NRSs; (iii) Cluster Analysis - to identify atmospheric transport pathways from NRSs; (iv) Probability Fields Analysis - to construct annual, monthly, and seasonal NRS impact indicators to identify the most impacted geographical regions; (v) Specific Case Studies - to estimate consequences for the environment and the populations after a hypothetical accident; (vi) Vulnerability Evaluation to Radioactive Deposition - to describe its persistence in the ecosystems with a focus to the transfer of certain radionuclides into the food chains of key importance for the intake and exposure for a whole population and for certain population groups; (vii) Risk Evaluation and Mapping - to analyse socio-economical consequences for different geographical areas and various population groups taking into account social-geophysical factors and probabilities, and using demographic databases based on GIS analysis.

  2. Pedophilia: an evaluation of diagnostic and risk prediction methods.

    PubMed

    Wilson, Robin J; Abracen, Jeffrey; Looman, Jan; Picheca, Janice E; Ferguson, Meaghan

    2011-06-01

    One hundred thirty child sexual abusers were diagnosed using each of following four methods: (a) phallometric testing, (b) strict application of Diagnostic and Statistical Manual of Mental Disorders (4th ed., text revision [DSM-IV-TR]) criteria, (c) Rapid Risk Assessment of Sex Offender Recidivism (RRASOR) scores, and (d) "expert" diagnoses rendered by a seasoned clinician. Comparative utility and intermethod consistency of these methods are reported, along with recidivism data indicating predictive validity for risk management. Results suggest that inconsistency exists in diagnosing pedophilia, leading to diminished accuracy in risk assessment. Although the RRASOR and DSM-IV-TR methods were significantly correlated with expert ratings, RRASOR and DSM-IV-TR were unrelated to each other. Deviant arousal was not associated with any of the other methods. Only the expert ratings and RRASOR scores were predictive of sexual recidivism. Logistic regression analyses showed that expert diagnosis did not add to prediction of sexual offence recidivism over and above RRASOR alone. Findings are discussed within a context of encouragement of clinical consistency and evidence-based practice regarding treatment and risk management of those who sexually abuse children.

  3. Software Speeds Up Analysis of Breast Cancer Risk

    MedlinePlus

    ... fullstory_161117.html Software Speeds Up Analysis of Breast Cancer Risk: Study Doctors were 30 times slower reading ... quickly analyzes mammograms and patient history to determine breast cancer risk could save time and reduce unnecessary biopsies, ...

  4. Mosquito habitat and dengue risk potential in Kenya: alternative methods to traditional risk mapping techniques.

    PubMed

    Attaway, David F; Jacobsen, Kathryn H; Falconer, Allan; Manca, Germana; Rosenshein Bennett, Lauren; Waters, Nigel M

    2014-11-01

    Outbreaks, epidemics and endemic conditions make dengue a disease that has emerged as a major threat in tropical and sub-tropical countries over the past 30 years. Dengue fever creates a growing burden for public health systems and has the potential to affect over 40% of the world population. The problem being investigated is to identify the highest and lowest areas of dengue risk. This paper presents "Similarity Search", a geospatial analysis aimed at identifying these locations within Kenya. Similarity Search develops a risk map by combining environmental susceptibility analysis and geographical information systems, and then compares areas with dengue prevalence to all other locations. Kenya has had outbreaks of dengue during the past 3 years, and we identified areas with the highest susceptibility to dengue infection using bioclimatic variables, elevation and mosquito habitat as input to the model. Comparison of the modelled risk map with the reported dengue epidemic cases obtained from the open source reporting ProMED and Government news reports from 1982-2013 confirmed the high-risk locations that were used as the Similarity Search presence cells. Developing the risk model based upon the bioclimatic variables, elevation and mosquito habitat increased the efficiency and effectiveness of the dengue fever risk mapping process.

  5. Working session 5: Operational aspects and risk analysis

    SciTech Connect

    Cizelj, L.; Donoghue, J.

    1997-02-01

    A general observation is that both operational aspects and risk analysis cannot be adequately discussed without information presented in other sessions. Some overlap of conclusions and recommendations is therefore to be expected. Further, it was assumed that recommendations concerning improvements in some related topics were generated by other sessions and are not repeated here. These include: (1) Knowledge on degradation mechanisms (initiation, progression, and failure). (2) Modeling of degradation (initiation, progression, and failure). (3) Capabilities of NDE methods. (4) Preventive maintenance and repair. One should note here, however, that all of these directly affect both operational and risk aspects of affected plants. A list of conclusions and recommendations is based on available presentations and discussions addressing risk and operational experience. The authors aimed at reaching as broad a consensus as possible. It should be noted here that there is no strict delineation between operational and safety aspects of degradation of steam generator tubes. This is caused by different risk perceptions in different countries/societies. The conclusions and recommendations were divided into four broad groups: human reliability; leakage monitoring; risk impact; and consequence assessment.

  6. Risk prediction with machine learning and regression methods.

    PubMed

    Steyerberg, Ewout W; van der Ploeg, Tjeerd; Van Calster, Ben

    2014-07-01

    This is a discussion of issues in risk prediction based on the following papers: "Probability estimation with machine learning methods for dichotomous and multicategory outcome: Theory" by Jochen Kruppa, Yufeng Liu, Gérard Biau, Michael Kohler, Inke R. König, James D. Malley, and Andreas Ziegler; and "Probability estimation with machine learning methods for dichotomous and multicategory outcome: Applications" by Jochen Kruppa, Yufeng Liu, Hans-Christian Diener, Theresa Holste, Christian Weimar, Inke R. König, and Andreas Ziegler.

  7. Pressure Systems Stored-Energy Threshold Risk Analysis

    SciTech Connect

    Paulsen, Samuel S.

    2009-08-25

    Federal Regulation 10 CFR 851, which became effective February 2007, brought to light potential weaknesses regarding the Pressure Safety Program at the Pacific Northwest National Laboratory (PNNL). The definition of a pressure system in 10 CFR 851 does not contain a limit based upon pressure or any other criteria. Therefore, the need for a method to determine an appropriate risk-based hazard level for pressure safety was identified. The Laboratory has historically used a stored energy of 1000 lbf-ft to define a pressure hazard; however, an analytical basis for this value had not been documented. This document establishes the technical basis by evaluating the use of stored energy as an appropriate criterion to establish a pressure hazard, exploring a suitable risk threshold for pressure hazards, and reviewing the methods used to determine stored energy. The literature review and technical analysis concludes the use of stored energy as a method for determining a potential risk, the 1000 lbf-ft threshold, and the methods used by PNNL to calculate stored energy are all appropriate. Recommendations for further program improvements are also discussed

  8. Global Human Settlement Analysis for Disaster Risk Reduction

    NASA Astrophysics Data System (ADS)

    Pesaresi, M.; Ehrlich, D.; Ferri, S.; Florczyk, A.; Freire, S.; Haag, F.; Halkia, M.; Julea, A. M.; Kemper, T.; Soille, P.

    2015-04-01

    The Global Human Settlement Layer (GHSL) is supported by the European Commission, Joint Research Center (JRC) in the frame of his institutional research activities. Scope of GHSL is developing, testing and applying the technologies and analysis methods integrated in the JRC Global Human Settlement analysis platform for applications in support to global disaster risk reduction initiatives (DRR) and regional analysis in the frame of the European Cohesion policy. GHSL analysis platform uses geo-spatial data, primarily remotely sensed and population. GHSL also cooperates with the Group on Earth Observation on SB-04-Global Urban Observation and Information, and various international partners andWorld Bank and United Nations agencies. Some preliminary results integrating global human settlement information extracted from Landsat data records of the last 40 years and population data are presented.

  9. Putting problem formulation at the forefront of GMO risk analysis.

    PubMed

    Tepfer, Mark; Racovita, Monica; Craig, Wendy

    2013-01-01

    When applying risk assessment and the broader process of risk analysis to decisions regarding the dissemination of genetically modified organisms (GMOs), the process has a tendency to become remarkably complex. Further, as greater numbers of countries consider authorising the large-scale dissemination of GMOs, and as GMOs with more complex traits reach late stages of development, there has been increasing concern about the burden posed by the complexity of risk analysis. We present here an improved approach for GMO risk analysis that gives a central role to problem formulation. Further, the risk analysis strategy has been clarified and simplified in order to make rigorously scientific risk assessment and risk analysis more broadly accessible to diverse stakeholder groups. PMID:23160540

  10. Putting problem formulation at the forefront of GMO risk analysis.

    PubMed

    Tepfer, Mark; Racovita, Monica; Craig, Wendy

    2013-01-01

    When applying risk assessment and the broader process of risk analysis to decisions regarding the dissemination of genetically modified organisms (GMOs), the process has a tendency to become remarkably complex. Further, as greater numbers of countries consider authorising the large-scale dissemination of GMOs, and as GMOs with more complex traits reach late stages of development, there has been increasing concern about the burden posed by the complexity of risk analysis. We present here an improved approach for GMO risk analysis that gives a central role to problem formulation. Further, the risk analysis strategy has been clarified and simplified in order to make rigorously scientific risk assessment and risk analysis more broadly accessible to diverse stakeholder groups.

  11. HANFORD SAFETY ANALYSIS & RISK ASSESSMENT HANDBOOK (SARAH)

    SciTech Connect

    EVANS, C B

    2004-12-21

    The purpose of the Hanford Safety Analysis and Risk Assessment Handbook (SARAH) is to support the development of safety basis documentation for Hazard Category 2 and 3 (HC-2 and 3) U.S. Department of Energy (DOE) nuclear facilities to meet the requirements of 10 CFR 830, ''Nuclear Safety Management''. Subpart B, ''Safety Basis Requirements.'' Consistent with DOE-STD-3009-94, Change Notice 2, ''Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'' (STD-3009), and DOE-STD-3011-2002, ''Guidance for Preparation of Basis for Interim Operation (BIO) Documents'' (STD-3011), the Hanford SARAH describes methodology for performing a safety analysis leading to development of a Documented Safety Analysis (DSA) and derivation of Technical Safety Requirements (TSR), and provides the information necessary to ensure a consistently rigorous approach that meets DOE expectations. The DSA and TSR documents, together with the DOE-issued Safety Evaluation Report (SER), are the basic components of facility safety basis documentation. For HC-2 or 3 nuclear facilities in long-term surveillance and maintenance (S&M), for decommissioning activities, where source term has been eliminated to the point that only low-level, residual fixed contamination is present, or for environmental remediation activities outside of a facility structure, DOE-STD-1120-98, ''Integration of Environment, Safety, and Health into Facility Disposition Activities'' (STD-1120), may serve as the basis for the DSA. HC-2 and 3 environmental remediation sites also are subject to the hazard analysis methodologies of this standard.

  12. Aircraft accidents : method of analysis

    NASA Technical Reports Server (NTRS)

    1929-01-01

    This report on a method of analysis of aircraft accidents has been prepared by a special committee on the nomenclature, subdivision, and classification of aircraft accidents organized by the National Advisory Committee for Aeronautics in response to a request dated February 18, 1928, from the Air Coordination Committee consisting of the Assistant Secretaries for Aeronautics in the Departments of War, Navy, and Commerce. The work was undertaken in recognition of the difficulty of drawing correct conclusions from efforts to analyze and compare reports of aircraft accidents prepared by different organizations using different classifications and definitions. The air coordination committee's request was made "in order that practices used may henceforth conform to a standard and be universally comparable." the purpose of the special committee therefore was to prepare a basis for the classification and comparison of aircraft accidents, both civil and military. (author)

  13. Flow analysis system and method

    NASA Technical Reports Server (NTRS)

    Hill, Wayne S. (Inventor); Barck, Bruce N. (Inventor)

    1998-01-01

    A non-invasive flow analysis system and method wherein a sensor, such as an acoustic sensor, is coupled to a conduit for transmitting a signal which varies depending on the characteristics of the flow in the conduit. The signal is amplified and there is a filter, responsive to the sensor signal, and tuned to pass a narrow band of frequencies proximate the resonant frequency of the sensor. A demodulator generates an amplitude envelope of the filtered signal and a number of flow indicator quantities are calculated based on variations in amplitude of the amplitude envelope. A neural network, or its equivalent, is then used to determine the flow rate of the flow in the conduit based on the flow indicator quantities.

  14. Risk-Informed Safety Margin Characterization Methods Development Work

    SciTech Connect

    Smith, Curtis L; Ma, Zhegang; Tom Riley; Mandelli, Diego; Nielsen, Joseph W; Alfonsi, Andrea; Rabiti, Cristian

    2014-09-01

    This report summarizes the research activity developed during the Fiscal year 2014 within the Risk Informed Safety Margin and Characterization (RISMC) pathway within the Light Water Reactor Sustainability (LWRS) campaign. This research activity is complementary to the one presented in the INL/EXT-??? report which shows advances Probabilistic Risk Assessment Analysis using RAVEN and RELAP-7 in conjunction to novel flooding simulation tools. Here we present several analyses that prove the values of the RISMC approach in order to assess risk associated to nuclear power plants (NPPs). We focus on simulation based PRA which, in contrast to classical PRA, heavily employs system simulator codes. Firstly we compare, these two types of analyses, classical and RISMC, for a Boiling water reactor (BWR) station black out (SBO) initiating event. Secondly we present an extended BWR SBO analysis using RAVEN and RELAP-5 which address the comments and suggestions received about he original analysis presented in INL/EXT-???. This time we focus more on the stochastic analysis such probability of core damage and on the determination of the most risk-relevant factors. We also show some preliminary results regarding the comparison between RELAP5-3D and the new code RELAP-7 for a simplified Pressurized Water Reactors system. Lastly we present some conceptual ideas regarding the possibility to extended the RISMC capabilities from an off-line tool (i.e., as PRA analysis tool) to an online-tool. In this new configuration, RISMC capabilities can be used to assist and inform reactor operator during real accident scenarios.

  15. Computational methods for global/local analysis

    NASA Technical Reports Server (NTRS)

    Ransom, Jonathan B.; Mccleary, Susan L.; Aminpour, Mohammad A.; Knight, Norman F., Jr.

    1992-01-01

    Computational methods for global/local analysis of structures which include both uncoupled and coupled methods are described. In addition, global/local analysis methodology for automatic refinement of incompatible global and local finite element models is developed. Representative structural analysis problems are presented to demonstrate the global/local analysis methods.

  16. Environmental risk analysis for indirect coal liquefaction

    SciTech Connect

    Barnthouse, L.W.; Suter, G.W. II; Baes, C.F. III; Bartell, S.M.; Cavendish, M.G.; Gardner, R.H.; O'Neill, R.V.; Rosen, A.E.

    1985-01-01

    This report presents an analysis of the risks to fish, water quality (due to noxious algal blooms), crops, forests, and wildlife of two technologies for the indirect liquefaction of coal: Lurgi and Koppers-Totzek gasification of coal for Fischer-Tropsch synthesis. A variety of analytical techniques were used to make maximum use of the available data to consider effects of effluents on different levels of ecological organization. The most significant toxicants to fish were found to be ammonia, cadmium, and acid gases. An analysis of whole-effluent toxicity indicated that the Lurgi effluent is more acutely toxic than the Koppers-Totzek effluent. Six effluent components appear to pose a potential threat of blue-green algal blooms, primarily because of their effects on higher trophic levels. The most important atmospheric emissions with respect to crops, forests, and wildlife were found to be the conventional combustion products SO/sub 2/ and NO/sub 2/. Of the materials deposited on the soil, arsenic, cadmium, and nickel appear of greatest concern for phytotoxicity. 147 references, 5 figures, 41 tables.

  17. Analysis Methods of Magnesium Chips

    NASA Astrophysics Data System (ADS)

    Ohmann, Sven; Ditze, André; Scharf, Christiane

    2015-11-01

    The quality of recycled magnesium from chips depends strongly on their exposure to inorganic and organic impurities that are added during the production processes. Different kinds of magnesium chips from these processes were analyzed by several methods. In addition, the accuracy and effectiveness of the methods are discussed. The results show that the chips belong either to the AZ91, AZ31, AM50/60, or AJ62 alloy. Some kinds of chips show deviations from the above-mentioned normations. Different impurities result mainly from transition metals and lime. The water and oil content does not exceed 25%, and the chip size is not more than 4 mm in the diameter. The sieve analysis shows good results for oily and wet chips. The determination of oil and water shows better results for the application of a Soxhlet compared with the addition of lime and vacuum distillation. The most accurate values for the determination of water and oil are obtained by drying at 110°C (for water) and washing with acetone (for oil) by hand.

  18. Method and tool for network vulnerability analysis

    DOEpatents

    Swiler, Laura Painton; Phillips, Cynthia A.

    2006-03-14

    A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

  19. [Methods for mortality analysis in SENTIERI Project].

    PubMed

    De Santis, M; Pasetto, R; Minelli, G; Conti, S

    2011-01-01

    The methods of mortality analysis in Italian polluted sites (IPS) are described. The study concerned 44 IPSs; each one included one or more municipalities. Mortality at municipality level was studied in the period 1995-2002, using the following indicators: crude rate, standardized rate, standardized mortality ratio (SMR), and SMR adjusted for an ad hoc deprivation index. Regional populations were used as reference for indirect standardization. The deprivation index was constructed using the 2001 national census variables representing the following socioeconomic domains: education, unemployment, dwelling ownership, overcrowding. Mortality indicators were computed for 63 single or grouped causes. The results for all the 63 analysed causes of death are available for each IPS, and in this Chapter the results for each IPS for causes selected on the basis of a priori evidence of risk from local sources of environmental pollution are presented. The procedures and results of the evidence evaluation have been published in the 2010 Supplement of Epidemiology & Prevention devoted to SENTIERI.

  20. Methods development to evaluate the risk of upgrading to DCS: The human factor

    SciTech Connect

    Ostrom, L.T.; Wilhelmsen, C.A.

    1995-04-01

    The NRC recognizes that a more complete technical basis for understanding and regulating advanced digital technologies in commercial nuclear power plants is needed. A concern is that the introduction of digital safety systems may have an impact on risk. There is currently no standard methodology for measuring digital system reliability. A tool currently used to evaluate NPP risk in analog systems is the probabilistic risk assessment (PRA). The use of this tool to evaluate the digital system risk was considered to be a potential methodology for determining the risk. To test this hypothesis, it was decided to perform a limited PRA on a single dominant accident sequence. However, a review of existing human reliability analysis (HRA) methods showed that they were inadequate to analyze systems utilizing digital technology. A four step process was used to adapt existing HRA methodologies to digital environments and to develop new techniques. The HRA methods were then used to analyze an NPP that had undergone a backfit to digital technology in order to determine, as a first step, whether the methods were effective. The very small-break loss of coolant accident sequence was analyzed to determine whether the upgrade to the Eagle-21 process protection system had an effect on risk. The analysis of the very small-break LOCA documented in the Sequoyah PRA was used as the basis of the analysis. The analysis of the results of the HRA showed that the mean human error probabilities for the Eagle-21 PPS were slightly less than those for the analog system it replaced. One important observation from the analysis is that the operators have increased confidence steming from the better level of control provided by the digital system. The analysis of the PRA results, which included the human error component and the Eagle-21 PPS, disclosed that the reactor protection system had a higher failure rate than the analog system, although the difference was not statistically significant.

  1. Capital investment analysis: three methods.

    PubMed

    Gapenski, L C

    1993-08-01

    Three cash flow/discount rate methods can be used when conducting capital budgeting financial analyses: the net operating cash flow method, the net cash flow to investors method, and the net cash flow to equity holders method. The three methods differ in how the financing mix and the benefits of debt financing are incorporated. This article explains the three methods, demonstrates that they are essentially equivalent, and recommends which method to use under specific circumstances.

  2. Risk analysis for autonomous underwater vehicle operations in extreme environments.

    PubMed

    Brito, Mario Paulo; Griffiths, Gwyn; Challenor, Peter

    2010-12-01

    Autonomous underwater vehicles (AUVs) are used increasingly to explore hazardous marine environments. Risk assessment for such complex systems is based on subjective judgment and expert knowledge as much as on hard statistics. Here, we describe the use of a risk management process tailored to AUV operations, the implementation of which requires the elicitation of expert judgment. We conducted a formal judgment elicitation process where eight world experts in AUV design and operation were asked to assign a probability of AUV loss given the emergence of each fault or incident from the vehicle's life history of 63 faults and incidents. After discussing methods of aggregation and analysis, we show how the aggregated risk estimates obtained from the expert judgments were used to create a risk model. To estimate AUV survival with mission distance, we adopted a statistical survival function based on the nonparametric Kaplan-Meier estimator. We present theoretical formulations for the estimator, its variance, and confidence limits. We also present a numerical example where the approach is applied to estimate the probability that the Autosub3 AUV would survive a set of missions under Pine Island Glacier, Antarctica in January-March 2009.

  3. Novel methods for spectral analysis

    NASA Astrophysics Data System (ADS)

    Roy, R.; Sumpter, B. G.; Pfeffer, G. A.; Gray, S. K.; Noid, D. W.

    1991-06-01

    In this review article, various techniques for obtaining estimates of parameters related to the spectrum of an underlying process are discussed. These techniques include the conventional nonparametric FFT approach and more recently developed parametric techniques such as maximum entropy, MUSIC, and ESPRIT, the latter two being classified as signal-subspace or eigenvector techniques. These estimators span the spectrum of possible estimators in that extremes of a priori knowledge are assumed (nonparametric versus parametric) and extremes in the underlying model of the observed process (deterministic versus stochastic) are involved. The advantage of parametric techniques is their ability to provide very accurate estimates using data from extremely short time intervals. Several applications of these novel methods for frequency analysis of very short time data are presented. These include calculation of dispersion curves, and the density of vibrational states g(ω) for many-body systems, semiclassical transition frequencies, overtone linewidths, and resonance energies of the time-dependent Schrödinger equation for few-body problems.

  4. Geotechnical risk analysis by flat dilatometer (DMT)

    NASA Astrophysics Data System (ADS)

    Amoroso, Sara; Monaco, Paola

    2015-04-01

    In the last decades we have assisted at a massive migration from laboratory testing to in situ testing, to the point that, today, in situ testing is often the major part of a geotechnical investigation. The State of the Art indicates that direct-push in situ tests, such as the Cone Penetration Test (CPT) and the Flat Dilatometer Test (DMT), are fast and convenient in situ tests for routine site investigation. In most cases the DMT estimated parameters, in particular the undrained shear strength su and the constrained modulus M, are used with the common design methods of Geotechnical Engineering for evaluating bearing capacity, settlements etc. The paper focuses on the prediction of settlements of shallow foundations, that is probably the No. 1 application of the DMT, especially in sands, where undisturbed samples cannot be retrieved, and on the risk associated with their design. A compilation of documented case histories that compare DMT-predicted vs observed settlements, was collected by Monaco et al. (2006), indicating that, in general, the constrained modulus M can be considered a reasonable "operative modulus" (relevant to foundations in "working conditions") for settlement predictions based on the traditional linear elastic approach. Indeed, the use of a site investigation method, such as DMT, that improve the accuracy of design parameters, reduces risk, and the design can then center on the site's true soil variability without parasitic test variability. In this respect, Failmezger et al. (1999, 2015) suggested to introduce Beta probability distribution, that provides a realistic and useful description of variability for geotechnical design problems. The paper estimates Beta probability distribution in research sites where DMT tests and observed settlements are available. References Failmezger, R.A., Rom, D., Ziegler, S.R. (1999). "SPT? A better approach of characterizing residual soils using other in-situ tests", Behavioral Characterics of Residual Soils, B

  5. Sexual Pleasure and Sexual Risk among Women who Use Methamphetamine: A Mixed Methods Study

    PubMed Central

    Lorvick, Jennifer; Bourgois, Philippe; Wenger, Lynn D.; Arreola, Sonya G.; Lutnick, Alexandra; Wechsberg, Wendee M.; Kral, Alex H.

    2012-01-01

    Background The intersection of drug use, sexual pleasure and sexual risk behavior is rarely explored when it comes to poor women who use drugs. This paper explores the relationship between sexual behavior and methamphetamine use in a community-based sample of women, exploring not only risk, but also desire, pleasure and the challenges of overcoming trauma. Methods Quantitative data were collected using standard epidemiological methods (N=322) for community-based studies. In addition, using purposive sampling, qualitative data were collected among a subset of participants (n=34). Data were integrated for mixed methods analysis. Results While many participants reported sexual risk behavior (unprotected vaginal or anal intercourse) in the quantitative survey, sexual risk was not the central narrative pertaining to sexual behavior and methamphetamine use in qualitative findings. Rather, desire, pleasure and disinhibition arose as central themes. Women described feelings of power and agency related to sexual behavior while high on methamphetamine. Findings were mixed on whether methamphetamine use increased sexual risk behavior. Conclusion The use of mixed methods afforded important insights into the sexual behavior and priorities of methamphetamine-using women. Efforts to reduce sexual risk should recognize and valorize the positive aspects of methamphetamine use for some women, building on positive feelings of power and agency as an approach to harm minimization. PMID:22954501

  6. Automating risk analysis of software design models.

    PubMed

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  7. Automating Risk Analysis of Software Design Models

    PubMed Central

    Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  8. Spatial risk assessment for critical network infrastructure using sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Möderl, Michael; Rauch, Wolfgang

    2011-12-01

    The presented spatial risk assessment method allows for managing critical network infrastructure in urban areas under abnormal and future conditions caused e.g., by terrorist attacks, infrastructure deterioration or climate change. For the spatial risk assessment, vulnerability maps for critical network infrastructure are merged with hazard maps for an interfering process. Vulnerability maps are generated using a spatial sensitivity analysis of network transport models to evaluate performance decrease under investigated thread scenarios. Thereby parameters are varied according to the specific impact of a particular threat scenario. Hazard maps are generated with a geographical information system using raster data of the same threat scenario derived from structured interviews and cluster analysis of events in the past. The application of the spatial risk assessment is exemplified by means of a case study for a water supply system, but the principal concept is applicable likewise to other critical network infrastructure. The aim of the approach is to help decision makers in choosing zones for preventive measures.

  9. A structured elicitation method to identify key direct risk factors for the management of natural resources.

    PubMed

    Smith, Michael; Wallace, Ken; Lewis, Loretta; Wagner, Christian

    2015-11-01

    The high level of uncertainty inherent in natural resource management requires planners to apply comprehensive risk analyses, often in situations where there are few resources. In this paper, we demonstrate a broadly applicable, novel and structured elicitation approach to identify important direct risk factors. This new approach combines expert calibration and fuzzy based mathematics to capture and aggregate subjective expert estimates of the likelihood that a set of direct risk factors will cause management failure. A specific case study is used to demonstrate the approach; however, the described methods are widely applicable in risk analysis. For the case study, the management target was to retain all species that characterise a set of natural biological elements. The analysis was bounded by the spatial distribution of the biological elements under consideration and a 20-year time frame. Fourteen biological elements were expected to be at risk. Eleven important direct risk factors were identified that related to surrounding land use practices, climate change, problem species (e.g., feral predators), fire and hydrological change. In terms of their overall influence, the two most important risk factors were salinisation and a lack of water which together pose a considerable threat to the survival of nine biological elements. The described approach successfully overcame two concerns arising from previous risk analysis work: (1) the lack of an intuitive, yet comprehensive scoring method enabling the detection and clarification of expert agreement and associated levels of uncertainty; and (2) the ease with which results can be interpreted and communicated while preserving a rich level of detail essential for informed decision making.

  10. A structured elicitation method to identify key direct risk factors for the management of natural resources.

    PubMed

    Smith, Michael; Wallace, Ken; Lewis, Loretta; Wagner, Christian

    2015-11-01

    The high level of uncertainty inherent in natural resource management requires planners to apply comprehensive risk analyses, often in situations where there are few resources. In this paper, we demonstrate a broadly applicable, novel and structured elicitation approach to identify important direct risk factors. This new approach combines expert calibration and fuzzy based mathematics to capture and aggregate subjective expert estimates of the likelihood that a set of direct risk factors will cause management failure. A specific case study is used to demonstrate the approach; however, the described methods are widely applicable in risk analysis. For the case study, the management target was to retain all species that characterise a set of natural biological elements. The analysis was bounded by the spatial distribution of the biological elements under consideration and a 20-year time frame. Fourteen biological elements were expected to be at risk. Eleven important direct risk factors were identified that related to surrounding land use practices, climate change, problem species (e.g., feral predators), fire and hydrological change. In terms of their overall influence, the two most important risk factors were salinisation and a lack of water which together pose a considerable threat to the survival of nine biological elements. The described approach successfully overcame two concerns arising from previous risk analysis work: (1) the lack of an intuitive, yet comprehensive scoring method enabling the detection and clarification of expert agreement and associated levels of uncertainty; and (2) the ease with which results can be interpreted and communicated while preserving a rich level of detail essential for informed decision making. PMID:27441228

  11. a Meteorological Risk Assessment Method for Power Lines Based on GIS and Multi-Sensor Integration

    NASA Astrophysics Data System (ADS)

    Lin, Zhiyong; Xu, Zhimin

    2016-06-01

    Power lines, exposed in the natural environment, are vulnerable to various kinds of meteorological factors. Traditional research mainly deals with the influence of a single meteorological condition on the power line, which lacks of comprehensive effects evaluation and analysis of the multiple meteorological factors. In this paper, we use multiple meteorological monitoring data obtained by multi-sensors to implement the meteorological risk assessment and early warning of power lines. Firstly, we generate meteorological raster map from discrete meteorological monitoring data using spatial interpolation. Secondly, the expert scoring based analytic hierarchy process is used to compute the power line risk index of all kinds of meteorological conditions and establish the mathematical model of meteorological risk. By adopting this model in raster calculator of ArcGIS, we will have a raster map showing overall meteorological risks for power line. Finally, by overlaying the power line buffer layer to that raster map, we will get to know the exact risk index around a certain part of power line, which will provide significant guidance for power line risk management. In the experiment, based on five kinds of observation data gathered from meteorological stations in Guizhou Province of China, including wind, lightning, rain, ice, temperature, we carry on the meteorological risk analysis for the real power lines, and experimental results have proved the feasibility and validity of our proposed method.

  12. A workshop on developing risk assessment methods for medical use of radioactive material. Volume 1: Summary

    SciTech Connect

    Tortorelli, J.P.

    1995-08-01

    A workshop was held at the Idaho National Engineering Laboratory, August 16--18, 1994 on the topic of risk assessment on medical devices that use radioactive isotopes. Its purpose was to review past efforts to develop a risk assessment methodology to evaluate these devices, and to develop a program plan and a scoping document for future methodology development. This report contains a summary of that workshop. Participants included experts in the fields of radiation oncology, medical physics, risk assessment, human-error analysis, and human factors. Staff from the US Nuclear Regulatory Commission (NRC) associated with the regulation of medical uses of radioactive materials and with research into risk-assessment methods participated in the workshop. The workshop participants concurred in NRC`s intended use of risk assessment as an important technology in the development of regulations for the medical use of radioactive material and encouraged the NRC to proceed rapidly with a pilot study. Specific recommendations are included in the executive summary and the body of this report. An appendix contains the 8 papers presented at the conference: NRC proposed policy statement on the use of probabilistic risk assessment methods in nuclear regulatory activities; NRC proposed agency-wide implementation plan for probabilistic risk assessment; Risk evaluation of high dose rate remote afterloading brachytherapy at a large research/teaching institution; The pros and cons of using human reliability analysis techniques to analyze misadministration events; Review of medical misadministration event summaries and comparison of human error modeling; Preliminary examples of the development of error influences and effects diagrams to analyze medical misadministration events; Brachytherapy risk assessment program plan; and Principles of brachytherapy quality assurance.

  13. Methods and Techniques for Risk Prediction of Space Shuttle Upgrades

    NASA Technical Reports Server (NTRS)

    Hoffman, Chad R.; Pugh, Rich; Safie, Fayssal

    1998-01-01

    Since the Space Shuttle Accident in 1986, NASA has been trying to incorporate probabilistic risk assessment (PRA) in decisions concerning the Space Shuttle and other NASA projects. One major study NASA is currently conducting is in the PRA area in establishing an overall risk model for the Space Shuttle System. The model is intended to provide a tool to predict the Shuttle risk and to perform sensitivity analyses and trade studies including evaluation of upgrades. Marshall Space Flight Center (MSFC) and its prime contractors including Pratt and Whitney (P&W) are part of the NASA team conducting the PRA study. MSFC responsibility involves modeling the External Tank (ET), the Solid Rocket Booster (SRB), the Reusable Solid Rocket Motor (RSRM), and the Space Shuttle Main Engine (SSME). A major challenge that faced the PRA team is modeling the shuttle upgrades. This mainly includes the P&W High Pressure Fuel Turbopump (HPFTP) and the High Pressure Oxidizer Turbopump (HPOTP). The purpose of this paper is to discuss the various methods and techniques used for predicting the risk of the P&W redesigned HPFTP and HPOTP.

  14. Petri net modeling of fault analysis for probabilistic risk assessment

    NASA Astrophysics Data System (ADS)

    Lee, Andrew

    Fault trees and event trees have been widely accepted as the modeling strategy to perform Probabilistic Risk Assessment (PRA). However, there are several limitations associated with fault tree/event tree modeling. These include 1. It only considers binary events; 2. It assumes independence among basic events; and 3. It does not consider timing sequence of basic events. This thesis investigates Petri net modeling as a potential alternative for PRA modeling. Petri nets have mainly been used as a simulation tool for queuing and network systems. However, it has been suggested that they could also model failure scenarios, and thus could be a potential modeling strategy for PRA. In this thesis, the transformations required to model logic gates in a fault tree by Petri nets are explored. The gap between fault tree analysis and Petri net analysis is bridged through gate equivalency analysis. Methods for qualitative and quantitative analysis for Petri nets are presented. Techniques are developed and implemented to revise and tailor traditional Petri net modeling for system failure analysis. The airlock system and the maintenance cooling system of a CANada Deuterium Uranium (CANDU) reactor are used as case studies to demonstrate Petri nets ability to model system failure and provide a structured approach for qualitative and quantitative analysis. The minimal cutsets and the probability of the airlock system failing to maintain the pressure boundary are obtained. Furthermore, the case study is extended to non-coherent system analysis due to system maintenance.

  15. Methods to evaluate the nutrition risk in hospitalized patients

    PubMed Central

    Erkan, Tülay

    2014-01-01

    The rate of malnutrition is substantially high both in the population and in chronic patients hospitalized because of different reasons. The rate of patients with no marked malnutrition at the time of hospitalization who develop malnutrition during hospitalization is also substantially high. Therefore, there are currently different screening methods with different targets to prevent malnutrition and its overlook. These methods should be simple and reliable and should not be time-consuming in order to be used in daily practice. Seven nutrition risk screening methods used in children have been established until the present time. However, no consensus has been made on any method as in adults. It should be accepted that interrogation of nutrition is a part of normal examination to increase awareness on this issue and to draw attention to this issue. PMID:26078678

  16. 49 CFR 260.17 - Credit risk premium analysis.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 4 2014-10-01 2014-10-01 false Credit risk premium analysis. 260.17 Section 260.17 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD... Financial Assistance § 260.17 Credit risk premium analysis. (a) When Federal appropriations are...

  17. 49 CFR 260.17 - Credit risk premium analysis.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 4 2013-10-01 2013-10-01 false Credit risk premium analysis. 260.17 Section 260.17 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD... Financial Assistance § 260.17 Credit risk premium analysis. (a) When Federal appropriations are...

  18. Virtues and Limitations of Risk Analysis

    ERIC Educational Resources Information Center

    Weatherwax, Robert K.

    1975-01-01

    After summarizing the Rasmussion Report, the author reviews the probabilistic portion of the report from the perspectives of engineering utility and risk assessment uncertainty. The author shows that the report may represent a significant step forward in the assurance of reactor safety and an imperfect measure of actual reactor risk. (BT)

  19. Risk analysis for worker exposure to benzene

    NASA Astrophysics Data System (ADS)

    Hallenbeck, William H.; Flowers, Roxanne E.

    1992-05-01

    Cancer risk factors (characterized by route, dose, dose rate per kilogram, fraction of lifetime exposed, species, and sex) were derived for workers exposed to benzene via inhalation or ingestion. Exposure at the current Occupational Safety and Health Administration (OSHA) permissible exposure limit (PEL) and at leaking underground storage tank (LUST) sites were evaluated. At the current PEL of 1 ppm, the theoretical lifetime excess risk of cancer from benzene inhalation is ten per 1000. The theoretical lifetime excess risk for worker inhalation exposure at LUST sites ranged from 10 to 40 per 1000. These results indicate that personal protection should be required. The theoretical lifetime excess risk due to soil ingestion is five to seven orders of magnitude less than the inhalation risks.

  20. Method Analysis of Microbial-Resistant Gypsum Products

    EPA Science Inventory

    Method Analysis of Microbial-Resistant Gypsum ProductsD.A. Betancourt1, T.R.Dean1, A. Evans2, and G.Byfield2 1. US Environmental Protection Agency, Office of Research and Development, National Risk Management Research Laboratory; RTP, NC 277112. RTI International, RTP, NCSeveral...

  1. Proposal of a management method of rockfall risk induced on a road

    NASA Astrophysics Data System (ADS)

    Mignelli, C.; Peila, D.; Lo Russo, S.

    2012-04-01

    Many kilometers of roads have adjacent rock slopes that are prone to rockfall. The analysis of risks associated with these types of instabilities is a complex operation requiring the precise assessment of hazard, the vulnerability and therefore the risk of vehicles on roads along the foothills. Engineering design of protection devices should aim to minimize risk while taking advantage of the most advanced technologies. Decision makers should be equipped with the technical tools permitting them to choose the best solution within the context of local maximum acceptable risk levels. The fulfilment of safety requirements for mountainside routes involves in many cases the implementation of protective measures and devices to control and manage rockfall and it is of key importance the evaluation of the positive effects of such measures in terms of risk reduction. A risk analysis management procedure for roads subject to rockfall phenomena using a specifically developed method named: Rockfall risk Management (RO.MA.) is presented and discussed. The method is based on statistic tools, using as input the data coming both from in situ survey and from historical data. It is important to highline that historical database are not often available and usually there is a lack of useful information due to a not complete setting of parameters. The analysis based only on historical data can be difficult to be developed. For this purpose a specific database collection system has been developed to provide geotechnical and geomechanical description of the studied rockside. This parameters and the data collected from historical database, define the input parameters of the Ro.Ma method. Moreover to allow the quantification of the harm, the data coming from the monitoring of the road by the road manager are required. The value of harm is proportional to the number of persons on the road (i.e. people in a vehicle) and the following traffic characteristics: type of vehicles (i.e. bicycles

  2. Toward a risk assessment of the spent fuel and high-level nuclear waste disposal system. Risk assessment requirements, literature review, methods evaluation: an interim report

    SciTech Connect

    Hamilton, L.D.; Hill, D.; Rowe, M.D.; Stern, E.

    1986-04-01

    This report provides background information for a risk assessment of the disposal system for spent nuclear fuel and high-level radioactive waste (HLW). It contains a literature review, a survey of the statutory requirements for risk assessment, and a preliminary evaluation of methods. The literature review outlines the state of knowledge of risk assessment and accident consequence analysis in the nuclear fuel cycle and its applicability to spent fuel and HLW disposal. The survey of statutory requirements determines the extent to which risk assessment may be needed in development of the waste-disposal system. The evaluation of methods reviews and evaluates merits and applicabilities of alternative methods for assessing risks and relates them to the problems of spent fuel and HLW disposal. 99 refs.

  3. Contextual Risk Factors for Low Birth Weight: A Multilevel Analysis

    PubMed Central

    Kayode, Gbenga A.; Amoakoh-Coleman, Mary; Agyepong, Irene Akua; Ansah, Evelyn; Grobbee, Diederick E.; Klipstein-Grobusch, Kerstin

    2014-01-01

    Background Low birth weight (LBW) remains to be a leading cause of neonatal death and a major contributor to infant and under-five mortality. Its prevalence has not declined in the last decade in sub-Saharan Africa (SSA) and Asia. Some individual level factors have been identified as risk factors for LBW but knowledge is limited on contextual risk factors for LBW especially in SSA. Methods Contextual risk factors for LBW in Ghana were identified by performing multivariable multilevel logistic regression analysis of 6,900 mothers dwelling in 412 communities that participated in the 2003 and 2008 Demographic and Health Surveys in Ghana. Results Contextual-level factors were significantly associated with LBW: Being a rural dweller increased the likelihood of having a LBW infant by 43% (OR 1.43; 95% CI 1.01–2.01; P-value <0.05) while living in poverty-concentrated communities increased the risk of having a LBW infant twofold (OR 2.16; 95% CI 1.29–3.61; P-value <0.01). In neighbourhoods with a high coverage of safe water supply the odds of having a LBW infant reduced by 28% (OR 0.74; 95% CI 0.57–0.96; P-value <0.05). Conclusion This study showed contextual risk factors to have independent effects on the prevalence of LBW infants. Being a rural dweller, living in a community with a high concentration of poverty and a low coverage of safe water supply were found to increase the prevalence of LBW infants. Implementing appropriate community-based intervention programmes will likely reduce the occurrence of LBW infants. PMID:25360709

  4. EC Transmission Line Risk Identification and Analysis

    SciTech Connect

    Bigelow, Tim S

    2012-04-01

    The purpose of this document is to assist in evaluating and planning for the cost, schedule, and technical project risks associated with the delivery and operation of the EC (Electron cyclotron) transmission line system. In general, the major risks that are anticipated to be encountered during the project delivery phase associated with the implementation of the Procurement Arrangement for the EC transmission line system are associated with: (1) Undefined or changing requirements (e.g., functional or regulatory requirements) (2) Underperformance of prototype, first unit, or production components during testing (3) Unavailability of qualified vendors for critical components Technical risks associated with the design and operation of the system are also identified.

  5. Stratospheric Aerosol and Gas Experiment, SAGE III on ISS, An Earth Science Mission on the International Space Station, Schedule Risk Analysis, A Project Perspective

    NASA Technical Reports Server (NTRS)

    Bonine, Lauren

    2015-01-01

    The presentation provides insight into the schedule risk analysis process used by the Stratospheric Aerosol and Gas Experiment III on the International Space Station Project. The presentation focuses on the schedule risk analysis process highlighting the methods for identification of risk inputs, the inclusion of generic risks identified outside the traditional continuous risk management process, and the development of tailored analysis products used to improve risk informed decision making.

  6. Enhancing local action planning through quantitative flood risk analysis: a case study in Spain

    NASA Astrophysics Data System (ADS)

    Castillo-Rodríguez, Jesica Tamara; Escuder-Bueno, Ignacio; Perales-Momparler, Sara; Ramón Porta-Sancho, Juan

    2016-07-01

    This article presents a method to incorporate and promote quantitative risk analysis to support local action planning against flooding. The proposed approach aims to provide a framework for local flood risk analysis, combining hazard mapping with vulnerability data to quantify risk in terms of expected annual affected population, potential injuries, number of fatalities, and economic damages. Flood risk is estimated combining GIS data of loads, system response, and consequences and using event tree modelling for risk calculation. The study area is the city of Oliva, located on the eastern coast of Spain. Results from risk modelling have been used to inform local action planning and to assess the benefits of structural and non-structural risk reduction measures. Results show the potential impact on risk reduction of flood defences and improved warning communication schemes through local action planning: societal flood risk (in terms of annual expected affected population) would be reduced up to 51 % by combining both structural and non-structural measures. In addition, the effect of seasonal population variability is analysed (annual expected affected population ranges from 82 to 107 %, compared with the current situation, depending on occupancy rates in hotels and campsites). Results highlight the need for robust and standardized methods for urban flood risk analysis replicability at regional and national scale.

  7. Fire behavior and risk analysis in spacecraft

    NASA Technical Reports Server (NTRS)

    Friedman, Robert; Sacksteder, Kurt R.

    1988-01-01

    Practical risk management for present and future spacecraft, including space stations, involves the optimization of residual risks balanced by the spacecraft operational, technological, and economic limitations. Spacecraft fire safety is approached through three strategies, in order of risk: (1) control of fire-causing elements, through exclusion of flammable materials for example; (2) response to incipient fires through detection and alarm; and (3) recovery of normal conditions through extinguishment and cleanup. Present understanding of combustion in low gravity is that, compared to normal gravity behavior, fire hazards may be reduced by the absence of buoyant gas flows yet at the same time increased by ventilation flows and hot particle expulsion. This paper discusses the application of low-gravity combustion knowledge and appropriate aircraft analogies to fire detection, fire fighting, and fire-safety decisions for eventual fire-risk management and optimization in spacecraft.

  8. Control of Risks Through the Use of Procedures: A Method for Evaluating the Change in Risk

    NASA Technical Reports Server (NTRS)

    Praino, Gregory T.; Sharit, Joseph

    2010-01-01

    This paper considers how procedures can be used to control risks faced by an organization and proposes a means of recognizing if a particular procedure reduces risk or contributes to the organization's exposure. The proposed method was developed out of the review of work documents and the governing procedures performed in the wake of the Columbia accident by NASA and the Space Shuttle prime contractor, United Space Alliance, LLC. A technique was needed to understand the rules, or procedural controls, in place at the time in the context of how important the role of each rule was. The proposed method assesses procedural risks, the residual risk associated with a hazard after a procedure's influence is accounted for, by considering each clause of a procedure as a unique procedural control that may be beneficial or harmful. For procedural risks with consequences severe enough to threaten the survival of the organization, the method measures the characteristics of each risk on a scale that is an alternative to the traditional consequence/likelihood couple. The dual benefits of the substitute scales are that they eliminate both the need to quantify a relationship between different consequence types and the need for the extensive history a probabilistic risk assessment would require. Control Value is used as an analog for the consequence, where the value of a rule is based on how well the control reduces the severity of the consequence when operating successfully. This value is composed of two parts: the inevitability of the consequence in the absence of the control, and the opportunity to intervene before the consequence is realized. High value controls will be ones where there is minimal need for intervention but maximum opportunity to actively prevent the outcome. Failure Likelihood is used as the substitute for the conventional likelihood of the outcome. For procedural controls, a failure is considered to be any non-malicious violation of the rule, whether intended or

  9. DMAICR in an ergonomic risks analysis.

    PubMed

    Santos, E F; Lima, C R C

    2012-01-01

    The DMAICR problem-solving methodology is used throughout this paper to show you how to implement ergonomics recommendations. The DMAICR method consists of the following five six steps by which you can solve ergonomic design problems: The steps of the proposed method, adapting DMAICR, are the following: In the steep D, there is the definition of the project or the situation to be assessed and its guiding objectives, known as demand. In the step M, it relates to the work, tasks and organizational protocols and also includes the need of measuring. In the step A, all concepts are about the analysis itself. The step I is the moment of improving or incrementing. In the step C, control, prevention from prospective troublesome situation and implementation of management are the activities controlling the situation. R is Report. Some relevant technical and conceptual aspects for the comparison of these methodologies are illustrated in this paper. The steps of DMAICR were taken by a multifunctional team (multi-professional and multi-disciplinary) termed as focus group, composed by selected members of the company and supported by experts in ergonomics.

  10. Human-centered risk management for medical devices - new methods and tools.

    PubMed

    Janß, Armin; Plogmann, Simon; Radermacher, Klaus

    2016-04-01

    Studies regarding adverse events with technical devices in the medical context showed, that in most of the cases non-usable interfaces are the cause for use deficiencies and therefore a potential harm for the patient and third parties. This is partially due to the lack of suitable methods for interlinking usability engineering and human-centered risk management. Especially regarding the early identification of human-induced errors and the systematic control of these failures, medical device manufacturers and in particular the developers have to be supported in order to guarantee reliable design and error-tolerant human-machine interfaces (HMI). In this context, we developed the HiFEM methodology and a corresponding software tool (mAIXuse) for model-based human risk analysis. Based on a two-fold approach, HiFEM provides a task-type-sensitive modeling structure with integrated temporal relations in order to represent and analyze the use process in a detailed way. The approach can be used from early developmental stages up to the validation process. Results of a comparative study with the HiFEM method and a classical process-failure mode and effect analysis (FMEA) depict, that the new modeling and analysis technique clearly outperforms the FMEA. Besides, we implemented a new method for systematic human risk control (mAIXcontrol). Accessing information from the method's knowledge base enables the operator to detect the most suitable countermeasures for a respective risk. Forty-one approved generic countermeasure principles have been indexed as a resulting combination of root causes and failures in a matrix. The methodology has been tested in comparison to a conventional approach as well. Evaluation of the matrix and the reassessment of the risk priority numbers by a blind expert demonstrate a substantial benefit of the new mAIXcontrol method.

  11. Perceived Risks Associated with Contraceptive Method Use among Men and Women in Ibadan and Kaduna, Nigeria.

    PubMed

    Schwandt, Hilary M; Skinner, Joanna; Hebert, Luciana E; Saad, Abdulmumin

    2015-12-01

    Research shows that side effects are often the most common reason for contraceptive non-use in Nigeria; however, research to date has not explored the underlying factors that influence risk and benefit perceptions associated with specific contraceptive methods in Nigeria. A qualitative study design using focus group discussions was used to explore social attitudes and beliefs about family planning methods in Ibadan and Kaduna, Nigeria. A total of 26 focus group discussions were held in 2010 with men and women of reproductive age, disaggregated by city, sex, age, marital status, neighborhood socioeconomic status, and--for women only--family planning experience. A discussion guide was used that included specific questions about the perceived risks and benefits associated with the use of six different family planning methods. A thematic content analytic approach guided the analysis. Participants identified a spectrum of risks encompassing perceived threats to health (both real and fictitious) and social concerns, as well as benefits associated with each method. By exploring Nigerian perspectives on the risks and benefits associated with specific family planning methods, programs aiming to increase contraceptive use in Nigeria can be better equipped to highlight recognized benefits, address specific concerns, and work to dispel misperceptions associated with each family planning method. PMID:27337851

  12. Pesticide residues in cashew apple, guava, kaki and peach: GC-μECD, GC-FPD and LC-MS/MS multiresidue method validation, analysis and cumulative acute risk assessment.

    PubMed

    Jardim, Andréia Nunes Oliveira; Mello, Denise Carvalho; Goes, Fernanda Caroline Silva; Frota Junior, Elcio Ferreira; Caldas, Eloisa Dutra

    2014-12-01

    A multiresidue method for the determination of 46 pesticides in fruits was validated. Samples were extracted with acidified ethyl acetate, MgSO4 and CH3COONa and cleaned up by dispersive SPE with PSA. The compounds were analysed by GC-FPD, GC-μECD or LC-MS/MS, with LOQs from 1 to 8 μg/kg. The method was used to analyse 238 kaki, cashew apple, guava, and peach fruit and pulp samples, which were also analysed for dithiocarbamates (DTCs) using a spectrophotometric method. Over 70% of the samples were positive, with DTC present in 46.5%, λ-cyhalothrin in 37.1%, and omethoate in 21.8% of the positive samples. GC-MS/MS confirmed the identities of the compounds detected by GC. None of the pesticides found in kaki, cashew apple and guava was authorised for these crops in Brazil. The risk assessment has shown that the cumulative acute intake of organophosphorus or pyrethroid compounds from the consumption of these fruits is unlikely to pose a health risk to consumers. PMID:24996324

  13. Pesticide residues in cashew apple, guava, kaki and peach: GC-μECD, GC-FPD and LC-MS/MS multiresidue method validation, analysis and cumulative acute risk assessment.

    PubMed

    Jardim, Andréia Nunes Oliveira; Mello, Denise Carvalho; Goes, Fernanda Caroline Silva; Frota Junior, Elcio Ferreira; Caldas, Eloisa Dutra

    2014-12-01

    A multiresidue method for the determination of 46 pesticides in fruits was validated. Samples were extracted with acidified ethyl acetate, MgSO4 and CH3COONa and cleaned up by dispersive SPE with PSA. The compounds were analysed by GC-FPD, GC-μECD or LC-MS/MS, with LOQs from 1 to 8 μg/kg. The method was used to analyse 238 kaki, cashew apple, guava, and peach fruit and pulp samples, which were also analysed for dithiocarbamates (DTCs) using a spectrophotometric method. Over 70% of the samples were positive, with DTC present in 46.5%, λ-cyhalothrin in 37.1%, and omethoate in 21.8% of the positive samples. GC-MS/MS confirmed the identities of the compounds detected by GC. None of the pesticides found in kaki, cashew apple and guava was authorised for these crops in Brazil. The risk assessment has shown that the cumulative acute intake of organophosphorus or pyrethroid compounds from the consumption of these fruits is unlikely to pose a health risk to consumers.

  14. Risk analysis of an RTG on the Space Shuttle

    NASA Technical Reports Server (NTRS)

    Frank, Michael V.

    1991-01-01

    As part of the effort to review the Ulysses Final Safety Analysis Report and to understand the risk of plutonium release from the Ulysses spacecraft General Purpose Heat Source-Radioisotope Thermal Generator (GPHS-RTG), the Interagency Nuclear Safety Review Panel (INSRP) and the author performed an integrated, quantitative analysis of the uncertainties of the calculated risk of plutonium release from Ulysses. Using state-of-the-art probabilistic risk assessment technology, the uncertainty analysis accounted for both variability and uncertainty of the key parameters of the risk analysis. The results show tht INSRP had high confidence that risk of fatal cancers from potential plutonium release associated with calculated launch and deployment accident scenarios is low.

  15. Space Weather Influence on Power Systems: Prediction, Risk Analysis, and Modeling

    NASA Astrophysics Data System (ADS)

    Yatsenko, Vitaliy

    2016-04-01

    This report concentrates on dynamic probabilistic risk analysis of optical elements for complex characterization of damages using physical model of solid state lasers and predictable level of ionizing radiation and space weather. The following main subjects will be covered by our report: (a) solid-state laser model; (b) mathematical models for dynamic probabilistic risk assessment; and (c) software for modeling and prediction of ionizing radiation. A probabilistic risk assessment method for solid-state lasers is presented with consideration of some deterministic and stochastic factors. Probabilistic risk assessment is a comprehensive, structured, and logical analysis method aimed at identifying and assessing risks in solid-state lasers for the purpose of cost-e®ectively improving their safety and performance. This method based on the Conditional Value-at-Risk measure (CVaR) and the expected loss exceeding Value-at-Risk (VaR). We propose to use a new dynamical-information approach for radiation damage risk assessment of laser elements by cosmic radiation. Our approach includes the following steps: laser modeling, modeling of ionizing radiation in°uences on laser elements, probabilistic risk assessment methods, and risk minimization. For computer simulation of damage processes at microscopic and macroscopic levels the following methods are used: () statistical; (b) dynamical; (c) optimization; (d) acceleration modeling, and (e) mathematical modeling of laser functioning. Mathematical models of space ionizing radiation in°uence on laser elements were developed for risk assessment in laser safety analysis. This is a so-called `black box' or `input-output' models, which seeks only to reproduce the behaviour of the system's output in response to changes in its inputs. The model inputs are radiation in°uences on laser systems and output parameters are dynamical characteristics of the solid laser. Algorithms and software for optimal structure and parameters of

  16. Prognostic Analysis System and Methods of Operation

    NASA Technical Reports Server (NTRS)

    MacKey, Ryan M. E. (Inventor); Sneddon, Robert (Inventor)

    2014-01-01

    A prognostic analysis system and methods of operating the system are provided. In particular, a prognostic analysis system for the analysis of physical system health applicable to mechanical, electrical, chemical and optical systems and methods of operating the system are described herein.

  17. Analysis of Radiation Pneumonitis Risk Using a Generalized Lyman Model

    SciTech Connect

    Tucker, Susan L. Liu, H. Helen; Liao Zhongxing; Wei Xiong; Wang Shulian; Jin Hekun; Komaki, Ritsuko; Martel, Mary K.; Mohan, Radhe

    2008-10-01

    Purpose: To introduce a version of the Lyman normal-tissue complication probability (NTCP) model adapted to incorporate censored time-to-toxicity data and clinical risk factors and to apply the generalized model to analysis of radiation pneumonitis (RP) risk. Methods and Materials: Medical records and radiation treatment plans were reviewed retrospectively for 576 patients with non-small cell lung cancer treated with radiotherapy. The time to severe (Grade {>=}3) RP was computed, with event times censored at last follow-up for patients not experiencing this endpoint. The censored time-to-toxicity data were analyzed using the standard and generalized Lyman models with patient smoking status taken into account. Results: The generalized Lyman model with patient smoking status taken into account produced NTCP estimates up to 27 percentage points different from the model based on dose-volume factors alone. The generalized model also predicted that 8% of the expected cases of severe RP were unobserved because of censoring. The estimated volume parameter for lung was not significantly different from n = 1, corresponding to mean lung dose. Conclusions: NTCP models historically have been based solely on dose-volume effects and binary (yes/no) toxicity data. Our results demonstrate that inclusion of nondosimetric risk factors and censored time-to-event data can markedly affect outcome predictions made using NTCP models.

  18. Convergence analysis of combinations of different methods

    SciTech Connect

    Kang, Y.

    1994-12-31

    This paper provides a convergence analysis for combinations of different numerical methods for solving systems of differential equations. The author proves that combinations of two convergent linear multistep methods or Runge-Kutta methods produce a new convergent method of which the order is equal to the smaller order of the two original methods.

  19. Passive sampling methods for contaminated sediments: risk assessment and management.

    PubMed

    Greenberg, Marc S; Chapman, Peter M; Allan, Ian J; Anderson, Kim A; Apitz, Sabine E; Beegan, Chris; Bridges, Todd S; Brown, Steve S; Cargill, John G; McCulloch, Megan C; Menzie, Charles A; Shine, James P; Parkerton, Thomas F

    2014-04-01

    This paper details how activity-based passive sampling methods (PSMs), which provide information on bioavailability in terms of freely dissolved contaminant concentrations (Cfree ), can be used to better inform risk management decision making at multiple points in the process of assessing and managing contaminated sediment sites. PSMs can increase certainty in site investigation and management, because Cfree is a better predictor of bioavailability than total bulk sediment concentration (Ctotal ) for 4 key endpoints included in conceptual site models (benthic organism toxicity, bioaccumulation, sediment flux, and water column exposures). The use of passive sampling devices (PSDs) presents challenges with respect to representative sampling for estimating average concentrations and other metrics relevant for exposure and risk assessment. These challenges can be addressed by designing studies that account for sources of variation associated with PSMs and considering appropriate spatial scales to meet study objectives. Possible applications of PSMs include: quantifying spatial and temporal trends in bioavailable contaminants, identifying and evaluating contaminant source contributions, calibrating site-specific models, and, improving weight-of-evidence based decision frameworks. PSM data can be used to assist in delineating sediment management zones based on likelihood of exposure effects, monitor remedy effectiveness, and, evaluate risk reduction after sediment treatment, disposal, or beneficial reuse after management actions. Examples are provided illustrating why PSMs and freely dissolved contaminant concentrations (Cfree ) should be incorporated into contaminated sediment investigations and study designs to better focus on and understand contaminant bioavailability, more accurately estimate exposure to sediment-associated contaminants, and better inform risk management decisions. Research and communication needs for encouraging broader use are discussed. PMID

  20. Passive sampling methods for contaminated sediments: Risk assessment and management

    PubMed Central

    Greenberg, Marc S; Chapman, Peter M; Allan, Ian J; Anderson, Kim A; Apitz, Sabine E; Beegan, Chris; Bridges, Todd S; Brown, Steve S; Cargill, John G; McCulloch, Megan C; Menzie, Charles A; Shine, James P; Parkerton, Thomas F

    2014-01-01

    This paper details how activity-based passive sampling methods (PSMs), which provide information on bioavailability in terms of freely dissolved contaminant concentrations (Cfree), can be used to better inform risk management decision making at multiple points in the process of assessing and managing contaminated sediment sites. PSMs can increase certainty in site investigation and management, because Cfree is a better predictor of bioavailability than total bulk sediment concentration (Ctotal) for 4 key endpoints included in conceptual site models (benthic organism toxicity, bioaccumulation, sediment flux, and water column exposures). The use of passive sampling devices (PSDs) presents challenges with respect to representative sampling for estimating average concentrations and other metrics relevant for exposure and risk assessment. These challenges can be addressed by designing studies that account for sources of variation associated with PSMs and considering appropriate spatial scales to meet study objectives. Possible applications of PSMs include: quantifying spatial and temporal trends in bioavailable contaminants, identifying and evaluating contaminant source contributions, calibrating site-specific models, and, improving weight-of-evidence based decision frameworks. PSM data can be used to assist in delineating sediment management zones based on likelihood of exposure effects, monitor remedy effectiveness, and, evaluate risk reduction after sediment treatment, disposal, or beneficial reuse after management actions. Examples are provided illustrating why PSMs and freely dissolved contaminant concentrations (Cfree) should be incorporated into contaminated sediment investigations and study designs to better focus on and understand contaminant bioavailability, more accurately estimate exposure to sediment-associated contaminants, and better inform risk management decisions. Research and communication needs for encouraging broader use are discussed. Integr

  1. Passive sampling methods for contaminated sediments: risk assessment and management.

    PubMed

    Greenberg, Marc S; Chapman, Peter M; Allan, Ian J; Anderson, Kim A; Apitz, Sabine E; Beegan, Chris; Bridges, Todd S; Brown, Steve S; Cargill, John G; McCulloch, Megan C; Menzie, Charles A; Shine, James P; Parkerton, Thomas F

    2014-04-01

    This paper details how activity-based passive sampling methods (PSMs), which provide information on bioavailability in terms of freely dissolved contaminant concentrations (Cfree ), can be used to better inform risk management decision making at multiple points in the process of assessing and managing contaminated sediment sites. PSMs can increase certainty in site investigation and management, because Cfree is a better predictor of bioavailability than total bulk sediment concentration (Ctotal ) for 4 key endpoints included in conceptual site models (benthic organism toxicity, bioaccumulation, sediment flux, and water column exposures). The use of passive sampling devices (PSDs) presents challenges with respect to representative sampling for estimating average concentrations and other metrics relevant for exposure and risk assessment. These challenges can be addressed by designing studies that account for sources of variation associated with PSMs and considering appropriate spatial scales to meet study objectives. Possible applications of PSMs include: quantifying spatial and temporal trends in bioavailable contaminants, identifying and evaluating contaminant source contributions, calibrating site-specific models, and, improving weight-of-evidence based decision frameworks. PSM data can be used to assist in delineating sediment management zones based on likelihood of exposure effects, monitor remedy effectiveness, and, evaluate risk reduction after sediment treatment, disposal, or beneficial reuse after management actions. Examples are provided illustrating why PSMs and freely dissolved contaminant concentrations (Cfree ) should be incorporated into contaminated sediment investigations and study designs to better focus on and understand contaminant bioavailability, more accurately estimate exposure to sediment-associated contaminants, and better inform risk management decisions. Research and communication needs for encouraging broader use are discussed.

  2. Trial Sequential Methods for Meta-Analysis

    ERIC Educational Resources Information Center

    Kulinskaya, Elena; Wood, John

    2014-01-01

    Statistical methods for sequential meta-analysis have applications also for the design of new trials. Existing methods are based on group sequential methods developed for single trials and start with the calculation of a required information size. This works satisfactorily within the framework of fixed effects meta-analysis, but conceptual…

  3. Convex geometry analysis method of hyperspectral data

    NASA Astrophysics Data System (ADS)

    Gong, Yanjun; Wang, XiChang; Qi, Hongxing; Yu, BingXi

    2003-06-01

    We present matrix expression of convex geometry analysis method of hyperspectral data by linear mixing model and establish a mathematic model of endmembers. A 30-band remote sensing image is applied to testify the model. The results of analysis reveal that the method can analyze mixed pixel questions. The targets that are smaller than earth surface pixel can be identified by applying the method.

  4. Computational methods in the pricing and risk management of modern financial derivatives

    NASA Astrophysics Data System (ADS)

    Deutsch, Hans-Peter

    1999-09-01

    In the last 20 years modern finance has developed into a complex mathematically challenging field. Very complicated risks exist in financial markets which need very advanced methods to measure and/or model them. The financial instruments invented by the market participants to trade these risk, the so called derivatives are usually even more complicated than the risks themselves and also sometimes generate new riks. Topics like random walks, stochastic differential equations, martingale measures, time series analysis, implied correlations, etc. are of common use in the field. This is why more and more people with a science background, such as physicists, mathematicians, or computer scientists, are entering the field of finance. The measurement and management of all theses risks is the key to the continuing success of banks. This talk gives insight into today's common methods of modern market risk management such as variance-covariance, historical simulation, Monte Carlo, “Greek” ratios, etc., including the statistical concepts on which they are based. Derivatives are at the same time the main reason for and the most effective means of conducting risk management. As such, they stand at the beginning and end of risk management. The valuation of derivatives and structured financial instruments is therefore the prerequisite, the condition sine qua non, for all risk management. This talk introduces some of the important valuation methods used in modern derivatives pricing such as present value, Black-Scholes, binomial trees, Monte Carlo, etc. In summary this talk highlights an area outside physics where there is a lot of interesting work to do, especially for physicists. Or as one of our consultants said: The fascinating thing about this job is that Arthur Andersen hired me not ALTHOUGH I am a physicist but BECAUSE I am a physicist.

  5. Cost-effectiveness analysis of risk-reduction measures to reach water safety targets.

    PubMed

    Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof; Pettersson, Thomas J R

    2011-01-01

    Identifying the most suitable risk-reduction measures in drinking water systems requires a thorough analysis of possible alternatives. In addition to the effects on the risk level, also the economic aspects of the risk-reduction alternatives are commonly considered important. Drinking water supplies are complex systems and to avoid sub-optimisation of risk-reduction measures, the entire system from source to tap needs to be considered. There is a lack of methods for quantification of water supply risk reduction in an economic context for entire drinking water systems. The aim of this paper is to present a novel approach for risk assessment in combination with economic analysis to evaluate risk-reduction measures based on a source-to-tap approach. The approach combines a probabilistic and dynamic fault tree method with cost-effectiveness analysis (CEA). The developed approach comprises the following main parts: (1) quantification of risk reduction of alternatives using a probabilistic fault tree model of the entire system; (2) combination of the modelling results with CEA; and (3) evaluation of the alternatives with respect to the risk reduction, the probability of not reaching water safety targets and the cost-effectiveness. The fault tree method and CEA enable comparison of risk-reduction measures in the same quantitative unit and consider costs and uncertainties. The approach provides a structured and thorough analysis of risk-reduction measures that facilitates transparency and long-term planning of drinking water systems in order to avoid sub-optimisation of available resources for risk reduction.

  6. [The methods of assessment of health risk from exposure to radon and radon daughters].

    PubMed

    Demin, V F; Zhukovskiy, M V; Kiselev, S M

    2014-01-01

    The critical analysis of existing models of the relationship dose-effect (RDE) for radon exposure on human health has been performed. Conclusion about the necessity and possibility of improving these models has been made. A new improved version ofthe RDE has been developed. A technique for assessing the human health risk of exposure to radon, including the method for estimating of exposure doses of radon, an improved model of RDE, proper methodology risk assessment has been described. Methodology is proposed for the use in the territory of Russia.

  7. Risk analysis of Finnish peacekeeping in Kosovo.

    PubMed

    Lehtomäki, Kyösti; Pääkkönen, Rauno J; Rantanen, Jorma

    2005-04-01

    The research team interviewed over 90 Finnish battalion members in Kosovo, visited 22 units or posts, registered its observations, and made any necessary measurements. Key persons were asked to list the most important risks for occupational safety and health in their area of responsibility. Altogether, 106 accidents and 40 cases of disease resulted in compensation claims in 2000. The risks to the peacekeeping force were about twice those of the permanent staff of military trainees in Finland. Altogether, 21 accidents or cases of disease resulted in sick leave for at least 3 months after service. One permanent injury resulted from an explosion. Biological, chemical, and physical factors caused 8 to 9 occupational illnesses each. Traffic accidents, operational factors, and munitions and mines were evaluated to be the three most important risk factors, followed by occupational hygiene, living conditions (mold, fungi, dust), and general hygiene. Possible fatal risks, such as traffic accidents and munitions and explosives, received a high ranking in both the subjective and the objective evaluations. One permanent injury resulted from an explosion, and two traffic accidents involved a fatality, although not of a peacekeeper. The reduction of sports and military training accidents, risk-control programs, and, for some tasks, better personal protection is considered a development challenge for the near future. PMID:15876212

  8. Balancing Precision and Risk: Should Multiple Detection Methods Be Analyzed Separately in N-Mixture Models?

    PubMed Central

    Graves, Tabitha A.; Royle, J. Andrew; Kendall, Katherine C.; Beier, Paul; Stetz, Jeffrey B.; Macleod, Amy C.

    2012-01-01

    Using multiple detection methods can increase the number, kind, and distribution of individuals sampled, which may increase accuracy and precision and reduce cost of population abundance estimates. However, when variables influencing abundance are of interest, if individuals detected via different methods are influenced by the landscape differently, separate analysis of multiple detection methods may be more appropriate. We evaluated the effects of combining two detection methods on the identification of variables important to local abundance using detections of grizzly bears with hair traps (systematic) and bear rubs (opportunistic). We used hierarchical abundance models (N-mixture models) with separate model components for each detection method. If both methods sample the same population, the use of either data set alone should (1) lead to the selection of the same variables as important and (2) provide similar estimates of relative local abundance. We hypothesized that the inclusion of 2 detection methods versus either method alone should (3) yield more support for variables identified in single method analyses (i.e. fewer variables and models with greater weight), and (4) improve precision of covariate estimates for variables selected in both separate and combined analyses because sample size is larger. As expected, joint analysis of both methods increased precision as well as certainty in variable and model selection. However, the single-method analyses identified different variables and the resulting predicted abundances had different spatial distributions. We recommend comparing single-method and jointly modeled results to identify the presence of individual heterogeneity between detection methods in N-mixture models, along with consideration of detection probabilities, correlations among variables, and tolerance to risk of failing to identify variables important to a subset of the population. The benefits of increased precision should be weighed against

  9. Balancing precision and risk: should multiple detection methods be analyzed separately in N-mixture models?

    USGS Publications Warehouse

    Graves, Tabitha A.; Royle, J. Andrew; Kendall, Katherine C.; Beier, Paul; Stetz, Jeffrey B.; Macleod, Amy C.

    2012-01-01

    Using multiple detection methods can increase the number, kind, and distribution of individuals sampled, which may increase accuracy and precision and reduce cost of population abundance estimates. However, when variables influencing abundance are of interest, if individuals detected via different methods are influenced by the landscape differently, separate analysis of multiple detection methods may be more appropriate. We evaluated the effects of combining two detection methods on the identification of variables important to local abundance using detections of grizzly bears with hair traps (systematic) and bear rubs (opportunistic). We used hierarchical abundance models (N-mixture models) with separate model components for each detection method. If both methods sample the same population, the use of either data set alone should (1) lead to the selection of the same variables as important and (2) provide similar estimates of relative local abundance. We hypothesized that the inclusion of 2 detection methods versus either method alone should (3) yield more support for variables identified in single method analyses (i.e. fewer variables and models with greater weight), and (4) improve precision of covariate estimates for variables selected in both separate and combined analyses because sample size is larger. As expected, joint analysis of both methods increased precision as well as certainty in variable and model selection. However, the single-method analyses identified different variables and the resulting predicted abundances had different spatial distributions. We recommend comparing single-method and jointly modeled results to identify the presence of individual heterogeneity between detection methods in N-mixture models, along with consideration of detection probabilities, correlations among variables, and tolerance to risk of failing to identify variables important to a subset of the population. The benefits of increased precision should be weighed against

  10. Balancing precision and risk: should multiple detection methods be analyzed separately in N-mixture models?

    PubMed

    Graves, Tabitha A; Royle, J Andrew; Kendall, Katherine C; Beier, Paul; Stetz, Jeffrey B; Macleod, Amy C

    2012-01-01

    Using multiple detection methods can increase the number, kind, and distribution of individuals sampled, which may increase accuracy and precision and reduce cost of population abundance estimates. However, when variables influencing abundance are of interest, if individuals detected via different methods are influenced by the landscape differently, separate analysis of multiple detection methods may be more appropriate. We evaluated the effects of combining two detection methods on the identification of variables important to local abundance using detections of grizzly bears with hair traps (systematic) and bear rubs (opportunistic). We used hierarchical abundance models (N-mixture models) with separate model components for each detection method. If both methods sample the same population, the use of either data set alone should (1) lead to the selection of the same variables as important and (2) provide similar estimates of relative local abundance. We hypothesized that the inclusion of 2 detection methods versus either method alone should (3) yield more support for variables identified in single method analyses (i.e. fewer variables and models with greater weight), and (4) improve precision of covariate estimates for variables selected in both separate and combined analyses because sample size is larger. As expected, joint analysis of both methods increased precision as well as certainty in variable and model selection. However, the single-method analyses identified different variables and the resulting predicted abundances had different spatial distributions. We recommend comparing single-method and jointly modeled results to identify the presence of individual heterogeneity between detection methods in N-mixture models, along with consideration of detection probabilities, correlations among variables, and tolerance to risk of failing to identify variables important to a subset of the population. The benefits of increased precision should be weighed against

  11. Root Cause Analysis: Methods and Mindsets.

    ERIC Educational Resources Information Center

    Kluch, Jacob H.

    This instructional unit is intended for use in training operations personnel and others involved in scram analysis at nuclear power plants in the techniques of root cause analysis. Four lessons are included. The first lesson provides an overview of the goals and benefits of the root cause analysis method. Root cause analysis techniques are covered…

  12. Cost Analysis: Methods and Realities.

    ERIC Educational Resources Information Center

    Cummings, Martin M.

    1989-01-01

    Argues that librarians need to be concerned with cost analysis of library functions and services because, in the allocation of resources, decision makers will favor library managers who demonstrate understanding of the relationships between costs and productive outputs. Factors that should be included in a reliable scheme for cost accounting are…

  13. Oil shale health and environmental risk analysis

    SciTech Connect

    Gratt, L.B.

    1983-04-01

    The potential human health and environmental risks of hypothetical one-million-barrels-per-day oil shale industry have been analyzed to serve as an aid in the formulation and management of a program of environmental research. The largest uncertainties for expected fatalities are in the public sector from air pollutants although the occupational sector is estimated to have 60% more expected fatalities than the public sector. Occupational safety and illness have been analyzed for the oil shale fuel cycle from extraction to delivery of products for end use. Pneumoconiosis from the dust environment is the worker disease resulting in the greatest number of fatalities, followed by chronic bronchitis, internal cancer, and skin cancers, respectively. Research recommendations are presented for reducing the uncertainties in the risks analyzed and to fill data gaps to estimate other risks.

  14. Risk analysis: divergent models and convergent interpretations

    NASA Technical Reports Server (NTRS)

    Carnes, B. A.; Gavrilova, N.

    2001-01-01

    Material presented at a NASA-sponsored workshop on risk models for exposure conditions relevant to prolonged space flight are described in this paper. Analyses used mortality data from experiments conducted at Argonne National Laboratory on the long-term effects of external whole-body irradiation on B6CF1 mice by 60Co gamma rays and fission neutrons delivered as a single exposure or protracted over either 24 or 60 once-weekly exposures. The maximum dose considered was restricted to 1 Gy for neutrons and 10 Gy for gamma rays. Proportional hazard models were used to investigate the shape of the dose response at these lower doses for deaths caused by solid-tissue tumors and tumors of either connective or epithelial tissue origin. For protracted exposures, a significant mortality effect was detected at a neutron dose of 14 cGy and a gamma-ray dose of 3 Gy. For single exposures, radiation-induced mortality for neutrons also occurred within the range of 10-20 cGy, but dropped to 86 cGy for gamma rays. Plots of risk relative to control estimated for each observed dose gave a visual impression of nonlinearity for both neutrons and gamma rays. At least for solid-tissue tumors, male and female mortality was nearly identical for gamma-ray exposures, but mortality risks for females were higher than for males for neutron exposures. As expected, protracting the gamma-ray dose reduced mortality risks. Although curvature consistent with that observed visually could be detected by a model parameterized to detect curvature, a relative risk term containing only a simple term for total dose was usually sufficient to describe the dose response. Although detectable mortality for the three pathology end points considered typically occurred at the same level of dose, the highest risks were almost always associated with deaths caused by tumors of epithelial tissue origin.

  15. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    SciTech Connect

    Nachtigal, Noel M.; Fruetel, Julia A.; Gleason, Nathaniel J.; Helms, Jovana; Imbro, Dennis Raymond; Sumner, Matthew C.

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  16. A statistical method for studying correlated rare events and their risk factors

    PubMed Central

    Xue, Xiaonan; Kim, Mimi Y; Wang, Tao; Kuniholm, Mark H; Strickler, Howard D

    2016-01-01

    Longitudinal studies of rare events such as cervical high-grade lesions or colorectal polyps that can recur often involve correlated binary data. Risk factor for these events cannot be reliably examined using conventional statistical methods. For example, logistic regression models that incorporate generalized estimating equations often fail to converge or provide inaccurate results when analyzing data of this type. Although exact methods have been reported, they are complex and computationally difficult. The current paper proposes a mathematically straightforward and easy-to-use two-step approach involving (i) an additive model to measure associations between a rare or uncommon correlated binary event and potential risk factors and (ii) a permutation test to estimate the statistical significance of these associations. Simulation studies showed that the proposed method reliably tests and accurately estimates the associations of exposure with correlated binary rare events. This method was then applied to a longitudinal study of human leukocyte antigen (HLA) genotype and risk of cervical high grade squamous intraepithelial lesions (HSIL) among HIV-infected and HIV-uninfected women. Results showed statistically significant associations of two HLA alleles among HIV-negative but not HIV-positive women, suggesting that immune status may modify the HLA and cervical HSIL association. Overall, the proposed method avoids model nonconvergence problems and provides a computationally simple, accurate, and powerful approach for the analysis of risk factor associations with rare/uncommon correlated binary events. PMID:25854937

  17. Integrated Hybrid System Architecture for Risk Analysis

    NASA Technical Reports Server (NTRS)

    Moynihan, Gary P.; Fonseca, Daniel J.; Ray, Paul S.

    2010-01-01

    A conceptual design has been announced of an expert-system computer program, and the development of a prototype of the program, intended for use as a project-management tool. The program integrates schedule and risk data for the purpose of determining the schedule applications of safety risks and, somewhat conversely, the effects of changes in schedules on changes on safety. It is noted that the design has been delivered to a NASA client and that it is planned to disclose the design in a conference presentation.

  18. An ecological risk assessment method for species exposed to contaminant mixtures

    SciTech Connect

    Logan, D.T.; Wilson, H.T. )

    1995-02-01

    The method developed here provides a quantitative, objective measure of ecological risk for natural populations exposed to mixtures of chemical contaminants. It is founded on generally accepted risk assessment concepts: use of toxic units to assess the joint toxic effects of mixtures and expression of ecological risk as a relationship between toxicological end points and estimated environmental concentrations. Toxicological end points may be regulatory levels with zero variance and species-dependent concentrations with estimates of variance. Risk is the probability that a linear combination of toxic units exceeds 1, which expresses the probability that a measurement end point will occur. Computations have three variations. One addresses concentration addition, in which chemicals act independently to produce similar biological effects. For noninteractive joint action with no addition, in which the biological response t the mixture is not significantly different from the response to the most toxic component, the method reduces to an analysis of extrapolation error. For other noninteractive joint action--antagonism, partial addition, and supra-addition--a correction factor similar to Konemann's mixture toxicity index is applied. An initial validation using published data indicated that increased in situ striped bass mortality was generally associated with elevated risk estimates. The method is applicable to many organisms and toxicant mixtures.

  19. Hybrid methods for cybersecurity analysis :

    SciTech Connect

    Davis, Warren Leon,; Dunlavy, Daniel M.

    2014-01-01

    Early 2010 saw a signi cant change in adversarial techniques aimed at network intrusion: a shift from malware delivered via email attachments toward the use of hidden, embedded hyperlinks to initiate sequences of downloads and interactions with web sites and network servers containing malicious software. Enterprise security groups were well poised and experienced in defending the former attacks, but the new types of attacks were larger in number, more challenging to detect, dynamic in nature, and required the development of new technologies and analytic capabilities. The Hybrid LDRD project was aimed at delivering new capabilities in large-scale data modeling and analysis to enterprise security operators and analysts and understanding the challenges of detection and prevention of emerging cybersecurity threats. Leveraging previous LDRD research e orts and capabilities in large-scale relational data analysis, large-scale discrete data analysis and visualization, and streaming data analysis, new modeling and analysis capabilities were quickly brought to bear on the problems in email phishing and spear phishing attacks in the Sandia enterprise security operational groups at the onset of the Hybrid project. As part of this project, a software development and deployment framework was created within the security analyst work ow tool sets to facilitate the delivery and testing of new capabilities as they became available, and machine learning algorithms were developed to address the challenge of dynamic threats. Furthermore, researchers from the Hybrid project were embedded in the security analyst groups for almost a full year, engaged in daily operational activities and routines, creating an atmosphere of trust and collaboration between the researchers and security personnel. The Hybrid project has altered the way that research ideas can be incorporated into the production environments of Sandias enterprise security groups, reducing time to deployment from months and

  20. Analysis of driver casualty risk for different work zone types.

    PubMed

    Weng, Jinxian; Meng, Qiang

    2011-09-01

    Using driver casualty data from the Fatality Analysis Report System, this study examines driver casualty risk and investigates the risk contributing factors in the construction, maintenance and utility work zones. The multiple t-tests results show that the driver casualty risk is statistically different depending on the work zone type. Moreover, construction work zones have the largest driver casualty risk, followed by maintenance and utility work zones. Three separate logistic regression models are developed to predict driver casualty risk for the three work zone types because of their unique features. Finally, the effects of risk factors on driver casualty risk for each work zone type are examined and compared. For all three work zone types, five significant risk factors including road alignment, truck involvement, most harmful event, vehicle age and notification time are associated with increased driver casualty risk while traffic control devices and restraint use are associated with reduced driver casualty risk. However, one finding is that three risk factors (light condition, gender and day of week) exhibit opposing effects on the driver casualty risk in different types of work zones. This may largely be due to different work zone features and driver behavior in different types of work zones.

  1. HVAC fault tree analysis for WIPP integrated risk assessment

    SciTech Connect

    Kirby, P.; Iacovino, J.

    1990-01-01

    In order to evaluate the public health risk from operation of the Waste Isolation Pilot Plant (WIPP) due to potential radioactive releases, a probabilistic risk assessment of waste handling operations was conducted. One major aspect of this risk assessment involved fault tree analysis of the plant heating, ventilation, and air conditioning (HVAC) systems, which comprise the final barrier between waste handling operations and the environment. 1 refs., 1 tab.

  2. Metabolic syndrome risk factors and dry eye syndrome: a Meta-analysis

    PubMed Central

    Tang, Ye-Lei; Cheng, Ya-Lan; Ren, Yu-Ping; Yu, Xiao-Ning; Shentu, Xing-Chao

    2016-01-01

    AIM To explore the relationship between metabolic risk factors and dry eye syndrome (DES). METHODS Retrieved studies on the association of metabolic syndrome risk factors (hypertension, hyperglycemia, obesity, and hyperlipidemia) and DES were collected from PubMed, Web of Science, and the Cochrane Library in December 2015. Odds ratio (OR) with 95% confidence interval (CI) were pooled to evaluate the final relationship. Subgroup analyses were conducted according to diagnostic criteria of DES. RESULTS Nine cross-sectional studies and three case-control studies were included in this Meta-analysis. The pooled results showed that people with hypertension, hyperglycemia, and hyperlipidemia had a higher risk of suffering from DES (P<0.05), especially the typical DES symptoms. On the other hand, obesity did not increase the risk of DES. CONCLUSION The present Meta-analysis suggests that all metabolic risk factors except obesity were risk factors for DES. PMID:27500114

  3. Risk and value analysis of SETI

    NASA Technical Reports Server (NTRS)

    Billingham, J.

    1986-01-01

    The risks, values, and costs of the SETI project are evaluated and compared with those of the Viking project. Examination of the scientific values, side benefits, and costs of the two projects reveal that both projects provide equal benefits at equal costs. The probability of scientific and technical success is analyzed.

  4. Uncertainties in Cancer Risk Coefficients for Environmental Exposure to Radionuclides. An Uncertainty Analysis for Risk Coefficients Reported in Federal Guidance Report No. 13

    SciTech Connect

    Pawel, David; Leggett, Richard Wayne; Eckerman, Keith F; Nelson, Christopher

    2007-01-01

    Federal Guidance Report No. 13 (FGR 13) provides risk coefficients for estimation of the risk of cancer due to low-level exposure to each of more than 800 radionuclides. Uncertainties in risk coefficients were quantified in FGR 13 for 33 cases (exposure to each of 11 radionuclides by each of three exposure pathways) on the basis of sensitivity analyses in which various combinations of plausible biokinetic, dosimetric, and radiation risk models were used to generate alternative risk coefficients. The present report updates the uncertainty analysis in FGR 13 for the cases of inhalation and ingestion of radionuclides and expands the analysis to all radionuclides addressed in that report. The analysis indicates that most risk coefficients for inhalation or ingestion of radionuclides are determined within a factor of 5 or less by current information. That is, application of alternate plausible biokinetic and dosimetric models and radiation risk models (based on the linear, no-threshold hypothesis with an adjustment for the dose and dose rate effectiveness factor) is unlikely to change these coefficients by more than a factor of 5. In this analysis the assessed uncertainty in the radiation risk model was found to be the main determinant of the uncertainty category for most risk coefficients, but conclusions concerning the relative contributions of risk and dose models to the total uncertainty in a risk coefficient may depend strongly on the method of assessing uncertainties in the risk model.

  5. [The role of a specialised risk analysis group in the Veterinary Services of a developing country].

    PubMed

    Urbina-Amarís, M E

    2003-08-01

    Since the World Trade Organization (WTO) Agreement on the Application of Sanitary and Phytosanitary Measures was established, risk analysis in trade, and ultimately in Veterinary and Animal Health Services, has become strategically important. Irrespective of their concept (discipline, approach, method, process), all types of risk analysis in trade involve four periods or phases:--risk identification-- risk assessment--risk management--risk information or communication. All veterinarians involved in a risk analysis unit must have in-depth knowledge of statistics and the epidemiology of transmissible diseases, as well as a basic knowledge of veterinary science, economics, mathematics, data processing and social communication, to enable them to work with professionals in these disciplines. Many developing countries do not have enough well-qualified professionnals in these areas to support a risk analysis unit. This will need to be rectified by seeking strategic alliances with other public or private sectors that will provide the required support to run the unit properly. Due to the special nature of its risk analysis functions, its role in supporting decision-making, and the criteria of independence and transparency that are so crucial to its operations, the hierarchical position of the risk analysis unit should be close to the top management of the Veterinary Service. Due to the shortage of personnel in developing countries with the required training and scientific and technical qualifications, countries with organisations responsible for both animal and plant health protection would be advised to set up integrated plant and animal risk analysis units. In addition, these units could take charge of all activities relating to WTO agreements and regional agreements on animal and plant health management. PMID:15884594

  6. [The role of a specialised risk analysis group in the Veterinary Services of a developing country].

    PubMed

    Urbina-Amarís, M E

    2003-08-01

    Since the World Trade Organization (WTO) Agreement on the Application of Sanitary and Phytosanitary Measures was established, risk analysis in trade, and ultimately in Veterinary and Animal Health Services, has become strategically important. Irrespective of their concept (discipline, approach, method, process), all types of risk analysis in trade involve four periods or phases:--risk identification-- risk assessment--risk management--risk information or communication. All veterinarians involved in a risk analysis unit must have in-depth knowledge of statistics and the epidemiology of transmissible diseases, as well as a basic knowledge of veterinary science, economics, mathematics, data processing and social communication, to enable them to work with professionals in these disciplines. Many developing countries do not have enough well-qualified professionnals in these areas to support a risk analysis unit. This will need to be rectified by seeking strategic alliances with other public or private sectors that will provide the required support to run the unit properly. Due to the special nature of its risk analysis functions, its role in supporting decision-making, and the criteria of independence and transparency that are so crucial to its operations, the hierarchical position of the risk analysis unit should be close to the top management of the Veterinary Service. Due to the shortage of personnel in developing countries with the required training and scientific and technical qualifications, countries with organisations responsible for both animal and plant health protection would be advised to set up integrated plant and animal risk analysis units. In addition, these units could take charge of all activities relating to WTO agreements and regional agreements on animal and plant health management.

  7. Microparticle analysis system and method

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R. (Inventor)

    2007-01-01

    A device for analyzing microparticles is provided which includes a chamber with an inlet and an outlet for respectively introducing and dispensing a flowing fluid comprising microparticles, a light source for providing light through the chamber and a photometer for measuring the intensity of light transmitted through individual microparticles. The device further includes an imaging system for acquiring images of the fluid. In some cases, the device may be configured to identify and determine a quantity of the microparticles within the fluid. Consequently, a method for identifying and tracking microparticles in motion is contemplated herein. The method involves flowing a fluid comprising microparticles in laminar motion through a chamber, transmitting light through the fluid, measuring the intensities of the light transmitted through the microparticles, imaging the fluid a plurality of times and comparing at least some of the intensities of light between different images of the fluid.

  8. Current status of methods for shielding analysis

    SciTech Connect

    Engle, W.W.

    1980-01-01

    Current methods used in shielding analysis and recent improvements in those methods are discussed. The status of methods development is discussed based on needs cited at the 1977 International Conference on Reactor Shielding. Additional areas where methods development is needed are discussed.

  9. Dairy consumption and lung cancer risk: a meta-analysis of prospective cohort studies

    PubMed Central

    Yu, Yi; Li, Hui; Xu, Kaiwu; Li, Xin; Hu, Chunlin; Wei, Hongyan; Zeng, Xiaoyun; Jing, Xiaoli

    2016-01-01

    Background Lung cancer risk is the leading cause of cancer-related deaths worldwide. We conducted a meta-analysis to evaluate the relationship between dairy consumption and lung cancer risk. Methods The databases included EMBASE, Medline (PubMed), and Web of Science. The relationship between dairy consumption and lung cancer risk was analyzed by relative risk or odds ratio estimates with 95% confidence intervals (CIs). We identified eight prospective cohort studies, which amounted to 10,344 cases and 61,901 participants. Results For milk intake, relative risk was 0.95 (95% CI: 0.76–1.15); heterogeneity was 70.2% (P=0.003). For total dairy product intake, relative risk was 0.96 (95% CI: 0.89–1.03), heterogeneity was 68.4% (P=0.004). Conclusion There was no significant association between dairy consumption and lung cancer risk. PMID:26766916

  10. Vulnerabilities, Influences and Interaction Paths: Failure Data for Integrated System Risk Analysis

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land

    2006-01-01

    We describe graph-based analysis methods for identifying and analyzing cross-subsystem interaction risks from subsystem connectivity information. By discovering external and remote influences that would be otherwise unexpected, these methods can support better communication among subsystem designers at points of potential conflict and to support design of more dependable and diagnosable systems. These methods identify hazard causes that can impact vulnerable functions or entities if propagated across interaction paths from the hazard source to the vulnerable target. The analysis can also assess combined impacts of And-Or trees of disabling influences. The analysis can use ratings of hazards and vulnerabilities to calculate cumulative measures of the severity and importance. Identification of cross-subsystem hazard-vulnerability pairs and propagation paths across subsystems will increase coverage of hazard and risk analysis and can indicate risk control and protection strategies.

  11. Bisphosphonates and Risk of Cardiovascular Events: A Meta-Analysis

    PubMed Central

    Kim, Dae Hyun; Rogers, James R.; Fulchino, Lisa A.; Kim, Caroline A.; Solomon, Daniel H.; Kim, Seoyoung C.

    2015-01-01

    Background and Objectives Some evidence suggests that bisphosphonates may reduce atherosclerosis, while concerns have been raised about atrial fibrillation. We conducted a meta-analysis to determine the effects of bisphosphonates on total adverse cardiovascular (CV) events, atrial fibrillation, myocardial infarction (MI), stroke, and CV death in adults with or at risk for low bone mass. Methods A systematic search of MEDLINE and EMBASE through July 2014 identified 58 randomized controlled trials with longer than 6 months in duration that reported CV events. Absolute risks and the Mantel-Haenszel fixed-effects odds ratios (ORs) and 95% confidence intervals (CIs) of total CV events, atrial fibrillation, MI, stroke, and CV death were estimated. Subgroup analyses by follow-up duration, population characteristics, bisphosphonate types, and route were performed. Results Absolute risks over 25–36 months in bisphosphonate-treated versus control patients were 6.5% versus 6.2% for total CV events; 1.4% versus 1.5% for atrial fibrillation; 1.0% versus 1.2% for MI; 1.6% versus 1.9% for stroke; and 1.5% versus 1.4% for CV death. Bisphosphonate treatment up to 36 months did not have any significant effects on total CV events (14 trials; ORs [95% CI]: 0.98 [0.84–1.14]; I2 = 0.0%), atrial fibrillation (41 trials; 1.08 [0.92–1.25]; I2 = 0.0%), MI (10 trials; 0.96 [0.69–1.34]; I2 = 0.0%), stroke (10 trials; 0.99 [0.82–1.19]; I2 = 5.8%), and CV death (14 trials; 0.88 [0.72–1.07]; I2 = 0.0%) with little between-study heterogeneity. The risk of atrial fibrillation appears to be modestly elevated for zoledronic acid (6 trials; 1.24 [0.96–1.61]; I2 = 0.0%), not for oral bisphosphonates (26 trials; 1.02 [0.83–1.24]; I2 = 0.0%). The CV effects did not vary by subgroups or study quality. Conclusions Bisphosphonates do not have beneficial or harmful effects on atherosclerotic CV events, but zoledronic acid may modestly increase the risk of atrial fibrillation. Given the large

  12. A Framework for Flood Risk Analysis and Benefit Assessment of Flood Control Measures in Urban Areas

    PubMed Central

    Li, Chaochao; Cheng, Xiaotao; Li, Na; Du, Xiaohe; Yu, Qian; Kan, Guangyuan

    2016-01-01

    Flood risk analysis is more complex in urban areas than that in rural areas because of their closely packed buildings, different kinds of land uses, and large number of flood control works and drainage systems. The purpose of this paper is to propose a practical framework for flood risk analysis and benefit assessment of flood control measures in urban areas. Based on the concept of disaster risk triangle (hazard, vulnerability and exposure), a comprehensive analysis method and a general procedure were proposed for urban flood risk analysis. Urban Flood Simulation Model (UFSM) and Urban Flood Damage Assessment Model (UFDAM) were integrated to estimate the flood risk in the Pudong flood protection area (Shanghai, China). S-shaped functions were adopted to represent flood return period and damage (R-D) curves. The study results show that flood control works could significantly reduce the flood risk within the 66-year flood return period and the flood risk was reduced by 15.59%. However, the flood risk was only reduced by 7.06% when the flood return period exceeded 66-years. Hence, it is difficult to meet the increasing demands for flood control solely relying on structural measures. The R-D function is suitable to describe the changes of flood control capacity. This frame work can assess the flood risk reduction due to flood control measures, and provide crucial information for strategy development and planning adaptation. PMID:27527202

  13. A Framework for Flood Risk Analysis and Benefit Assessment of Flood Control Measures in Urban Areas.

    PubMed

    Li, Chaochao; Cheng, Xiaotao; Li, Na; Du, Xiaohe; Yu, Qian; Kan, Guangyuan

    2016-01-01

    Flood risk analysis is more complex in urban areas than that in rural areas because of their closely packed buildings, different kinds of land uses, and large number of flood control works and drainage systems. The purpose of this paper is to propose a practical framework for flood risk analysis and benefit assessment of flood control measures in urban areas. Based on the concept of disaster risk triangle (hazard, vulnerability and exposure), a comprehensive analysis method and a general procedure were proposed for urban flood risk analysis. Urban Flood Simulation Model (UFSM) and Urban Flood Damage Assessment Model (UFDAM) were integrated to estimate the flood risk in the Pudong flood protection area (Shanghai, China). S-shaped functions were adopted to represent flood return period and damage (R-D) curves. The study results show that flood control works could significantly reduce the flood risk within the 66-year flood return period and the flood risk was reduced by 15.59%. However, the flood risk was only reduced by 7.06% when the flood return period exceeded 66-years. Hence, it is difficult to meet the increasing demands for flood control solely relying on structural measures. The R-D function is suitable to describe the changes of flood control capacity. This frame work can assess the flood risk reduction due to flood control measures, and provide crucial information for strategy development and planning adaptation. PMID:27527202

  14. A Framework for Flood Risk Analysis and Benefit Assessment of Flood Control Measures in Urban Areas.

    PubMed

    Li, Chaochao; Cheng, Xiaotao; Li, Na; Du, Xiaohe; Yu, Qian; Kan, Guangyuan

    2016-08-05

    Flood risk analysis is more complex in urban areas than that in rural areas because of their closely packed buildings, different kinds of land uses, and large number of flood control works and drainage systems. The purpose of this paper is to propose a practical framework for flood risk analysis and benefit assessment of flood control measures in urban areas. Based on the concept of disaster risk triangle (hazard, vulnerability and exposure), a comprehensive analysis method and a general procedure were proposed for urban flood risk analysis. Urban Flood Simulation Model (UFSM) and Urban Flood Damage Assessment Model (UFDAM) were integrated to estimate the flood risk in the Pudong flood protection area (Shanghai, China). S-shaped functions were adopted to represent flood return period and damage (R-D) curves. The study results show that flood control works could significantly reduce the flood risk within the 66-year flood return period and the flood risk was reduced by 15.59%. However, the flood risk was only reduced by 7.06% when the flood return period exceeded 66-years. Hence, it is difficult to meet the increasing demands for flood control solely relying on structural measures. The R-D function is suitable to describe the changes of flood control capacity. This frame work can assess the flood risk reduction due to flood control measures, and provide crucial information for strategy development and planning adaptation.

  15. Analysis to support allergen risk management: Which way to go?

    PubMed

    Cucu, Tatiana; Jacxsens, Liesbeth; De Meulenaer, Bruno

    2013-06-19

    Food allergy represents an important food safety issue because of the potential lethal effects; the only effective treatment is the complete removal of the allergen involved from the diet. However, due to the growing complexity of food formulations and food processing, foods may be unintentionally contaminated via allergen-containing ingredients or cross-contamination. This affects not only consumers' well-being but also food producers and competent authorities involved in inspecting and auditing food companies. To address these issues, the food industry and control agencies rely on available analytical methods to quantify the amount of a particular allergic commodity in a food and thus to decide upon its safety. However, no "gold standard methods" exist for the quantitative detection of food allergens. Nowadays mostly receptor-based methods and in particular commercial kits are used in routine analysis. However, upon evaluation of their performances, commercial assays proved often to be unreliable in processed foods, attributed to the chemical changes in proteins that affect the molecular recognition with the receptor used. Unfortunately, the analytical outcome of other methods, among which are chromatographic combined with mass spectrometric techniques as well as DNA-based methods, seem to be affected in a comparable way by food processing. Several strategies can be employed to improve the quantitative analysis of allergens in foods. Nevertheless, issues related to extractability and matrix effects remain a permanent challenge. In view of the presented results, it is clear that the food industry needs to continue to make extra efforts to provide accurate labeling and to reduce the contamination with allergens to an acceptable level through the use of allergen risk management on a company level, which needs to be supported inevitably by a tailor-validated extraction and detection method.

  16. American Airlines Propeller STOL Transport Economic Risk Analysis

    NASA Technical Reports Server (NTRS)

    Ransone, B.

    1972-01-01

    A Monte Carlo risk analysis on the economics of STOL transports in air passenger traffic established the probability of making the expected internal rate of financial return, or better, in a hypothetical regular Washington/New York intercity operation.

  17. DOE 2009 Geothermal Risk Analysis: Methodology and Results (Presentation)

    SciTech Connect

    Young, K. R.; Augustine, C.; Anderson, A.

    2010-02-01

    This presentation summarizes the methodology and results for a probabilistic risk analysis of research, development, and demonstration work-primarily for enhanced geothermal systems (EGS)-sponsored by the U.S. Department of Energy Geothermal Technologies Program.

  18. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    DOEpatents

    West, Phillip B.; Novascone, Stephen R.; Wright, Jerry P.

    2012-05-29

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  19. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    DOEpatents

    West, Phillip B.; Novascone, Stephen R.; Wright, Jerry P.

    2011-09-27

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  20. Comparison of methods trial for high-risk HPV.

    PubMed

    Kurtycz, Daniel F I; Smith, Michele; He, Rong; Miyazaki, Kayo; Shalkham, John

    2010-02-01

    In efforts to improve service, we compared the performance of four methods of HPV detection: Invader HPV (Hologic), Hybrid Capture 2 (Qiagen), Inform HPV detection (Ventana), and standard PCR.Using blinded/de-identified cervical samples in Preservcyt (Hologic), we compared Ventana's Inform HPV Test, against Hologic's HPV Invader and PCR. In a separate evaluation, we compared Inform versus Invader versus hc2. Ventana employs in situ hybridization; Hologic's technology uses three specifically designed oligonucleotides and a fluorescent signal for detection. Qiagen's hc2 method incorporates enzyme-linked antibody detection of RNA-DNA hybrids. PCR testing was provided by Access Genetics (Minneapolis, MN). The United States Food and Drug Administration recently approved the Third Wave/Hologic Invader HPV high-risk test (rebranded as Cervista HPV HR Test).In this small study, involving a few hundred tests, Third Wave, Qiagen, and PCR tests were comparable. Kappa statistics comparing Third Wave to PCR and Third Wave to Qiagen were 0.88 and 0.74, respectively. Ventana's method did not correlate well with any of the other methods with Kappa's ranging from a low of 0.25 versus Qiagen to 0.31 versus PCR. Kappa statistics measure correlation and not accuracy of measurement.Although we felt that the specificity of our original HPV method, Ventana Inform was satisfactory and lowered our subsequent colposcopy rate, worries about its lower sensitivity caused us to look at other techniques. Other methods, PCR, hc2, and Invader, appeared comparable with one another in our series. We chose to implement the Third Wave test in our laboratory.

  1. Comparison of 3 methods for identifying dietary patterns associated with risk of disease.

    PubMed

    DiBello, Julia R; Kraft, Peter; McGarvey, Stephen T; Goldberg, Robert; Campos, Hannia; Baylin, Ana

    2008-12-15

    Reduced rank regression and partial least-squares regression (PLS) are proposed alternatives to principal component analysis (PCA). Using all 3 methods, the authors derived dietary patterns in Costa Rican data collected on 3,574 cases and controls in 1994-2004 and related the resulting patterns to risk of first incident myocardial infarction. Four dietary patterns associated with myocardial infarction were identified. Factor 1, characterized by high intakes of lean chicken, vegetables, fruit, and polyunsaturated oil, was generated by all 3 dietary pattern methods and was associated with a significantly decreased adjusted risk of myocardial infarction (28%-46%, depending on the method used). PCA and PLS also each yielded a pattern associated with a significantly decreased risk of myocardial infarction (31% and 23%, respectively); this pattern was characterized by moderate intake of alcohol and polyunsaturated oil and low intake of high-fat dairy products. The fourth factor derived from PCA was significantly associated with a 38% increased risk of myocardial infarction and was characterized by high intakes of coffee and palm oil. Contrary to previous studies, the authors found PCA and PLS to produce more patterns associated with cardiovascular disease than reduced rank regression. The most effective method for deriving dietary patterns related to disease may vary depending on the study goals. PMID:18945692

  2. Semi-Competing Risks Data Analysis: Accounting for Death as a Competing Risk When the Outcome of Interest Is Nonterminal.

    PubMed

    Haneuse, Sebastien; Lee, Kyu Ha

    2016-05-01

    Hospital readmission is a key marker of quality of health care. Notwithstanding its widespread use, however, it remains controversial in part because statistical methods used to analyze readmission, primarily logistic regression and related models, may not appropriately account for patients who die before experiencing a readmission event within the time frame of interest. Toward resolving this, we describe and illustrate the semi-competing risks framework, which refers to the general setting where scientific interest lies with some nonterminal event (eg, readmission), the occurrence of which is subject to a terminal event (eg, death). Although several statistical analysis methods have been proposed for semi-competing risks data, we describe in detail the use of illness-death models primarily because of their relation to well-known methods for survival analysis and the availability of software. We also describe and consider in detail several existing approaches that could, in principle, be used to analyze semi-competing risks data, including composite end point and competing risks analyses. Throughout we illustrate the ideas and methods using data on N=49 763 Medicare beneficiaries hospitalized between 2011 and 2013 with a principle discharge diagnosis of heart failure.

  3. Environmental risk assessment in GMO analysis.

    PubMed

    Pirondini, Andrea; Marmiroli, Nelson

    2008-01-01

    Genetically modified or engineered organisms (GMOs, GEOs) are utilised in agriculture, expressing traits of interest, such as insect or herbicide resistance. Soybean, maize, cotton and oilseed rape are the GM crops with the largest acreage in the world. The distribution of GM acreage in the different countries is related with the different positions concerning labelling of GMO products: based on the principle of substantial equivalence, or rather based on the precautionary principle. The paper provides an overview on how the risks associated with release of GMO in the environments can be analysed and predicted, in view of a possible coexistence of GM and non-GM organisms in agriculture.Risk assessment procedures, both qualitative and quantitative, are compared in the context of application to GMOs considering also legislation requirements (Directive 2001/18/EC). Criteria and measurable properties to assess harm for human health and environmental safety are listed, and the possible consequences are evaluated in terms of significance.Finally, a mapping of the possible risks deriving from GMO release is reported, focusing on gene transfer to related species, horizontal gene transfer, direct and indirect effects on non target organisms, development of resistance in target organisms, and effects on biodiversity. PMID:19048472

  4. Environmental risk assessment in GMO analysis.

    PubMed

    Pirondini, Andrea; Marmiroli, Nelson

    2010-01-01

    Genetically modified or engineered organisms (GMOs, GEOs) are utilised in agriculture, expressing traits of interest, such as insect or herbicide resistance. Soybean, maize, cotton and oilseed rape are the GM crops with the largest acreage in the world. The distribution of GM acreage in the different countries is related with the different positions concerning labelling of GMO products: based on the principle of substantial equivalence, or rather based on the precautionary principle. The paper provides an overview on how the risks associated with release of GMO in the environments can be analysed and predicted, in view of a possible coexistence of GM and non-GM organisms in agriculture.Risk assessment procedures, both qualitative and quantitative, are compared in the context of application to GMOs considering also legislation requirements (Directive 2001/18/EC). Criteria and measurable properties to assess harm for human health and environmental safety are listed, and the possible consequences are evaluated in terms of significance.Finally, a mapping of the possible risks deriving from GMO release is reported, focusing on gene transfer to related species, horizontal gene transfer, direct and indirect effects on non target organisms, development of resistance in target organisms, and effects on biodiversity.

  5. Environmental risk assessment in GMO analysis.

    PubMed

    Pirondini, Andrea; Marmiroli, Nelson

    2008-01-01

    Genetically modified or engineered organisms (GMOs, GEOs) are utilised in agriculture, expressing traits of interest, such as insect or herbicide resistance. Soybean, maize, cotton and oilseed rape are the GM crops with the largest acreage in the world. The distribution of GM acreage in the different countries is related with the different positions concerning labelling of GMO products: based on the principle of substantial equivalence, or rather based on the precautionary principle. The paper provides an overview on how the risks associated with release of GMO in the environments can be analysed and predicted, in view of a possible coexistence of GM and non-GM organisms in agriculture.Risk assessment procedures, both qualitative and quantitative, are compared in the context of application to GMOs considering also legislation requirements (Directive 2001/18/EC). Criteria and measurable properties to assess harm for human health and environmental safety are listed, and the possible consequences are evaluated in terms of significance.Finally, a mapping of the possible risks deriving from GMO release is reported, focusing on gene transfer to related species, horizontal gene transfer, direct and indirect effects on non target organisms, development of resistance in target organisms, and effects on biodiversity.

  6. Environmental risk assessment in GMO analysis.

    PubMed

    Pirondini, Andrea; Marmiroli, Nelson

    2010-01-01

    Genetically modified or engineered organisms (GMOs, GEOs) are utilised in agriculture, expressing traits of interest, such as insect or herbicide resistance. Soybean, maize, cotton and oilseed rape are the GM crops with the largest acreage in the world. The distribution of GM acreage in the different countries is related with the different positions concerning labelling of GMO products: based on the principle of substantial equivalence, or rather based on the precautionary principle. The paper provides an overview on how the risks associated with release of GMO in the environments can be analysed and predicted, in view of a possible coexistence of GM and non-GM organisms in agriculture.Risk assessment procedures, both qualitative and quantitative, are compared in the context of application to GMOs considering also legislation requirements (Directive 2001/18/EC). Criteria and measurable properties to assess harm for human health and environmental safety are listed, and the possible consequences are evaluated in terms of significance.Finally, a mapping of the possible risks deriving from GMO release is reported, focusing on gene transfer to related species, horizontal gene transfer, direct and indirect effects on non target organisms, development of resistance in target organisms, and effects on biodiversity. PMID:21384330

  7. Supervised discretization can discover risk groups in cancer survival analysis.

    PubMed

    Gómez, Iván; Ribelles, Nuria; Franco, Leonardo; Alba, Emilio; Jerez, José M

    2016-11-01

    Discretization of continuous variables is a common practice in medical research to identify risk patient groups. This work compares the performance of gold-standard categorization procedures (TNM+A protocol) with that of three supervised discretization methods from Machine Learning (CAIM, ChiM and DTree) in the stratification of patients with breast cancer. The performance for the discretization algorithms was evaluated based on the results obtained after applying standard survival analysis procedures such as Kaplan-Meier curves, Cox regression and predictive modelling. The results show that the application of alternative discretization algorithms could lead the clinicians to get valuable information for the diagnosis and outcome of the disease. Patient data were collected from the Medical Oncology Service of the Hospital Clínico Universitario (Málaga, Spain) considering a follow up period from 1982 to 2008. PMID:27686699

  8. Vibration analysis methods for piping

    NASA Astrophysics Data System (ADS)

    Gibert, R. J.

    1981-09-01

    Attention is given to flow vibrations in pipe flow induced by singularity points in the piping system. The types of pressure fluctuations induced by flow singularities are examined, including the intense wideband fluctuations immediately downstream of the singularity and the acoustic fluctuations encountered in the remainder of the circuit, and a theory of noise generation by unsteady flow in internal acoustics is developed. The response of the piping systems to the pressure fluctuations thus generated is considered, and the calculation of the modal characteristics of piping containing a dense fluid in order to obtain the system transfer function is discussed. The TEDEL program, which calculates the vibratory response of a structure composed of straight and curved pipes with variable mechanical characteristics forming a three-dimensional network by a finite element method, is then presented, and calculations of fluid-structural coupling in tubular networks are illustrated.

  9. Genetic principles and methods in high-risk studies of schizophrenia.

    PubMed

    Cloninger, C R

    1987-01-01

    Recent advances in genetic epidemiology present excellent opportunities for future high-risk studies of schizophrenia. Improved methods are now available for specifying the natural boundaries of the schizophrenia spectrum and evaluating the mode of inheritance of schizophrenia and its biosocial risk factors. Path-analytic techniques permit the derivation of multifactor indices of both biological and social antecedents of schizophrenia. Powerful advances have been made in segregation analysis of pedigree data and in the power of linkage tests with DNA markers to confirm the presence of putative major loci that influence susceptibility to complex phenotypes like schizophrenia. The yield of information from high-risk samples is greatly increased when both longitudinal and pedigree analyses are combined.

  10. Analysis of Risk Management in Adapted Physical Education Textbooks

    ERIC Educational Resources Information Center

    Murphy, Kelle L.; Donovan, Jacqueline B.; Berg, Dominck A.

    2016-01-01

    Physical education teacher education (PETE) programs vary on how the topics of safe teaching and risk management are addressed. Common practices to cover such issues include requiring textbooks, lesson planning, peer teaching, videotaping, reflecting, and reading case law analyses. We used a mixed methods design to examine how risk management is…

  11. The semantic distinction between "risk" and "danger": a linguistic analysis.

    PubMed

    Boholm, Max

    2012-02-01

    The analysis combines frame semantic and corpus linguistic approaches in analyzing the role of agency and decision making in the semantics of the words "risk" and "danger" (both nominal and verbal uses). In frame semantics, the meanings of "risk" and of related words, such as "danger," are analyzed against the background of a specific cognitive-semantic structure (a frame) comprising frame elements such as Protagonist, Bad Outcome, Decision, Possession, and Source. Empirical data derive from the British National Corpus (100 million words). Results indicate both similarities and differences in use. First, both "risk" and "danger" are commonly used to represent situations having potential negative consequences as the result of agency. Second, "risk" and "danger," especially their verbal uses (to risk, to endanger), differ in agent-victim structure, i.e., "risk" is used to express that a person affected by an action is also the agent of the action, while "endanger" is used to express that the one affected is not the agent. Third, "risk," but not "danger," tends to be used to represent rational and goal-directed action. The results therefore to some extent confirm the analysis of "risk" and "danger" suggested by German sociologist Niklas Luhmann. As a point of discussion, the present findings arguably have implications for risk communication.

  12. Risk Analysis for Unintentional Slide Deployment During Airline Operations.

    PubMed

    Ayra, Eduardo S; Insua, David Ríos; Castellanos, María Eugenia; Larbi, Lydia

    2015-09-01

    We present a risk analysis undertaken to mitigate problems in relation to the unintended deployment of slides under normal operations within a commercial airline. This type of incident entails relevant costs for the airline industry. After assessing the likelihood and severity of its consequences, we conclude that such risks need to be managed. We then evaluate the effectiveness of various countermeasures, describing and justifying the chosen ones. We also discuss several issues faced when implementing and communicating the proposed measures, thus fully illustrating the risk analysis process.

  13. Probabilistic structural analysis by extremum methods

    NASA Technical Reports Server (NTRS)

    Nafday, Avinash M.

    1990-01-01

    The objective is to demonstrate discrete extremum methods of structural analysis as a tool for structural system reliability evaluation. Specifically, linear and multiobjective linear programming models for analysis of rigid plastic frames under proportional and multiparametric loadings, respectively, are considered. Kinematic and static approaches for analysis form a primal-dual pair in each of these models and have a polyhedral format. Duality relations link extreme points and hyperplanes of these polyhedra and lead naturally to dual methods for system reliability evaluation.

  14. Analysis of dysphagia risk using the modified dysphagia risk assessment for the community-dwelling elderly

    PubMed Central

    Byeon, Haewon

    2016-01-01

    [Purpose] The elderly are susceptible to dysphagia, and complications can be minimized if high-risk groups are screened in early stages and properly rehabilitated. This study provides basic material for the early detection and prevention of dysphagia by investigating the risks of dysphagia and related factors in community-dwelling elders. [Subjects and Methods] Participants included 325 community-dwelling elderly people aged 65 or older. The modified dysphagia risk assessment for the community-dwelling elderly was used to assess dysphagia risk. [Results] Approximately 52.6% (n=171) of participants belonged to the high-risk group for dysphagia. After adjusting for confounding variables, people aged 75+, who used dentures, and who needed partial help in daily living had a significantly higher risk of dysphagia. [Conclusion] It is necessary to develop guidelines for dysphagia for early detection and rehabilitation. PMID:27799680

  15. Environmental risk analysis of hazardous material rail transportation.

    PubMed

    Saat, Mohd Rapik; Werth, Charles J; Schaeffer, David; Yoon, Hongkyu; Barkan, Christopher P L

    2014-01-15

    An important aspect of railroad environmental risk management involves tank car transportation of hazardous materials. This paper describes a quantitative, environmental risk analysis of rail transportation of a group of light, non-aqueous-phase liquid (LNAPL) chemicals commonly transported by rail in North America. The Hazardous Materials Transportation Environmental Consequence Model (HMTECM) was used in conjunction with a geographic information system (GIS) analysis of environmental characteristics to develop probabilistic estimates of exposure to different spill scenarios along the North American rail network. The risk analysis incorporated the estimated clean-up cost developed using the HMTECM, route-specific probability distributions of soil type and depth to groundwater, annual traffic volume, railcar accident rate, and tank car safety features, to estimate the nationwide annual risk of transporting each product. The annual risk per car-mile (car-km) and per ton-mile (ton-km) was also calculated to enable comparison between chemicals and to provide information on the risk cost associated with shipments of these products. The analysis and the methodology provide a quantitative approach that will enable more effective management of the environmental risk of transporting hazardous materials.

  16. Issues in Benchmarking Human Reliability Analysis Methods: A Literature Review

    SciTech Connect

    Ronald L. Boring; Stacey M. L. Hendrickson; John A. Forester; Tuan Q. Tran; Erasmia Lois

    2010-06-01

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessments (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study comparing and evaluating HRA methods in assessing operator performance in simulator experiments is currently underway. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies are presented in order to aid in the design of future HRA benchmarking endeavors.

  17. Issues in benchmarking human reliability analysis methods : a literature review.

    SciTech Connect

    Lois, Erasmia; Forester, John Alan; Tran, Tuan Q.; Hendrickson, Stacey M. Langfitt; Boring, Ronald L.

    2008-04-01

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessment (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study is currently underway that compares HRA methods with each other and against operator performance in simulator studies. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies are presented in order to aid in the design of future HRA benchmarking endeavors.

  18. Climate change, land slide risks and sustainable development, risk analysis and decision support process tool

    NASA Astrophysics Data System (ADS)

    Andersson-sköld, Y. B.; Tremblay, M.

    2011-12-01

    Climate change is in most parts of Sweden expected to result in increased precipitation and increased sea water levels causing flooding, erosion, slope instability and related secondary consequences. Landslide risks are expected to increase with climate change in large parts of Sweden due to increased annual precipitation, more intense precipitation and increased flows combined with dryer summers. In response to the potential climate related risks, and on the commission of the Ministry of Environment, the Swedish Geotechnical Institute (SGI) is at present performing a risk analysis project for the most prominent landslide risk area in Sweden: the Göta river valley. As part of this, a methodology for land slide ex-ante consequence analysis today, and in a future climate, has been developed and applied in the Göta river valley. Human life, settlements, industry, contaminated sites, infrastructure of national importance are invented and assessed important elements at risk. The goal of the consequence analysis is to produce a map of geographically distributed expected losses, which can be combined with a corresponding map displaying landslide probability to describe the risk (the combination of probability and consequence of a (negative) event). The risk analysis is GIS-aided in presenting and visualise the risk and using existing databases for quantification of the consequences represented by ex-ante estimated monetary losses. The results will be used on national, regional and as an indication of the risk on local level, to assess the need of measures to mitigate the risk. The costs and environmental and social impacts to mitigate the risk are expected to be very high but the costs and impacts of a severe landslide are expected to be even higher. Therefore, civil servants have pronounced a need of tools to assess both the vulnerability and a more holistic picture of impacts of climate change adaptation measures. At SGI a tool for the inclusion of sustainability

  19. Text analysis methods, text analysis apparatuses, and articles of manufacture

    DOEpatents

    Whitney, Paul D; Willse, Alan R; Lopresti, Charles A; White, Amanda M

    2014-10-28

    Text analysis methods, text analysis apparatuses, and articles of manufacture are described according to some aspects. In one aspect, a text analysis method includes accessing information indicative of data content of a collection of text comprising a plurality of different topics, using a computing device, analyzing the information indicative of the data content, and using results of the analysis, identifying a presence of a new topic in the collection of text.

  20. Risk Assessment of Infrastructure System of Systems with Precursor Analysis.

    PubMed

    Guo, Zhenyu; Haimes, Yacov Y

    2016-08-01

    Physical infrastructure systems are commonly composed of interconnected and interdependent subsystems, which in their essence constitute system of systems (S-o-S). System owners and policy researchers need tools to foresee potential emergent forced changes and to understand their impact so that effective risk management strategies can be developed. We develop a systemic framework for precursor analysis to support the design of an effective and efficient precursor monitoring and decision support system with the ability to (i) identify and prioritize indicators of evolving risks of system failure; and (ii) evaluate uncertainties in precursor analysis to support informed and rational decision making. This integrated precursor analysis framework is comprised of three processes: precursor identification, prioritization, and evaluation. We use an example of a highway bridge S-o-S to demonstrate the theories and methodologies of the framework. Bridge maintenance processes involve many interconnected and interdependent functional subsystems and decision-making entities and bridge failure can have broad social and economic consequences. The precursor analysis framework, which constitutes an essential part of risk analysis, examines the impact of various bridge inspection and maintenance scenarios. It enables policy researchers and analysts who are seeking a risk perspective on bridge infrastructure in a policy setting to develop more risk informed policies and create guidelines to efficiently allocate limited risk management resources and mitigate severe consequences resulting from bridge failures. PMID:27575259

  1. Risk Assessment of Infrastructure System of Systems with Precursor Analysis.

    PubMed

    Guo, Zhenyu; Haimes, Yacov Y

    2016-08-01

    Physical infrastructure systems are commonly composed of interconnected and interdependent subsystems, which in their essence constitute system of systems (S-o-S). System owners and policy researchers need tools to foresee potential emergent forced changes and to understand their impact so that effective risk management strategies can be developed. We develop a systemic framework for precursor analysis to support the design of an effective and efficient precursor monitoring and decision support system with the ability to (i) identify and prioritize indicators of evolving risks of system failure; and (ii) evaluate uncertainties in precursor analysis to support informed and rational decision making. This integrated precursor analysis framework is comprised of three processes: precursor identification, prioritization, and evaluation. We use an example of a highway bridge S-o-S to demonstrate the theories and methodologies of the framework. Bridge maintenance processes involve many interconnected and interdependent functional subsystems and decision-making entities and bridge failure can have broad social and economic consequences. The precursor analysis framework, which constitutes an essential part of risk analysis, examines the impact of various bridge inspection and maintenance scenarios. It enables policy researchers and analysts who are seeking a risk perspective on bridge infrastructure in a policy setting to develop more risk informed policies and create guidelines to efficiently allocate limited risk management resources and mitigate severe consequences resulting from bridge failures.

  2. Computer-aided methods for evaluating cancer risk in miners due to radiation exposure.

    PubMed

    Domański, T; Kluszczyński, D; Chruścielewski, W; Olszewski, J

    1993-01-01

    The paper presents some aspects of radiation hazard which occurs in a non-nuclear sector of industry, namely radiation hazard in non-uranium underground mines. The radiation hazard is caused in each type of underground mine by the naturally occurring noble radioactive gas-radon (222Rn) and radioactive products of its decay 218Po, 214Pb, 214Bi/214Po the so-called 'radon daughters' occurring in the mines' air. The paper presents the concept of how to provide a reliable system of assessment of miners' exposure by application of representative individual dosimetry, and also presents principles of computer-aided methods for interpretation of the results of miner's dosimetry useful for conversion of dosimetry data to the term of expected risk of cancer caused by exposure at miner's workplaces. The representative Individual Dosimetry system strengthened by computer-aided methods of analysis of results provided essential information on radiation cancer risk for miners employed in coal mines, metal-ore mines, chemical raw material mines in Poland. The coefficient of annual cancer risk induction is 1.5 x 10(-4) year-1 for coal mines, 1.40 x 10(-4) year-1 for metal ore mines and 1.5 x 10(-4) year-1 for chemical raw material mines. The radiation risk appears to be of the same magnitude as the conventional risk of life loss at work-related accidents. The average Lost Life Expectancy coefficient for both the radiation risk and conventional risk are 0.5 and 0.3 year per each miner, respectively. PMID:8019199

  3. Risk Analysis for Resource Planning Optimization

    NASA Technical Reports Server (NTRS)

    Cheung, Kar-Ming

    2008-01-01

    This paper describes a systems engineering approach to resource planning by integrating mathematical modeling and constrained optimization, empirical simulation, and theoretical analysis techniques to generate an optimal task plan in the presence of uncertainties.

  4. Matrix methods for bare resonator eigenvalue analysis.

    PubMed

    Latham, W P; Dente, G C

    1980-05-15

    Bare resonator eigenvalues have traditionally been calculated using Fox and Li iterative techniques or the Prony method presented by Siegman and Miller. A theoretical framework for bare resonator eigenvalue analysis is presented. Several new methods are given and compared with the Prony method.

  5. Analysis of the height dependence of site-specific cancer risk in relation to organ mass

    PubMed Central

    2016-01-01

    Background Cancer risks at multiple cancer sites have been shown to correlate positively with height. An existing idea is that taller people have more cells and thus more prone to mutations that will lead to cancer, and the hypothesis is that cancer risk is proportional to the organ mass. In this study we quantitatively test this hypothesis. Methods We perform an analysis of large-scale data on the height dependence of site-specific cancer risks. We also perform an analysis of the height dependence of measured organ masses. We then compare the cancer risk data with the expectations based on the organ mass hypothesis. Our study includes 16 cancer sites of women and 14 cancer sites of men. Results For the relative risk (RR) per 10 cm increase in height for cancer incidence, the averaged expected value is within the 95% confidence interval (CI) of the averaged cancer risk data for 8 out of the 15 cancer sites for which the comparison can be made. Also, a large proportion of the sex difference of cancer risks for pancreas and lungs could come from the sex difference of the organ mass. Conclusions The hypothesis that cancer risk is proportional to the organ mass partially explains the height dependence of cancer risks. It also helps explain the sex difference of cancer risks, especially for pancreas and lungs. PMID:27047947

  6. Use of a risk assessment method to improve the safety of negative pressure wound therapy.

    PubMed

    Lelong, Anne-Sophie; Martelli, Nicolas; Bonan, Brigitte; Prognon, Patrice; Pineau, Judith

    2014-06-01

    To conduct a risk analysis of the negative pressure wound therapy (NPWT) care process and to improve the safety of NPWT, a working group of nurses, hospital pharmacists, physicians and hospital managers performed a risk analysis for the process of NPWT care. The failure modes, effects and criticality analysis (FMECA) method was used for this analysis. Failure modes and their consequences were defined and classified as a function of their criticality to identify priority actions for improvement. By contrast to classical FMECA, the criticality index (CI) of each consequence was calculated by multiplying occurrence, severity and detection scores. We identified 13 failure modes, leading to 20 different consequences. The CI of consequences was initially 712, falling to 357 after corrective measures were implemented. The major improvements proposed included the establishment of 6-monthly training cycles for nurses, physicians and surgeons and the introduction of computerised prescription for NPWT. The FMECA method also made it possible to prioritise actions as a function of the criticality ranking of consequences and was easily understood and used by the working group. This study is, to our knowledge, the first to use the FMECA method to improve the safety of NPWT.

  7. Risk D&D Rapid Prototype: Scenario Documentation and Analysis Tool

    SciTech Connect

    Unwin, Stephen D.; Seiple, Timothy E.

    2009-05-28

    Report describes process and methodology associated with a rapid prototype tool for integrating project risk analysis and health & safety risk analysis for decontamination and decommissioning projects.

  8. Decomposition-Based Failure Mode Identification Method for Risk-Free Design of Large Systems

    NASA Technical Reports Server (NTRS)

    Tumer, Irem Y.; Stone, Robert B.; Roberts, Rory A.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    When designing products, it is crucial to assure failure and risk-free operation in the intended operating environment. Failures are typically studied and eliminated as much as possible during the early stages of design. The few failures that go undetected result in unacceptable damage and losses in high-risk applications where public safety is of concern. Published NASA and NTSB accident reports point to a variety of components identified as sources of failures in the reported cases. In previous work, data from these reports were processed and placed in matrix form for all the system components and failure modes encountered, and then manipulated using matrix methods to determine similarities between the different components and failure modes. In this paper, these matrices are represented in the form of a linear combination of failures modes, mathematically formed using Principal Components Analysis (PCA) decomposition. The PCA decomposition results in a low-dimensionality representation of all failure modes and components of interest, represented in a transformed coordinate system. Such a representation opens the way for efficient pattern analysis and prediction of failure modes with highest potential risks on the final product, rather than making decisions based on the large space of component and failure mode data. The mathematics of the proposed method are explained first using a simple example problem. The method is then applied to component failure data gathered from helicopter, accident reports to demonstrate its potential.

  9. Use of Monte Carlo methods in environmental risk assessments at the INEL: Applications and issues

    SciTech Connect

    Harris, G.; Van Horn, R.

    1996-06-01

    The EPA is increasingly considering the use of probabilistic risk assessment techniques as an alternative or refinement of the current point estimate of risk. This report provides an overview of the probabilistic technique called Monte Carlo Analysis. Advantages and disadvantages of implementing a Monte Carlo analysis over a point estimate analysis for environmental risk assessment are discussed. The general methodology is provided along with an example of its implementation. A phased approach to risk analysis that allows iterative refinement of the risk estimates is recommended for use at the INEL.

  10. An introductory guide to uncertainty analysis in environmental and health risk assessment. Environmental Restoration Program

    SciTech Connect

    Hoffman, F.O.; Hammonds, J.S.

    1992-10-01

    To compensate for the potential for overly conservative estimates of risk using standard US Environmental Protection Agency methods, an uncertainty analysis should be performed as an integral part of each risk assessment. Uncertainty analyses allow one to obtain quantitative results in the form of confidence intervals that will aid in decision making and will provide guidance for the acquisition of additional data. To perform an uncertainty analysis, one must frequently rely on subjective judgment in the absence of data to estimate the range and a probability distribution describing the extent of uncertainty about a true but unknown value for each parameter of interest. This information is formulated from professional judgment based on an extensive review of literature, analysis of the data, and interviews with experts. Various analytical and numerical techniques are available to allow statistical propagation of the uncertainty in the model parameters to a statement of uncertainty in the risk to a potentially exposed individual. Although analytical methods may be straightforward for relatively simple models, they rapidly become complicated for more involved risk assessments. Because of the tedious efforts required to mathematically derive analytical approaches to propagate uncertainty in complicated risk assessments, numerical methods such as Monte Carlo simulation should be employed. The primary objective of this report is to provide an introductory guide for performing uncertainty analysis in risk assessments being performed for Superfund sites.

  11. An introductory guide to uncertainty analysis in environmental and health risk assessment

    SciTech Connect

    Hoffman, F.O.; Hammonds, J.S.

    1992-10-01

    To compensate for the potential for overly conservative estimates of risk using standard US Environmental Protection Agency methods, an uncertainty analysis should be performed as an integral part of each risk assessment. Uncertainty analyses allow one to obtain quantitative results in the form of confidence intervals that will aid in decision making and will provide guidance for the acquisition of additional data. To perform an uncertainty analysis, one must frequently rely on subjective judgment in the absence of data to estimate the range and a probability distribution describing the extent of uncertainty about a true but unknown value for each parameter of interest. This information is formulated from professional judgment based on an extensive review of literature, analysis of the data, and interviews with experts. Various analytical and numerical techniques are available to allow statistical propagation of the uncertainty in the model parameters to a statement of uncertainty in the risk to a potentially exposed individual. Although analytical methods may be straightforward for relatively simple models, they rapidly become complicated for more involved risk assessments. Because of the tedious efforts required to mathematically derive analytical approaches to propagate uncertainty in complicated risk assessments, numerical methods such as Monte Carlo simulation should be employed. The primary objective of this report is to provide an introductory guide for performing uncertainty analysis in risk assessments being performed for Superfund sites.

  12. Contract Negotiations Supported Through Risk Analysis

    NASA Astrophysics Data System (ADS)

    Rodrigues, Sérgio A.; Vaz, Marco A.; Souza, Jano M.

    Many clients often view software as a commodity; then, it is critical that IT sellers know how to create value into their offering to differentiate their service from all the others. Clients sometimes refuse to contract software development due to lack of technical understanding or simply because they are afraid of IT contractual commitments. The IT negotiators who recognize the importance of this issue and the reason why it is a problem will be able to work to reach the commercial terms they want. Therefore, this chapter aims to stimulate IT professionals to improve their negotiation skills and presents a computational tool to support managers to get the best out of software negotiations through the identification of contract risks.

  13. Germany wide seasonal flood risk analysis for agricultural crops

    NASA Astrophysics Data System (ADS)

    Klaus, Stefan; Kreibich, Heidi; Kuhlmann, Bernd; Merz, Bruno; Schröter, Kai

    2016-04-01

    In recent years, large-scale flood risk analysis and mapping has gained attention. Regional to national risk assessments are needed, for example, for national risk policy developments, for large-scale disaster management planning and in the (re-)insurance industry. Despite increasing requests for comprehensive risk assessments some sectors have not received much scientific attention, one of these is the agricultural sector. In contrast to other sectors, agricultural crop losses depend strongly on the season. Also flood probability shows seasonal variation. Thus, the temporal superposition of high flood susceptibility of crops and high flood probability plays an important role for agricultural flood risk. To investigate this interrelation and provide a large-scale overview of agricultural flood risk in Germany, an agricultural crop loss model is used for crop susceptibility analyses and Germany wide seasonal flood-frequency analyses are undertaken to derive seasonal flood patterns. As a result, a Germany wide map of agricultural flood risk is shown as well as the crop type most at risk in a specific region. The risk maps may provide guidance for federal state-wide coordinated designation of retention areas.

  14. Comparing risk in conventional and organic dairy farming in the Netherlands: an empirical analysis.

    PubMed

    Berentsen, P B M; Kovacs, K; van Asseldonk, M A P M

    2012-07-01

    This study was undertaken to contribute to the understanding of why most dairy farmers do not convert to organic farming. Therefore, the objective of this research was to assess and compare risks for conventional and organic farming in the Netherlands with respect to gross margin and the underlying price and production variables. To investigate the risk factors a farm accountancy database was used containing panel data from both conventional and organic representative Dutch dairy farms (2001-2007). Variables with regard to price and production risk were identified using a gross margin analysis scheme. Price risk variables were milk price and concentrate price. The main production risk variables were milk yield per cow, roughage yield per hectare, and veterinary costs per cow. To assess risk, an error component implicit detrending method was applied and the resulting detrended standard deviations were compared between conventional and organic farms. Results indicate that the risk included in the gross margin per cow is significantly higher in organic farming. This is caused by both higher price and production risks. Price risks are significantly higher in organic farming for both milk price and concentrate price. With regard to production risk, only milk yield per cow poses a significantly higher risk in organic farming.

  15. How does scientific risk assessment of GM crops fit within the wider risk analysis?

    PubMed

    Johnson, Katy L; Raybould, Alan F; Hudson, Malcolm D; Poppy, Guy M

    2007-01-01

    The debate concerning genetically modified crops illustrates confusion between the role of scientists and that of wider society in regulatory decision making. We identify two fundamental misunderstandings, which, if rectified, would allow progress with confidence. First, scientific risk assessment needs to test well-defined hypotheses, not simply collect data. Second, risk assessments need to be placed in the wider context of risk analysis to enable the wider 'non-scientific' questions to be considered in regulatory decision making. Such integration and understanding is urgently required because the challenges to regulation will escalate as scientific progress advances.

  16. Risk analysis with a fuzzy-logic approach of a complex installation

    NASA Astrophysics Data System (ADS)

    Peikert, Tim; Garbe, Heyno; Potthast, Stefan

    2016-09-01

    This paper introduces a procedural method based on fuzzy logic to analyze systematic the risk of an electronic system in an intentional electromagnetic environment (IEME). The method analyzes the susceptibility of a complex electronic installation with respect to intentional electromagnetic interference (IEMI). It combines the advantages of well-known techniques as fault tree analysis (FTA), electromagnetic topology (EMT) and Bayesian networks (BN) and extends the techniques with an approach to handle uncertainty. This approach uses fuzzy sets, membership functions and fuzzy logic to handle the uncertainty with probability functions and linguistic terms. The linguistic terms add to the risk analysis the knowledge from experts of the investigated system or environment.

  17. Why Map Issues? On Controversy Analysis as a Digital Method

    PubMed Central

    2015-01-01

    This article takes stock of recent efforts to implement controversy analysis as a digital method in the study of science, technology, and society (STS) and beyond and outlines a distinctive approach to address the problem of digital bias. Digital media technologies exert significant influence on the enactment of controversy in online settings, and this risks undermining the substantive focus of controversy analysis conducted by digital means. To address this problem, I propose a shift in thematic focus from controversy analysis to issue mapping. The article begins by distinguishing between three broad frameworks that currently guide the development of controversy analysis as a digital method, namely, demarcationist, discursive, and empiricist. Each has been adopted in STS, but only the last one offers a digital “move beyond impartiality.” I demonstrate this approach by analyzing issues of Internet governance with the aid of the social media platform Twitter. PMID:26336325

  18. Development of economic consequence methodology for process risk analysis.

    PubMed

    Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed

    2015-04-01

    A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies.

  19. Using fire tests for quantitative risk analysis

    SciTech Connect

    Ling, W.C.T.; Williamson, R.B.

    1980-03-01

    Fires can be considered a causal chain-of-events in which the growth and spread of fire may cause damage and injury if it is rapid enough to overcome the barriers placed in its way. Fire tests for fire resistance of the barriers can be used in a quantitative risk assessment. The fire growth and spread is modelled in a State Transition Model (STM). The fire barriers are presented as part of the Fire Protection Model (FPM) which is based on a portion of the NFPA Decision Tree. An Emergency Equivalent Network is introduced to couple the Fire Growth Model (FGM) and the FPM so that the spread of fire beyond the room-of-origin can be computed. An example is presented in which a specific building floor plan is analyzed to obtain the shortest expected time for fire to spread between two points. To obtain the probability and time for each link in the network, data from the results of fire tests were used. These results were found to be lacking and new standards giving better data are advocated.

  20. [Establishment of Method for Health Risk Assessment of Pollutants from Fixed Sources].

    PubMed

    Chen, Qiang; Wu, Huan-bo

    2016-05-15

    A health risk assessment method of pollutants from fixed sources was developed by applying AERMOD model in the health risk assessment. The method could directly forecast the health risks of toxic pollutants from source by some exposure pathway. Using the established method, in combination with the data of sources and traditional health risk assessment method as well as the measured data of PAHs in inhalation particle matter (PM₁₀) in Lanzhou, the health risk of polycyclic aromatic hydrocarbons (PAHs) and benzo [a] pyrene (BaP) in PM₁₀ from the three fire power plants and the health risk of PAHs and BaP in PM₁₀ at the receptor point by inhalation exposure in heating and non-heating seasons was calculated, respectively. Then the contribution rates of the health risk caused by the three fire power plants to the health risk at the receptor point were calculated. The results showed that the contribution rates were not associated with sex and age, but were associated with time period and risk types. The contribution rates in the non-heating seasons were greater than those in heating seasons, and the contribution rates of the carcinogenic risk index were greater than those of the cancer risk value. The reliability of the established method was validated by comparing with the traditional method. This method was applicable to health risk assessment of toxic pollutants from all fixed sources and environmental risk assessment of environmental impact assessment. PMID:27506015

  1. Airbreathing hypersonic vehicle design and analysis methods

    NASA Technical Reports Server (NTRS)

    Lockwood, Mary Kae; Petley, Dennis H.; Hunt, James L.; Martin, John G.

    1996-01-01

    The design, analysis, and optimization of airbreathing hypersonic vehicles requires analyses involving many highly coupled disciplines at levels of accuracy exceeding those traditionally considered in a conceptual or preliminary-level design. Discipline analysis methods including propulsion, structures, thermal management, geometry, aerodynamics, performance, synthesis, sizing, closure, and cost are discussed. Also, the on-going integration of these methods into a working environment, known as HOLIST, is described.

  2. Risk analysis of gravity dam instability using credibility theory Monte Carlo simulation model.

    PubMed

    Xin, Cao; Chongshi, Gu

    2016-01-01

    Risk analysis of gravity dam stability involves complicated uncertainty in many design parameters and measured data. Stability failure risk ratio described jointly by probability and possibility has deficiency in characterization of influence of fuzzy factors and representation of the likelihood of risk occurrence in practical engineering. In this article, credibility theory is applied into stability failure risk analysis of gravity dam. Stability of gravity dam is viewed as a hybrid event considering both fuzziness and randomness of failure criterion, design parameters and measured data. Credibility distribution function is conducted as a novel way to represent uncertainty of influence factors of gravity dam stability. And combining with Monte Carlo simulation, corresponding calculation method and procedure are proposed. Based on a dam section, a detailed application of the modeling approach on risk calculation of both dam foundation and double sliding surfaces is provided. The results show that, the present method is feasible to be applied on analysis of stability failure risk for gravity dams. The risk assessment obtained can reflect influence of both sorts of uncertainty, and is suitable as an index value.

  3. Risk analysis of gravity dam instability using credibility theory Monte Carlo simulation model.

    PubMed

    Xin, Cao; Chongshi, Gu

    2016-01-01

    Risk analysis of gravity dam stability involves complicated uncertainty in many design parameters and measured data. Stability failure risk ratio described jointly by probability and possibility has deficiency in characterization of influence of fuzzy factors and representation of the likelihood of risk occurrence in practical engineering. In this article, credibility theory is applied into stability failure risk analysis of gravity dam. Stability of gravity dam is viewed as a hybrid event considering both fuzziness and randomness of failure criterion, design parameters and measured data. Credibility distribution function is conducted as a novel way to represent uncertainty of influence factors of gravity dam stability. And combining with Monte Carlo simulation, corresponding calculation method and procedure are proposed. Based on a dam section, a detailed application of the modeling approach on risk calculation of both dam foundation and double sliding surfaces is provided. The results show that, the present method is feasible to be applied on analysis of stability failure risk for gravity dams. The risk assessment obtained can reflect influence of both sorts of uncertainty, and is suitable as an index value. PMID:27386264

  4. State of the art in benefit-risk analysis: introduction.

    PubMed

    Verhagen, H; Tijhuis, M J; Gunnlaugsdóttir, H; Kalogeras, N; Leino, O; Luteijn, J M; Magnússon, S H; Odekerken, G; Pohjola, M V; Tuomisto, J T; Ueland, Ø; White, B C; Holm, F

    2012-01-01

    Risk-taking is normal in everyday life if there are associated (perceived) benefits. Benefit-Risk Analysis (BRA) compares the risk of a situation to its related benefits and addresses the acceptability of the risk. Over the past years BRA in relation to food and food ingredients has gained attention. Food, and even the same food ingredient, may confer both beneficial and adverse effects. Measures directed at food safety may lead to suboptimal or insufficient levels of ingredients from a benefit perspective. In BRA, benefits and risks of food (ingredients) are assessed in one go and may conditionally be expressed into one currency. This allows the comparison of adverse and beneficial effects to be qualitative and quantitative. A BRA should help policy-makers to make more informed and balanced benefit-risk management decisions. Not allowing food benefits to occur in order to guarantee food safety is a risk management decision much the same as accepting some risk in order to achieve more benefits. BRA in food and nutrition is making progress, but difficulties remain. The field may benefit from looking across its borders to learn from other research areas. The BEPRARIBEAN project (Best Practices for Risk-Benefit Analysis: experience from out of food into food; http://en.opasnet.org/w/Bepraribean) aims to do so, by working together with Medicines, Food Microbiology, Environmental Health, Economics & Marketing-Finance and Consumer Perception. All perspectives are reviewed and subsequently integrated to identify opportunities for further development of BRA for food and food ingredients. Interesting issues that emerge are the varying degrees of risk that are deemed acceptable within the areas and the trend towards more open and participatory BRA processes. A set of 6 'state of the art' papers covering the above areas and a paper integrating the separate (re)views are published in this volume. PMID:21679738

  5. State of the art in benefit-risk analysis: introduction.

    PubMed

    Verhagen, H; Tijhuis, M J; Gunnlaugsdóttir, H; Kalogeras, N; Leino, O; Luteijn, J M; Magnússon, S H; Odekerken, G; Pohjola, M V; Tuomisto, J T; Ueland, Ø; White, B C; Holm, F

    2012-01-01

    Risk-taking is normal in everyday life if there are associated (perceived) benefits. Benefit-Risk Analysis (BRA) compares the risk of a situation to its related benefits and addresses the acceptability of the risk. Over the past years BRA in relation to food and food ingredients has gained attention. Food, and even the same food ingredient, may confer both beneficial and adverse effects. Measures directed at food safety may lead to suboptimal or insufficient levels of ingredients from a benefit perspective. In BRA, benefits and risks of food (ingredients) are assessed in one go and may conditionally be expressed into one currency. This allows the comparison of adverse and beneficial effects to be qualitative and quantitative. A BRA should help policy-makers to make more informed and balanced benefit-risk management decisions. Not allowing food benefits to occur in order to guarantee food safety is a risk management decision much the same as accepting some risk in order to achieve more benefits. BRA in food and nutrition is making progress, but difficulties remain. The field may benefit from looking across its borders to learn from other research areas. The BEPRARIBEAN project (Best Practices for Risk-Benefit Analysis: experience from out of food into food; http://en.opasnet.org/w/Bepraribean) aims to do so, by working together with Medicines, Food Microbiology, Environmental Health, Economics & Marketing-Finance and Consumer Perception. All perspectives are reviewed and subsequently integrated to identify opportunities for further development of BRA for food and food ingredients. Interesting issues that emerge are the varying degrees of risk that are deemed acceptable within the areas and the trend towards more open and participatory BRA processes. A set of 6 'state of the art' papers covering the above areas and a paper integrating the separate (re)views are published in this volume.

  6. Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture

    DOEpatents

    Sanfilippo, Antonio P.; Cowell, Andrew J.; Gregory, Michelle L.; Baddeley, Robert L.; Paulson, Patrick R.; Tratz, Stephen C.; Hohimer, Ryan E.

    2012-03-20

    Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture are described according to some aspects. In one aspect, a hypothesis analysis method includes providing a hypothesis, providing an indicator which at least one of supports and refutes the hypothesis, using the indicator, associating evidence with the hypothesis, weighting the association of the evidence with the hypothesis, and using the weighting, providing information regarding the accuracy of the hypothesis.

  7. Population-standardized genetic risk score: the SNP-based method of choice for inherited risk assessment of prostate cancer

    PubMed Central

    Conran, Carly A; Na, Rong; Chen, Haitao; Jiang, Deke; Lin, Xiaoling; Zheng, S Lilly; Brendler, Charles B; Xu, Jianfeng

    2016-01-01

    Several different approaches are available to clinicians for determining prostate cancer (PCa) risk. The clinical validity of various PCa risk assessment methods utilizing single nucleotide polymorphisms (SNPs) has been established; however, these SNP-based methods have not been compared. The objective of this study was to compare the three most commonly used SNP-based methods for PCa risk assessment. Participants were men (n = 1654) enrolled in a prospective study of PCa development. Genotypes of 59 PCa risk-associated SNPs were available in this cohort. Three methods of calculating SNP-based genetic risk scores (GRSs) were used for the evaluation of individual disease risk such as risk allele count (GRS-RAC), weighted risk allele count (GRS-wRAC), and population-standardized genetic risk score (GRS-PS). Mean GRSs were calculated, and performances were compared using area under the receiver operating characteristic curve (AUC) and positive predictive value (PPV). All SNP-based methods were found to be independently associated with PCa (all P < 0.05; hence their clinical validity). The mean GRSs in men with or without PCa using GRS-RAC were 55.15 and 53.46, respectively, using GRS-wRAC were 7.42 and 6.97, respectively, and using GRS-PS were 1.12 and 0.84, respectively (all P < 0.05 for differences between patients with or without PCa). All three SNP-based methods performed similarly in discriminating PCa from non-PCa based on AUC and in predicting PCa risk based on PPV (all P > 0.05 for comparisons between the three methods), and all three SNP-based methods had a significantly higher AUC than family history (all P < 0.05). Results from this study suggest that while the three most commonly used SNP-based methods performed similarly in discriminating PCa from non-PCa at the population level, GRS-PS is the method of choice for risk assessment at the individual level because its value (where 1.0 represents average population risk) can be easily interpreted regardless

  8. Credibility analysis of risk classes by generalized linear model

    NASA Astrophysics Data System (ADS)

    Erdemir, Ovgucan Karadag; Sucu, Meral

    2016-06-01

    In this paper generalized linear model (GLM) and credibility theory which are frequently used in nonlife insurance pricing are combined for reliability analysis. Using full credibility standard, GLM is associated with limited fluctuation credibility approach. Comparison criteria such as asymptotic variance and credibility probability are used to analyze the credibility of risk classes. An application is performed by using one-year claim frequency data of a Turkish insurance company and results of credible risk classes are interpreted.

  9. Ecological food web analysis for chemical risk assessment.

    PubMed

    Preziosi, Damian V; Pastorok, Robert A

    2008-12-01

    Food web analysis can be a critical component of ecological risk assessment, yet it has received relatively little attention among risk assessors. Food web data are currently used in modeling bioaccumulation of toxic chemicals and, to a limited extent, in the determination of the ecological significance of risks. Achieving more realism in ecological risk assessments requires new analysis tools and models that incorporate accurate information on key receptors in a food web paradigm. Application of food web analysis in risk assessments demands consideration of: 1) different kinds of food webs; 2) definition of trophic guilds; 3) variation in food webs with habitat, space, and time; and 4) issues for basic sampling design and collection of dietary data. The different kinds of food webs include connectance webs, materials flow webs, and functional (or interaction) webs. These three kinds of webs play different roles throughout various phases of an ecological risk assessment, but risk assessors have failed to distinguish among web types. When modeling food webs, choices must be made regarding the level of complexity for the web, assignment of species to trophic guilds, selection of representative species for guilds, use of average diets, the characterization of variation among individuals or guild members within a web, and the spatial and temporal scales/dynamics of webs. Integrating exposure and effects data in ecological models for risk assessment of toxic chemicals relies on coupling food web analysis with bioaccumulation models (e.g., Gobas-type models for fish and their food webs), wildlife exposure models, dose-response models, and population dynamics models. PMID:18703218

  10. Multi-hazard risk analysis for management strategies

    NASA Astrophysics Data System (ADS)

    Kappes, M.; Keiler, M.; Bell, R.; Glade, T.

    2009-04-01

    Risk management is very often operating in a reactive way, responding to an event, instead of proactive starting with risk analysis and building up the whole process of risk evaluation, prevention, event management and regeneration. Since damage and losses from natural hazards raise continuously more and more studies, concepts (e.g. Switzerland or South Tyrol-Bolozano) and software packages (e.g. ARMAGEDOM, HAZUS or RiskScape) are developed to guide, standardize and facilitate the risk analysis. But these approaches focus on different aspects and are mostly closely adapted to the situation (legislation, organization of the administration, specific processes etc.) of the specific country or region. We propose in this study the development of a flexible methodology for multi-hazard risk analysis, identifying the stakeholders and their needs, processes and their characteristics, modeling approaches as well as incoherencies occurring by combining all these different aspects. Based on this concept a flexible software package will be established consisting of ArcGIS as central base and being complemented by various modules for hazard modeling, vulnerability assessment and risk calculation. Not all modules will be developed newly but taken from the current state-of-the-art and connected or integrated into ArcGIS. For this purpose two study sites, Valtellina in Italy and Bacelonnette in France, were chosen and the hazards types debris flows, rockfalls, landslides, avalanches and floods are planned to be included in the tool for a regional multi-hazard risk analysis. Since the central idea of this tool is its flexibility this will only be a first step, in the future further processes and scales can be included and the instrument thus adapted to any study site.

  11. Laboratory theory and methods for sediment analysis

    USGS Publications Warehouse

    Guy, Harold P.

    1969-01-01

    The diverse character of fluvial sediments makes the choice of laboratory analysis somewhat arbitrary and the pressing of sediment samples difficult. This report presents some theories and methods used by the Water Resources Division for analysis of fluvial sediments to determine the concentration of suspended-sediment samples and the particle-size distribution of both suspended-sediment and bed-material samples. Other analyses related to these determinations may include particle shape, mineral content, and specific gravity, the organic matter and dissolved solids of samples, and the specific weight of soils. The merits and techniques of both the evaporation and filtration methods for concentration analysis are discussed. Methods used for particle-size analysis of suspended-sediment samples may include the sieve pipet, the VA tube-pipet, or the BW tube-VA tube depending on the equipment available, the concentration and approximate size of sediment in the sample, and the settling medium used. The choice of method for most bed-material samples is usually limited to procedures suitable for sand or to some type of visual analysis for large sizes. Several tested forms are presented to help insure a well-ordered system in the laboratory to handle the samples, to help determine the kind of analysis required for each, to conduct the required processes, and to assist in the required computations. Use of the manual should further 'standardize' methods of fluvial sediment analysis among the many laboratories and thereby help to achieve uniformity and precision of the data.

  12. Risk Analysis and Decision Making FY 2013 Milestone Report

    SciTech Connect

    Engel, David W.; Dalton, Angela C.; Dale, Crystal; Jones, Edward; Thompson, J.

    2013-06-01

    Risk analysis and decision making is one of the critical objectives of CCSI, which seeks to use information from science-based models with quantified uncertainty to inform decision makers who are making large capital investments. The goal of this task is to develop tools and capabilities to facilitate the development of risk models tailored for carbon capture technologies, quantify the uncertainty of model predictions, and estimate the technical and financial risks associated with the system. This effort aims to reduce costs by identifying smarter demonstrations, which could accelerate development and deployment of the technology by several years.

  13. Gambler Risk Perception: A Mental Model and Grounded Theory Analysis.

    PubMed

    Spurrier, Michael; Blaszczynski, Alexander; Rhodes, Paul

    2015-09-01

    Few studies have investigated how gamblers perceive risk or the role of risk perception in disordered gambling. The purpose of the current study therefore was to obtain data on lay gamblers' beliefs on these variables and their effects on decision-making, behaviour, and disordered gambling aetiology. Fifteen regular lay gamblers (non-problem/low risk, moderate risk and problem gamblers) completed a semi-structured interview following mental models and grounded theory methodologies. Gambler interview data was compared to an expert 'map' of risk-perception, to identify comparative gaps or differences associated with harmful or safe gambling. Systematic overlapping processes of data gathering and analysis were used to iteratively extend, saturate, test for exception, and verify concepts and themes emerging from the data. The preliminary findings suggested that gambler accounts supported the presence of expert conceptual constructs, and to some degree the role of risk perception in protecting against or increasing vulnerability to harm and disordered gambling. Gambler accounts of causality, meaning, motivation, and strategy were highly idiosyncratic, and often contained content inconsistent with measures of disordered gambling. Disordered gambling appears heavily influenced by relative underestimation of risk and overvaluation of gambling, based on explicit and implicit analysis, and deliberate, innate, contextual, and learned processing evaluations and biases. PMID:24402720

  14. Comparative analysis of health risk assessments for municipal waste combustors

    SciTech Connect

    Levin, A.; Fratt, D.B.; Leonard, A.; Bruins, R.J.F.; Fradkin, L.

    1991-01-01

    Quantitative health risk assessments have been performed for a number of proposed municipal waste combustor (MWC) facilities over the past several years. The article presents the results of a comparative analysis of a total of 21 risk assessments, focusing on seven of the most comprehensive methodologies. The analysis concentrates on stack emissions of noncriteria pollutants and is comparative rather than critical in nature. Overall, the risk assessment methodologies used were similar whereas the assumptions and input values used varied from study to study. Some of the variability results directly from differences in site-specific characteristics, but much of it is due to absence of data, lack of field validation, lack of specific guidelines from regulatory agencies, and reliance on professional judgment. The results indicate that carcinogenic risks are more significant than chronic non-carcinogenic risks. In most instances polychlorodibenzodioxins, polychlorodibenzofurans, and cadmium contribute more significantly to the total carcinogenic risk from MWC stack emissions than other contaminants. In addition, the contribution to total risk of all indirect routes of exposure (ingestion and dermal contact) exceeds that of the direct inhalation route for most studies reviewed.

  15. Comparative analysis of health risk assessments for municipal waste combustors.

    PubMed

    Levin, A; Fratt, D B; Leonard, A; Bruins, R J; Fradkin, L

    1991-01-01

    Quantitative health risk assessments have been performed for a number of proposed municipal waste combustor (MWC) facilities over the past several years. This article presents the results of a comparative analysis of a total of 21 risk assessments, focusing on seven of the most comprehensive methodologies. The analysis concentrates on stack emissions of noncriteria pollutants and is comparative rather than critical in nature. Overall, the risk assessment methodologies used were similar whereas the assumptions and input values used varied from study to study. Some of this variability results directly from differences in site-specific characteristics, but much of it is due to absence of data, lack of field validation, lack of specific guidelines from regulatory agencies, and reliance on professional judgment. The results indicate that carcinogenic risks are more significant than chronic non-carcinogenic risks. In most instances polychlorodibenzodioxins, polychlorodibenzofurans, and cadmium contribute more significantly to the total carcinogenic risk from MWC stack emissions than other contaminants. In addition, the contribution to total risk of all indirect routes of exposure (ingestion and dermal contact) exceeds that of the direct inhalation route for most studies reviewed.

  16. Comparative analysis of health risk assessments for municipal waste combustors

    SciTech Connect

    Levin, A.; Fratt, D.B.; Leonard, A.; Bruins, R.J.; Fradkin, L. )

    1991-01-01

    Quantitative health risk assessments have been performed for a number of proposed municipal waste combustor (MWC) facilities over the past several years. This article presents the results of a comparative analysis of a total of 21 risk assessments, focusing on seven of the most comprehensive methodologies. The analysis concentrates on stack emissions of noncriteria pollutants and is comparative rather than critical in nature. Overall, the risk assessment methodologies used were similar whereas the assumptions and input values used varied from study to study. Some of this variability results directly from differences in site-specific characteristics, but much of it is due to absence of data, lack of field validation, lack of specific guidelines from regulatory agencies, and reliance on professional judgment. The results indicate that carcinogenic risks are more significant than chronic non-carcinogenic risks. In most instances polychlorodibenzodioxins, polychlorodibenzofurans, and cadmium contribute more significantly to the total carcinogenic risk from MWC stack emissions than other contaminants. In addition, the contribution to total risk of all indirect routes of exposure (ingestion and dermal contact) exceeds that of the direct inhalation route for most studies reviewed. 42 refs.

  17. Risk analysis. HIV / AIDS country profile: Mozambique.

    PubMed

    1996-12-01

    Mozambique's National STD/AIDS Control Program (NACP) estimates that, at present, about 8% of the population is infected with human immunodeficiency virus (HIV). The epidemic is expected to peak in 1997. By 2001, Mozambique is projected to have 1,650,000 HIV-positive adults 15-49 years of age, of whom 500,000 will have developed acquired immunodeficiency syndrome (AIDS), and 500,000 AIDS orphans. Incidence rates are highest in the country's central region, the transport corridors, and urban centers. The rapid spread of HIV has been facilitated by extreme poverty, the social upheaval and erosion of traditional norms created by years of political conflict and civil war, destruction of the primary health care infrastructure, growth of the commercial sex work trade, and labor migration to and from neighboring countries with high HIV prevalence. Moreover, about 10% of the adult population suffers from sexually transmitted diseases (STDs), including genital ulcers. NACP, created in 1988, is attempting to curb the further spread of HIV through education aimed at changing high-risk behaviors and condom distribution to prevent STD transmission. Theater performances and radio/television programs are used to reach the large illiterate population. The integration of sex education and STD/AIDS information in the curricula of primary and secondary schools and universities has been approved by the Ministry of Education. Several private companies have been persuaded to distribute condoms to their employees. Finally, the confidentiality of HIV patients has been guaranteed. In 1993, the total AIDS budget was US $1.67 million, 50% of which was provided by the European Union. The European Commission seeks to develop a national strategy for managing STDs within the primary health care system.

  18. Quantitative Risk Analysis of Obstacle Limitation Standards

    NASA Astrophysics Data System (ADS)

    Sandaradura, Amila Silva

    Obstacle limitation surfaces (OLS) are the main safeguard against objects that can pose a hazard to aircraft operations at and around the airports. The standard dimensions of the most of these surfaces were estimated using the pilot's experience at the time when they were included in to the standard documents. As a result, some of these standards may have been overestimated while others may not provide an adequate level of safety. With airports moving to the Safety Management System (SMS) approach to design and operations safety, proper evaluation of the level of safety provided by OLS at specific sites becomes great importance to airport operators. There is no published evidence, however, for the estimation of the safety level provided by the existing OLS standards. Moreover, the rationale used by the ICAO to establish existing OLS standards is not readily available in the standard documents. Therefore this study attempts to collect actual flight path data using information provided by air traffic control radars and construct a methodology to assess the probability of aircraft deviating from their intended/protected path. The extension of the developed methodology can be used to estimate the OLS dimensions that provide an acceptable safety level for the aircraft operations. This will be helpful to estimate safe and efficient standard dimensions of the OLS and assess the risk level of objects to the aircraft operations around airports. In order to assess the existing standards and show the applications of the methodology, three case studies were conducted using aircraft data collected from Ottawa (CYOW), Calgary (CYYC) and Edmonton (CYEG) International Airports.

  19. Risk analysis. HIV / AIDS country profile: Mozambique.

    PubMed

    1996-12-01

    Mozambique's National STD/AIDS Control Program (NACP) estimates that, at present, about 8% of the population is infected with human immunodeficiency virus (HIV). The epidemic is expected to peak in 1997. By 2001, Mozambique is projected to have 1,650,000 HIV-positive adults 15-49 years of age, of whom 500,000 will have developed acquired immunodeficiency syndrome (AIDS), and 500,000 AIDS orphans. Incidence rates are highest in the country's central region, the transport corridors, and urban centers. The rapid spread of HIV has been facilitated by extreme poverty, the social upheaval and erosion of traditional norms created by years of political conflict and civil war, destruction of the primary health care infrastructure, growth of the commercial sex work trade, and labor migration to and from neighboring countries with high HIV prevalence. Moreover, about 10% of the adult population suffers from sexually transmitted diseases (STDs), including genital ulcers. NACP, created in 1988, is attempting to curb the further spread of HIV through education aimed at changing high-risk behaviors and condom distribution to prevent STD transmission. Theater performances and radio/television programs are used to reach the large illiterate population. The integration of sex education and STD/AIDS information in the curricula of primary and secondary schools and universities has been approved by the Ministry of Education. Several private companies have been persuaded to distribute condoms to their employees. Finally, the confidentiality of HIV patients has been guaranteed. In 1993, the total AIDS budget was US $1.67 million, 50% of which was provided by the European Union. The European Commission seeks to develop a national strategy for managing STDs within the primary health care system. PMID:12320532

  20. The impact of communicating genetic risks of disease on risk-reducing health behaviour: systematic review with meta-analysis

    PubMed Central

    Hollands, Gareth J; French, David P; Griffin, Simon J; Prevost, A Toby; Sutton, Stephen; King, Sarah

    2016-01-01

    Objective To assess the impact of communicating DNA based disease risk estimates on risk-reducing health behaviours and motivation to engage in such behaviours. Design Systematic review with meta-analysis, using Cochrane methods. Data sources Medline, Embase, PsycINFO, CINAHL, and the Cochrane Central Register of Controlled Trials up to 25 February 2015. Backward and forward citation searches were also conducted. Study selection Randomised and quasi-randomised controlled trials involving adults in which one group received personalised DNA based estimates of disease risk for conditions where risk could be reduced by behaviour change. Eligible studies included a measure of risk-reducing behaviour. Results We examined 10 515 abstracts and included 18 studies that reported on seven behavioural outcomes, including smoking cessation (six studies; n=2663), diet (seven studies; n=1784), and physical activity (six studies; n=1704). Meta-analysis revealed no significant effects of communicating DNA based risk estimates on smoking cessation (odds ratio 0.92, 95% confidence interval 0.63 to 1.35, P=0.67), diet (standardised mean difference 0.12, 95% confidence interval −0.00 to 0.24, P=0.05), or physical activity (standardised mean difference −0.03, 95% confidence interval −0.13 to 0.08, P=0.62). There were also no effects on any other behaviours (alcohol use, medication use, sun protection behaviours, and attendance at screening or behavioural support programmes) or on motivation to change behaviour, and no adverse effects, such as depression and anxiety. Subgroup analyses provided no clear evidence that communication of a risk-conferring genotype affected behaviour more than communication of the absence of such a genotype. However, studies were predominantly at high or unclear risk of bias, and evidence was typically of low quality. Conclusions Expectations that communicating DNA based risk estimates changes behaviour is not supported by existing evidence

  1. Developing New Tools and Methods for Risk Assessment

    EPA Science Inventory

    Traditionally, risk assessment for environmental chemicals is based upon epidemiological and/or animal toxicity data. Since the release of the National Academy of Sciences Toxicity in the 21st Century: A Vision and a Strategy (2007) and Science and Decisions: Advancing Risk Asses...

  2. A Bayesian approach to probabilistic sensitivity analysis in structured benefit-risk assessment.

    PubMed

    Waddingham, Ed; Mt-Isa, Shahrul; Nixon, Richard; Ashby, Deborah

    2016-01-01

    Quantitative decision models such as multiple criteria decision analysis (MCDA) can be used in benefit-risk assessment to formalize trade-offs between benefits and risks, providing transparency to the assessment process. There is however no well-established method for propagating uncertainty of treatment effects data through such models to provide a sense of the variability of the benefit-risk balance. Here, we present a Bayesian statistical method that directly models the outcomes observed in randomized placebo-controlled trials and uses this to infer indirect comparisons between competing active treatments. The resulting treatment effects estimates are suitable for use within the MCDA setting, and it is possible to derive the distribution of the overall benefit-risk balance through Markov Chain Monte Carlo simulation. The method is illustrated using a case study of natalizumab for relapsing-remitting multiple sclerosis.

  3. Causal Moderation Analysis Using Propensity Score Methods

    ERIC Educational Resources Information Center

    Dong, Nianbo

    2012-01-01

    This paper is based on previous studies in applying propensity score methods to study multiple treatment variables to examine the causal moderator effect. The propensity score methods will be demonstrated in a case study to examine the causal moderator effect, where the moderators are categorical and continuous variables. Moderation analysis is an…

  4. 2007 Wholesale Power Rate Case Initial Proposal : Risk Analysis Study.

    SciTech Connect

    United States. Bonneville Power Administration.

    2005-11-01

    The Federal Columbia River Power System (FCRPS), operated on behalf of the ratepayers of the PNW by BPA and other Federal agencies, faces many uncertainties during the FY 2007-2009 rate period. Among these uncertainties, the largest revolve around hydro conditions, market prices and river operations for fish recovery. In order to provide a high probability of making its U.S. Treasury payments, BPA performs a Risk Analysis as part of its rate-making process. In this Risk Analysis, BPA identifies key risks, models their relationships, and then analyzes their impacts on net revenues (total revenues less expenses). BPA subsequently evaluates in the ToolKit Model the Treasury Payment Probability (TPP) resulting from the rates, risks, and risk mitigation measures described here and in the Wholesale Power Rate Development Study (WPRDS). If the TPP falls short of BPA's standard, additional risk mitigation revenues, such as PNRR and CRAC revenues are incorporated in the modeling in ToolKit until the TPP standard is met. Increased wholesale market price volatility and six years of drought have significantly changed the profile of risk and uncertainty facing BPA and its stakeholders. These present new challenges for BPA in its effort to keep its power rates as low as possible while fully meeting its obligations to the U.S. Treasury. As a result, the risk BPA faces in not receiving the level of secondary revenues that have been credited to power rates before receiving those funds is greater. In addition to market price volatility, BPA also faces uncertainty around the financial impacts of operations for fish programs in FY 2006 and in the FY 2007-2009 rate period. A new Biological Opinion or possible court-ordered change to river operations in FY 2006 through FY 2009 may reduce BPA's net revenues included Initial Proposal. Finally, the FY 2007-2009 risk analysis includes new operational risks as well as a more comprehensive analysis of non-operating risks. Both the operational

  5. [Competitive karate and the risk of HIV infection--review, risk analysis and risk minimizing strategies].

    PubMed

    Müller-Rath, R; Mumme, T; Miltner, O; Skobel, E

    2004-03-01

    Bleeding facial injuries are not uncommon in competitive karate. Nevertheless, the risk of an infection with HIV is extremely low. Guidelines about the prevention of HIV infections are presented. Especially in contact sports and martial arts the athletes, judges and staff have to recognize and employ these recommendations. Bleeding wounds of the hands due to contact with the opponents teeth can be minimized by fist padding.

  6. Hybrid least squares multivariate spectral analysis methods

    DOEpatents

    Haaland, David M.

    2002-01-01

    A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following estimation or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The "hybrid" method herein means a combination of an initial classical least squares analysis calibration step with subsequent analysis by an inverse multivariate analysis method. A "spectral shape" herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The "shape" can be continuous, discontinuous, or even discrete points illustrative of the particular effect.

  7. RiskChanges Spatial Decision Support system for the analysis of changing multi-hazard risk

    NASA Astrophysics Data System (ADS)

    van Westen, Cees; Zhang, Kaixi; Bakker, Wim; Andrejchenko, Vera; Berlin, Julian; Olyazadeh, Roya; Cristal, Irina

    2015-04-01

    Within the framework of the EU FP7 Marie Curie Project CHANGES and the EU FP7 Copernicus project INCREO a spatial decision support system was developed with the aim to analyse the effect of risk reduction planning alternatives on reducing the risk now and in the future, and support decision makers in selecting the best alternatives. Central to the SDSS are the stakeholders. The envisaged users of the system are organizations involved in planning of risk reduction measures, and that have staff capable of visualizing and analyzing spatial data at a municipal scale. The SDSS should be able to function in different countries with different legal frameworks and with organizations with different mandates. These could be subdivided into Civil protection organization with the mandate to design disaster response plans, Expert organizations with the mandate to design structural risk reduction measures (e.g. dams, dikes, check-dams etc), and planning organizations with the mandate to make land development plans. The SDSS can be used in different ways: analyzing the current level of risk, analyzing the best alternatives for risk reduction, the evaluation of the consequences of possible future scenarios to the risk levels, and the evaluation how different risk reduction alternatives will lead to risk reduction under different future scenarios. The SDSS is developed based on open source software and following open standards, for code as well as for data formats and service interfaces. Code development was based upon open source software as well. The architecture of the system is modular. The various parts of the system are loosely coupled, extensible, using standards for interoperability, flexible and web-based. The Spatial Decision Support System is composed of a number of integrated components. The Risk Assessment component allows to carry out spatial risk analysis, with different degrees of complexity, ranging from simple exposure (overlay of hazard and assets maps) to

  8. A novel risk-based analysis for the production system under epistemic uncertainty

    NASA Astrophysics Data System (ADS)

    Khalaj, Mehran; Khalaj, Fereshteh; Khalaj, Amineh

    2013-11-01

    Risk analysis of production system, while the actual and appropriate data is not available, will cause wrong system parameters prediction and wrong decision making. In uncertainty condition, there are no appropriate measures for decision making. In epistemic uncertainty, we are confronted by the lack of data. Therefore, in calculating the system risk, we encounter vagueness that we have to use more methods that are efficient in decision making. In this research, using Dempster-Shafer method and risk assessment diagram, the researchers have achieved a better method of calculating tools failure risk. Traditional statistical methods for recognizing and evaluating systems are not always appropriate, especially when enough data is not available. The goal of this research was to present a more modern and applied method in real world organizations. The findings of this research were used in a case study, and an appropriate framework and constraint for tools risk were provided. The research has presented a hopeful concept for the calculation of production systems' risk, and its results show that in uncertainty condition or in case of the lack of knowledge, the selection of an appropriate method will facilitate the decision-making process.

  9. Standardised survey method for identifying catchment risks to water quality.

    PubMed

    Baker, D L; Ferguson, C M; Chier, P; Warnecke, M; Watkinson, A

    2016-06-01

    This paper describes the development and application of a systematic methodology to identify and quantify risks in drinking water and recreational catchments. The methodology assesses microbial and chemical contaminants from both diffuse and point sources within a catchment using Escherichia coli, protozoan pathogens and chemicals (including fuel and pesticides) as index contaminants. Hazard source information is gathered by a defined sanitary survey process involving use of a software tool which groups hazards into six types: sewage infrastructure, on-site sewage systems, industrial, stormwater, agriculture and recreational sites. The survey estimates the likelihood of the site affecting catchment water quality, and the potential consequences, enabling the calculation of risk for individual sites. These risks are integrated to calculate a cumulative risk for each sub-catchment and the whole catchment. The cumulative risks process accounts for the proportion of potential input sources surveyed and for transfer of contaminants from upstream to downstream sub-catchments. The output risk matrices show the relative risk sources for each of the index contaminants, highlighting those with the greatest impact on water quality at a sub-catchment and catchment level. Verification of the sanitary survey assessments and prioritisation is achieved by comparison with water quality data and microbial source tracking. PMID:27280603

  10. Analysis of reticulocyte counts using various methods.

    PubMed

    McKenzie, S B; Gauger, C A

    1991-01-01

    The precision and accuracy of manual reticulocyte counts using the Miller disc reticle, other ruled reticle and no reticle are compared with the reticulocyte results from the automated Hematrak 590 instrument. Two slides of each of 50 patient blood specimens were sent to the hematology laboratories of each of six participating hospitals. In addition to between-method comparison (precision), the manual method results using the three different counting techniques were each compared with the Hematrak results to determine if there were significant differences in reported results (accuracy). Statistical analysis revealed that the Miller disc method was the most precise and accurate manual method as compared with the Hematrak. Methods without a Miller disc reported significantly higher reticulocyte counts. Imprecision was also higher among non-Miller manual methods. By using the Miller disc, the accuracy and precision of manual methods may be increased to that of the automated Hematrak method. PMID:10149411

  11. Assessment of ecological risks at former landfill site using TRIAD procedure and multicriteria analysis.

    PubMed

    Sorvari, Jaana; Schultz, Eija; Haimi, Jari

    2013-02-01

    Old industrial landfills are important sources of environmental contamination in Europe, including Finland. In this study, we demonstrated the combination of TRIAD procedure, multicriteria decision analysis (MCDA), and statistical Monte Carlo analysis for assessing the risks to terrestrial biota in a former landfill site contaminated by petroleum hydrocarbons (PHCs) and metals. First, we generated hazard quotients by dividing the concentrations of metals and PHCs in soil by the corresponding risk-based ecological benchmarks. Then we conducted ecotoxicity tests using five plant species, earthworms, and potworms, and determined the abundance and diversity of soil invertebrates from additional samples. We aggregated the results in accordance to the methods used in the TRIAD procedure, conducted rating of the assessment methods based on their performance in terms of specific criteria, and weighted the criteria using two alternative weighting techniques to produce performance scores for each method. We faced problems in using the TRIAD procedure, for example, the results from the animal counts had to be excluded from the calculation of integrated risk estimates (IREs) because our reference soil sample showed the lowest biodiversity and abundance of soil animals. In addition, hormesis hampered the use of the results from the ecotoxicity tests. The final probabilistic IREs imply significant risks at all sampling locations. Although linking MCDA with TRIAD provided a useful means to study and consider the performance of the alternative methods in predicting ecological risks, some uncertainties involved still remained outside the quantitative analysis. PMID:22762796

  12. The Use of Object-Oriented Analysis Methods in Surety Analysis

    SciTech Connect

    Craft, Richard L.; Funkhouser, Donald R.; Wyss, Gregory D.

    1999-05-01

    Object-oriented analysis methods have been used in the computer science arena for a number of years to model the behavior of computer-based systems. This report documents how such methods can be applied to surety analysis. By embodying the causality and behavior of a system in a common object-oriented analysis model, surety analysts can make the assumptions that underlie their models explicit and thus better communicate with system designers. Furthermore, given minor extensions to traditional object-oriented analysis methods, it is possible to automatically derive a wide variety of traditional risk and reliability analysis methods from a single common object model. Automatic model extraction helps ensure consistency among analyses and enables the surety analyst to examine a system from a wider variety of viewpoints in a shorter period of time. Thus it provides a deeper understanding of a system's behaviors and surety requirements. This report documents the underlying philosophy behind the common object model representation, the methods by which such common object models can be constructed, and the rules required to interrogate the common object model for derivation of traditional risk and reliability analysis models. The methodology is demonstrated in an extensive example problem.

  13. An Empirical Bayes Method for Multivariate Meta-analysis with an Application in Clinical Trials

    PubMed Central

    Chen, Yong; Luo, Sheng; Chu, Haitao; Su, Xiao; Nie, Lei

    2013-01-01

    We propose an empirical Bayes method for evaluating overall and study-specific treatment effects in multivariate meta-analysis with binary outcome. Instead of modeling transformed proportions or risks via commonly used multivariate general or generalized linear models, we directly model the risks without any transformation. The exact posterior distribution of the study-specific relative risk is derived. The hyperparameters in the posterior distribution can be inferred through an empirical Bayes procedure. As our method does not rely on the choice of transformation, it provides a flexible alternative to the existing methods and in addition, the correlation parameter can be intuitively interpreted as the correlation coefficient between risks. PMID:25089070

  14. Bioanalytical methods for food contaminant analysis.

    PubMed

    Van Emon, Jeanette M

    2010-01-01

    Foods are complex mixtures of lipids, carbohydrates, proteins, vitamins, organic compounds, and other naturally occurring substances. Sometimes added to this mixture are residues of pesticides, veterinary and human drugs, microbial toxins, preservatives, contaminants from food processing and packaging, and other residues. This milieu of compounds can pose difficulties in the analysis of food contaminants. There is an expanding need for rapid and cost-effective residue methods for difficult food matrixes to safeguard our food supply. Bioanalytical methods are established for many food contaminants such as mycotoxins and are the method of choice for many food allergens. Bioanalytical methods are often more cost-effective and sensitive than instrumental procedures. Recent developments in bioanalytical methods may provide more applications for their use in food analysis.

  15. Method and apparatus for ceramic analysis

    DOEpatents

    Jankowiak, Ryszard J.; Schilling, Chris; Small, Gerald J.; Tomasik, Piotr

    2003-04-01

    The present invention relates to a method and apparatus for ceramic analysis, in particular, a method for analyzing density, density gradients and/or microcracks, including an apparatus with optical instrumentation for analysis of density, density gradients and/or microcracks in ceramics. The method provides analyzing density of a ceramic comprising exciting a component on a surface/subsurface of the ceramic by exposing the material to excitation energy. The method may further include the step of obtaining a measurement of an emitted energy from the component. The method may additionally include comparing the measurement of the emitted energy from the component with a predetermined reference measurement so as to obtain a density for said ceramic.

  16. Methods for assessing uncertainty in fundamental assumptions and associated models for cancer risk assessment.

    PubMed

    Small, Mitchell J

    2008-10-01

    The distributional approach for uncertainty analysis in cancer risk assessment is reviewed and extended. The method considers a combination of bioassay study results, targeted experiments, and expert judgment regarding biological mechanisms to predict a probability distribution for uncertain cancer risks. Probabilities are assigned to alternative model components, including the determination of human carcinogenicity, mode of action, the dosimetry measure for exposure, the mathematical form of the dose-response relationship, the experimental data set(s) used to fit the relationship, and the formula used for interspecies extrapolation. Alternative software platforms for implementing the method are considered, including Bayesian belief networks (BBNs) that facilitate assignment of prior probabilities, specification of relationships among model components, and identification of all output nodes on the probability tree. The method is demonstrated using the application of Evans, Sielken, and co-workers for predicting cancer risk from formaldehyde inhalation exposure. Uncertainty distributions are derived for maximum likelihood estimate (MLE) and 95th percentile upper confidence limit (UCL) unit cancer risk estimates, and the effects of resolving selected model uncertainties on these distributions are demonstrated, considering both perfect and partial information for these model components. A method for synthesizing the results of multiple mechanistic studies is introduced, considering the assessed sensitivities and selectivities of the studies for their targeted effects. A highly simplified example is presented illustrating assessment of genotoxicity based on studies of DNA damage response caused by naphthalene and its metabolites. The approach can provide a formal mechanism for synthesizing multiple sources of information using a transparent and replicable weight-of-evidence procedure. PMID:18844862

  17. Landslide hazard, vulnerability and risk assessment: methods, limits and challenges (Invited)

    NASA Astrophysics Data System (ADS)

    Guzzetti, F.

    2010-12-01

    Landslides are common and widespread geomorphological phenomena that contribute to shape landscapes in all continents. Slope failures are caused by different climatic, meteorological and geophysical triggers and by multiple human activities, and pose a threat to the population, to private and public properties, and to the environment. The large variety of landslide phenomena makes it difficult to establish a single methodology to determine landslide hazard, to ascertain the vulnerability to landslides, and to evaluate landslide risk, at different spatial and temporal scales, and in diverse geomorphological settings. Establishing landslide hazard in a region requires deciding “where”, “when” and “how destructive” landslides are expected. Probabilistic models exist to determine landslide hazard, but these models work under general geomorphological assumptions that are difficult to prove locally. In general, the most difficult - and the most uncertain - component of a landslide hazard assessment is the determination of “when” landslides are expected. Methods based on statistical or deterministic thresholds, or on the analysis of time series of landslides, exist but do not lack limitations related chiefly to the scarcity of data. In addition, validation of the results of a landslide hazard model is a challenging task with the information commonly available. Vulnerability is the degree of loss to a given element, or a set of elements at risk resulting from the occurrence of a landslide. Standards for measuring the vulnerability to landslides have not been established, and catalogues listing information on landslide damage to different types of elements at risk are rare. Lack of information on landslide vulnerability limits our ability to ascertain landslide risk. Risk analysis aims to determine the probability that a specific hazard (an individual landslide or a group of landslides) will cause harm, and it investigates the relationships between the

  18. Methods for impact analysis of shipping containers

    SciTech Connect

    Nelson, T.A.; Chun, R.C.

    1987-11-01

    This report reviews methods for performing impact stress analyses of shipping containers used to transport spent fuel. The three methods discussed are quasi-static, dynamic lumped parameter; and dynamic finite element. These methods are used by industry for performing impact analyses for Safety Analysis Reports. The approach for each method is described including assumptions and limitations and modeling considerations. The effects of uncertainties in the modeling and analyzing of casks are identified. Each of the methods uses linear elastic structural analysis principles. Methods for interfacing impact stresses with the design and load combinations criteria specified in Regulatory Guides 7.6 and 7.8 are outlined. The quasi-static method is based on D'Alembert's principle to substitute equivalent static forces for inertial forces created by the impact. The lumped parameter method is based on using a discrete number of stiffness elements and masses to represent the cask during impact. The dynamic finite element method uses finite element techniques combined with time integration to analyze the cask impact. Each of these methods can provide an acceptable means, within certain limitations, for analyzing cask impact on unyielding surfaces. 25 refs., 23 figs.

  19. Downside Risk analysis applied to the Hedge Funds universe

    NASA Astrophysics Data System (ADS)

    Perelló, Josep

    2007-09-01

    Hedge Funds are considered as one of the portfolio management sectors which shows a fastest growing for the past decade. An optimal Hedge Fund management requires an appropriate risk metrics. The classic CAPM theory and its Ratio Sharpe fail to capture some crucial aspects due to the strong non-Gaussian character of Hedge Funds statistics. A possible way out to this problem while keeping the CAPM simplicity is the so-called Downside Risk analysis. One important benefit lies in distinguishing between good and bad returns, that is: returns greater or lower than investor's goal. We revisit most popular Downside Risk indicators and provide new analytical results on them. We compute these measures by taking the Credit Suisse/Tremont Investable Hedge Fund Index Data and with the Gaussian case as a benchmark. In this way, an unusual transversal lecture of the existing Downside Risk measures is provided.

  20. New risk metrics and mathematical tools for risk analysis: Current and future challenges

    NASA Astrophysics Data System (ADS)

    Skandamis, Panagiotis N.; Andritsos, Nikolaos; Psomas, Antonios; Paramythiotis, Spyridon

    2015-01-01

    The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) the Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total `failure' that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user-friendly softwares

  1. New risk metrics and mathematical tools for risk analysis: Current and future challenges

    SciTech Connect

    Skandamis, Panagiotis N. Andritsos, Nikolaos Psomas, Antonios Paramythiotis, Spyridon

    2015-01-22

    The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) the Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total ‘failure’ that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user

  2. Comparative risk analysis of technological hazards (a review).

    PubMed Central

    Kates, R W; Kasperson, J X

    1983-01-01

    Hazards are threats to people and what they value and risks are measures of hazards. Comparative analyses of the risks and hazards of technology can be dated to Starr's 1969 paper [Starr, C. (1969) Science 165, 1232-1238] but are rooted in recent trends in the evolution of technology, the identification of hazard, the perception of risk, and the activities of society. These trends have spawned an interdisciplinary quasi profession with new terminology, methodology, and literature. A review of 54 English-language monographs and book-length collections, published between 1970 and 1983, identified seven recurring themes: (i) overviews of the field of risk assessment, (ii) efforts to estimate and quantify risk, (iii) discussions of risk acceptability, (iv) perception, (v) analyses of regulation, (vi) case studies of specific technological hazards, and (vii) agenda for research. Within this field, science occupies a unique niche, for many technological hazards transcend the realm of ordinary experience and require expert study. Scientists can make unique contributions to each area of hazard management but their primary contribution is the practice of basic science. Beyond that, science needs to further risk assessment by understanding the more subtle processes of hazard creation and by establishing conventions for estimating risk and for presenting and handling uncertainty. Scientists can enlighten the discussion of tolerable risk by setting risks into comparative contexts, by studying the process of evaluation, and by participating as knowledgeable individuals, but they cannot decide the issue. Science can inform the hazard management process by broadening the range of alternative control actions and modes of implementation and by devising methods to evaluate their effectiveness. PMID:6580625

  3. Rocky Flats Plant Live-Fire Range Risk Analysis Report

    SciTech Connect

    Nicolosi, S.L.; Rodriguez, M.A.

    1994-04-01

    The objective of the Live-Fire Range Risk Analysis Report (RAR) is to provide an authorization basis for operation as required by DOE 5480.16. The existing Live-Fire Range does not have a safety analysis-related authorization basis. EG&G Rocky Flats, Inc. has worked with DOE and its representatives to develop a format and content description for development of an RAR for the Live-Fire Range. Development of the RAR is closely aligned with development of the design for a baffle system to control risks from errant projectiles. DOE 5480.16 requires either an RAR or a safety analysis report (SAR) for live-fire ranges. An RAR rather than a SAR was selected in order to gain flexibility to more closely address the safety analysis and conduct of operation needs for a live-fire range in a cost-effective manner.

  4. Accidental Water Pollution Risk Analysis of Mine Tailings Ponds in Guanting Reservoir Watershed, Zhangjiakou City, China.

    PubMed

    Liu, Renzhi; Liu, Jing; Zhang, Zhijiao; Borthwick, Alistair; Zhang, Ke

    2015-12-01

    Over the past half century, a surprising number of major pollution incidents occurred due to tailings dam failures. Most previous studies of such incidents comprised forensic analyses of environmental impacts after a tailings dam failure, with few considering the combined pollution risk before incidents occur at a watershed-scale. We therefore propose Watershed-scale Tailings-pond Pollution Risk Analysis (WTPRA), designed for multiple mine tailings ponds, stemming from previous watershed-scale accidental pollution risk assessments. Transferred and combined risk is embedded using risk rankings of multiple routes of the "source-pathway-target" in the WTPRA. The previous approach is modified using multi-criteria analysis, dam failure models, and instantaneous water quality models, which are modified for application to multiple tailings ponds. The study area covers the basin of Gutanting Reservoir (the largest backup drinking water source for Beijing) in Zhangjiakou City, where many mine tailings ponds are located. The resultant map shows that risk is higher downstream of Gutanting Reservoir and in its two tributary basins (i.e., Qingshui River and Longyang River). Conversely, risk is lower in the midstream and upstream reaches. The analysis also indicates that the most hazardous mine tailings ponds are located in Chongli and Xuanhua, and that Guanting Reservoir is the most vulnerable receptor. Sensitivity and uncertainty analyses are performed to validate the robustness of the WTPRA method. PMID:26633450

  5. Accidental Water Pollution Risk Analysis of Mine Tailings Ponds in Guanting Reservoir Watershed, Zhangjiakou City, China

    PubMed Central

    Liu, Renzhi; Liu, Jing; Zhang, Zhijiao; Borthwick, Alistair; Zhang, Ke

    2015-01-01

    Over the past half century, a surprising number of major pollution incidents occurred due to tailings dam failures. Most previous studies of such incidents comprised forensic analyses of environmental impacts after a tailings dam failure, with few considering the combined pollution risk before incidents occur at a watershed-scale. We therefore propose Watershed-scale Tailings-pond Pollution Risk Analysis (WTPRA), designed for multiple mine tailings ponds, stemming from previous watershed-scale accidental pollution risk assessments. Transferred and combined risk is embedded using risk rankings of multiple routes of the “source-pathway-target” in the WTPRA. The previous approach is modified using multi-criteria analysis, dam failure models, and instantaneous water quality models, which are modified for application to multiple tailings ponds. The study area covers the basin of Gutanting Reservoir (the largest backup drinking water source for Beijing) in Zhangjiakou City, where many mine tailings ponds are located. The resultant map shows that risk is higher downstream of Gutanting Reservoir and in its two tributary basins (i.e., Qingshui River and Longyang River). Conversely, risk is lower in the midstream and upstream reaches. The analysis also indicates that the most hazardous mine tailings ponds are located in Chongli and Xuanhua, and that Guanting Reservoir is the most vulnerable receptor. Sensitivity and uncertainty analyses are performed to validate the robustness of the WTPRA method. PMID:26633450

  6. Accidental Water Pollution Risk Analysis of Mine Tailings Ponds in Guanting Reservoir Watershed, Zhangjiakou City, China.

    PubMed

    Liu, Renzhi; Liu, Jing; Zhang, Zhijiao; Borthwick, Alistair; Zhang, Ke

    2015-12-02

    Over the past half century, a surprising number of major pollution incidents occurred due to tailings dam failures. Most previous studies of such incidents comprised forensic analyses of environmental impacts after a tailings dam failure, with few considering the combined pollution risk before incidents occur at a watershed-scale. We therefore propose Watershed-scale Tailings-pond Pollution Risk Analysis (WTPRA), designed for multiple mine tailings ponds, stemming from previous watershed-scale accidental pollution risk assessments. Transferred and combined risk is embedded using risk rankings of multiple routes of the "source-pathway-target" in the WTPRA. The previous approach is modified using multi-criteria analysis, dam failure models, and instantaneous water quality models, which are modified for application to multiple tailings ponds. The study area covers the basin of Gutanting Reservoir (the largest backup drinking water source for Beijing) in Zhangjiakou City, where many mine tailings ponds are located. The resultant map shows that risk is higher downstream of Gutanting Reservoir and in its two tributary basins (i.e., Qingshui River and Longyang River). Conversely, risk is lower in the midstream and upstream reaches. The analysis also indicates that the most hazardous mine tailings ponds are located in Chongli and Xuanhua, and that Guanting Reservoir is the most vulnerable receptor. Sensitivity and uncertainty analyses are performed to validate the robustness of the WTPRA method.

  7. Summary of the workshop on issues in risk assessment: quantitative methods for developmental toxicology.

    PubMed

    Mattison, D R; Sandler, J D

    1994-08-01

    This report summarizes the proceedings of a conference on quantitative methods for assessing the risks of developmental toxicants. The conference was planned by a subcommittee of the National Research Council's Committee on Risk Assessment Methodology in conjunction with staff from several federal agencies, including the U.S. Environmental Protection Agency, U.S. Food and Drug Administration, U.S. Consumer Products Safety Commission, and Health and Welfare Canada. Issues discussed at the workshop included computerized techniques for hazard identification, use of human and animal data for defining risks in a clinical setting, relationships between end points in developmental toxicity testing, reference dose calculations for developmental toxicology, analysis of quantitative dose-response data, mechanisms of developmental toxicity, physiologically based pharmacokinetic models, and structure-activity relationships. Although a formal consensus was not sought, many participants favored the evolution of quantitative techniques for developmental toxicology risk assessment, including the replacement of lowest observed adverse effect levels (LOAELs) and no observed adverse effect levels (NOAELs) with the benchmark dose methodology.

  8. Risk analysis and priority setting for environmental policy

    SciTech Connect

    Travis, C.C.

    1991-01-01

    There is a growing realization that the demand for funding to correct our nation's environmental problems will soon outstrip available resources. In the hazardous waste area alone, the estimated cost of remediating Superfund sites ranges from $32 billion to $80 billion. Numerous other areas of competing for these same financial resources. These include ozone depletion, global warming, the protection of endangered species and wetlands, toxic air pollution, carcinogenic pesticides, and urban smog. In response to this imbalance in the supply and demand for national funds, several political constituencies are calling for the use of risk assessment as a tool in the prioritization of research and budget needs. Comparative risk analysis offers a logical framework in which to organize information about complex environmental problems. Risk analysis allows policy analysts to make resource allocation decisions on the basis of scientific judgement rather than political expediency.

  9. Risk analysis and priority setting for environmental policy

    SciTech Connect

    Travis, C.C.

    1991-12-31

    There is a growing realization that the demand for funding to correct our nation`s environmental problems will soon outstrip available resources. In the hazardous waste area alone, the estimated cost of remediating Superfund sites ranges from $32 billion to $80 billion. Numerous other areas of competing for these same financial resources. These include ozone depletion, global warming, the protection of endangered species and wetlands, toxic air pollution, carcinogenic pesticides, and urban smog. In response to this imbalance in the supply and demand for national funds, several political constituencies are calling for the use of risk assessment as a tool in the prioritization of research and budget needs. Comparative risk analysis offers a logical framework in which to organize information about complex environmental problems. Risk analysis allows policy analysts to make resource allocation decisions on the basis of scientific judgement rather than political expediency.

  10. Towards secure virtual directories : a risk analysis framework.

    SciTech Connect

    Claycomb, William R.

    2010-07-01

    Directory services are used by almost every enterprise computing environment to provide data concerning users, computers, contacts, and other objects. Virtual directories are components that provide directory services in a highly customized manner. Unfortunately, though the use of virtual directory services are widespread, an analysis of risks posed by their unique position and architecture has not been completed. We present a detailed analysis of six attacks to virtual directory services, including steps for detection and prevention. We also describe various categories of attack risks, and discuss what is necessary to launch an attack on virtual directories. Finally, we present a framework to use in analyzing risks to individual enterprise computing virtual directory instances. We show how to apply this framework to an example implementation, and discuss the benefits of doing so.

  11. A free and open source QGIS plugin for flood risk analysis: FloodRisk

    NASA Astrophysics Data System (ADS)

    Albano, Raffaele; Sole, Aurelia; Mancusi, Leonardo

    2016-04-01

    An analysis of global statistics shows a substantial increase in flood damage over the past few decades. Moreover, it is expected that flood risk will continue to rise due to the combined effect of increasing numbers of people and economic assets in risk-prone areas and the effects of climate change. In order to increase the resilience of European economies and societies, the improvement of risk assessment and management has been pursued in the last years. This results in a wide range of flood analysis models of different complexities with substantial differences in underlying components needed for its implementation, as geographical, hydrological and social differences demand specific approaches in the different countries. At present, it is emerging the need of promote the creation of open, transparent, reliable and extensible tools for a comprehensive, context-specific and applicable flood risk analysis. In this context, the free and open-source Quantum GIS (QGIS) plugin "FloodRisk" is a good starting point to address this objective. The vision of the developers of this free and open source software (FOSS) is to combine the main features of state-of-the-art science, collaboration, transparency and interoperability in an initiative to assess and communicate flood risk worldwide and to assist authorities to facilitate the quality and fairness of flood risk management at multiple scales. Among the scientific community, this type of activity can be labelled as "participatory research", intended as adopting a set of techniques that "are interactive and collaborative" and reproducible, "providing a meaningful research experience that both promotes learning and generates knowledge and research data through a process of guided discovery"' (Albano et al., 2015). Moreover, this FOSS geospatial approach can lowering the financial barriers to understanding risks at national and sub-national levels through a spatio-temporal domain and can provide better and more complete

  12. School Health Promotion Policies and Adolescent Risk Behaviors in Israel: A Multilevel Analysis

    ERIC Educational Resources Information Center

    Tesler, Riki; Harel-Fisch, Yossi; Baron-Epel, Orna

    2016-01-01

    Background: Health promotion policies targeting risk-taking behaviors are being implemented across schools in Israel. This study identified the most effective components of these policies influencing cigarette smoking and alcohol consumption among adolescents. Methods: Logistic hierarchical linear model (HLM) analysis of data for 5279 students in…

  13. Applying Uncertainty Analysis to a Risk Assessment for the Pesticide Permethrin

    EPA Science Inventory

    We discuss the application of methods of uncertainty analysis from our previous poster to the problem of a risk assessment for exposure to the food-use pesticide permethrin resulting from residential pesticide crack and crevice application. Exposures are simulated by the SHEDS (S...

  14. Risk Factors for the Perpetration of Child Sexual Abuse: A Review and Meta-Analysis

    ERIC Educational Resources Information Center

    Whitaker, Daniel J.; Le, Brenda; Hanson, R. Karl; Baker, Charlene K.; McMahon, Pam M.; Ryan, Gail; Klein, Alisa; Rice, Deborah Donovan

    2008-01-01

    Objectives: Since the late 1980s, there has been a strong theoretical focus on psychological and social influences of perpetration of child sexual abuse. This paper presents the results of a review and meta-analysis of studies examining risk factors for perpetration of child sexual abuse published since 1990. Method: Eighty-nine studies published…

  15. Hybrid least squares multivariate spectral analysis methods

    DOEpatents

    Haaland, David M.

    2004-03-23

    A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following prediction or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The hybrid method herein means a combination of an initial calibration step with subsequent analysis by an inverse multivariate analysis method. A spectral shape herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The shape can be continuous, discontinuous, or even discrete points illustrative of the particular effect.

  16. Mudflow Hazards in the Georgian Caucasus - Using Participatory Methods to Investigate Disaster Risk

    NASA Astrophysics Data System (ADS)

    Spanu, Valentina; McCall, Michael; Gaprindashvili, George

    2014-05-01

    /Management (DRR and DRM). Participatory surveys (and participatory monitoring) elicit local people's knowledge about the specifics of the hazard concerning frequency, timing, warning signals, rates of flow, spatial extent, etc. And significantly, only this local knowledge from informants can reveal essential information about different vulnerabilities of people and places, and about any coping or adjustment mechanisms that local people have. The participatory methods employed in Mleta included historical discussions with key informants, village social transects, participatory mapping with children, semi-structured interviews with inhabitants, and VCA (Vulnerability & Capacity Analysis). The geolomorphological map produced on the base of the local geology has been realized with ArcGIS. This allowed the assessment of the areas at risk and the relative maps. We adapted and tested the software programme CyberTracker as a survey tool, a digital device method of field data collection. Google Earth, OpenStreetMap, Virtual Earth and Ilwis have been used for data processing.

  17. Characterization and evaluation of uncertainty in probabilistic risk analysis

    SciTech Connect

    Parry, G.W.; Winter, P.W.

    1981-01-01

    The sources of uncertainty in probabilistic risk analysis are discussed, using the event and fault-tree methodology as an example. The role of statistics in quantifying these uncertainties is investigated. A class of uncertainties is identified which are, at present, unquantifiable using either classical or Bayesian statistics. it is argued that Bayesian statistics is the more appropriate vehicle for the probabilistic analysis of rare events, and a short review is given with some discussion on the representation of ignorance.

  18. Advanced analysis methods in particle physics

    SciTech Connect

    Bhat, Pushpalatha C.; /Fermilab

    2010-10-01

    Each generation of high energy physics experiments is grander in scale than the previous - more powerful, more complex and more demanding in terms of data handling and analysis. The spectacular performance of the Tevatron and the beginning of operations of the Large Hadron Collider, have placed us at the threshold of a new era in particle physics. The discovery of the Higgs boson or another agent of electroweak symmetry breaking and evidence of new physics may be just around the corner. The greatest challenge in these pursuits is to extract the extremely rare signals, if any, from huge backgrounds arising from known physics processes. The use of advanced analysis techniques is crucial in achieving this goal. In this review, I discuss the concepts of optimal analysis, some important advanced analysis methods and a few examples. The judicious use of these advanced methods should enable new discoveries and produce results with better precision, robustness and clarity.

  19. Spectroscopic chemical analysis methods and apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F. (Inventor); Reid, Ray D. (Inventor)

    2010-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted simultaneously with native fluorescence spectroscopy to provide high levels of sensitivity and specificity in the same instrument.

  20. Spectroscopic chemical analysis methods and apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F. (Inventor); Reid, Ray D. (Inventor)

    2009-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted simultaneously with native fluorescence spectroscopy to provide high levels of sensitivity and specificity in the same instrument.

  1. RASOR Project: Rapid Analysis and Spatialisation of Risk, from Hazard to Risk using EO data

    NASA Astrophysics Data System (ADS)

    Rossi, Lauro; Rudari, Roberto

    2016-04-01

    Over recent decades, there has been a dramatic rise in disasters, and their impact on human populations. Escalation in complexities in our societies is making risks increasingly difficult to understand and changing the ways in which hazards interact with each other. The Rapid Analysis and Spatialisation and Of Risk (RASOR) project developed a multi-hazard risk analysis platform to support the full cycle of disaster management. RASOR provides up-to-date hazard information across floods and geohazards, up-to-date exposure data from known sources and newly-generated EO-based data, and characterised quantitatively their vulnerabilities. RASOR also adapts the newly-developed 12m resolution global TanDEM-X Digital Elevation Model (DEM) to risk management applications, using it as a base layer to develop specific disaster scenarios. RASOR overlays archived and near real-time very high resolution optical and radar satellite data, combined with in situ data for both global and local applications. A scenario-driven query system allows users to project situations into the future and model multi-hazard risk both before and during an event. Applications with regards to different case study sites are presented in order to illustrate the platform potential.

  2. Quantitative risk analysis of oil storage facilities in seismic areas.

    PubMed

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference. PMID:15908107

  3. CUMULATIVE RISK ASSESSMENT: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS

    EPA Science Inventory

    INTRODUCTION: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS FOR CUMULATIVE RISK

    Hugh A. Barton1 and Carey N. Pope2
    1US EPA, Office of Research and Development, National Health and Environmental Effects Research Laboratory, Research Triangle Park, NC
    2Department of...

  4. Safety risk analysis of an innovative environmental technology.

    PubMed

    Parnell, G S; Frimpon, M; Barnes, J; Kloeber, J M; Deckro, R E; Jackson, J A

    2001-02-01

    The authors describe a decision and risk analysis performed for the cleanup of a large Department of Energy mixed-waste subsurface disposal area governed by the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA). In a previous study, the authors worked with the site decision makers, state regulators, and U.S. Environmental Protection Agency regional regulators to develop a CERCLA-based multiobjective decision analysis value model and used the model to perform a screening analysis of 28 remedial alternatives. The analysis results identified an innovative technology, in situ vitrification, with high effectiveness versus cost. Since this technology had not been used on this scale before, the major uncertainties were contaminant migration and pressure buildup. Pressure buildup was a safety concern due to the potential risks to worker safety. With the help of environmental technology experts remedial alternative changes were identified to mitigate the concerns about contaminant migration and pressure buildup. The analysis results showed that the probability of an event with a risk to worker safety had been significantly reduced. Based on these results, site decision makers have refocused their test program to examine in situ vitrification and have continued the use of the CERCLA-based decision analysis methodology to analyze remedial alternatives. PMID:11332543

  5. Determinants of contraceptive method among young women at risk for unintended pregnancy and sexually transmitted infections.

    PubMed

    Raine, Tina; Minnis, Alexandra M; Padian, Nancy S

    2003-07-01

    The objective of this study was to examine the relationship between contraceptive method choice, sexual risk and various demographic and social factors. Data were collected on 378, 15- to 24-year-old women, recruited from health clinics and through community outreach in Northern California. Logistic regression analysis was used to estimate the association of predictors with contraceptive method used at last sex. Asian and Latina women were less likely to use any method. Women who were raised with a religion, or thought they were infertile, were also less likely to use any method. Women with multiple partners were generally less likely to use any method, but were more likely to use barrier methods when they did use one. Few women (7%) were dual method users. Women appear to act in a rational fashion within their own social context and may use no methods at all or use methods that are less effective for pregnancy prevention but offer more protection from sexually transmitted infections. PMID:12878282

  6. The Short- and Long-Term Risk of Stroke after Herpes Zoster: A Meta-Analysis

    PubMed Central

    Liu, Xuechun; Guan, Yeming; Hou, Liang; Huang, Haili; Liu, Hongjuan; Li, Chuanwen; Zhu, Yingying; Tao, Xingyong; Wang, Qingsong

    2016-01-01

    Background Accumulating evidence indicates that stroke risk may be increased following herpes zoster. The aim of this study is to perform a meta-analysis of current literature to systematically analyze and quantitatively estimate the short and long-term effects of herpes zoster on the risk of stroke. Methods Embase, PubMed and Cochrane library databases were searched for relevant studies up to March 2016. Studies were selected for analysis based on certain inclusion and exclusion criteria. Relative risks with 95% confidence interval (CI) were extracted to assess the association between herpes zoster and stroke. Results A total of 8 articles were included in our analysis. The present meta-analysis showed that the risks of stroke after herpes zoster were 2.36 (95% CI: 2.17–2.56) for first 2 weeks, 1.56 (95% CI: 1.46–1.66) for first month, 1.17 (95% CI: 1.13–1.22) for first year, and 1.09 (95% CI: 1.02–1.16) for more than 1 year, respectively. Conclusion The results of our study demonstrated that herpes zoster was associated with a higher risk of stroke, but the risks decreased along with the time after herpes zoster. PMID:27768762

  7. Scientific commentary: Strategic analysis of environmental policy risks--heat maps, risk futures and the character of environmental harm.

    PubMed

    Prpich, G; Dagonneau, J; Rocks, S A; Lickorish, F; Pollard, S J T

    2013-10-01

    We summarise our recent efforts on the policy-level risk appraisal of environmental risks. These have necessitated working closely with policy teams and a requirement to maintain crisp and accessible messages for policy audiences. Our comparative analysis uses heat maps, supplemented with risk narratives, and employs the multidimensional character of risks to inform debates on the management of current residual risk and future threats. The policy research and ensuing analysis raises core issues about how comparative risk analyses are used by policy audiences, their validation and future developments that are discussed in the commentary below.

  8. Chromatographic methods for analysis of triazine herbicides.

    PubMed

    Abbas, Hana Hassan; Elbashir, Abdalla A; Aboul-Enein, Hassan Y

    2015-01-01

    Gas chromatography (GC) and high-performance liquid chromatography (HPLC) coupled to different detectors, and in combination with different sample extraction methods, are most widely used for analysis of triazine herbicides in different environmental samples. Nowadays, many variations and modifications of extraction and sample preparation methods such as solid-phase microextraction (SPME), hollow fiber-liquid phase microextraction (HF-LPME), stir bar sportive extraction (SBSE), headspace-solid phase microextraction (HS-SPME), dispersive liquid-liquid microextraction (DLLME), dispersive liquid-liquid microextraction based on solidification of floating organic droplet (DLLME-SFO), ultrasound-assisted emulsification microextraction (USAEME), and others have been introduced and developed to obtain sensitive and accurate methods for the analysis of these hazardous compounds. In this review, several analytical properties such as linearity, sensitivity, repeatability, and accuracy for each developed method are discussed, and excellent results were obtained for the most of developed methods combined with GC and HPLC techniques for the analysis of triazine herbicides. This review gives an overview of recent publications of the application of GC and HPLC for analysis of triazine herbicides residues in various samples.

  9. Systemic Lupus Erythematous and Malignancy Risk: A Meta-Analysis

    PubMed Central

    Xu, Gaixiang; Liu, Ping; Meng, Haitao; Wang, Jinghan; Zhao, Xiaoying; Tang, Yongmin; Jin, Jie

    2015-01-01

    Background Pilot studies have estimated cancer incidence in patients with systemic lupus erythematous (SLE). However, the results have been inconclusive. To ascertain the correlation between SLE and malignancy more comprehensively and precisely, we conducted a meta-analysis. Methods PubMed, the Cochrane Library and Embase databases through June 2014, were searched to identify observational studies evaluating the association between SLE and malignancy. The outcomes from these studies were measured as relative risks (RRs). A random or fixed effects model was chosen to calculate the pooled RR according to heterogeneity test. Between-study heterogeneity was assessed by estimating I2 index. Publication bias was assessed by Egger’s test. Results A total of 16 papers, including 59,662 SLE patients, were suitable for the meta-analysis. Of these papers, 15 reported RRs for overall malignancy, 12 for non-Hodgkin lymphoma (NHL) and lung cancer, 7 for bladder cancer, 6 for Hodgkin lymphoma (HL) and leukemia, 5 for skin melanoma, and liver and thyroid cancers, 4 for multiple myeloma (MM), and esophageal and vaginal/vulvar cancers and 3 for laryngeal and non-melanoma skin cancers. The pooled RRs were 1.28 (95% CI, 1.17–1.41) for overall cancer, 5.40 (95% CI, 3.75–7.77) for NHL, 3.26(95% CI, 2.17–4.88) for HL, 2.01(95% CI, 1.61–2.52) for leukemia, 1.45(95% CI, 1.04–2.03) for MM, 4.19(95% CI, 1.98–8.87) for laryngeal cancer, 1.59 (95% CI, 1.44–1.76) for lung cancer, 1.86(95% CI, 1.21–2.88) for esophageal cancer, 3.21(95% CI, 1.70–6.05) for liver cancer, 3.67(95% CI, 2.80–4.81) for vaginal/vulvar cancer, 2.11(95% CI, 1.12–3.99) for bladder cancer, 1.51(95% CI, 1.12–2.03) for non-melanoma skin cancer, 1.78(95% CI, 1.35–2.33) for thyroid cancer, and 0.65(95% CI, 0.50–0.85) for skin melanoma. Only the meta-analyses of overall malignancy, NHL, and liver and bladder cancers produced substantial heterogeneity (I2, 57.6% vs 74.3% vs 67.7% vs 82.3%). No

  10. Rationale and methods of the cardiometabolic valencian study (escarval-risk) for validation of risk scales in mediterranean patients with hypertension, diabetes or dyslipidemia

    PubMed Central

    2010-01-01

    Background The Escarval-Risk study aims to validate cardiovascular risk scales in patients with hypertension, diabetes or dyslipidemia living in the Valencia Community, a European Mediterranean region, based on data from an electronic health recording system comparing predicted events with observed during 5 years follow-up study. Methods/Design A cohort prospective 5 years follow-up study has been designed including 25000 patients with hypertension, diabetes and/or dyslipidemia attended in usual clinical practice. All information is registered in a unique electronic health recording system (ABUCASIS) that is the usual way to register clinical practice in the Valencian Health System (primary and secondary care). The system covers about 95% of population (near 5 million people). The system is linked with database of mortality register, hospital withdrawals, prescriptions and assurance databases in which each individual have a unique identification number. Diagnoses in clinical practice are always registered based on IDC-9. Occurrence of CV disease was the main outcomes of interest. Risk survival analysis methods will be applied to estimate the cumulative incidence of developing CV events over time. Discussion The Escarval-Risk study will provide information to validate different cardiovascular risk scales in patients with hypertension, diabetes or dyslipidemia from a low risk Mediterranean Region, the Valencia Community. PMID:21092179

  11. Design analysis, robust methods, and stress classification

    SciTech Connect

    Bees, W.J.

    1993-01-01

    This special edition publication volume is comprised of papers presented at the 1993 ASME Pressure Vessels and Piping Conference, July 25--29, 1993 in Denver, Colorado. The papers were prepared for presentations in technical sessions developed under the auspices of the PVPD Committees on Computer Technology, Design and Analysis, Operations Applications and Components. The topics included are: Analysis of Pressure Vessels and Components; Expansion Joints; Robust Methods; Stress Classification; and Non-Linear Analysis. Individual papers have been processed separately for inclusion in the appropriate data bases.

  12. ATMOS data processing and science analysis methods.

    PubMed

    Norton, R H; Rinsland, C P

    1991-02-01

    The ATMOS (atmospheric trace molecule spectroscopy) instrument, a high speed Fourier transform spectrometer operating in the middle IR (2.2-16 microm), recorded more than 1500 solar spectra at approximately 0.0105-cm(-1) resolution during its first mission onboard the shuttle Challenger in the spring of 1985. These spectra were acquired during high sun conditions for studies of the solar atmosphere and during low sun conditions for studies of the earth's upper atmosphere. This paper describes the steps by which the telemetry data were converted into spectra suitable for analysis, the analysis software and methods developed for the atmospheric and solar studies, and the ATMOS data analysis facility.

  13. Numerical analysis of the orthogonal descent method

    SciTech Connect

    Shokov, V.A.; Shchepakin, M.B.

    1994-11-01

    The author of the orthogonal descent method has been testing it since 1977. The results of these tests have only strengthened the need for further analysis and development of orthogonal descent algorithms for various classes of convex programming problems. Systematic testing of orthogonal descent algorithms and comparison of test results with other nondifferentiable optimization methods was conducted at TsEMI RAN in 1991-1992 using the results.

  14. Quantitative risk assessment methods for cancer and noncancer effects.

    PubMed

    Baynes, Ronald E

    2012-01-01

    Human health risk assessments have evolved from the more qualitative approaches to more quantitative approaches in the past decade. This has been facilitated by the improvement in computer hardware and software capability and novel computational approaches being slowly recognized by regulatory agencies. These events have helped reduce the reliance on experimental animals as well as better utilization of published animal toxicology data in deriving quantitative toxicity indices that may be useful for risk management purposes. This chapter briefly describes some of the approaches as described in the guidance documents from several of the regulatory agencies as it pertains to hazard identification and dose-response assessment of a chemical. These approaches are contrasted with more novel computational approaches that provide a better grasp of the uncertainty often associated with chemical risk assessments.

  15. Methods for genetic linkage analysis using trisomies

    SciTech Connect

    Feingold, E.; Lamb, N.E.; Sherman, S.L.

    1995-02-01

    Certain genetic disorders are rare in the general population, but more common in individuals with specific trisomies. Examples of this include leukemia and duodenal atresia in trisomy 21. This paper presents a linkage analysis method for using trisomic individuals to map genes for such traits. It is based on a very general gene-specific dosage model that posits that the trait is caused by specific effects of different alleles at one or a few loci and that duplicate copies of {open_quotes}susceptibility{close_quotes} alleles inherited from the nondisjoining parent give increased likelihood of having the trait. Our mapping method is similar to identity-by-descent-based mapping methods using affected relative pairs and also to methods for mapping recessive traits using inbred individuals by looking for markers with greater than expected homozygosity by descent. In the trisomy case, one would take trisomic individuals and look for markers with greater than expected homozygosity in the chromosomes inherited from the nondisjoining parent. We present statistical methods for performing such a linkage analysis, including a test for linkage to a marker, a method for estimating the distance from the marker to the trait gene, a confidence interval for that distance, and methods for computing power and sample sizes. We also resolve some practical issues involved in implementing the methods, including how to use partially informative markers and how to test candidate genes. 20 refs., 5 figs., 1 tab.

  16. Japanese Encephalitis Risk and Contextual Risk Factors in Southwest China: A Bayesian Hierarchical Spatial and Spatiotemporal Analysis

    PubMed Central

    Zhao, Xing; Cao, Mingqin; Feng, Hai-Huan; Fan, Heng; Chen, Fei; Feng, Zijian; Li, Xiaosong; Zhou, Xiao-Hua

    2014-01-01

    It is valuable to study the spatiotemporal pattern of Japanese encephalitis (JE) and its association with the contextual risk factors in southwest China, which is the most endemic area in China. Using data from 2004 to 2009, we applied GISmapping and spatial autocorrelation analysis to analyze reported incidence data of JE in 438 counties in southwest China, finding that JE cases were not randomly distributed, and a Bayesian hierarchical spatiotemporal model identified the east part of southwest China as a high risk area. Meanwhile, the Bayesian hierarchical spatial model in 2006 demonstrated a statistically significant association between JE and the agricultural and climatic variables, including the proportion of rural population, the pig-to-human ratio, the monthly precipitation and the monthly mean minimum and maximum temperatures. Particular emphasis was placed on the time-lagged effect for climatic factors. The regression method and the Spearman correlation analysis both identified a two-month lag for the precipitation, while the regression method found a one-month lag for temperature. The results show that the high risk area in the east part of southwest China may be connected to the agricultural and climatic factors. The routine surveillance and the allocation of health resources should be given more attention in this area. Moreover, the meteorological variables might be considered as possible predictors of JE in southwest China. PMID:24739769

  17. Identifying High-Risk Populations of Tuberculosis Using Environmental Factors and GIS Based Multi-Criteria Decision Making Method

    NASA Astrophysics Data System (ADS)

    Rasam, A. R. Abdul; Shariff, N. M.; Dony, J. F.

    2016-09-01

    Development of an innovative method to enhance the detection of tuberculosis (TB) in Malaysia is the latest agenda of the Ministry of Health. Therefore, a geographical information system (GIS) based index model is proposed as an alternative method for defining potential high-risk areas of local TB cases at Section U19, Shah Alam. It is adopted a spatial multi-criteria decision making (MCDM) method for ranking environmental risk factors of the disease in a standardised five-score scale. Scale 1 and 5 illustrate the lowest and the highest risk of the TB spread respectively, while scale from 3 to 5 is included as a potential risk level. These standardised scale values are then combined with expert normalised weights (0 to 1) to calculate the overall index values and produce a TB ranked map using a GIS overlay analysis and weighted linear combination. It is discovered that 71.43% of the Section is potential as TB high risk areas particularly at urban and densely populated settings. This predictive result is also reliable with the current real cases in 2015 by 76.00% accuracy. A GIS based MCDM method has demonstrated analytical capabilities in targeting high-risk spots and TB surveillance monitoring system of the country, but the result could be strengthened by applying other uncertainty assessment method.

  18. Evaluation of association methods for analysing modifiers of disease risk in carriers of high-risk mutations.

    PubMed

    Barnes, Daniel R; Lee, Andrew; Easton, Douglas F; Antoniou, Antonis C

    2012-04-01

    There is considerable evidence indicating that disease risk in carriers of high-risk mutations (e.g. BRCA1 and BRCA2) varies by other genetic factors. Such mutations tend to be rare in the population and studies of genetic modifiers of risk have focused on sampling mutation carriers through clinical genetics centres. Genetic testing targets affected individuals from high-risk families, making ascertainment of mutation carriers non-random with respect to disease phenotype. Standard analytical methods can lead to biased estimates of associations. Methods proposed to address this problem include a weighted-cohort (WC) and retrospective likelihood (RL) approach. Their performance has not been evaluated systematically. We evaluate these methods by simulation and extend the RL to analysing associations of two diseases simultaneously (competing risks RL-CRRL). The standard cohort approach (Cox regression) yielded the most biased risk ratio (RR) estimates (relative bias-RB: -25% to -17%) and had the lowest power. The WC and RL approaches provided similar RR estimates, were least biased (RB: -2.6% to 2.5%), and had the lowest mean-squared errors. The RL method generally had more power than WC. When analysing associations with two diseases, ignoring a potential association with one disease leads to inflated type I errors for inferences with respect to the second disease and biased RR estimates. The CRRL generally gave unbiased RR estimates for both disease risks and had correct nominal type I errors. These methods are illustrated by analyses of genetic modifiers of breast and ovarian cancer risk for BRCA1 and BRCA2 mutation carriers. PMID:22714938

  19. tropical cyclone risk analysis: a decisive role of its track

    NASA Astrophysics Data System (ADS)

    Chelsea Nam, C.; Park, Doo-Sun R.; Ho, Chang-Hoi

    2016-04-01

    The tracks of 85 tropical cyclones (TCs) that made landfall to South Korea for the period 1979-2010 are classified into four clusters by using a fuzzy c-means clustering method. The four clusters are characterized by 1) east-short, 2) east-long, 3) west-long, and 4) west-short based on the moving routes around Korean peninsula. We conducted risk comparison analysis for these four clusters regarding their hazards, exposure, and damages. Here, hazard parameters are calculated from two different sources independently, one from the best-track data (BT) and the other from the 60 weather stations over the country (WS). The results show distinct characteristics of the four clusters in terms of the hazard parameters and economic losses (EL), suggesting that there is a clear track-dependency in the overall TC risk. It is appeared that whether there occurred an "effective collision" overweighs the intensity of the TC per se. The EL ranking did not agree with the BT parameters (maximum wind speed, central pressure, or storm radius), but matches to WS parameter (especially, daily accumulated rainfall and TC-influenced period). The west-approaching TCs (i.e. west-long and west-short clusters) generally recorded larger EL than the east-approaching TCs (i.e. east-short and east-long clusters), although the east-long clusters are the strongest in BT point of view. This can be explained through the spatial distribution of the WS parameters and the regional EL maps corresponding to it. West-approaching TCs accompanied heavy rainfall on the southern regions with the helps of the topographic effect on their tracks, and of the extended stay on the Korean Peninsula in their extratropical transition, that were not allowed to the east-approaching TCs. On the other hand, some regions had EL that are not directly proportional to the hazards, and this is partly attributed to spatial disparity in wealth and vulnerability. Correlation analysis also revealed the importance of rainfall; daily

  20. Debris Flow Risk Management Framework and Risk Analysis in Taiwan, A Preliminary Study

    NASA Astrophysics Data System (ADS)

    Tsao, Ting-Chi; Hsu, Wen-Ko; Chiou, Lin-Bin; Cheng, Chin-Tung; Lo, Wen-Chun; Chen, Chen-Yu; Lai, Cheng-Nong; Ju, Jiun-Ping

    2010-05-01

    Taiwan is located on a seismically active mountain belt between the Philippine Sea plate and Eurasian plate. After 1999's Chi-Chi earthquake (Mw=7.6), landslide and debris flow occurred frequently. In Aug. 2009, Typhoon Morakot struck Taiwan and numerous landslides and debris flow events, some with tremendous fatalities, were observed. With limited resources, authorities should establish a disaster management system to cope with slope disaster risks more effectively. Since 2006, Taiwan's authority in charge of debris flow management, the Soil and Water Conservation Bureau (SWCB), completed the basic investigation and data collection of 1,503 potential debris flow creeks around Taiwan. During 2008 and 2009, a debris flow quantitative risk analysis (QRA) framework, based on landslide risk management framework of Australia, was proposed and conducted on 106 creeks of the 30 villages with debris flow hazard history. Information and value of several types of elements at risk (bridge, road, building and crop) were gathered and integrated into a GIS layer, with the vulnerability model of each elements at risk applied. Through studying the historical hazard events of the 30 villages, numerical simulations of debris flow hazards with different magnitudes (5, 10, 25, 50, 100 and 200 years return period) were conducted, the economic losses and fatalities of each scenario were calculated for each creek. When taking annual exceeding probability into account, the annual total risk of each creek was calculated, and the results displayed on a debris flow risk map. The number of fatalities and frequency were calculated, and the F-N curves of 106 creeks were provided. For F-N curves, the individual risk to life per year of 1.0E-04 and slope of 1, which matched with international standards, were considered to be an acceptable risk. Applying the results of the 106 creeks onto the F-N curve, they were divided into 3 categories: Unacceptable, ALARP (As Low As Reasonable Practicable) and

  1. Flood risk assessment in France: comparison of extreme flood estimation methods (EXTRAFLO project, Task 7)

    NASA Astrophysics Data System (ADS)

    Garavaglia, F.; Paquet, E.; Lang, M.; Renard, B.; Arnaud, P.; Aubert, Y.; Carre, J.

    2013-12-01

    In flood risk assessment the methods can be divided in two families: deterministic methods and probabilistic methods. In the French hydrologic community the probabilistic methods are historically preferred to the deterministic ones. Presently a French research project named EXTRAFLO (RiskNat Program of the French National Research Agency, https://extraflo.cemagref.fr) deals with the design values for extreme rainfall and floods. The object of this project is to carry out a comparison of the main methods used in France for estimating extreme values of rainfall and floods, to obtain a better grasp of their respective fields of application. In this framework we present the results of Task 7 of EXTRAFLO project. Focusing on French watersheds, we compare the main extreme flood estimation methods used in French background: (i) standard flood frequency analysis (Gumbel and GEV distribution), (ii) regional flood frequency analysis (regional Gumbel and GEV distribution), (iii) local and regional flood frequency analysis improved by historical information (Naulet et al., 2005), (iv) simplify probabilistic method based on rainfall information (i.e. Gradex method (CFGB, 1994), Agregee method (Margoum, 1992) and Speed method (Cayla, 1995)), (v) flood frequency analysis by continuous simulation approach and based on rainfall information (i.e. Schadex method (Paquet et al., 2013, Garavaglia et al., 2010), Shyreg method (Lavabre et al., 2003)) and (vi) multifractal approach. The main result of this comparative study is that probabilistic methods based on additional information (i.e. regional, historical and rainfall information) provide better estimations than the standard flood frequency analysis. Another interesting result is that, the differences between the various extreme flood quantile estimations of compared methods increase with return period, staying relatively moderate up to 100-years return levels. Results and discussions are here illustrated throughout with the example

  2. Analysis of automated highway system risks and uncertainties. Volume 5

    SciTech Connect

    Sicherman, A.

    1994-10-01

    This volume describes a risk analysis performed to help identify important Automated Highway System (AHS) deployment uncertainties and quantify their effect on costs and benefits for a range of AHS deployment scenarios. The analysis identified a suite of key factors affecting vehicle and roadway costs, capacities and market penetrations for alternative AHS deployment scenarios. A systematic protocol was utilized for obtaining expert judgments of key factor uncertainties in the form of subjective probability percentile assessments. Based on these assessments, probability distributions on vehicle and roadway costs, capacity and market penetration were developed for the different scenarios. The cost/benefit risk methodology and analysis provide insights by showing how uncertainties in key factors translate into uncertainties in summary cost/benefit indices.

  3. Engine non-containment: UK risk assessment methods

    NASA Technical Reports Server (NTRS)

    Wallin, J. C.

    1977-01-01

    More realistic guideline data must be developed for use in aircraft design in order to comply with recent changes in British civil airworthiness requirements. Unrealistically pessimistic results were obtained when the methodology developed during the Concorde SST certification program was extended to assess catastrophic risks resulting from uncontained engine rotors.

  4. Space flight risk data collection and analysis project: Risk and reliability database

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The focus of the NASA 'Space Flight Risk Data Collection and Analysis' project was to acquire and evaluate space flight data with the express purpose of establishing a database containing measurements of specific risk assessment - reliability - availability - maintainability - supportability (RRAMS) parameters. The developed comprehensive RRAMS database will support the performance of future NASA and aerospace industry risk and reliability studies. One of the primary goals has been to acquire unprocessed information relating to the reliability and availability of launch vehicles and the subsystems and components thereof from the 45th Space Wing (formerly Eastern Space and Missile Command -ESMC) at Patrick Air Force Base. After evaluating and analyzing this information, it was encoded in terms of parameters pertinent to ascertaining reliability and availability statistics, and then assembled into an appropriate database structure.

  5. Risk-based planning analysis for a single levee

    NASA Astrophysics Data System (ADS)

    Hui, Rui; Jachens, Elizabeth; Lund, Jay

    2016-04-01

    Traditional risk-based analysis for levee planning focuses primarily on overtopping failure. Although many levees fail before overtopping, few planning studies explicitly include intermediate geotechnical failures in flood risk analysis. This study develops a risk-based model for two simplified levee failure modes: overtopping failure and overall intermediate geotechnical failure from through-seepage, determined by the levee cross section represented by levee height and crown width. Overtopping failure is based only on water level and levee height, while through-seepage failure depends on many geotechnical factors as well, mathematically represented here as a function of levee crown width using levee fragility curves developed from professional judgment or analysis. These levee planning decisions are optimized to minimize the annual expected total cost, which sums expected (residual) annual flood damage and annualized construction costs. Applicability of this optimization approach to planning new levees or upgrading existing levees is demonstrated preliminarily for a levee on a small river protecting agricultural land, and a major levee on a large river protecting a more valuable urban area. Optimized results show higher likelihood of intermediate geotechnical failure than overtopping failure. The effects of uncertainty in levee fragility curves, economic damage potential, construction costs, and hydrology (changing climate) are explored. Optimal levee crown width is more sensitive to these uncertainties than height, while the derived general principles and guidelines for risk-based optimal levee planning remain the same.

  6. Integrated method for chaotic time series analysis

    DOEpatents

    Hively, L.M.; Ng, E.G.

    1998-09-29

    Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data are disclosed. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated. 8 figs.

  7. Integrated method for chaotic time series analysis

    DOEpatents

    Hively, Lee M.; Ng, Esmond G.

    1998-01-01

    Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated.

  8. Systems and methods for sample analysis

    SciTech Connect

    Cooks, Robert Graham; Li, Guangtao; Li, Xin; Ouyang, Zheng

    2015-10-20

    The invention generally relates to systems and methods for sample analysis. In certain embodiments, the invention provides a system for analyzing a sample that includes a probe including a material connected to a high voltage source, a device for generating a heated gas, and a mass analyzer.

  9. Systems and methods for sample analysis

    DOEpatents

    Cooks, Robert Graham; Li, Guangtao; Li, Xin; Ouyang, Zheng

    2015-01-13

    The invention generally relates to systems and methods for sample analysis. In certain embodiments, the invention provides a system for analyzing a sample that includes a probe including a material connected to a high voltage source, a device for generating a heated gas, and a mass analyzer.

  10. Methods for Chemical Analysis of Fresh Waters.

    ERIC Educational Resources Information Center

    Golterman, H. L.

    This manual, one of a series prepared for the guidance of research workers conducting studies as part of the International Biological Programme, contains recommended methods for the analysis of fresh water. The techniques are grouped in the following major sections: Sample Taking and Storage; Conductivity, pH, Oxidation-Reduction Potential,…

  11. Research Methods Textbooks: An Objective Analysis.

    ERIC Educational Resources Information Center

    Jackson, Sherri L.; Lugo, Susan M.; Griggs, Richard A.

    2001-01-01

    Presents an analysis of undergraduate methods course textbooks (n=26) published in the United States with copyright dates from 1995-1999. Examines aspects of the textbooks, such as demographic qualities, use of pedagogical aids and illustrative material, and topic coverage. Includes the results in detail. (CMK)

  12. [Assessment of health risk in view of changes in labor code and methods used for protecting the environment].

    PubMed

    Indulski, J A; Rolecki, R

    1994-01-01

    In view of the present and proposed amendments to the Labor Code as well as bearing in mind anticipated harmonization of regulations in this area with those of EEC, the authors emphasize the need for well developed methodology for assessing chemical safety in an occupational environment with special reference to health effects in people exposed to chemicals. Methods for assessing health risk induced by work under conditions of exposure to chemicals were divided into: methods for assessing technological/processing risk, and methods for assessing health risk related to the toxic effect of chemicals. The need for developing means of risk communication in order to secure proper risk perception among people exposed to chemicals and risk managers responsible for prevention against chemical hazards was also stressed. It is suggested to establish a centre for chemical substances in order to settle down all issues pertaining to human exposure to chemicals. The centre would be responsible, under the provisions of the Chemical Substances Act, for the qualitative and quantitative analysis of the present situation and for the development of guidelines on assessment of health risk among persons exposed to chemicals.

  13. Risk Interfaces to Support Integrated Systems Analysis and Development

    NASA Technical Reports Server (NTRS)

    Mindock, Jennifer; Lumpkins, Sarah; Shelhamer, Mark; Anton, Wilma; Havenhill, Maria

    2016-01-01

    Objectives for systems analysis capability: Develop integrated understanding of how a complex human physiological-socio-technical mission system behaves in spaceflight. Why? Support development of integrated solutions that prevent unwanted outcomes (Implementable approaches to minimize mission resources(mass, power, crew time, etc.)); Support development of tools for autonomy (need for exploration) (Assess and maintain resilience -individuals, teams, integrated system). Output of this exercise: -Representation of interfaces based on Human System Risk Board (HSRB) Risk Summary information and simple status based on Human Research Roadmap; Consolidated HSRB information applied to support communication; Point-of-Departure for HRP Element planning; Ability to track and communicate status of collaborations. 4

  14. Patient-specific meta-analysis for risk assessment using multivariate proportional hazards regression

    PubMed Central

    Crager, Michael R.; Tang, Gong

    2015-01-01

    We propose a method for assessing an individual patient’s risk of a future clinical event using clinical trial or cohort data and Cox proportional hazards regression, combining the information from several studies using meta-analysis techniques. The method combines patient-specific estimates of the log cumulative hazard across studies, weighting by the relative precision of the estimates, using either fixed- or random-effects meta-analysis calculations. Risk assessment can be done for any future patient using a few key summary statistics determined once and for all from each study. Generalizations of the method to logistic regression and linear models are immediate. We evaluate the methods using simulation studies and illustrate their application using real data. PMID:26664111

  15. Deciding which chemical mixtures risk assessment methods work best for what mixtures

    SciTech Connect

    Teuschler, Linda K.

    2007-09-01

    The most commonly used chemical mixtures risk assessment methods involve simple notions of additivity and toxicological similarity. Newer methods are emerging in response to the complexities of chemical mixture exposures and effects. Factors based on both science and policy drive decisions regarding whether to conduct a chemical mixtures risk assessment and, if so, which methods to employ. Scientific considerations are based on positive evidence of joint toxic action, elevated human exposure conditions or the potential for significant impacts on human health. Policy issues include legislative drivers that may mandate action even though adequate toxicity data on a specific mixture may not be available and risk assessment goals that impact the choice of risk assessment method to obtain the amount of health protection desired. This paper discusses three important concepts used to choose among available approaches for conducting a chemical mixtures risk assessment: (1) additive joint toxic action of mixture components; (2) toxicological interactions of mixture components; and (3) chemical composition of complex mixtures. It is proposed that scientific support for basic assumptions used in chemical mixtures risk assessment should be developed by expert panels, risk assessment methods experts, and laboratory toxicologists. This is imperative to further develop and refine quantitative methods and provide guidance on their appropriate applications. Risk assessors need scientific support for chemical mixtures risk assessment methods in the form of toxicological data on joint toxic action for high priority mixtures, statistical methods for analyzing dose-response for mixtures, and toxicological and statistical criteria for determining sufficient similarity of complex mixtures.

  16. Hybrid Safety Analysis Using Functional and Risk Decompositions

    SciTech Connect

    COOPER,J. ARLIN; JOHNSON,ALICE J.; WERNER,PAUL W.

    2000-07-15

    Safety analysis of complex systems depends on decomposing the systems into manageable subsystems, from which analysis can be rolled back up to the system level. The authors have found that there is no single best way to decompose; in fact hybrid combinations of decompositions are generally necessary to achieve optimum results. They are currently using two backbone coordinated decompositions--functional and risk, supplemented by other types, such as organizational. An objective is to derive metrics that can be used to efficiently and accurately aggregate information through analysis, to contribute toward assessing system safety, and to contribute information necessary for defensible decisions.

  17. Risk Factor Analysis in Low-Temperature Geothermal Play Fairway Analysis for the Appalachian Basin (GPFA-AB)

    DOE Data Explorer

    Teresa E. Jordan

    2015-09-30

    This submission contains information used to compute the risk factors for the GPFA-AB project (DE-EE0006726). The risk factors are natural reservoir quality, thermal resource quality, potential for induced seismicity, and utilization. The methods used to combine the risk factors included taking the product, sum, and minimum of the four risk factors. The files are divided into images, rasters, shapefiles, and supporting information. The image files show what the raster and shapefiles should look like. The raster files contain the input risk factors, calculation of the scaled risk factors, and calculation of the combined risk factors. The shapefiles include definition of the fairways, definition of the US Census Places, the center of the raster cells, and locations of industries. Supporting information contains details of the calculations or processing used in generating the files. An image of the raster will have the same name except *.png as the file ending instead of *.tif. Images with “fairways” or “industries” added to the name are composed of a raster with the relevant shapefile added. The file About_GPFA-AB_Phase1RiskAnalysisTask5DataUpload.pdf contains information the citation, special use considerations, authorship, etc. More details on each file are given in the spreadsheet “list_of_contents.csv” in the folder “SupportingInfo”. Code used to calculate values is available at https://github.com/calvinwhealton/geothermal_pfa under the folder “combining_metrics”.

  18. Multiple predictor smoothing methods for sensitivity analysis.

    SciTech Connect

    Helton, Jon Craig; Storlie, Curtis B.

    2006-08-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (1) locally weighted regression (LOESS), (2) additive models, (3) projection pursuit regression, and (4) recursive partitioning regression. The indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present.

  19. Method for Assessing the Integrated Risk of Soil Pollution in Industrial and Mining Gathering Areas.

    PubMed

    Guan, Yang; Shao, Chaofeng; Gu, Qingbao; Ju, Meiting; Zhang, Qian

    2015-11-13

    Industrial and mining activities are recognized as major sources of soil pollution. This study proposes an index system for evaluating the inherent risk level of polluting factories and introduces an integrated risk assessment method based on human health risk. As a case study, the health risk, polluting factories and integrated risks were analyzed in a typical industrial and mining gathering area in China, namely, Binhai New Area. The spatial distribution of the risk level was determined using a Geographic Information System. The results confirmed the following: (1) Human health risk in the study area is moderate to extreme, with heavy metals posing the greatest threat; (2) Polluting factories pose a moderate to extreme inherent risk in the study area. Such factories are concentrated in industrial and urban areas, but are irregularly distributed and also occupy agricultural land, showing a lack of proper planning and management; (3) The integrated risks of soil are moderate to high in the study area.

  20. Method for Assessing the Integrated Risk of Soil Pollution in Industrial and Mining Gathering Areas.

    PubMed

    Guan, Yang; Shao, Chaofeng; Gu, Qingbao; Ju, Meiting; Zhang, Qian

    2015-11-01

    Industrial and mining activities are recognized as major sources of soil pollution. This study proposes an index system for evaluating the inherent risk level of polluting factories and introduces an integrated risk assessment method based on human health risk. As a case study, the health risk, polluting factories and integrated risks were analyzed in a typical industrial and mining gathering area in China, namely, Binhai New Area. The spatial distribution of the risk level was determined using a Geographic Information System. The results confirmed the following: (1) Human health risk in the study area is moderate to extreme, with heavy metals posing the greatest threat; (2) Polluting factories pose a moderate to extreme inherent risk in the study area. Such factories are concentrated in industrial and urban areas, but are irregularly distributed and also occupy agricultural land, showing a lack of proper planning and management; (3) The integrated risks of soil are moderate to high in the study area. PMID:26580644

  1. Method for Assessing the Integrated Risk of Soil Pollution in Industrial and Mining Gathering Areas

    PubMed Central

    Guan, Yang; Shao, Chaofeng; Gu, Qingbao; Ju, Meiting; Zhang, Qian

    2015-01-01

    Industrial and mining activities are recognized as major sources of soil pollution. This study proposes an index system for evaluating the inherent risk level of polluting factories and introduces an integrated risk assessment method based on human health risk. As a case study, the health risk, polluting factories and integrated risks were analyzed in a typical industrial and mining gathering area in China, namely, Binhai New Area. The spatial distribution of the risk level was determined using a Geographic Information System. The results confirmed the following: (1) Human health risk in the study area is moderate to extreme, with heavy metals posing the greatest threat; (2) Polluting factories pose a moderate to extreme inherent risk in the study area. Such factories are concentrated in industrial and urban areas, but are irregularly distributed and also occupy agricultural land, showing a lack of proper planning and management; (3) The integrated risks of soil are moderate to high in the study area. PMID:26580644

  2. Assessing the validity of prospective hazard analysis methods: a comparison of two techniques

    PubMed Central

    2014-01-01

    Background Prospective Hazard Analysis techniques such as Healthcare Failure Modes and Effects Analysis (HFMEA) and Structured What If Technique (SWIFT) have the potential to increase safety by identifying risks before an adverse event occurs. Published accounts of their application in healthcare have identified benefits, but the reliability of some methods has been found to be low. The aim of this study was to examine the validity of SWIFT and HFMEA by comparing their outputs in the process of risk assessment, and comparing the results with risks identified by retrospective methods. Methods The setting was a community-based anticoagulation clinic, in which risk assessment activities had been previously performed and were available. A SWIFT and an HFMEA workshop were conducted consecutively on the same day by experienced experts. Participants were a mixture of pharmacists, administrative staff and software developers. Both methods produced lists of risks scored according to the method’s procedure. Participants’ views about the value of the workshops were elicited with a questionnaire. Results SWIFT identified 61 risks and HFMEA identified 72 risks. For both methods less than half the hazards were identified by the other method. There was also little overlap between the results of the workshops and risks identified by prior root cause analysis, staff interviews or clinical governance board discussions. Participants’ feedback indicated that the workshops were viewed as useful. Conclusions Although there was limited overlap, both methods raised important hazards. Scoping the problem area had a considerable influence on the outputs. The opportunity for teams to discuss their work from a risk perspective is valuable, but these methods cannot be relied upon in isolation to provide a comprehensive description. Multiple methods for identifying hazards should be used and data from different sources should be integrated to give a comprehensive view of risk in a system

  3. Egg consumption and breast cancer risk: a meta-analysis.

    PubMed

    Si, Ruohuang; Qu, Kunpeng; Jiang, Zebin; Yang, Xiaojun; Gao, Peng

    2014-05-01

    The relationship between egg consumption and breast cancer risk has been inconsistent, so it is necessary to conduct a meta-analysis to evaluate the relationship. PubMed, EMBASE and ISI Web of Knowledge were searched to find cohort studies or case control studies that evaluated the relationship between egg consumption and breast cancer risk. A comprehensive meta-analysis software was used to conduct the meta-analysis. 13 studies were included. The meta-analysis results showed that egg consumption was associated with increased breast cancer risk (RR 1.04, 95 % CI 1.01-1.08). Subgroup analyses showed egg consumption was also associated with increased breast cancer risk based on cohort studies (RR 1.04, 95 % CI 1.00-1.08), among European population (RR 1.05, 95 % CI 1.01-1.09), Asian population (RR 1.09, 95 % CI 1.00-1.18), postmenopausal population (RR 1.06, 95 % CI 1.02-1.10), and those who consumed ≥2, ≤5/week (RR 1.10, 95 % CI 1.02-1.17), but not in case-control studies (RR 1.06, 95 % CI 0.97-1.15), among American population (RR 1.04, 95 % CI 0.94-1.16), premenopausal population (RR 1.04, 95 % CI 0.98-1.11) and those who consumed ≥1, <2/week (RR 1.04, 95 % CI 0.97-1.11) or >5 eggs/week (RR 0.97, 95 % CI 0.88-1.06). Egg consumption was associated with increased breast cancer risk among the European, Asian and postmenopausal population and those who consumed ≥2, ≤5/week. PMID:24504557

  4. Association among Dietary Flavonoids, Flavonoid Subclasses and Ovarian Cancer Risk: A Meta-Analysis

    PubMed Central

    You, Ruxu; Yang, Yu; Liao, Jing; Chen, Dongsheng; Yu, Lixiu

    2016-01-01

    Background Previous studies have indicated that intake of dietary flavonoids or flavonoid subclasses is associated with the ovarian cancer risk, but presented controversial results. Therefore, we conducted a meta-analysis to derive a more precise estimation of these associations. Methods We performed a search in PubMed, Google Scholar and ISI Web of Science from their inception to April 25, 2015 to select studies on the association among dietary flavonoids, flavonoid subclasses and ovarian cancer risk. The information was extracted by two independent authors. We assessed the heterogeneity, sensitivity, publication bias and quality of the articles. A random-effects model was used to calculate the pooled risk estimates. Results Five cohort studies and seven case-control studies were included in the final meta-analysis. We observed that intake of dietary flavonoids can decrease ovarian cancer risk, which was demonstrated by pooled RR (RR = 0.82, 95% CI = 0.68–0.98). In a subgroup analysis by flavonoid subtypes, the ovarian cancer risk was also decreased for isoflavones (RR = 0.67, 95% CI = 0.50–0.92) and flavonols (RR = 0.68, 95% CI = 0.58–0.80). While there was no compelling evidence that consumption of flavones (RR = 0.86, 95% CI = 0.71–1.03) could decrease ovarian cancer risk, which revealed part sources of heterogeneity. The sensitivity analysis indicated stable results, and no publication bias was observed based on the results of Funnel plot analysis and Egger’s test (p = 0.26). Conclusions This meta-analysis suggested that consumption of dietary flavonoids and subtypes (isoflavones, flavonols) has a protective effect against ovarian cancer with a reduced risk of ovarian cancer except for flavones consumption. Nevertheless, further investigations on a larger population covering more flavonoid subclasses are warranted. PMID:26960146

  5. Walking the line: Understanding pedestrian behaviour and risk at rail level crossings with cognitive work analysis.

    PubMed

    Read, Gemma J M; Salmon, Paul M; Lenné, Michael G; Stanton, Neville A

    2016-03-01

    Pedestrian fatalities at rail level crossings (RLXs) are a public safety concern for governments worldwide. There is little literature examining pedestrian behaviour at RLXs and no previous studies have adopted a formative approach to understanding behaviour in this context. In this article, cognitive work analysis is applied to understand the constraints that shape pedestrian behaviour at RLXs in Melbourne, Australia. The five phases of cognitive work analysis were developed using data gathered via document analysis, behavioural observation, walk-throughs and critical decision method interviews. The analysis demonstrates the complex nature of pedestrian decision making at RLXs and the findings are synthesised to provide a model illustrating the influences on pedestrian decision making in this context (i.e. time, effort and social pressures). Further, the CWA outputs are used to inform an analysis of the risks to safety associated with pedestrian behaviour at RLXs and the identification of potential interventions to reduce risk.

  6. New Analysis Methods In Photon Correlation Spectroscopy

    NASA Astrophysics Data System (ADS)

    Nash, P. J.; King, T. A.

    1983-06-01

    This paper describes the analysis of photon correlation spectroscopy decay curves by a significant new method based on the fitting of sums of positive exponentials by the S-exponential sum fitting method. The method fits a positive exponential sum to a given data set providing a best weighted least squares fit. No initial setting of any of the parameters is required and the number of exponential coefficients does not have to be preset in the program but is determined by the number of components apparent above the noise level. Results will be discussed for application in scattering systems which may be single or multiple component. Systems generating single, double or multiple exponential decay functions derived from computer simulation or photon correlation exneriments are considered and fitting analysis with varying noise levels.

  7. Groundwater-risk analysis of New York utilizing GIS technology

    NASA Astrophysics Data System (ADS)

    Hillenbrand, Charles John, III

    Using Geographic Information System (GIS) technology, data layers can be processed and analyzed to produce a regional groundwater-risk grid of New York State (NYS). GIS can be used to assess the potential to introduce contaminants at the ground surface, and assess the potential for the contaminants to migrate through the vadose zone and be introduced to an aquifer at the water-table. The potential to introduce contaminants to the ground surface was assessed utilizing existing database information in combination with the United States Geological Survey (USGS) Multi-Resolution Land Classification (MRLC) land use grid. The databases allowed an analysis of contaminant association with Standard Industrial Classification (SIC) codes, risk evaluation of the contaminants using groundwater intake values protective of human health, the development of SIC code-risk values, the construction of a SIC code-risked facility point coverage, and the construction of a land use-risk grid; this grid assesses the potential to introduce contaminants to the ground surface. Aquifer susceptibility was determined by analyzing vadose zone residence time assuming saturated conditions. Vadose zone residence time is a measure of the vadose zone's ability to attenuate and retard the migration of contaminants. Existing data layers were processed to produce a depth to water-table (vadose zone thickness) grid. Existing GIS data layers of soil, surficial geology and bedrock geology, along with review of literature and pump/slug test data, enabled the creation of thickness, porosity and vertical hydraulic conductivity grids for the three considered components of the vadose zone. The average linear velocity was then calculated for each vadose zone component by dividing their hydraulic conductivity grid by their respective porosity grid. The thickness grid of each vadose zone component was then divided by their respective average linear velocity grid to produce vadose zone residence time grids. The sum

  8. Looking beyond borders: integrating best practices in benefit-risk analysis into the field of food and nutrition.

    PubMed

    Tijhuis, M J; Pohjola, M V; Gunnlaugsdóttir, H; Kalogeras, N; Leino, O; Luteijn, J M; Magnússon, S H; Odekerken-Schröder, G; Poto, M; Tuomisto, J T; Ueland, O; White, B C; Holm, F; Verhagen, H

    2012-01-01

    An integrated benefit-risk analysis aims to give guidance in decision situations where benefits do not clearly prevail over risks, and explicit weighing of benefits and risks is thus indicated. The BEPRARIBEAN project aims to advance benefit-risk analysis in the area of food and nutrition by learning from other fields. This paper constitutes the final stage of the project, in which commonalities and differences in benefit-risk analysis are identified between the Food and Nutrition field and other fields, namely Medicines, Food Microbiology, Environmental Health, Economics and Marketing-Finance, and Consumer Perception. From this, ways forward are characterized for benefit-risk analysis in Food and Nutrition. Integrated benefit-risk analysis in Food and Nutrition may advance in the following ways: Increased engagement and communication between assessors, managers, and stakeholders; more pragmatic problem-oriented framing of assessment; accepting some risk; pre- and post-market analysis; explicit communication of the assessment purpose, input and output; more human (dose-response) data and more efficient use of human data; segmenting populations based on physiology; explicit consideration of value judgments in assessment; integration of multiple benefits and risks from multiple domains; explicit recognition of the impact of consumer beliefs, opinions, views, perceptions, and attitudes on behaviour; and segmenting populations based on behaviour; the opportunities proposed here do not provide ultimate solutions; rather, they define a collection of issues to be taken account of in developing methods, tools, practices and policies, as well as refining the regulatory context, for benefit-risk analysis in Food and Nutrition and other fields. Thus, these opportunities will now need to be explored further and incorporated into benefit-risk practice and policy. If accepted, incorporation of these opportunities will also involve a paradigm shift in Food and Nutrition benefit-risk

  9. Looking beyond borders: integrating best practices in benefit-risk analysis into the field of food and nutrition.

    PubMed

    Tijhuis, M J; Pohjola, M V; Gunnlaugsdóttir, H; Kalogeras, N; Leino, O; Luteijn, J M; Magnússon, S H; Odekerken-Schröder, G; Poto, M; Tuomisto, J T; Ueland, O; White, B C; Holm, F; Verhagen, H

    2012-01-01

    An integrated benefit-risk analysis aims to give guidance in decision situations where benefits do not clearly prevail over risks, and explicit weighing of benefits and risks is thus indicated. The BEPRARIBEAN project aims to advance benefit-risk analysis in the area of food and nutrition by learning from other fields. This paper constitutes the final stage of the project, in which commonalities and differences in benefit-risk analysis are identified between the Food and Nutrition field and other fields, namely Medicines, Food Microbiology, Environmental Health, Economics and Marketing-Finance, and Consumer Perception. From this, ways forward are characterized for benefit-risk analysis in Food and Nutrition. Integrated benefit-risk analysis in Food and Nutrition may advance in the following ways: Increased engagement and communication between assessors, managers, and stakeholders; more pragmatic problem-oriented framing of assessment; accepting some risk; pre- and post-market analysis; explicit communication of the assessment purpose, input and output; more human (dose-response) data and more efficient use of human data; segmenting populations based on physiology; explicit consideration of value judgments in assessment; integration of multiple benefits and risks from multiple domains; explicit recognition of the impact of consumer beliefs, opinions, views, perceptions, and attitudes on behaviour; and segmenting populations based on behaviour; the opportunities proposed here do not provide ultimate solutions; rather, they define a collection of issues to be taken account of in developing methods, tools, practices and policies, as well as refining the regulatory context, for benefit-risk analysis in Food and Nutrition and other fields. Thus, these opportunities will now need to be explored further and incorporated into benefit-risk practice and policy. If accepted, incorporation of these opportunities will also involve a paradigm shift in Food and Nutrition benefit-risk

  10. Exploring Mexican adolescents' perceptions of environmental health risks: a photographic approach to risk analysis.

    PubMed

    Börner, Susanne; Albino, Juan Carlos Torrico; Caraveo, Luz María Nieto; Tejeda, Ana Cristina Cubillas

    2015-05-01

    The objective of this study was to explore Mexican adolescents' perceptions of environmental health risks in contaminated urban areas, and to test the environmental photography technique as a research tool for engaging adolescents in community-based health research. The study was conducted with 74 adolescents from two communities in the city of San Luis Potosi, Mexico. Participants were provided with disposable cameras and asked to take photographs of elements and situations which they believed affected their personal health both at home and outside their homes. They were also asked to describe each photograph in writing. Photographs and written explanations were analyzed by using quantitative and qualitative content analysis. Risk perception plays a crucial role in the development of Risk Communication Programs (RCPs) aimed at the improvement of community health. The photography technique opens up a promising field for environmental health research since it affords a realistic and concise impression of the perceived risks. Adolescents in both communities perceived different environmental health risks as detrimental to their well-being, e.g. waste, air pollution, and lack of hygiene. Yet, some knowledge gaps remain which need to be addressed. PMID:26017963

  11. Exploring Mexican adolescents' perceptions of environmental health risks: a photographic approach to risk analysis.

    PubMed

    Börner, Susanne; Albino, Juan Carlos Torrico; Caraveo, Luz María Nieto; Tejeda, Ana Cristina Cubillas

    2015-05-01

    The objective of this study was to explore Mexican adolescents' perceptions of environmental health risks in contaminated urban areas, and to test the environmental photography technique as a research tool for engaging adolescents in community-based health research. The study was conducted with 74 adolescents from two communities in the city of San Luis Potosi, Mexico. Participants were provided with disposable cameras and asked to take photographs of elements and situations which they believed affected their personal health both at home and outside their homes. They were also asked to describe each photograph in writing. Photographs and written explanations were analyzed by using quantitative and qualitative content analysis. Risk perception plays a crucial role in the development of Risk Communication Programs (RCPs) aimed at the improvement of community health. The photography technique opens up a promising field for environmental health research since it affords a realistic and concise impression of the perceived risks. Adolescents in both communities perceived different environmental health risks as detrimental to their well-being, e.g. waste, air pollution, and lack of hygiene. Yet, some knowledge gaps remain which need to be addressed.

  12. Risk analysis and Monte Carlo simulation applied to the generation of drilling AFE estimates

    SciTech Connect

    Peterson, S.K.; Murtha, J.A.; Schneider, F.F.

    1995-06-01

    This paper presents a method for developing an authorization-for-expenditure (AFE)-generating model and illustrates the technique with a specific offshore field development case study. The model combines Monte Carlo simulation and statistical analysis of historical drilling data to generate more accurate, risked, AFE estimates. In addition to the general method, two examples of making AFE time estimates for North Sea wells with the presented techniques are given.

  13. Prenatal, Perinatal and Neonatal Risk Factors for Intellectual Disability: A Systemic Review and Meta-Analysis

    PubMed Central

    Qu, Yi; Mu, Dezhi

    2016-01-01

    Background The etiology of non-genetic intellectual disability (ID) is not fully known, and we aimed to identify the prenatal, perinatal and neonatal risk factors for ID. Method PubMed and Embase databases were searched for studies that examined the association between pre-, peri- and neonatal factors and ID risk (keywords “intellectual disability” or “mental retardation” or “ID” or “MR” in combination with “prenatal” or “pregnancy” or “obstetric” or “perinatal” or “neonatal”. The last search was updated on September 15, 2015. Summary effect estimates (pooled odds ratios) were calculated for each risk factor using random effects models, with tests for heterogeneity and publication bias. Results Seventeen studies with 55,344 patients and 5,723,749 control individuals were eligible for inclusion in our analysis, and 16 potential risk factors were analyzed. Ten prenatal factors (advanced maternal age, maternal black race, low maternal education, third or more parity, maternal alcohol use, maternal tobacco use, maternal diabetes, maternal hypertension, maternal epilepsy and maternal asthma), one perinatal factor (preterm birth) and two neonatal factors (male sex and low birth weight) were significantly associated with increased risk of ID. Conclusion This systemic review and meta-analysis provides a comprehensive evidence-based assessment of the risk factors for ID. Future studies are encouraged to focus on perinatal and neonatal risk factors and the combined effects of multiple factors. PMID:27110944

  14. Risk factors for rape re-victimisation: a retrospective analysis.

    PubMed

    Lurie, S; Boaz, M; Golan, A

    2013-11-01

    Sexual re-victimisation refers to a pattern in which the sexual assault victim has an increased risk of subsequent victimisation relative to an individual who was never victimised. The purpose of our study was to identify risks factors for a second rape, the severest form of sexual re-victimisation. All rape victims treated at the First Regional Israeli Center for Sexual Assault Victims between October 2000 and July 2010 were included in this retrospective analysis. We compared characteristics of 53 rape victims who were victimised twice to those of 1,939 rape victims who were victimised once. We identified several risk factors for a second rape, which can be used in prevention programmes. These are: psychiatric background, history of social services involvement, adulthood, non-virginity and minority ethnicity.

  15. Risk assessment and its application to flight safety analysis

    SciTech Connect

    Keese, D.L.; Barton, W.R.

    1989-12-01

    Potentially hazardous test activities have historically been a part of Sandia National Labs mission to design, develop, and test new weapons systems. These test activities include high speed air drops for parachute development, sled tests for component and system level studies, multiple stage rocket experiments, and artillery firings of various projectiles. Due to the nature of Sandia's test programs, the risk associated with these activities can never be totally eliminated. However, a consistent set of policies should be available to provide guidance into the level of risk that is acceptable in these areas. This report presents a general set of guidelines for addressing safety issues related to rocket flight operations at Sandia National Laboratories. Even though the majority of this report deals primarily with rocket flight safety, these same principles could be applied to other hazardous test activities. The basic concepts of risk analysis have a wide range of applications into many of Sandia's current operations. 14 refs., 1 tab.

  16. Risk Analysis as Regulatory Science: Toward The Establishment of Standards.

    PubMed

    Murakami, Michio

    2016-09-01

    Understanding how to establish standards is essential for risk communication and also provides perspectives for further study. In this paper, the concept of risk analysis as regulatory science for the establishment of standards is demonstrated through examples of standards for evacuation and provisional regulation values in foods and drinking water. Moreover, academic needs for further studies related to standards are extracted. The concepts of the traditional 'Standard I', which has a paternalistic orientation, and 'Standard II', established through stakeholder consensus, are then systemized by introducing the current status of the new standards-related movement that developed after the Fukushima nuclear power plant accident, and the perspectives of the standards are discussed. Preparation of standards on the basis of stakeholder consensus through intensive risk dialogue before a potential nuclear power plant accident is suggested to be a promising approach to ensure a safe society and enhance subjective well-being.

  17. Risk Analysis as Regulatory Science: Toward The Establishment of Standards.

    PubMed

    Murakami, Michio

    2016-09-01

    Understanding how to establish standards is essential for risk communication and also provides perspectives for further study. In this paper, the concept of risk analysis as regulatory science for the establishment of standards is demonstrated through examples of standards for evacuation and provisional regulation values in foods and drinking water. Moreover, academic needs for further studies related to standards are extracted. The concepts of the traditional 'Standard I', which has a paternalistic orientation, and 'Standard II', established through stakeholder consensus, are then systemized by introducing the current status of the new standards-related movement that developed after the Fukushima nuclear power plant accident, and the perspectives of the standards are discussed. Preparation of standards on the basis of stakeholder consensus through intensive risk dialogue before a potential nuclear power plant accident is suggested to be a promising approach to ensure a safe society and enhance subjective well-being. PMID:27475751

  18. Risk Analysis as Regulatory Science: Toward The Establishment of Standards

    PubMed Central

    Murakami, Michio

    2016-01-01

    Understanding how to establish standards is essential for risk communication and also provides perspectives for further study. In this paper, the concept of risk analysis as regulatory science for the establishment of standards is demonstrated through examples of standards for evacuation and provisional regulation values in foods and drinking water. Moreover, academic needs for further studies related to standards are extracted. The concepts of the traditional ‘Standard I’, which has a paternalistic orientation, and ‘Standard II’, established through stakeholder consensus, are then systemized by introducing the current status of the new standards-related movement that developed after the Fukushima nuclear power plant accident, and the perspectives of the standards are discussed. Preparation of standards on the basis of stakeholder consensus through intensive risk dialogue before a potential nuclear power plant accident is suggested to be a promising approach to ensure a safe society and enhance subjective well-being. PMID:27475751

  19. The development of risk analysis: a personal perspective.

    PubMed

    Wilson, Richard

    2012-12-01

    This article reflects on my experiences observing and participating in the development of risk analysis for environmental and health hazards since the 1970s with emphasis on its critical role in informing decisions with potentially high consequences, even for very low probability events once ignored or simply viewed as "acts of God." I discuss how modern society wants to protect itself from hazards with limited or no immediate historical precedent such that prediction and protective actions must depend on models that offer varying degrees of reliability. I believe that we must invest in understanding risks and risk models to ensure health in the future and protect ourselves from large challenges, including climate change, whether anthropogenic or otherwise, terrorism, and perhaps even cosmic change. PMID:22563748

  20. An analysis method for control reconfigurability of linear systems

    NASA Astrophysics Data System (ADS)

    Wang, Dayi; Duan, Wenjie; Liu, Chengrui

    2016-01-01

    The reconfigurability of control systems is further researched based on the function-objective model (FOM). The establishment of the FOM has been published in the authors' former paper, solving the problem whether the system is reconfigurable without losing the desired control objective. Based on the FOM, the importance factor, the risk factor and the k th reconfigurability factor are proposed to evaluate the fault risks of all components and the system reconfigurability with k faults. These factors show which components should be improved and which faults cannot be tolerated. The analysis results are very useful for enhancing the fault-tolerance performances of the control systems by improving system designs. A satellite model is utilized to illustrate the proposed method.

  1. Integration of Gis-analysis and Atmospheric Modelling For Nuclear Risk and Vulnerability Assessment

    NASA Astrophysics Data System (ADS)

    Rigina, O.; Baklanov, A.; Mahura, A.

    The paper is devoted to the problems of residential radiation risk and territorial vul- nerability with respect to nuclear sites in Europe. The study suggests two approaches, based on an integration of the GIS-analysis and the atmospheric modelling, to calcu- late radiation risk/vulnerability. First, modelling simulations were done for a number of case-studies, based on real data, such as reactor core inventory and estimations from the known accidents, for a number of typical meteorological conditions and different accidental scenarios. Then, using these simulations and the population database as input data, the GIS-analysis reveals administrative units at the highest risk with re- spect to the mean individual and collective doses received by the population. Then, two alternative methods were suggested to assess a probabilistic risk to the population in case of a severe accident on the Kola and Leningrad NPPs (as examples) based on social-geophysical factors: proximity to the accident site, population density and presence of critical groups, and the probabilities of wind trajectories and precipitation. The two latter probabilities were calculated by the atmospheric trajectory models and statistical methods for many years. The GIS analysis was done for the Nordic coun- tries as an example. GIS-based spatial analyses integrated with mathematical mod- elling allow to develop a common methodological approach for complex assessment of regional vulnerability and residential radiation risk, by merging together the sepa- rate aspects: modelling of consequences, probabilistic analysis of atmospheric flows, dose estimation etc. The approach was capable to create risk/vulnerability maps of the Nordic countries and to reveal the most vulnerable provinces with respect to the radiation risk sites.

  2. Quality-by-Design II: Application of Quantitative Risk Analysis to the Formulation of Ciprofloxacin Tablets.

    PubMed

    Claycamp, H Gregg; Kona, Ravikanth; Fahmy, Raafat; Hoag, Stephen W

    2016-04-01

    Qualitative risk assessment methods are often used as the first step to determining design space boundaries; however, quantitative assessments of risk with respect to the design space, i.e., calculating the probability of failure for a given severity, are needed to fully characterize design space boundaries. Quantitative risk assessment methods in design and operational spaces are a significant aid to evaluating proposed design space boundaries. The goal of this paper is to demonstrate a relatively simple strategy for design space definition using a simplified Bayesian Monte Carlo simulation. This paper builds on a previous paper that used failure mode and effects analysis (FMEA) qualitative risk assessment and Plackett-Burman design of experiments to identity the critical quality attributes. The results show that the sequential use of qualitative and quantitative risk assessments can focus the design of experiments on a reduced set of critical material and process parameters that determine a robust design space under conditions of limited laboratory experimentation. This approach provides a strategy by which the degree of risk associated with each known parameter can be calculated and allocates resources in a manner that manages risk to an acceptable level.

  3. Data screening methods for baseline ecological risk assessments

    SciTech Connect

    Schmeising, L.M.

    1994-12-31

    In conducting ecological risk assessments (ERAs), it is commonplace to take a phased approach in assessing potential impacts to ecological receptors. The first phase, the baseline ecological risk assessment (BERA) often includes a component which involves the systematic screening of the analytical data for abiotic media (i.e., surface water, sediment, surface soil) versus available ecology-based criteria, standards, guidelines, and benchmark values. Examples of ecological benchmark values include applicable toxicity data, such as no observed effects levels (NOELS) , lowest observed effects levels (LOELS) , or lethal doses (LC50, LD50) for selected indicator species or surrogates. An additional step often included in the screening process, is the calculation of ecological quotients (EQs), or environmental concentration/ benchmark ratios. The intent of the data screening process in performing BERAs is to determine which contaminants at a site are potentially posing a threat to ecological receptors. These contaminants, known as the ecological contaminants of concern (COCS) , are retained for further, detailed evaluations in later phases of the risk assessment. Application of these screening methodologies is presented, along with examples of ecology-based criteria, standards, and guidelines, and benchmark values.

  4. Robotic Mars Sample Return: Risk Assessment and Analysis Report

    NASA Technical Reports Server (NTRS)

    Lalk, Thomas R.; Spence, Cliff A.

    2003-01-01

    A comparison of the risk associated with two alternative scenarios for a robotic Mars sample return mission was conducted. Two alternative mission scenarios were identified, the Jet Propulsion Lab (JPL) reference Mission and a mission proposed by Johnson Space Center (JSC). The JPL mission was characterized by two landers and an orbiter, and a Mars orbit rendezvous to retrieve the samples. The JSC mission (Direct/SEP) involves a solar electric propulsion (SEP) return to earth followed by a rendezvous with the space shuttle in earth orbit. A qualitative risk assessment to identify and characterize the risks, and a risk analysis to quantify the risks were conducted on these missions. Technical descriptions of the competing scenarios were developed in conjunction with NASA engineers and the sequence of events for each candidate mission was developed. Risk distributions associated with individual and combinations of events were consolidated using event tree analysis in conjunction with Monte Carlo techniques to develop probabilities of mission success for each of the various alternatives. The results were the probability of success of various end states for each candidate scenario. These end states ranged from complete success through various levels of partial success to complete failure. Overall probability of success for the Direct/SEP mission was determined to be 66% for the return of at least one sample and 58% for the JPL mission for the return of at least one sample cache. Values were also determined for intermediate events and end states as well as for the probability of violation of planetary protection. Overall mission planetary protection event probabilities of occurrence were determined to be 0.002% and 1.3% for the Direct/SEP and JPL Reference missions respectively.

  5. Power System Transient Stability Analysis through a Homotopy Analysis Method

    SciTech Connect

    Wang, Shaobu; Du, Pengwei; Zhou, Ning

    2014-04-01

    As an important function of energy management systems (EMSs), online contingency analysis plays an important role in providing power system security warnings of instability. At present, N-1 contingency analysis still relies on time-consuming numerical integration. To save computational cost, the paper proposes a quasi-analytical method to evaluate transient stability through time domain periodic solutions’ frequency sensitivities against initial values. First, dynamic systems described in classical models are modified into damping free systems whose solutions are either periodic or expanded (non-convergent). Second, because the sensitivities experience sharp changes when periodic solutions vanish and turn into expanded solutions, transient stability is assessed using the sensitivity. Third, homotopy analysis is introduced to extract frequency information and evaluate the sensitivities only from initial values so that time consuming numerical integration is avoided. Finally, a simple case is presented to demonstrate application of the proposed method, and simulation results show that the proposed method is promising.

  6. Classification method for disease risk mapping based on discrete hidden Markov random fields.

    PubMed

    Charras-Garrido, Myriam; Abrial, David; Goër, Jocelyn De; Dachian, Sergueï; Peyrard, Nathalie

    2012-04-01

    Risk mapping in epidemiology enables areas with a low or high risk of disease contamination to be localized and provides a measure of risk differences between these regions. Risk mapping models for pooled data currently used by epidemiologists focus on the estimated risk for each geographical unit. They are based on a Poisson log-linear mixed model with a latent intrinsic continuous hidden Markov random field (HMRF) generally corresponding to a Gaussian autoregressive spatial smoothing. Risk classification, which is necessary to draw clearly delimited risk zones (in which protection measures may be applied), generally must be performed separately. We propose a method for direct classified risk mapping based on a Poisson log-linear mixed model with a latent discrete HMRF. The discrete hidden field (HF) corresponds to the assignment of each spatial unit to a risk class. The risk values attached to the classes are parameters and are estimated. When mapping risk using HMRFs, the conditional distribution of the observed field is modeled with a Poisson rather than a Gaussian distribution as in image segmentation. Moreover, abrupt changes in risk levels are rare in disease maps. The spatial hidden model should favor smoothed out risks, but conventional discrete Markov random fields (e.g. the Potts model) do not impose this. We therefore propose new potential functions for the HF that take into account class ordering. We use a Monte Carlo version of the expectation-maximization algorithm to estimate parameters and determine risk classes. We illustrate the method's behavior on simulated and real data sets. Our method appears particularly well adapted to localize high-risk regions and estimate the corresponding risk levels.

  7. GPFA-AB_Phase1RiskAnalysisTask5DataUpload

    DOE Data Explorer

    Teresa E. Jordan

    2015-09-30

    This submission contains information used to compute the risk factors for the GPFA-AB project (DE-EE0006726). The risk factors are natural reservoir quality, thermal resource quality, potential for induced seismicity, and utilization. The methods used to combine the risk factors included taking the product, sum, and minimum of the four risk factors. The files are divided into images, rasters, shapefiles, and supporting information. The image files show what the raster and shapefiles should look like. The raster files contain the input risk factors, calculation of the scaled risk factors, and calculation of the combined risk factors. The shapefiles include definition of the fairways, definition of the US Census Places, the center of the raster cells, and locations of industries. Supporting information contains details of the calculations or processing used in generating the files. An image of the raster will have the same name except *.png as the file ending instead of *.tif. Images with “fairways” or “industries” added to the name are composed of a raster with the relevant shapefile added. The file About_GPFA-AB_Phase1RiskAnalysisTask5DataUpload.pdf contains information the citation, special use considerations, authorship, etc. More details on each file are given in the spreadsheet “list_of_contents.csv” in the folder “SupportingInfo”. Code used to calculate values is available at https://github.com/calvinwhealton/geothermal_pfa under the folder “combining_metrics”.

  8. Aspirin use and risk of breast cancer: systematic review and meta-analysis of observational studies.

    PubMed

    Zhong, Shanliang; Chen, Lin; Zhang, Xiaohui; Yu, Dandan; Tang, Jinhai; Zhao, Jianhua

    2015-11-01

    Previous studies concerning the association between aspirin use and breast cancer risk yielded inconsistent results. We aimed to investigate the association by meta-analysis. PubMed and EMBASE were searched for relevant studies. We calculated the summary relative risks (RR) and 95% confidence intervals (CI) using random-effects models. Seventeen cohort studies and 15 case-control studies were included. The overall result showed that aspirin use decreased risk of breast cancer (RR, 0.90; 95% CI, 0.85-0.95). However, there was evidence of publication bias and heterogeneity and the association disappeared after correction using the trim-and-fill method. When stratified by study design, a significant benefit for aspirin users was only found in population-based and hospital-based case-control studies but not in cohort or nest case-control studies. Further subgroup analyses showed that aspirin use could decrease risk of in situ breast tumors or hormone receptor-positive tumors and reduce risk of breast cancer in postmenopausal women. Aspirin use may not affect overall risk of breast cancer, but decrease risk of in situ breast tumors or hormone receptor-positive tumors and reduce risk of breast cancer in postmenopausal women. Considering between-study significant heterogeneity and publication bias, confirmation in future studies is also essential. PMID:26315555

  9. Cask crush pad analysis using detailed and simplified analysis methods

    SciTech Connect

    Uldrich, E.D.; Hawkes, B.D.

    1997-12-31

    A crush pad has been designed and analyzed to absorb the kinetic energy of a hypothetically dropped spent nuclear fuel shipping cask into a 44-ft. deep cask unloading pool at the Fluorinel and Storage Facility (FAST). This facility, located at the Idaho Chemical Processing Plant (ICPP) at the Idaho national Engineering and Environmental Laboratory (INEEL), is a US Department of Energy site. The basis for this study is an analysis by Uldrich and Hawkes. The purpose of this analysis was to evaluate various hypothetical cask drop orientations to ensure that the crush pad design was adequate and the cask deceleration at impact was less than 100 g. It is demonstrated herein that a large spent fuel shipping cask, when dropped onto a foam crush pad, can be analyzed by either hand methods or by sophisticated dynamic finite element analysis using computer codes such as ABAQUS. Results from the two methods are compared to evaluate accuracy of the simplified hand analysis approach.

  10. Operational Implementation of a Pc Uncertainty Construct for Conjunction Assessment Risk Analysis

    NASA Technical Reports Server (NTRS)

    Newman, Lauri K.; Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    Earlier this year the NASA Conjunction Assessment and Risk Analysis (CARA) project presented the theoretical and algorithmic aspects of a method to include the uncertainties in the calculation inputs when computing the probability of collision (Pc) between two space objects, principally uncertainties in the covariances and the hard-body radius. The output of this calculation approach is to produce rather than a single Pc value an entire probability density function that will represent the range of possible Pc values given the uncertainties in the inputs and bring CA risk analysis methodologies more in line with modern risk management theory. The present study provides results from the exercise of this method against an extended dataset of satellite conjunctions in order to determine the effect of its use on the evaluation of conjunction assessment (CA) event risk posture. The effects are found to be considerable: a good number of events are downgraded from or upgraded to a serious risk designation on the basis of consideration of the Pc uncertainty. The findings counsel the integration of the developed methods into NASA CA operations.

  11. Methods for genetic linkage analysis using trisomies

    SciTech Connect

    Feingold, E.; Lamb, N.E.; Sherman, S.L.

    1994-09-01

    Certain genetic disorders (e.g. congenital cataracts, duodenal atresia) are rare in the general population, but more common in people with Down`s syndrome. We present a method for using individuals with trisomy 21 to map genes for such traits. Our methods are analogous to methods for mapping autosomal dominant traits using affected relative pairs by looking for markers with greater than expected identity-by-descent. In the trisomy case, one would take trisomic individuals and look for markers with greater than expected reduction to homozygosity in the chromosomes inherited form the non-disjoining parent. We present statistical methods for performing such a linkage analysis, including a test for linkage to a marker, a method for estimating the distance from the marker to the gene, a confidence interval for that distance, and methods for computing power and sample sizes. The methods are described in the context of gene-dosage model for the etiology of the disorder, but can be extended to other models. We also resolve some practical issues involved in implementing the methods, including how to use partially informative markers, how to test candidate genes, and how to handle the effect of reduced recombination associated with maternal meiosis I non-disjunction.

  12. Biological risk factors for suicidal behaviors: a meta-analysis.

    PubMed

    Chang, B P; Franklin, J C; Ribeiro, J D; Fox, K R; Bentley, K H; Kleiman, E M; Nock, M K

    2016-01-01

    Prior studies have proposed a wide range of potential biological risk factors for future suicidal behaviors. Although strong evidence exists for biological correlates of suicidal behaviors, it remains unclear if these correlates are also risk factors for suicidal behaviors. We performed a meta-analysis to integrate the existing literature on biological risk factors for suicidal behaviors and to determine their statistical significance. We conducted a systematic search of PubMed, PsycInfo and Google Scholar for studies that used a biological factor to predict either suicide attempt or death by suicide. Inclusion criteria included studies with at least one longitudinal analysis using a biological factor to predict either of these outcomes in any population through 2015. From an initial screen of 2541 studies we identified 94 cases. Random effects models were used for both meta-analyses and meta-regression. The combined effect of biological factors produced statistically significant but relatively weak prediction of suicide attempts (weighted mean odds ratio (wOR)=1.41; CI: 1.09-1.81) and suicide death (wOR=1.28; CI: 1.13-1.45). After accounting for publication bias, prediction was nonsignificant for both suicide attempts and suicide death. Only two factors remained significant after accounting for publication bias-cytokines (wOR=2.87; CI: 1.40-5.93) and low levels of fish oil nutrients (wOR=1.09; CI: 1.01-1.19). Our meta-analysis revealed that currently known biological factors are weak predictors of future suicidal behaviors. This conclusion should be interpreted within the context of the limitations of the existing literature, including long follow-up intervals and a lack of tests of interactions with other risk factors. Future studies addressing these limitations may more effectively test for potential biological risk factors. PMID:27622931

  13. Biological risk factors for suicidal behaviors: a meta-analysis.

    PubMed

    Chang, B P; Franklin, J C; Ribeiro, J D; Fox, K R; Bentley, K H; Kleiman, E M; Nock, M K

    2016-01-01

    Prior studies have proposed a wide range of potential biological risk factors for future suicidal behaviors. Although strong evidence exists for biological correlates of suicidal behaviors, it remains unclear if these correlates are also risk factors for suicidal behaviors. We performed a meta-analysis to integrate the existing literature on biological risk factors for suicidal behaviors and to determine their statistical significance. We conducted a systematic search of PubMed, PsycInfo and Google Scholar for studies that used a biological factor to predict either suicide attempt or death by suicide. Inclusion criteria included studies with at least one longitudinal analysis using a biological factor to predict either of these outcomes in any population through 2015. From an initial screen of 2541 studies we identified 94 cases. Random effects models were used for both meta-analyses and meta-regression. The combined effect of biological factors produced statistically significant but relatively weak prediction of suicide attempts (weighted mean odds ratio (wOR)=1.41; CI: 1.09-1.81) and suicide death (wOR=1.28; CI: 1.13-1.45). After accounting for publication bias, prediction was nonsignificant for both suicide attempts and suicide death. Only two factors remained significant after accounting for publication bias-cytokines (wOR=2.87; CI: 1.40-5.93) and low levels of fish oil nutrients (wOR=1.09; CI: 1.01-1.19). Our meta-analysis revealed that currently known biological factors are weak predictors of future suicidal behaviors. This conclusion should be interpreted within the context of the limitations of the existing literature, including long follow-up intervals and a lack of tests of interactions with other risk factors. Future studies addressing these limitations may more effectively test for potential biological risk factors.

  14. Biological risk factors for suicidal behaviors: a meta-analysis

    PubMed Central

    Chang, B P; Franklin, J C; Ribeiro, J D; Fox, K R; Bentley, K H; Kleiman, E M; Nock, M K

    2016-01-01

    Prior studies have proposed a wide range of potential biological risk factors for future suicidal behaviors. Although strong evidence exists for biological correlates of suicidal behaviors, it remains unclear if these correlates are also risk factors for suicidal behaviors. We performed a meta-analysis to integrate the existing literature on biological risk factors for suicidal behaviors and to determine their statistical significance. We conducted a systematic search of PubMed, PsycInfo and Google Scholar for studies that used a biological factor to predict either suicide attempt or death by suicide. Inclusion criteria included studies with at least one longitudinal analysis using a biological factor to predict either of these outcomes in any population through 2015. From an initial screen of 2541 studies we identified 94 cases. Random effects models were used for both meta-analyses and meta-regression. The combined effect of biological factors produced statistically significant but relatively weak prediction of suicide attempts (weighted mean odds ratio (wOR)=1.41; CI: 1.09–1.81) and suicide death (wOR=1.28; CI: 1.13–1.45). After accounting for publication bias, prediction was nonsignificant for both suicide attempts and suicide death. Only two factors remained significant after accounting for publication bias—cytokines (wOR=2.87; CI: 1.40–5.93) and low levels of fish oil nutrients (wOR=1.09; CI: 1.01–1.19). Our meta-analysis revealed that currently known biological factors are weak predictors of future suicidal behaviors. This conclusion should be interpreted within the context of the limitations of the existing literature, including long follow-up intervals and a lack of tests of interactions with other risk factors. Future studies addressing these limitations may more effectively test for potential biological risk factors. PMID:27622931

  15. Risk assessment as an evolved threat detection and analysis process.

    PubMed

    Blanchard, D Caroline; Griebel, Guy; Pobbe, Roger; Blanchard, Robert J

    2011-03-01

    Risk assessment is a pattern of activities involved in detection and analysis of threat stimuli and the situations in which the threat is encountered. It is a core process in the choice of specific defenses, such as flight, freezing, defensive threat and defensive attack, that counter the threat and minimize the danger it poses. This highly adaptive process takes into account important characteristics, such as type and location (including distance from the subject) of the threat, as well as those (e.g. presence of an escape route or hiding place) of the situation, combining them to predict which specific defense is optimal with that particular combination of threat and situation. Risk assessment is particularly associated with ambiguity either of the threat stimulus or of the outcome of available defensive behaviors. It is also crucial in determining that threat is no longer present, permitting a return to normal, nondefensive behavior. Although risk assessment has been described in detail in rodents, it is also a feature of human defensive behavior, particularly in association with ambiguity. Rumination may be a specifically human form of risk assessment, more often expressed by women, and highly associated with anxiety. Risk assessment behaviors respond to drugs effective against generalized anxiety disorder; however, flight, a dominant specific defense in many common situations, shows a pharmacological response profile closer to that of panic disorder. Risk assessment and flight also appear to show some consistent differences in terms of brain regional activation patterns, suggesting a potential biological differentiation of anxiety and fear/panic systems. An especially intriguing possibility is that mirror neurons may respond to some of the same types of situational differences that are analyzed during risk assessment, suggesting an additional functional role for these neurons.

  16. Adapting Chemical Mixture Risk Assessment Methods to Assess Chemical and Non-Chemical Stressor Combinations

    EPA Science Inventory

    Presentation based on the following abstract: Chemical mixtures risk assessment methods are routinely used. To address combined chemical and nonchemical stressors, component-based approaches may be applicable, depending on the toxic action among diverse stressors. Such methods a...

  17. Plasma prolactin and breast cancer risk: a meta- analysis

    PubMed Central

    Wang, Minghao; Wu, Xiujuan; Chai, Fan; Zhang, Yi; Jiang, Jun

    2016-01-01

    Breast cancer is the most common cancer among women, and its incidence is on a constant rise. Previous studies suggest that higher levels of plasma prolactin are associated with escalated risk of breast cancer, however, these results are contradictory and inconclusive. PubMed and Medline were used to search and identify published observational studies that assessed the relationship between plasma prolactin levels and the risk of breast cancer. The pooled relative risks (RRs) with 95% confidence intervals (CIs) were calculated using a fixed-effects or random-effects model. A total of 7 studies were included in our analysis. For the highest versus lowest levels of plasma prolactin, the pooled RR (95% CI) of breast cancer were 1.16 (1.04, 1.29). In subgroup analyses, we found a positive association between plasma prolactin levels and the risk of breast cancer among the patients who were postmenopausal, ER+/PR+ or in situ and invasive carcinoma. However, this positive association was not detected in the premenopausal and ER-/PR- patients. In conclusion, the present study provides evidence supporting a significantly positive association between plasma prolactin levels and the risk of breast cancer. PMID:27184120

  18. Structural sensitivity analysis: Methods, applications and needs

    NASA Technical Reports Server (NTRS)

    Adelman, H. M.; Haftka, R. T.; Camarda, C. J.; Walsh, J. L.

    1984-01-01

    Innovative techniques applicable to sensitivity analysis of discretized structural systems are reviewed. The techniques include a finite difference step size selection algorithm, a method for derivatives of iterative solutions, a Green's function technique for derivatives of transient response, simultaneous calculation of temperatures and their derivatives, derivatives with respect to shape, and derivatives of optimum designs with respect to problem parameters. Computerized implementations of sensitivity analysis and applications of sensitivity derivatives are also discussed. Some of the critical needs in the structural sensitivity area are indicated along with plans for dealing with some of those needs.

  19. Structural sensitivity analysis: Methods, applications, and needs

    NASA Technical Reports Server (NTRS)

    Adelman, H. M.; Haftka, R. T.; Camarda, C. J.; Walsh, J. L.

    1984-01-01

    Some innovative techniques applicable to sensitivity analysis of discretized structural systems are reviewed. These techniques include a finite-difference step-size selection algorithm, a method for derivatives of iterative solutions, a Green's function technique for derivatives of transient response, a simultaneous calculation of temperatures and their derivatives, derivatives with respect to shape, and derivatives of optimum designs with respect to problem parameters. Computerized implementations of sensitivity analysis and applications of sensitivity derivatives are also discussed. Finally, some of the critical needs in the structural sensitivity area are indicated along with Langley plans for dealing with some of these needs.

  20. Advanced Analysis Methods in High Energy Physics

    SciTech Connect

    Pushpalatha C. Bhat

    2001-10-03

    During the coming decade, high energy physics experiments at the Fermilab Tevatron and around the globe will use very sophisticated equipment to record unprecedented amounts of data in the hope of making major discoveries that may unravel some of Nature's deepest mysteries. The discovery of the Higgs boson and signals of new physics may be around the corner. The use of advanced analysis techniques will be crucial in achieving these goals. The author discusses some of the novel methods of analysis that could prove to be particularly valuable for finding evidence of any new physics, for improving precision measurements and for exploring parameter spaces of theoretical models.