Science.gov

Sample records for risk analysis method

  1. Risk uncertainty analysis methods for NUREG-1150

    SciTech Connect

    Benjamin, U.S.; Boyd, G.J.

    1986-01-01

    Evaluation and display of risk uncertainties for NUREG-1150 constitute a principal focus of the Severe Accident Risk Rebaselining/Risk Reduction Program (SARRP). Some of the principal objectives of the uncertainty evaluation are: (1) to provide a quantitative estimate that reflects, for those areas considered, a credible and realistic range of uncertainty in risk; (2) to rank the various sources of uncertainty with respect to their importance for various measures of risk; and (3) to characterize the state of understanding of each aspect of the risk assessment for which major uncertainties exist. This paper describes the methods developed to fulfill these objectives.

  2. Meta-analysis methods for risk differences.

    PubMed

    Bonett, Douglas G; Price, Robert M

    2014-11-01

    The difference between two proportions, referred to as a risk difference, is a useful measure of effect size in studies where the response variable is dichotomous. Confidence interval methods based on a varying coefficient model are proposed for combining and comparing risk differences from multi-study between-subjects or within-subjects designs. The proposed methods are new alternatives to the popular constant coefficient and random coefficient methods. The proposed varying coefficient methods do not require the constant coefficient assumption of effect size homogeneity, nor do they require the random coefficient assumption that the risk differences from the selected studies represent a random sample from a normally distributed superpopulation of risk differences. The proposed varying coefficient methods are shown to have excellent finite-sample performance characteristics under realistic conditions. © 2013 The British Psychological Society.

  3. Common Methods for Security Risk Analysis

    DTIC Science & Technology

    2007-11-02

    Workshops was particularly influential among Canadian tool-designers in the late 1980’s. These models generally favour a software tool solution simply...tools that have too small a market to justify extensive software development. Also, most of the risk management standards that came out at this...companies developing specialized risk analysis tools, such as the Vulcanizer project of DOMUS Software Inc. The latter incorporated fuzzy logic to

  4. The development of a 3D risk analysis method.

    PubMed

    I, Yet-Pole; Cheng, Te-Lung

    2008-05-01

    Much attention has been paid to the quantitative risk analysis (QRA) research in recent years due to more and more severe disasters that have happened in the process industries. Owing to its calculation complexity, very few software, such as SAFETI, can really make the risk presentation meet the practice requirements. However, the traditional risk presentation method, like the individual risk contour in SAFETI, is mainly based on the consequence analysis results of dispersion modeling, which usually assumes that the vapor cloud disperses over a constant ground roughness on a flat terrain with no obstructions and concentration fluctuations, which is quite different from the real situations of a chemical process plant. All these models usually over-predict the hazardous regions in order to maintain their conservativeness, which also increases the uncertainty of the simulation results. On the other hand, a more rigorous model such as the computational fluid dynamics (CFD) model can resolve the previous limitations; however, it cannot resolve the complexity of risk calculations. In this research, a conceptual three-dimensional (3D) risk calculation method was proposed via the combination of results of a series of CFD simulations with some post-processing procedures to obtain the 3D individual risk iso-surfaces. It is believed that such technique will not only be limited to risk analysis at ground level, but also be extended into aerial, submarine, or space risk analyses in the near future.

  5. Comprehensive safeguards evaluation methods and societal risk analysis

    SciTech Connect

    Richardson, J.M.

    1982-03-01

    Essential capabilities of an integrated evaluation methodology for analyzing safeguards systems are discussed. Such a methodology must be conceptually meaningful, technically defensible, discriminating and consistent. A decompostion of safeguards systems by function is mentioned as a possible starting point for methodology development. The application of a societal risk equation to safeguards systems analysis is addressed. Conceptual problems with this approach are discussed. Technical difficulties in applying this equation to safeguards systems are illustrated through the use of confidence intervals, information content, hypothesis testing and ranking and selection procedures.

  6. An advanced method for flood risk analysis in river deltas, applied to societal flood fatality risks in the Netherlands

    NASA Astrophysics Data System (ADS)

    de Bruijn, K. M.; Diermanse, F. L. M.; Beckers, J. V. L.

    2014-02-01

    This paper discusses the new method developed to analyse flood risks in river deltas. Risk analysis of river deltas is complex, because both storm surges and river discharges may cause flooding and since the effect of upstream breaches on downstream water levels and flood risks must be taken into account. A Monte Carlo based flood risk analysis framework for policy making was developed, which considers both storm surges and river flood waves and includes hydrodynamic interaction effects on flood risks. It was applied to analyse societal flood fatality risks (the probability of events with more than N fatalities) in the Rhine-Meuse delta.

  7. An advanced method for flood risk analysis in river deltas, applied to societal flood fatality risk in the Netherlands

    NASA Astrophysics Data System (ADS)

    de Bruijn, K. M.; Diermanse, F. L. M.; Beckers, J. V. L.

    2014-10-01

    This paper discusses a new method for flood risk assessment in river deltas. Flood risk analysis of river deltas is complex, because both storm surges and river discharges may cause flooding and the effect of upstream breaches on downstream water levels and flood risk must be taken into account. This paper presents a Monte Carlo-based flood risk analysis framework for policy making, which considers both storm surges and river flood waves and includes effects from hydrodynamic interaction on flood risk. It was applied to analyse societal flood fatality risk in the Rhine-Meuse delta.

  8. Simplified Plant Analysis Risk (SPAR) Human Reliability Analysis (HRA) Methodology: Comparisons with other HRA Methods

    SciTech Connect

    Byers, James Clifford; Gertman, David Ira; Hill, Susan Gardiner; Blackman, Harold Stabler; Gentillon, Cynthia Ann; Hallbert, Bruce Perry; Haney, Lon Nolan

    2000-08-01

    The 1994 Accident Sequence Precursor (ASP) human reliability analysis (HRA) methodology was developed for the U.S. Nuclear Regulatory Commission (USNRC) in 1994 by the Idaho National Engineering and Environmental Laboratory (INEEL). It was decided to revise that methodology for use by the Simplified Plant Analysis Risk (SPAR) program. The 1994 ASP HRA methodology was compared, by a team of analysts, on a point-by-point basis to a variety of other HRA methods and sources. This paper briefly discusses how the comparisons were made and how the 1994 ASP HRA methodology was revised to incorporate desirable aspects of other methods. The revised methodology was renamed the SPAR HRA methodology.

  9. Simplified plant analysis risk (SPAR) human reliability analysis (HRA) methodology: Comparisons with other HRA methods

    SciTech Connect

    J. C. Byers; D. I. Gertman; S. G. Hill; H. S. Blackman; C. D. Gentillon; B. P. Hallbert; L. N. Haney

    2000-07-31

    The 1994 Accident Sequence Precursor (ASP) human reliability analysis (HRA) methodology was developed for the U.S. Nuclear Regulatory Commission (USNRC) in 1994 by the Idaho National Engineering and Environmental Laboratory (INEEL). It was decided to revise that methodology for use by the Simplified Plant Analysis Risk (SPAR) program. The 1994 ASP HRA methodology was compared, by a team of analysts, on a point-by-point basis to a variety of other HRA methods and sources. This paper briefly discusses how the comparisons were made and how the 1994 ASP HRA methodology was revised to incorporate desirable aspects of other methods. The revised methodology was renamed the SPAR HRA methodology.

  10. The integration methods of fuzzy fault mode and effect analysis and fault tree analysis for risk analysis of yogurt production

    NASA Astrophysics Data System (ADS)

    Aprilia, Ayu Rizky; Santoso, Imam; Ekasari, Dhita Murita

    2017-05-01

    Yogurt is a product based on milk, which has beneficial effects for health. The process for the production of yogurt is very susceptible to failure because it involves bacteria and fermentation. For an industry, the risks may cause harm and have a negative impact. In order for a product to be successful and profitable, it requires the analysis of risks that may occur during the production process. Risk analysis can identify the risks in detail and prevent as well as determine its handling, so that the risks can be minimized. Therefore, this study will analyze the risks of the production process with a case study in CV.XYZ. The method used in this research is the Fuzzy Failure Mode and Effect Analysis (fuzzy FMEA) and Fault Tree Analysis (FTA). The results showed that there are 6 risks from equipment variables, raw material variables, and process variables. Those risks include the critical risk, which is the risk of a lack of an aseptic process, more specifically if starter yogurt is damaged due to contamination by fungus or other bacteria and a lack of sanitation equipment. The results of quantitative analysis of FTA showed that the highest probability is the probability of the lack of an aseptic process, with a risk of 3.902%. The recommendations for improvement include establishing SOPs (Standard Operating Procedures), which include the process, workers, and environment, controlling the starter of yogurt and improving the production planning and sanitation equipment using hot water immersion.

  11. [Discussion on the building of post market risk analysis method in hemodialysis device].

    PubMed

    Xu, Honglei; Peng, Xiaolong; Tian, Xiaojun; Wang, Peilian

    2014-09-01

    This paper discussed the building of post market risk analysis method in hemodialysis device from the point of government supervision. By proposing practical research methods for post market risk identification and estimation on hemodialysis device, providing technical guidance for government to put risk management of hemodialysis device into effect, and offering reference for enterprises to carry out post market risk evaluation on their products as well.

  12. A comparative analysis of PRA and intelligent adversary methods for counterterrorism risk management.

    PubMed

    Merrick, Jason; Parnell, Gregory S

    2011-09-01

    In counterterrorism risk management decisions, the analyst can choose to represent terrorist decisions as defender uncertainties or as attacker decisions. We perform a comparative analysis of probabilistic risk analysis (PRA) methods including event trees, influence diagrams, Bayesian networks, decision trees, game theory, and combined methods on the same illustrative examples (container screening for radiological materials) to get insights into the significant differences in assumptions and results. A key tenent of PRA and decision analysis is the use of subjective probability to assess the likelihood of possible outcomes. For each technique, we compare the assumptions, probability assessment requirements, risk levels, and potential insights for risk managers. We find that assessing the distribution of potential attacker decisions is a complex judgment task, particularly considering the adaptation of the attacker to defender decisions. Intelligent adversary risk analysis and adversarial risk analysis are extensions of decision analysis and sequential game theory that help to decompose such judgments. These techniques explicitly show the adaptation of the attacker and the resulting shift in risk based on defender decisions. © 2011 Society for Risk Analysis.

  13. Failure mode and effects analysis: a comparison of two common risk prioritisation methods.

    PubMed

    McElroy, Lisa M; Khorzad, Rebeca; Nannicelli, Anna P; Brown, Alexandra R; Ladner, Daniela P; Holl, Jane L

    2016-05-01

    Failure mode and effects analysis (FMEA) is a method of risk assessment increasingly used in healthcare over the past decade. The traditional method, however, can require substantial time and training resources. The goal of this study is to compare a simplified scoring method with the traditional scoring method to determine the degree of congruence in identifying high-risk failures. An FMEA of the operating room (OR) to intensive care unit (ICU) handoff was conducted. Failures were scored and ranked using both the traditional risk priority number (RPN) and criticality-based method, and a simplified method, which designates failures as 'high', 'medium' or 'low' risk. The degree of congruence was determined by first identifying those failures determined to be critical by the traditional method (RPN≥300), and then calculating the per cent congruence with those failures designated critical by the simplified methods (high risk). In total, 79 process failures among 37 individual steps in the OR to ICU handoff process were identified. The traditional method yielded Criticality Indices (CIs) ranging from 18 to 72 and RPNs ranging from 80 to 504. The simplified method ranked 11 failures as 'low risk', 30 as medium risk and 22 as high risk. The traditional method yielded 24 failures with an RPN ≥300, of which 22 were identified as high risk by the simplified method (92% agreement). The top 20% of CI (≥60) included 12 failures, of which six were designated as high risk by the simplified method (50% agreement). These results suggest that the simplified method of scoring and ranking failures identified by an FMEA can be a useful tool for healthcare organisations with limited access to FMEA expertise. However, the simplified method does not result in the same degree of discrimination in the ranking of failures offered by the traditional method. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  14. Review of Research Trends and Methods in Nano Environmental, Health, and Safety Risk Analysis.

    PubMed

    Erbis, Serkan; Ok, Zeynep; Isaacs, Jacqueline A; Benneyan, James C; Kamarthi, Sagar

    2016-08-01

    Despite the many touted benefits of nanomaterials, concerns remain about their possible environmental, health, and safety (EHS) risks in terms of their toxicity, long-term accumulation effects, or dose-response relationships. The published studies on EHS risks of nanomaterials have increased significantly over the past decade and half, with most focused on nanotoxicology. Researchers are still learning about health consequences of nanomaterials and how to make environmentally responsible decisions regarding their production. This article characterizes the scientific literature on nano-EHS risk analysis to map the state-of-the-art developments in this field and chart guidance for the future directions. First, an analysis of keyword co-occurrence networks is investigated for nano-EHS literature published in the past decade to identify the intellectual turning points and research trends in nanorisk analysis studies. The exposure groups targeted in emerging nano-EHS studies are also assessed. System engineering methods for risk, safety, uncertainty, and system reliability analysis are reviewed, followed by detailed descriptions where applications of these methods are utilized to analyze nanomaterial EHS risks. Finally, the trends, methods, future directions, and opportunities of system engineering methods in nano-EHS research are discussed. The analysis of nano-EHS literature presented in this article provides important insights on risk assessment and risk management tools associated with nanotechnology, nanomanufacturing, and nano-enabled products.

  15. A comparison of two prospective risk analysis methods: Traditional FMEA and a modified healthcare FMEA.

    PubMed

    Rah, Jeong-Eun; Manger, Ryan P; Yock, Adam D; Kim, Gwe-Ya

    2016-12-01

    To examine the abilities of a traditional failure mode and effects analysis (FMEA) and modified healthcare FMEA (m-HFMEA) scoring methods by comparing the degree of congruence in identifying high risk failures. The authors applied two prospective methods of the quality management to surface image guided, linac-based radiosurgery (SIG-RS). For the traditional FMEA, decisions on how to improve an operation were based on the risk priority number (RPN). The RPN is a product of three indices: occurrence, severity, and detectability. The m-HFMEA approach utilized two indices, severity and frequency. A risk inventory matrix was divided into four categories: very low, low, high, and very high. For high risk events, an additional evaluation was performed. Based upon the criticality of the process, it was decided if additional safety measures were needed and what they comprise. The two methods were independently compared to determine if the results and rated risks matched. The authors' results showed an agreement of 85% between FMEA and m-HFMEA approaches for top 20 risks of SIG-RS-specific failure modes. The main differences between the two approaches were the distribution of the values and the observation that failure modes (52, 54, 154) with high m-HFMEA scores do not necessarily have high FMEA-RPN scores. In the m-HFMEA analysis, when the risk score is determined, the basis of the established HFMEA Decision Tree™ or the failure mode should be more thoroughly investigated. m-HFMEA is inductive because it requires the identification of the consequences from causes, and semi-quantitative since it allows the prioritization of high risks and mitigation measures. It is therefore a useful tool for the prospective risk analysis method to radiotherapy.

  16. RAMPART (TM): Risk Assessment Method-Property Analysis and Ranking Tool v.4.0

    SciTech Connect

    Carson, Susan D.; Hunter, Regina L.; Link, Madison D.; Browitt, Robert D.

    2007-09-30

    RAMPART{trademark}, Risk Assessment Method-property Analysis and Ranking Tool, is a new type of computer software package for the assessment of risk to buildings. RAMPART{trademark} has been developed by Sandia National Laboratories (SNL) for the U.S. General Services Administration (GSA). RAMPART {trademark} has been designed and developed to be a risk-based decision support tool that requires no risk analysis expertise on the part of the user. The RAMPART{trademark} user interface elicits information from the user about the building. The RAMPART{trademark} expert system is a set of rules that embodies GSA corporate knowledge and SNL's risk assessment experience. The RAMPART{trademark} database contains both data entered by the user during a building analysis session and large sets of natural hazard and crime data. RAMPART{trademark} algorithms use these data to assess the risk associated with a given building in the face of certain hazards. Risks arising from five natural hazards (earthquake, hurricane, winter storm, tornado and flood); crime (inside and outside the building); fire and terrorism are calculated. These hazards may cause losses of various kinds. RAMPART{trademark} considers death, injury, loss of mission, loss of property, loss of contents, loss of building use, and first-responder loss. The results of each analysis are presented graphically on the screen and in a written report.

  17. Barrier and operational risk analysis of hydrocarbon releases (BORA-Release). Part I. Method description.

    PubMed

    Aven, Terje; Sklet, Snorre; Vinnem, Jan Erik

    2006-09-21

    Investigations of major accidents show that technical, human, operational, as well as organisational factors influence the accident sequences. In spite of these facts, quantitative risk analyses of offshore oil and gas production platforms have focused on technical safety systems. This paper presents a method (called BORA-Release) for qualitative and quantitative risk analysis of the platform specific hydrocarbon release frequency. By using BORA-Release it is possible to analyse the effect of safety barriers introduced to prevent hydrocarbon releases, and how platform specific conditions of technical, human, operational, and organisational risk influencing factors influence the barrier performance. BORA-Release comprises the following main steps: (1) development of a basic risk model including release scenarios, (2) modelling the performance of safety barriers, (3) assignment of industry average probabilities/frequencies and risk quantification based on these probabilities/frequencies, (4) development of risk influence diagrams, (5) scoring of risk influencing factors, (6) weighting of risk influencing factors, (7) adjustment of industry average probabilities/frequencies, and (8) recalculation of the risk in order to determine the platform specific risk related to hydrocarbon release. The various steps in BORA-Release are presented and discussed. Part II of the paper presents results from a case study where BORA-Release is applied.

  18. Multi-factor Constrained Information Analysis Method for Landslide Hazard Risk

    NASA Astrophysics Data System (ADS)

    Tao, Kunwang; Wang, Liang; Qian, Xinlin

    2015-04-01

    Landslide hazard cause enormous damage to human life, property, and the environment. The most effective way to mitigate the effects of landslides is to evaluate the risk of the landslides, and take measures to avoid the loss in advance. Various factors should be considered for the landslides' risk assessment, so the assessment has great complexity and uncertainty. According to landslides, the multiple factors constrained method for landslides risk assessment was proposed at the same time, which with three step to carried it out, first using the GIS technology to divide the analysis grid as the base analysis unit, second, according to the available information, slope, lithology, faults, landuse, etc. as the multiple evaluation factors, finally, based on years of landslides in the observed data, the risk assessment analysis of landslides were realized with of multiple factors constrained assessment model, which the weight value of every factor was determined by the information model. The region of Gongliu was selected as the experimental area which located in Xinjiang Ili River basin and the altitude of 600 to 4000 meters, with the special terrain characteristics of long at the east to west, narrow at north to south. The unique topography characters is the abundant rainfall which causing frequent landslides. By selecting 500m * 500m as an analysis grid which covering the whole study area, based on the multiple factors constrained method for the landslides risk assessment, a comprehensive assessment of the landslides risk in this region were computed out, and the landslide hazard classification map was realized at the end. From the experimental results of the statistical perspective, the proportion of landslide hazard point is 94.04% at a little high risk and high risk areas. And the proportion of 4.64% at the low risk zone, 1.32% at the most low risk zone. The results showed a high probability of landslides at high level of the assess region, which showed that

  19. An improved method for risk evaluation in failure modes and effects analysis of CNC lathe

    NASA Astrophysics Data System (ADS)

    Rachieru, N.; Belu, N.; Anghel, D. C.

    2015-11-01

    Failure mode and effects analysis (FMEA) is one of the most popular reliability analysis tools for identifying, assessing and eliminating potential failure modes in a wide range of industries. In general, failure modes in FMEA are evaluated and ranked through the risk priority number (RPN), which is obtained by the multiplication of crisp values of the risk factors, such as the occurrence (O), severity (S), and detection (D) of each failure mode. However, the crisp RPN method has been criticized to have several deficiencies. In this paper, linguistic variables, expressed in Gaussian, trapezoidal or triangular fuzzy numbers, are used to assess the ratings and weights for the risk factors S, O and D. A new risk assessment system based on the fuzzy set theory and fuzzy rule base theory is to be applied to assess and rank risks associated to failure modes that could appear in the functioning of Turn 55 Lathe CNC. Two case studies have been shown to demonstrate the methodology thus developed. It is illustrated a parallel between the results obtained by the traditional method and fuzzy logic for determining the RPNs. The results show that the proposed approach can reduce duplicated RPN numbers and get a more accurate, reasonable risk assessment. As a result, the stability of product and process can be assured.

  20. Genotype relative risks: Methods for design and analysis of candidate-gene association studies

    SciTech Connect

    Shaid, D.J.; Sommer, S.S. )

    1993-11-01

    Design and analysis methods are presented for studying the association of a candidate gene with a disease by using parental data in place of nonrelated controls. This alternating design eliminates spurious differences in allele frequencies between cases and nonrelated controls resulting from different ethnic origins and population stratification for these two groups. The authors present analysis methods which are based on two genetic relative risks: (1) the relative risk of disease for homozygotes with two copies of the candidate gene versus homozygotes without the candidate gene and (2) the relative risk for heterozygotes with one copy of the candidate gene versus homozygotes without the candidate gene. In addition to estimating the magnitude of these relative risks, likelihood methods allow specific hypotheses to be tested, namely, a test for overall association of the candidate gene with disease, as well as specific genetic hypotheses, such as dominant or recessive inheritance. Two likelihood methods are presented: (1) a likelihood method appropriate when Hardy-Weinberg equilibrium holds and (2) a likelihood method in which the authors condition on parental genotype data when Hardy-Weinberg equilibrium does not hold. The results for the relative efficiency of these two methods suggest that the conditional approach may at times be preferable, even when equilibrium holds. Sample-size and power calculations are presented for a multitiered design. Tier 1 detects the presence of an abnormal sequence for a postulated candidate gene among a small group of cases. Tier 2 tests for association of the abnormal variant with disease, such as by the likelihood methods presented. Tier 3 confirms positive results from tier 2. Results indicate that required sample sizes are smaller when expression of disease is recessive, rather than dominant, and that, for recessive disease and large relative risks, necessary sample sizes may be feasible. 19 refs., 2 figs., 2 tabs.

  1. Method of assessing a lipid-related health risk based on ion mobility analysis of lipoproteins

    DOEpatents

    Benner, W. Henry; Krauss, Ronald M.; Blanche, Patricia J.

    2010-12-14

    A medical diagnostic method and instrumentation system for analyzing noncovalently bonded agglomerated biological particles is described. The method and system comprises: a method of preparation for the biological particles; an electrospray generator; an alpha particle radiation source; a differential mobility analyzer; a particle counter; and data acquisition and analysis means. The medical device is useful for the assessment of human diseases, such as cardiac disease risk and hyperlipidemia, by rapid quantitative analysis of lipoprotein fraction densities. Initially, purification procedures are described to reduce an initial blood sample to an analytical input to the instrument. The measured sizes from the analytical sample are correlated with densities, resulting in a spectrum of lipoprotein densities. The lipoprotein density distribution can then be used to characterize cardiac and other lipid-related health risks.

  2. Risk Analysis of Central Java Gas Transmission Pipeline by Risk-Based Inspection Method

    NASA Astrophysics Data System (ADS)

    Mediansyah; Haryadi, G. D.; Ismail, R.; Kim, S. J.

    2017-05-01

    During the operational period of gas transmission pipeline was found a potential hazard that could result in pipeline failure. As a consequence, the problem of the pipeline failure happening more and more. Economic and environmental factors, as well as human life, be considered to involve the current challenges as structural integrity and safety standards. Therefore, the reliability of structural integrity and security of gas pipelines under various conditions, including the existence of defects should be carefully evaluated. The results of this study were the steps for setting a Risk Level on any instrument using the Risk-Based Inspection API 581 standard and the subsequent results are recommended as an effective inspection planning by Risk Level and Remaining Life Time.

  3. Risk-based analysis methods applied to nuclear power plant technical specifications

    SciTech Connect

    Wagner, D.P.; Minton, L.A.; Gaertner, J.P.

    1989-03-01

    A computer-aided methodology and practical applications of risk-based evaluation of technical specifications are described. The methodology, developed for use by the utility industry, is a part of the overall process of improving nuclear power plant technical specifications. The SOCRATES computer program uses the results of a probabilistic risk assessment or a system-level risk analysis to calculate changes in risk due to changes in the surveillance test interval and/or the allowed outage time stated in the technical specification. The computer program can accommodate various testing strategies (such as staggered or simultaneous testing) to allow modeling of component testing as it is carried out at the plant. The methods and computer program are an integral part of a larger decision process aimed at determining benefits from technical specification changes. These benefits can include cost savings to the utilities by reducing forced shutdowns and decreasing labor requirements for test and maintenance activities, with no adverse impacts on risk. The methodology and the SOCRATES computer program have been used extensively toe valuate several actual technical specifications in case studies demonstrating the methods. Summaries of these applications demonstrate the types of results achieved and the usefulness of the risk-based evaluation in improving the technical specifications.

  4. An integrated optimization method for river water quality management and risk analysis in a rural system.

    PubMed

    Liu, J; Li, Y P; Huang, G H; Zeng, X T; Nie, S

    2016-01-01

    In this study, an interval-stochastic-based risk analysis (RSRA) method is developed for supporting river water quality management in a rural system under uncertainty (i.e., uncertainties exist in a number of system components as well as their interrelationships). The RSRA method is effective in risk management and policy analysis, particularly when the inputs (such as allowable pollutant discharge and pollutant discharge rate) are expressed as probability distributions and interval values. Moreover, decision-makers' attitudes towards system risk can be reflected using a restricted resource measure by controlling the variability of the recourse cost. The RSRA method is then applied to a real case of water quality management in the Heshui River Basin (a rural area of China), where chemical oxygen demand (COD), total nitrogen (TN), total phosphorus (TP), and soil loss are selected as major indicators to identify the water pollution control strategies. Results reveal that uncertainties and risk attitudes have significant effects on both pollutant discharge and system benefit. A high risk measure level can lead to a reduced system benefit; however, this reduction also corresponds to raised system reliability. Results also disclose that (a) agriculture is the dominant contributor to soil loss, TN, and TP loads, and abatement actions should be mainly carried out for paddy and dry farms; (b) livestock husbandry is the main COD discharger, and abatement measures should be mainly conducted for poultry farm; (c) fishery accounts for a high percentage of TN, TP, and COD discharges but a has low percentage of overall net benefit, and it may be beneficial to cease fishery activities in the basin. The findings can facilitate the local authority in identifying desired pollution control strategies with the tradeoff between socioeconomic development and environmental sustainability.

  5. Uncertainty analysis in regulatory programs: Application factors versus probabilistic methods in ecological risk assessments of chemicals

    SciTech Connect

    Moore, D.R.J.; Elliot, B.

    1995-12-31

    In assessments of toxic chemicals, sources of uncertainty may be dealt with by two basic approaches: application factors and probabilistic methods. In regulatory programs, the most common approach is to calculate a quotient by dividing the predicted environmental concentration (PEC) by the predicted no effects concentration (PNEC). PNECs are usually derived from laboratory bioassays, thus requiring the use of application factors to account for uncertainty introduced by the extrapolation from the laboratory to the field, and from measurement to assessment endpoints. Using this approach, often with worst-case assumptions about exposure and species sensitivities, the hope is that chemicals with a quotient of less than one will have a very low probability of causing adverse ecological effects. This approach has received widespread criticism recently, particularly because it tends to be overly conservative and does not adequately estimate the magnitude and probability of causing adverse effects. On the plus side, application factors are simple to use, accepted worldwide, and may be used with limited effects data in a quotient calculation. The alternative approach is to use probabilistic methods such as Monte Carlo simulation, Baye`s theorem or other techniques to estimate risk. Such methods often have rigorous statistical assumptions and may have large data requirements. Stating an effect in probabilistic terms, however, forces the identification of sources of uncertainty and quantification of their impact on risk estimation. In this presentation the authors discuss the advantages and disadvantages of using application factors and probabilistic methods in dealing with uncertainty in ecological risk assessments of chemicals. Based on this analysis, recommendations are presented to assist in choosing the appropriate approach for different types of regulatory programs dealing with toxic chemicals.

  6. Animal Disease Import Risk Analysis--a Review of Current Methods and Practice.

    PubMed

    Peeler, E J; Reese, R A; Thrush, M A

    2015-10-01

    The application of risk analysis to the spread of disease with international trade in animals and their products, that is, import risk analysis (IRA), has been largely driven by the Sanitary and Phytosanitary (SPS) agreement of the World Trade Organization (WTO). The degree to which the IRA standard established by the World Organization for Animal Health (OIE), and associated guidance, meets the needs of the SPS agreement is discussed. The use of scenario trees is the core modelling approach used to represent the steps necessary for the hazard to occur. There is scope to elaborate scenario trees for commodity IRA so that the quantity of hazard at each step is assessed, which is crucial to the likelihood of establishment. The dependence between exposure and establishment suggests that they should fall within the same subcomponent. IRA undertaken for trade reasons must include an assessment of consequences to meet SPS criteria, but guidance is sparse. The integration of epidemiological and economic modelling may open a path for better methods. Matrices have been used in qualitative IRA to combine estimates of entry and exposure, and consequences with likelihood, but this approach has flaws and better methods are needed. OIE IRA standards and guidance indicate that the volume of trade should be taken into account, but offer no detail. Some published qualitative IRAs have assumed current levels and patterns of trade without specifying the volume of trade, which constrains the use of IRA to determine mitigation measures (to reduce risk to an acceptable level) and whether the principle of equivalence, fundamental to the SPS agreement, has been observed. It is questionable whether qualitative IRA can meet all the criteria set out in the SPS agreement. Nevertheless, scope exists to elaborate the current standards and guidance, so they better serve the principle of science-based decision-making.

  7. Stochastic Drought Risk Analysis and Projection Methods For Thermoelectric Power Systems

    NASA Astrophysics Data System (ADS)

    Bekera, Behailu Belamo

    Combined effects of socio-economic, environmental, technological and political factors impact fresh cooling water availability, which is among the most important elements of thermoelectric power plant site selection and evaluation criteria. With increased variability and changes in hydrologic statistical stationarity, one concern is the increased occurrence of extreme drought events that may be attributable to climatic changes. As hydrological systems are altered, operators of thermoelectric power plants need to ensure a reliable supply of water for cooling and generation requirements. The effects of climate change are expected to influence hydrological systems at multiple scales, possibly leading to reduced efficiency of thermoelectric power plants. This study models and analyzes drought characteristics from a thermoelectric systems operational and regulation perspective. A systematic approach to characterize a stream environment in relation to extreme drought occurrence, duration and deficit-volume is proposed and demonstrated. More specifically, the objective of this research is to propose a stochastic water supply risk analysis and projection methods from thermoelectric power systems operation and management perspectives. The study defines thermoelectric drought as a shortage of cooling water due to stressed supply or beyond operable water temperature limits for an extended period of time requiring power plants to reduce production or completely shut down. It presents a thermoelectric drought risk characterization framework that considers heat content and water quantity facets of adequate water availability for uninterrupted operation of such plants and safety of its surroundings. In addition, it outlines mechanisms to identify rate of occurrences of the said droughts and stochastically quantify subsequent potential losses to the sector. This mechanism is enabled through a model based on compound Nonhomogeneous Poisson Process. This study also demonstrates how

  8. Handbook of methods for risk-based analysis of technical specifications

    SciTech Connect

    Samanta, P.K.; Kim, I.S.; Mankamo, T.; Vesely, W.E.

    1996-09-01

    Technical Specifications (TS) requirements for nuclear power plants define the Limiting Conditions for Operations (LCOs) and Surveillance Requirements (SRs) to assure safety during operation. In general, these requirements are based on deterministic analyses and engineering judgments. Improvements in these requirements are facilitated by the availability of plant-specific Probabilistic Risk Assessments (PRAs). The US Nuclear Regulatory Commission (USNRC) Office of Research sponsored research to develop systematic, risk-based methods to improve various aspects of TS requirements. A handbook of methods summarizing such risk-based approaches has been completed in 1994. It is expected that this handbook will provide valuable input to NRC`s present work in developing guidance for using PRA in risk-informed regulation. The handbook addresses reliability and risk-based methods for evaluating allowed outage times (AOTs), action statements requiring shutdown where shutdown risk may be substantial, surveillance test intervals (STIs), managing plant configurations, and scheduling maintenance.

  9. Methods of analysis for chemicals that disrupt cellular signaling pathways: risk assessment for potential endocrine disruptors.

    PubMed

    Umezawa, Yoshio; Ozawa, Takeaki; Sato, Moritoshi; Inadera, Hidekuni; Kaneko, Shuichi; Kunimoto, Manabu; Hashimoto, Shin-ichi

    2005-01-01

    Here we present a basic concept and several examples of methods of analysis for chemicals that disrupt cellular signaling pathways, in view of risk assessment for potential endocrine disrupting chemicals (EDCs). The key cellular signaling pathways include 1) ER/coactivator interaction, 2) AR translocation into the nucleus, 3) ER/NO/sGC/cGMP, 4) ER/Akt, 5) ER/Src, 6)ER/Src/Grb2, and 7) ER/Ca2+/CaM/CaMK pathways. These were visualized in relevant live cells using newly developed fluorescent and bioluminescent probes. Changes in cellular signals were thereby observed in nongenomic pathways of steroid hormones upon treatment of the target cells with steroid hormones and related chemicals. This method of analysis appears to be a rational approach to high-throughput prescreening (HTPS) of biohazardous chemicals, EDCs, in particular. Also described was the screening of gene expression by serial analysis of gene expression and gene chips upon applying EDCs to breast cancer cells, mouse livers, and human neuroblastoma NB-1 cells.

  10. Risk analysis of a biomass combustion process using MOSAR and FMEA methods.

    PubMed

    Thivel, P-X; Bultel, Y; Delpech, F

    2008-02-28

    Thermal and chemical conversion processes that convert in energy the sewage sludge, pasty waste and other pre-processed waste are increasingly common, for economic and ecological reasons. Fluidized bed combustion is currently one of the most promising methods of energy conversion, since it burns biomass very efficiently, and produces only very small quantities of sulphur and nitrogen oxides. The hazards associated with biomass combustion processes are fire, explosion and poisoning from the combustion gases (CO, etc.). The risk analysis presented in this paper uses the MADS-MOSAR methodology, applied to a semi-industrial pilot scheme comprising a fluidization column, a conventional cyclone, two natural gas burners and a continuous supply of biomass. The methodology uses a generic approach, with an initial macroscopic stage where hazard sources are identified, scenarios for undesired events are recognized and ranked using a grid of SeverityxProbability and safety barriers suggested. A microscopic stage then analyzes in detail the major risks identified during the first stage. This analysis may use various different tools, such as HAZOP, FMEA, etc.: our analysis is based on FMEA. Using MOSAR, we were able to identify five subsystems: the reactor (fluidized bed and centrifuge), the fuel and biomass supply lines, the operator and the environment. When we drew up scenarios based on these subsystems, we found that malfunction of the gas supply burners was a common trigger in many scenarios. Our subsequent microscopic analysis, therefore, focused on the burners, looking at the ways they failed, and at the effects and criticality of those failures (FMEA). We were, thus, able to identify a number of critical factors such as the incoming gas lines and the ignition electrode.

  11. Quantitative environmental risk analysis

    SciTech Connect

    Klovning, J.; Nilsen, E.F.

    1995-12-31

    According to regulations relating to implementation and rise of risk analysis in the petroleum activities issued by the Norwegian Petroleum Directorate, it is mandatory for an operator on the Norwegian Continental Shelf to establish acceptance criteria for environmental risk in the activities and carry out environmental risk analysis. This paper presents a {open_quotes}new{close_quotes} method for environmental risk analysis developed by the company. The objective has been to assist the company to meet rules and regulations and to assess and describe the environmental risk in a systematic manner. In the environmental risk analysis the most sensitive biological resource in the affected area is used to assess the environmental damage. The analytical method is based on the methodology for quantitative risk analysis related to loss of life. In addition it incorporates the effect of seasonal fluctuations in the environmental risk evaluations. The paper is describing the function of the main analytical sequences exemplified through an analysis of environmental risk related to exploration drilling in an environmental sensitive area on the Norwegian Continental Shelf.

  12. Risk analysis before launch

    NASA Astrophysics Data System (ADS)

    Behlert, Rene

    1988-08-01

    A quality methodology is proposed based on risk analysis and observation of technical facts. The procedures for the quantization of a risk are described and examples are given. A closed loop quality analysis is described. Overall mission safety goals are described. The concept of maintenance is developed to evolutionary maintenance. It is shown that a large number of data must be processed to apply the proposed methods. The use of computer data processing is required.

  13. Is risk analysis scientific?

    PubMed

    Hansson, Sven Ove; Aven, Terje

    2014-07-01

    This article discusses to what extent risk analysis is scientific in view of a set of commonly used definitions and criteria. We consider scientific knowledge to be characterized by its subject matter, its success in developing the best available knowledge in its fields of study, and the epistemic norms and values that guide scientific investigations. We proceed to assess the field of risk analysis according to these criteria. For this purpose, we use a model for risk analysis in which science is used as a base for decision making on risks, which covers the five elements evidence, knowledge base, broad risk evaluation, managerial review and judgment, and the decision; and that relates these elements to the domains experts and decisionmakers, and to the domains fact-based or value-based. We conclude that risk analysis is a scientific field of study, when understood as consisting primarily of (i) knowledge about risk-related phenomena, processes, events, etc., and (ii) concepts, theories, frameworks, approaches, principles, methods and models to understand, assess, characterize, communicate, and manage risk, in general and for specific applications (the instrumental part). © 2014 Society for Risk Analysis.

  14. A Refinement of Risk Analysis Procedures for Trichloroethylene Through the Use of Monte Carlo Method in Conjunction with Physiologically Based Pharmacokinetic Modeling

    DTIC Science & Technology

    1993-09-01

    This study refines risk analysis procedures for trichloroethylene (TCE) using a physiologically based pharmacokinetic (PBPK) model in conjunction...promulgate, and better present, more realistic standards.... Risk analysis , Physiologically based pharmacokinetics, Pbpk, Trichloroethylene, Monte carlo method.

  15. Risk Analysis

    NASA Technical Reports Server (NTRS)

    Morring, Frank, Jr.

    2004-01-01

    A National Academies panel says the Hubble Space Telescope is too valuable ;or gamblingon a long-shot robotic mission to extend its service life, and urges Directly contradicting Administrator Sean O'Keefe, who killed a planned fifth shuttle servicing mission to the telescope on grounds it was too dangerous for a human crew in the post-Challenger environment, the expert committee found that upgrades to shuttle safety actually should make it less hazardous to fly to the telescope than it was before Columbia was lost. Risks of a telescope-servicing mission are only marginally greater than the planned missions to the International Space Station (ISS) O'Keefe has authorized, the panel found. After comparing those risks to the dangers inherent in trying to develop a complex space robot in the 39 months remaining in the Hubble s estimated service life, the panel opted for the human mission to save one of the major achievements of the American space program, in the words of Louis J. Lanzerotti, its chairman.

  16. Integrate life-cycle assessment and risk analysis results, not methods.

    PubMed

    Linkov, Igor; Trump, Benjamin D; Wender, Ben A; Seager, Thomas P; Kennedy, Alan J; Keisler, Jeffrey M

    2017-08-04

    Two analytic perspectives on environmental assessment dominate environmental policy and decision-making: risk analysis (RA) and life-cycle assessment (LCA). RA focuses on management of a toxicological hazard in a specific exposure scenario, while LCA seeks a holistic estimation of impacts of thousands of substances across multiple media, including non-toxicological and non-chemically deleterious effects. While recommendations to integrate the two approaches have remained a consistent feature of environmental scholarship for at least 15 years, the current perception is that progress is slow largely because of practical obstacles, such as a lack of data, rather than insurmountable theoretical difficulties. Nonetheless, the emergence of nanotechnology presents a serious challenge to both perspectives. Because the pace of nanomaterial innovation far outstrips acquisition of environmentally relevant data, it is now clear that a further integration of RA and LCA based on dataset completion will remain futile. In fact, the two approaches are suited for different purposes and answer different questions. A more pragmatic approach to providing better guidance to decision-makers is to apply the two methods in parallel, integrating only after obtaining separate results.

  17. Indoor Soiling Method and Outdoor Statistical Risk Analysis of Photovoltaic Power Plants

    NASA Astrophysics Data System (ADS)

    Rajasekar, Vidyashree

    This is a two-part thesis. Part 1 presents an approach for working towards the development of a standardized artificial soiling method for laminated photovoltaic (PV) cells or mini-modules. Construction of an artificial chamber to maintain controlled environmental conditions and components/chemicals used in artificial soil formulation is briefly explained. Both poly-Si mini-modules and a single cell mono-Si coupons were soiled and characterization tests such as I-V, reflectance and quantum efficiency (QE) were carried out on both soiled, and cleaned coupons. From the results obtained, poly-Si mini-modules proved to be a good measure of soil uniformity, as any non-uniformity present would not result in a smooth curve during I-V measurements. The challenges faced while executing reflectance and QE characterization tests on poly-Si due to smaller size cells was eliminated on the mono-Si coupons with large cells to obtain highly repeatable measurements. This study indicates that the reflectance measurements between 600-700 nm wavelengths can be used as a direct measure of soil density on the modules. Part 2 determines the most dominant failure modes of field aged PV modules using experimental data obtained in the field and statistical analysis, FMECA (Failure Mode, Effect, and Criticality Analysis). The failure and degradation modes of about 744 poly-Si glass/polymer frameless modules fielded for 18 years under the cold-dry climate of New York was evaluated. Defect chart, degradation rates (both string and module levels) and safety map were generated using the field measured data. A statistical reliability tool, FMECA that uses Risk Priority Number (RPN) is used to determine the dominant failure or degradation modes in the strings and modules by means of ranking and prioritizing the modes. This study on PV power plants considers all the failure and degradation modes from both safety and performance perspectives. The indoor and outdoor soiling studies were jointly

  18. Variation in cancer risk estimates for exposure to powerline frequency electromagnetic fields: a meta-analysis comparing EMF measurement methods.

    PubMed

    Miller, M A; Murphy, J R; Miller, T I; Ruttenber, A J

    1995-04-01

    We used meta-analysis to synthesize the findings from eleven case-control studies on cancer risks in humans exposed to 50-60 Hertz powerline electromagnetic fields (EMFs). Pooled estimates of risk are derived for different EMF measurement methods and types of cancer. EMF measurement methods are classified as: wiring configuration codes, distance to power distribution equipment, spot measurements of magnetic fields, and calculated indices based on distance to power distribution equipment and historic load data. Pooled odds ratios depicting the risk of cancer by each measurement type are presented for all cancers combined, leukemia for all age groups and childhood leukemia. The wire code measurement technique was associated with a significantly increased risk for all three cancer types, while spot measures consistently showed non-significant odds ratios. Distance measures and the calculated indices produced risk estimates which were significant only for leukemia.

  19. Variation in cancer risk estimates for exposure to powerline frequency electromagnetic fields: A meta-analysis comparing EMF measurement methods

    SciTech Connect

    Miller, M.A.; Murphy, J.R.; MIller, T. I; Ruttenber, A.J.

    1995-04-01

    We used meta-analysis to synthesize the findings from eleven case-control studies on cancer risks in humans exposed to 50-60 Hertz powerline electromagnetic fields (EMFs). Pooled estimates of risk are derived for different EMF measurement methods and types of cancer. EMF measurement methods are classified as: wiring configuration codes, distance to power distribution equipment, spot measurements of magnetic fields, and calculated indices based on distance to power distribution equipment and historic load data. Pooled odds ratios depicting the risk of cancer by each measurement type are presented for all cancers combined, leukemia for all age groups and childhood leukemia. The wire code measurement technique was associated with a significantly increased risk for all three cancer types, while spot measures consistently showed non-significant odds ratios. Distance measures and the calculated indices produced risk estimates which were significant only for leukemia. 24 refs., 6 tabs.

  20. New Methods for the Analysis of Heartbeat Behavior in Risk Stratification

    PubMed Central

    Glass, Leon; Lerma, Claudia; Shrier, Alvin

    2011-01-01

    Developing better methods for risk stratification for tachyarrhythmic sudden cardiac remains a major challenge for physicians and scientists. Since the transition from sinus rhythm to ventricular tachycardia/fibrillation happens by different mechanisms in different people, it is unrealistic to think that a single measure will be adequate to provide a good index for risk stratification. We analyze the dynamical properties of ventricular premature complexes over 24 h in an effort to understand the underlying mechanisms of ventricular arrhythmias and to better understand the arrhythmias that occur in individual patients. Two dimensional density plots, called heartprints, correlate characteristic features of the dynamics of premature ventricular complexes and the sinus rate. Heartprints show distinctive characteristics in individual patients. Based on a better understanding of the natures of transitions from sinus rhythm to sudden cardiac and the mechanisms of arrhythmia prior to cardiac arrest, it should be possible to develop better methods for risk stratification. PMID:22144963

  1. Modelling childhood caries using parametric competing risks survival analysis methods for clustered data.

    PubMed

    Stephenson, J; Chadwick, B L; Playle, R A; Treasure, E T

    2010-01-01

    Caries in primary teeth is an ongoing issue in children's dental health. Its quantification is affected by clustering of data within children and the concurrent risk of exfoliation of primary teeth. This analysis of caries data of 103,776 primary molar tooth surfaces from a cohort study of 2,654 British children aged 4-5 years at baseline applied multilevel competing risks survival analysis methodology to identify factors significantly associated with caries occurrence in primary tooth surfaces in the presence of the concurrent risk of exfoliation, and assessed the effect of exfoliation on caries development. Multivariate multilevel parametric survival models were applied at surface level to the analysis of the sound-carious and sound-exfoliation transitions to which primary tooth surfaces are subject. Socio-economic class, fluoridation status and surface type were found to be the strongest predictors of primary caries, with the highest rates of occurrence and lowest median survival times associated with occlusal surfaces of children from poor socio-economic class living in non-fluoridated areas. The concurrent risk of exfoliation was shown to reduce the distinction in survival experience between different types of surfaces, and between surfaces of teeth from children of different socio-economic class or fluoridation status. Clustering of data had little effect on inferences of parameter significance.

  2. Method for improved prediction of bone fracture risk using bone mineral density in structural analysis

    NASA Technical Reports Server (NTRS)

    Cann, Christopher E. (Inventor); Faulkner, Kenneth G. (Inventor)

    1992-01-01

    A non-invasive in-vivo method of analyzing a bone for fracture risk includes obtaining data from the bone such as by computed tomography or projection imaging which data represents a measure of bone material characteristics such as bone mineral density. The distribution of the bone material characteristics is used to generate a finite element method (FEM) mesh from which load capability of the bone can be determined. In determining load capability, the bone is mathematically compressed, and stress, strain force, force/area versus bone material characteristics are determined.

  3. Comparison of methods for estimating the attributable risk in the context of survival analysis.

    PubMed

    Gassama, Malamine; Bénichou, Jacques; Dartois, Laureen; Thiébaut, Anne C M

    2017-01-23

    The attributable risk (AR) measures the proportion of disease cases that can be attributed to an exposure in the population. Several definitions and estimation methods have been proposed for survival data. Using simulations, we compared four methods for estimating AR defined in terms of survival functions: two nonparametric methods based on Kaplan-Meier's estimator, one semiparametric based on Cox's model, and one parametric based on the piecewise constant hazards model, as well as one simpler method based on estimated exposure prevalence at baseline and Cox's model hazard ratio. We considered a fixed binary exposure with varying exposure probabilities and strengths of association, and generated event times from a proportional hazards model with constant or monotonic (decreasing or increasing) Weibull baseline hazard, as well as from a nonproportional hazards model. We simulated 1,000 independent samples of size 1,000 or 10,000. The methods were compared in terms of mean bias, mean estimated standard error, empirical standard deviation and 95% confidence interval coverage probability at four equally spaced time points. Under proportional hazards, all five methods yielded unbiased results regardless of sample size. Nonparametric methods displayed greater variability than other approaches. All methods showed satisfactory coverage except for nonparametric methods at the end of follow-up for a sample size of 1,000 especially. With nonproportional hazards, nonparametric methods yielded similar results to those under proportional hazards, whereas semiparametric and parametric approaches that both relied on the proportional hazards assumption performed poorly. These methods were applied to estimate the AR of breast cancer due to menopausal hormone therapy in 38,359 women of the E3N cohort. In practice, our study suggests to use the semiparametric or parametric approaches to estimate AR as a function of time in cohort studies if the proportional hazards assumption appears

  4. Integrated seismic risk analysis using simple weighting method: the case of residential Eskişehir, Turkey

    NASA Astrophysics Data System (ADS)

    Pekkan, E.; Tun, M.; Guney, Y.; Mutlu, S.

    2015-06-01

    A large part of the residential areas in Turkey are at risk from earthquakes. The main factors that threaten residential areas during an earthquake are poor quality building stock and soil problems. Liquefaction, loss of bearing capacity, amplification, slope failure, and landslide hazards must be taken into account for residential areas that are close to fault zones and covered with younger sediments. Analyzing these hazards separately and then combining the analyses would ensure a more realistic risk evaluation according to population density than analyzing several risks based on a single parameter. In this study, an integrated seismic risk analysis of central Eskişehir was performed based on two earthquake related parameters, liquefaction and amplification. The analysis used a simple weighting method. Other earthquake-related problems such as loss of bearing capacity, landslides, and slope failures are not significant for Eskişehir because of the geological and the topographical conditions of the region. According to the integrated seismic risk analysis of the Eskişehir residential area, the populated area is found to be generally at medium to high risk during a potential earthquake.

  5. A decomposition-integration risk analysis method for real-time operation of a complex flood control system

    NASA Astrophysics Data System (ADS)

    Chen, Juan; Zhong, Ping-An; Zhang, Yu; Navar, David; Yeh, William W.-G.

    2017-03-01

    Risk analysis plays an important role in decision making for real-time flood control operation of complex flood control systems. A typical flood control system consists of reservoirs, river channels, and downstream control points. The system generally is characterized by nonlinearity and large scale. Additionally, the input variables are mostly stochastic. Because of the dimensionality problem, generally, it would not be possible to carry out risk analysis without decomposition. In this paper, we propose a decomposition-integration approach whereby the original complex flood control system is decomposed into a number of independent subsystems. We conduct risk analysis for each subsystem and then integrate the results by means of combination theory of stochastic processes. We evaluate the propagation of uncertainties through the complex flood control system and calculate the risk of reservoir overtopping, as well as the risk of flooding at selected downstream control points. We apply the proposed methodology to a flood control system in the middle reaches of the Huaihe River basin in China. The results show that the proposed method is practical and provides a way to estimate the risks in real-time flood control operation of a complex flood control system.

  6. FOOD RISK ANALYSIS

    USDA-ARS?s Scientific Manuscript database

    Food risk analysis is a holistic approach to food safety because it considers all aspects of the problem. Risk assessment modeling is the foundation of food risk analysis. Proper design and simulation of the risk assessment model is important to properly predict and control risk. Because of knowl...

  7. Qualitative methods for assessing risk

    SciTech Connect

    Mahn, J.A.; Hannaman, G.W.; Kryska, P.

    1995-04-01

    The Department of Energy`s (DOE) non-nuclear facilities generally require only a qualitative accident analysis to assess facility risks in accordance with DOE Order 5481.1B, Safety Analysis and Review System. Achieving a meaningful qualitative assessment of risk necessarily requires the use of suitable non-numerical assessment criteria. Typically, the methods and criteria for assigning facility-specific accident scenarios to the qualitative severity and likelihood classification system in the DOE order requires significant judgment in many applications. Systematic methods for more consistently assigning the total accident scenario frequency and associated consequences are required to substantiate and enhance future risk ranking between various activities at Sandia National Laboratories (SNL). SNL`s Risk Management and National Environmental Policy Act (NEPA) Department has developed an improved methodology for performing qualitative risk assessments in accordance wi the DOE order requirements. Products of this effort are an improved set of qualitative description that permit (1) definition of the severity for both technical and programmatic consequences that may result from a variety of accident scenarios, and (2) qualitative representation of the likelihood of occurrence. These sets of descriptions are intended to facilitate proper application of DOE criteria for assessing facility risks.

  8. Fuzzy Risk Evaluation in Failure Mode and Effects Analysis Using a D Numbers Based Multi-Sensor Information Fusion Method.

    PubMed

    Deng, Xinyang; Jiang, Wen

    2017-09-12

    Failure mode and effect analysis (FMEA) is a useful tool to define, identify, and eliminate potential failures or errors so as to improve the reliability of systems, designs, and products. Risk evaluation is an important issue in FMEA to determine the risk priorities of failure modes. There are some shortcomings in the traditional risk priority number (RPN) approach for risk evaluation in FMEA, and fuzzy risk evaluation has become an important research direction that attracts increasing attention. In this paper, the fuzzy risk evaluation in FMEA is studied from a perspective of multi-sensor information fusion. By considering the non-exclusiveness between the evaluations of fuzzy linguistic variables to failure modes, a novel model called D numbers is used to model the non-exclusive fuzzy evaluations. A D numbers based multi-sensor information fusion method is proposed to establish a new model for fuzzy risk evaluation in FMEA. An illustrative example is provided and examined using the proposed model and other existing method to show the effectiveness of the proposed model.

  9. Use of a systematic risk analysis method to improve safety in the production of paediatric parenteral nutrition solutions

    PubMed Central

    Bonnabry, P; Cingria, L; Sadeghipour, F; Ing, H; Fonzo-Christe, C; Pfister, R

    2005-01-01

    Background: Until recently, the preparation of paediatric parenteral nutrition formulations in our institution included re-transcription and manual compounding of the mixture. Although no significant clinical problems have occurred, re-engineering of this high risk activity was undertaken to improve its safety. Several changes have been implemented including new prescription software, direct recording on a server, automatic printing of the labels, and creation of a file used to pilot a BAXA MM 12 automatic compounder. The objectives of this study were to compare the risks associated with the old and new processes, to quantify the improved safety with the new process, and to identify the major residual risks. Methods: A failure modes, effects, and criticality analysis (FMECA) was performed by a multidisciplinary team. A cause-effect diagram was built, the failure modes were defined, and the criticality index (CI) was determined for each of them on the basis of the likelihood of occurrence, the severity of the potential effect, and the detection probability. The CIs for each failure mode were compared for the old and new processes and the risk reduction was quantified. Results: The sum of the CIs of all 18 identified failure modes was 3415 for the old process and 1397 for the new (reduction of 59%). The new process reduced the CIs of the different failure modes by a mean factor of 7. The CI was smaller with the new process for 15 failure modes, unchanged for two, and slightly increased for one. The greatest reduction (by a factor of 36) concerned re-transcription errors, followed by readability problems (by a factor of 30) and chemical cross contamination (by a factor of 10). The most critical steps in the new process were labelling mistakes (CI 315, maximum 810), failure to detect a dosage or product mistake (CI 288), failure to detect a typing error during the prescription (CI 175), and microbial contamination (CI 126). Conclusions: Modification of the process

  10. An Improved Breast Epithelial Sampling Method for Molecular Profiling and Biomarker Analysis in Women at Risk for Breast Cancer.

    PubMed

    Danforth, David N; Warner, Andrew C; Wangsa, Darawalee; Ried, Thomas; Duelli, Dominik; Filie, Armando C; Prindiville, Sheila A

    2015-01-01

    There is a strong need to define the molecular changes in normal at-risk breast epithelium to identify biomarkers and new targets for breast cancer prevention and to develop a molecular signature for risk assessment. Improved methods of breast epithelial sampling are needed to promote whole-genome molecular profiling, increase ductal epithelial cell yield, and reduce sample cell heterogeneity. We developed an improved method of breast ductal sampling with ductal lavage through a 22-gauge catheter and collection of ductal samples with a microaspirator. Women at normal risk or increased risk for breast cancer were studied. Ductal epithelial samples were analyzed for cytopathologic changes, cellular yield, epithelial cell purity, quality and quantity of DNA and RNA, and use in multiple downstream molecular applications. We studied 50 subjects, including 40 subjects at normal risk for breast cancer and 37 subjects with non-nipple aspirate fluid-yielding ducts. This method provided multiple 1.0 mL samples of high ductal epithelial cell content (median ≥8 samples per subject of ≥5,000 cells per sample) with 80%-100% epithelial cell purity. Extraction of a single intact ductal sample (fluid and cells) or the separate frozen cellular component provided DNA and RNA for multiple downstream studies, including quantitative reverse transcription- polymerase chain reaction (PCR) for microRNA, quantitative PCR for the human telomerase reverse transcriptase gene, whole-genome DNA amplification, and array comparative genomic hybridization analysis. An improved breast epithelial sampling method has been developed, which should significantly expand the acquisition and biomarker analysis of breast ductal epithelium in women at risk for breast cancer.

  11. Methods Development for a Spatially Explicit Population-Level Risk Assessment, Uncertainty Analysis, and Comparison with Risk Quotient Approaches

    EPA Science Inventory

    The standard framework of Ecological Risk Assessment (ERA) uses organism-level assessment endpoints to qualitatively determine the risk to populations. While organism-level toxicity data provide the pathway by which a species may be affected by a chemical stressor, they neither i...

  12. Methods Development for a Spatially Explicit Population-Level Risk Assessment, Uncertainty Analysis, and Comparison with Risk Quotient Approaches

    EPA Science Inventory

    The standard framework of Ecological Risk Assessment (ERA) uses organism-level assessment endpoints to qualitatively determine the risk to populations. While organism-level toxicity data provide the pathway by which a species may be affected by a chemical stressor, they neither i...

  13. Development of a method to predict crash risk using trend analysis of driver behavior changes over time.

    PubMed

    Murata, Atsuo; Fukuda, Kohei

    2016-01-01

    This study aimed at identifying and predicting in advance the point in time with a high risk of a virtual accident before a virtual accident actually occurs using the change of behavioral measures and subjective rating on drowsiness over time and the trend analysis of each behavioral measure. Behavioral measures such as neck bending angle and tracking error in steering maneuvering during the simulated driving task were recorded under the low arousal condition of all participants who stayed up all night without sleeping. The trend analysis of each evaluation measure was conducted using a single regression model where time and each measure of drowsiness corresponded to an independent variable and a dependent variable, respectively. Applying the trend analysis technique to the experimental data, we proposed a method to predict in advance the point in time with a high risk of a virtual accident (in a real-world driving environment, this corresponds to a crash) before the point in time when the participant would have encountered a crucial accident if he or she continued driving a vehicle (we call this the point in time of a virtual accident). On the basis of applying the proposed trend analysis method to behavioral measures, we found that the proposed approach could predict in advance the point in time with a high risk of a virtual accident before the point in time of a virtual accident. The proposed method is a promising technique for predicting in advance the time zone with potentially high risk (probability) of being involved in an accident due to drowsy driving and for warning drivers of such a drowsy and risky state.

  14. The "Dry-Run" Analysis: A Method for Evaluating Risk Scores for Confounding Control.

    PubMed

    Wyss, Richard; Hansen, Ben B; Ellis, Alan R; Gagne, Joshua J; Desai, Rishi J; Glynn, Robert J; Stürmer, Til

    2017-05-01

    A propensity score (PS) model's ability to control confounding can be assessed by evaluating covariate balance across exposure groups after PS adjustment. The optimal strategy for evaluating a disease risk score (DRS) model's ability to control confounding is less clear. DRS models cannot be evaluated through balance checks within the full population, and they are usually assessed through prediction diagnostics and goodness-of-fit tests. A proposed alternative is the "dry-run" analysis, which divides the unexposed population into "pseudo-exposed" and "pseudo-unexposed" groups so that differences on observed covariates resemble differences between the actual exposed and unexposed populations. With no exposure effect separating the pseudo-exposed and pseudo-unexposed groups, a DRS model is evaluated by its ability to retrieve an unconfounded null estimate after adjustment in this pseudo-population. We used simulations and an empirical example to compare traditional DRS performance metrics with the dry-run validation. In simulations, the dry run often improved assessment of confounding control, compared with the C statistic and goodness-of-fit tests. In the empirical example, PS and DRS matching gave similar results and showed good performance in terms of covariate balance (PS matching) and controlling confounding in the dry-run analysis (DRS matching). The dry-run analysis may prove useful in evaluating confounding control through DRS models. © The Author 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  15. Applying quantitative benefit-risk analysis to aid regulatory decision making in diagnostic imaging: methods, challenges, and opportunities.

    PubMed

    Agapova, Maria; Devine, Emily Beth; Bresnahan, Brian W; Higashi, Mitchell K; Garrison, Louis P

    2014-09-01

    Health agencies making regulatory marketing-authorization decisions use qualitative and quantitative approaches to assess expected benefits and expected risks associated with medical interventions. There is, however, no universal standard approach that regulatory agencies consistently use to conduct benefit-risk assessment (BRA) for pharmaceuticals or medical devices, including for imaging technologies. Economics, health services research, and health outcomes research use quantitative approaches to elicit preferences of stakeholders, identify priorities, and model health conditions and health intervention effects. Challenges to BRA in medical devices are outlined, highlighting additional barriers in radiology. Three quantitative methods--multi-criteria decision analysis, health outcomes modeling and stated-choice survey--are assessed using criteria that are important in balancing benefits and risks of medical devices and imaging technologies. To be useful in regulatory BRA, quantitative methods need to: aggregate multiple benefits and risks, incorporate qualitative considerations, account for uncertainty, and make clear whose preferences/priorities are being used. Each quantitative method performs differently across these criteria and little is known about how BRA estimates and conclusions vary by approach. While no specific quantitative method is likely to be the strongest in all of the important areas, quantitative methods may have a place in BRA of medical devices and radiology. Quantitative BRA approaches have been more widely applied in medicines, with fewer BRAs in devices. Despite substantial differences in characteristics of pharmaceuticals and devices, BRA methods may be as applicable to medical devices and imaging technologies as they are to pharmaceuticals. Further research to guide the development and selection of quantitative BRA methods for medical devices and imaging technologies is needed. Copyright © 2014 AUR. Published by Elsevier Inc. All rights

  16. Severe COPD Exacerbation Risk and Long-Acting Bronchodilator Treatments: Comparison of Three Observational Data Analysis Methods.

    PubMed

    Roberts, Melissa H; Mapel, Douglas W; Borrego, Matthew E; Raisch, Dennis W; Georgopoulos, Larry; van der Goes, David

    2015-06-01

    Results from three observational methods for assessing effectiveness of long-acting bronchodilator therapies for reducing severe exacerbations of chronic obstructive pulmonary disease (COPD) were compared: intent-to-treat (ITT), as protocol (AP), and an as-treated analysis that utilized a marginal structural model (MSM) incorporating time-varying covariates related to treatment adherence and moderate exacerbations. Severe exacerbation risk was assessed over a 2-year period using claims data for patients aged ≥40 years who initiated long-acting muscarinic antagonist (LAMA), inhaled corticosteroid/long-acting beta-agonist (ICS/LABA), or triple therapy (LAMA + ICS/LABA). A total of 5475 COPD patients met inclusion criteria. Six months post-initiation, 53.5 % of patients discontinued using any therapy. The ITT analysis found an increased severe exacerbation risk for triple therapy treatment (hazard ratio [HR] 1.24; 95 % confidence interval [CI] 1.00-1.53). No increased risk was found in the AP (HR 1.00; 95 % CI 0.73-1.36), or MSM analyses (HR 1.11; 95 % CI 0.68-1.81). The MSM highlighted important associations among post-index events. Neglecting to adjust for treatment discontinuation may produce biased risk estimates. The MSM approach is a promising tool to compare chronic disease management by illuminating relationships between treatment decisions, adherence, patient choices, and outcomes.

  17. Assessing the Risk of Secondary Transfer Via Fingerprint Brush Contamination Using Enhanced Sensitivity DNA Analysis Methods.

    PubMed

    Bolivar, Paula-Andrea; Tracey, Martin; McCord, Bruce

    2016-01-01

    Experiments were performed to determine the extent of cross-contamination of DNA resulting from secondary transfer due to fingerprint brushes used on multiple items of evidence. Analysis of both standard and low copy number (LCN) STR was performed. Two different procedures were used to enhance sensitivity, post-PCR cleanup and increased cycle number. Under standard STR typing procedures, some additional alleles were produced that were not present in the controls or blanks; however, there was insufficient data to include the contaminant donor as a contributor. Inclusion of the contaminant donor did occur for one sample using post-PCR cleanup. Detection of the contaminant donor occurred for every replicate of the 31 cycle amplifications; however, using LCN interpretation recommendations for consensus profiles, only one sample would include the contaminant donor. Our results indicate that detection of secondary transfer of DNA can occur through fingerprint brush contamination and is enhanced using LCN-DNA methods. © 2015 American Academy of Forensic Sciences.

  18. Probabilistic risk analysis and terrorism risk.

    PubMed

    Ezell, Barry Charles; Bennett, Steven P; von Winterfeldt, Detlof; Sokolowski, John; Collins, Andrew J

    2010-04-01

    Since the terrorist attacks of September 11, 2001, and the subsequent establishment of the U.S. Department of Homeland Security (DHS), considerable efforts have been made to estimate the risks of terrorism and the cost effectiveness of security policies to reduce these risks. DHS, industry, and the academic risk analysis communities have all invested heavily in the development of tools and approaches that can assist decisionmakers in effectively allocating limited resources across the vast array of potential investments that could mitigate risks from terrorism and other threats to the homeland. Decisionmakers demand models, analyses, and decision support that are useful for this task and based on the state of the art. Since terrorism risk analysis is new, no single method is likely to meet this challenge. In this article we explore a number of existing and potential approaches for terrorism risk analysis, focusing particularly on recent discussions regarding the applicability of probabilistic and decision analytic approaches to bioterrorism risks and the Bioterrorism Risk Assessment methodology used by the DHS and criticized by the National Academies and others.

  19. Methods for risk-benefit and cost-benefit analysis of vaccinations.

    PubMed

    Wiedermann, G; Ambrosch, F

    1979-01-01

    The basic aim of vaccination is to prevent more severe complications of the respective disease than might be caused by the vaccine itself. Following this idea formulae were developed for calculation of the risk ratio Q and the risk difference D. The following parameters are considered in this formulae: risk of the disease (R), risk of vaccination (r), protection rate (p) and duration of protection (t). Besides, for calculations with partially vaccinated populations the immunization rate "I" has to be considered. A vaccination is beneficial if Q greater than 1.0 and D greater than 0. If vaccinations have proved to be valuable from the medical point of view, additional cost-benefit calculations may be of great importance for socio-economic considerations. Consequently, the above mentioned formulae were modified for calculation of the benefit cost ratio (Qc) and the benefit cost difference (Dc) for monovalent as well as for bivalent vaccines.

  20. Risk: a multidisciplinary concept analysis.

    PubMed

    McNeill, Charleen

    2014-01-01

    To analyze the concept of risk utilizing Walker and Avant's method of analysis to determine a conceptual definition applicable within nursing and nursing research. The mental constructs and consequences of risk have a proactive connotation compared with the negative behaviors often identified as illustrations of risk. A new conceptual definition of risk provides insight into an understanding of risk regardless of discipline. Its application to the metaparadigm of nursing should be the impetus for action and education. Formalizing the mental constructs of the concept of risk in a clear manner facilitates the inclusion of its latent constructs in nursing research. © 2013 Wiley Periodicals, Inc.

  1. Methods for Multitemporal Analysis of Satellite Data Aimed at Environmental Risk Monitoring

    NASA Astrophysics Data System (ADS)

    Caprioli, M.; Scognamiglio, A.

    2012-08-01

    In the last years the topic of Environmental monitoring has raised a particular importance, also according to minor short-term stability and predictability of climatic events. Facing this situation, often in terms of emergency, involves high and unpredictable costs for public Agencies. Prevention of damages caused by natural disasters does not regard only weather forecasts, but requires constant attention and practice of monitoring and control of human activity on territory. Practically, the problem is not knowing if and when an event will affect a determined area, but recognizing the possible damages if this event happened, by adopting the adequate measures to reduce them to a minimum, and requiring the necessary tools for a timely intervention. On the other hand, the surveying technologies should be the most possible accurate and updatable in order to guarantee high standards, involving the analysis of a great amount of data. The management of such data requires the integration and calculation systems with specialized software and fast and reliable connection and communication networks. To solve such requirements, current satellite technology, with recurrent data acquisition for the timely generation of cartographic products updated and coherent to the territorial investigation, offers the possibility to fill the temporal gap between the need of urgent information and official reference information. Among evolved image processing techniques, Change detection analysis is useful to facilitate individuation of environmental temporal variations, contributing to reduce the users intervention by means of the processes automation and improving in a progressive way the qualitative and quantitative accuracy of results. The research investigate automatic methods on land cover transformations by means of "Change detection" techniques executable on satellite data that are heterogeneous for spatial and spectral resolution with homogenization and registration in an unique

  2. Automated Method for Analysis of Mammographic Breast Density - A Technique for Breast Cancer Risk Estimation

    DTIC Science & Technology

    2006-07-01

    parenchymal patterns and breast cancer risk,’’ Epidemiol. Rev. 9, 146–174 ~1987!. 6 J. Brisson, R. Verreault , A. S. Morrison, D. Tennina, and F. Meyer...contraceptive designed to reduce breast cancer risk,’’ J. Natl. Cancer Inst. 86, 431–436 ~1994!. 17 J. Brisson, R. Verreault , A. S. Morrison, D. Tennina, and...equipment and product of any companies mentioned should be inferred. References 1. J. Brisson, R. Verreault , A. S. Morrison, D. Tennina and F. Meyer

  3. Utility Theory as a Method to Minimise the Risk in Deformation Analysis Decisions

    NASA Astrophysics Data System (ADS)

    Zhang, Yin; Neumann, Ingo

    2014-11-01

    Deformation monitoring usually focuses on the detection of whether the monitored objects satisfy the given properties (e.g. being stable or not), and makes further decisions to minimise the risks, for example, the consequences and costs in case of collapse of artificial objects and/or natural hazards. With this intention, a methodology relying on hypothesis testing and utility theory is reviewed in this paper. The main idea of utility theory is to judge each possible outcome with a utility value. The presented methodology makes it possible to minimise the risk of an individual monitoring project by considering the costs and consequences of overall possible situations within the decision process. It is not the danger that the monitored object may collapse that can be reduced. The risk (based on the utility values multiplied by the danger) can be described more appropriately and therefore more valuable decisions can be made. Especially, the opportunity for the measurement process to minimise the risk is an important key issue. In this paper, application of the methodology to two of the classical cases in hypothesis testing will be discussed in detail: 1) both probability density functions (pdfs) of tested objects under null and alternative hypotheses are known; 2) only the pdf under the null hypothesis is known and the alternative hypothesis is treated as the pure negation of the null hypothesis. Afterwards, a practical example in deformation monitoring is introduced and analysed. Additionally, the way in which the magnitudes of utility values (consequences of a decision) influence the decision will be considered and discussed at the end.

  4. Risk analysis highly valued.

    PubMed

    Gammelsaeter, Håkon; Ramstad, Jens Eirik; Røv, Ann Solberg; Walseth, Frode; Paulsen, Anne Margrethe

    2003-11-01

    It is felt that risk and vulnerability analysis is an excellent means of assessing and communicating risk and inconvenience related to extensive construction activities. The main reasons for this are: It uncovers the risks and inconveniences involved. Risk reducing and alert measures are identified. Preventive action and emergency plans are implemented. It is easy to learn. It is unbureaucratic. It promotes cross-professional communication. It distributes correct information very effectively.

  5. Recasting risk analysis methods in terms of object-oriented modeling techniques

    SciTech Connect

    Wyss, G.D.; Craft, R.L.; Vandewart, R.L.; Funkhouser, D.R.

    1998-08-01

    For more than two decades, risk analysts have relied on powerful logic-based models to perform their analyses. However, the applicability of these models has been limited because they can be complex and expensive to develop. Analysts must frequently start from scratch when analyzing a new (but similar) system because the understanding of how the system works exists only in the mind of the analyst and is only incompletely instantiated in the actual logic model. This paper introduces the notion of using explicit object-oriented system models, such as those embodied in computer-aided software engineering (CASE) tools, to document the analyst`s understanding of the system and appropriately capture how the system works. It also shows that from these models, standard assessment products, such as fault trees and event trees, can be automatically derived.

  6. Risk analysis methodology survey

    NASA Technical Reports Server (NTRS)

    Batson, Robert G.

    1987-01-01

    NASA regulations require that formal risk analysis be performed on a program at each of several milestones as it moves toward full-scale development. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from simple to complex network-based simulation were surveyed. A Program Risk Analysis Handbook was prepared in order to provide both analyst and manager with a guide for selection of the most appropriate technique.

  7. Analysis of the LaSalle Unit 2 nuclear power plant: Risk Methods Integration and Evaluation Program (RMIEP). Volume 8, Seismic analysis

    SciTech Connect

    Wells, J.E.; Lappa, D.A.; Bernreuter, D.L.; Chen, J.C.; Chuang, T.Y.; Johnson, J.J.; Campbell, R.D.; Hashimoto, P.S.; Maslenikov, O.R.; Tiong, L.W.; Ravindra, M.K.; Kincaid, R.H.; Sues, R.H.; Putcha, C.S.

    1993-11-01

    This report describes the methodology used and the results obtained from the application of a simplified seismic risk methodology to the LaSalle County Nuclear Generating Station Unit 2. This study is part of the Level I analysis being performed by the Risk Methods Integration and Evaluation Program (RMIEP). Using the RMIEP developed event and fault trees, the analysis resulted in a seismically induced core damage frequency point estimate of 6.OE-7/yr. This result, combined with the component importance analysis, indicated that system failures were dominated by random events. The dominant components included diesel generator failures (failure to swing, failure to start, failure to run after started), and condensate storage tank.

  8. Application of a risk analysis method to different technologies for producing a monoclonal antibody employed in hepatitis B vaccine manufacturing.

    PubMed

    Milá, Lorely; Valdés, Rodolfo; Tamayo, Andrés; Padilla, Sigifredo; Ferro, Williams

    2012-03-01

    CB.Hep-1 monoclonal antibody (mAb) is used for a recombinant Hepatitis B vaccine manufacturing, which is included in a worldwide vaccination program against Hepatitis B disease. The use of this mAb as immunoligand has been addressed into one of the most efficient steps of active pharmaceutical ingredient purification process. Regarding this, Quality Risk Management (QRM) provides an excellent framework for the risk management use in pharmaceutical manufacturing and quality decision-making applications. Consequently, this study sought applying a prospective risk analysis methodology Failure Mode Effects Analysis (FMEA) as QRM tool for analyzing different CB.Hep-1 mAb manufacturing technologies. As main conclusions FMEA was successfully used to assess risks associated with potential problems in CB.Hep-1 mAb manufacturing processes. The severity and occurrence of risks analysis evidenced that the percentage of very high severe risks ranged 31.0-38.7% of all risks and the huge majority of risks have a very low occurrence level (61.9-83.3%) in all assessed technologies. Finally, additive Risk Priority Number, was descending ordered as follow: transgenic plants (2636), ascites (2577), transgenic animals (2046) and hollow fiber bioreactors (1654), which also corroborated that in vitro technology, should be the technology of choice for CB.Hep-1 mAb manufacturing in terms of risks and mAb molecule quality. Copyright © 2011 The International Alliance for Biological Standardization. Published by Elsevier Ltd. All rights reserved.

  9. Estimating erosion risk on forest lands using improved methods of discriminant analysis

    Treesearch

    J. Lewis; R. M. Rice

    1990-01-01

    A population of 638 timber harvest areas in northwestern California was sampled for data related to the occurrence of critical amounts of erosion (>153 m3 within 0.81 ha). Separate analyses were done for forest roads and logged areas. Linear discriminant functions were computed in each analysis to contrast site conditions at critical plots with randomly selected...

  10. The influence of the free space environment on the superlight-weight thermal protection system: conception, methods, and risk analysis

    NASA Astrophysics Data System (ADS)

    Yatsenko, Vitaliy; Falchenko, Iurii; Fedorchuk, Viktor; Petrushynets, Lidiia

    2016-07-01

    This report focuses on the results of the EU project "Superlight-weight thermal protection system for space application (LIGHT-TPS)". The bottom line is an analysis of influence of the free space environment on the superlight-weight thermal protection system (TPS). This report focuses on new methods that based on the following models: synergetic, physical, and computational. This report concentrates on four approaches. The first concerns the synergetic approach. The synergetic approach to the solution of problems of self-controlled synthesis of structures and creation of self-organizing technologies is considered in connection with the super-problem of creation of materials with new functional properties. Synergetics methods and mathematical design are considered according to actual problems of material science. The second approach describes how the optimization methods can be used to determine material microstructures with optimized or targeted properties. This technique enables one to find unexpected microstructures with exotic behavior (e.g., negative thermal expansion coefficients). The third approach concerns the dynamic probabilistic risk analysis of TPS l elements with complex characterizations for damages using a physical model of TPS system and a predictable level of ionizing radiation and space weather. Focusing is given mainly on the TPS model, mathematical models for dynamic probabilistic risk assessment and software for the modeling and prediction of the influence of the free space environment. The probabilistic risk assessment method for TPS is presented considering some deterministic and stochastic factors. The last approach concerns results of experimental research of the temperature distribution on the surface of the honeycomb sandwich panel size 150 x 150 x 20 mm at the diffusion welding in vacuum are considered. An equipment, which provides alignment of temperature fields in a product for the formation of equal strength of welded joints is

  11. Designing a Software for Flood Risk Assessment Based on Multi Criteria Desicion Analysis and Information Diffusion Methods

    NASA Astrophysics Data System (ADS)

    Musaoglu, N.; Saral, A.; Seker, D. Z.

    2012-12-01

    Flooding is one of the major natural disasters not only in Turkey but also in all over the world and it causes serious damage and harm. It is estimated that of the total economic loss caused by all kinds of disasters, 40% was due to floods. In July 1995, the Ayamama Creek in Istanbul was flooded, the insurance sector received around 1,200 claims notices during that period, insurance companies had to pay a total of $40 million for claims. In 2009, the same creek was flooded again and killed 31 people over two days and insurance firms paid for damages around cost €150 million for claims. To solve these kinds of problems modern tools such as GIS and Remote Sensing should be utilized. In this study, a software was designed for the flood risk analysis with Analytic Hierarchy Process (AHP) and Information Diffusion( InfoDif) methods.In the developed sofware, five evaluation criterias were taken into account, which were slope, aspect, elevation, geology and land use which were extracted from the satellite sensor data. The Digital Elevation Model (DEM) of the Ayamama River Basin was acquired from the SPOT 5 satellite image with 2.5 meter spatial resolution. Slope and aspect values of the study basin were extracted from this DEM. The land use of the Ayamama Creek was obtained by performing object-oriented nearest neighbor classification method by image segmentation on SPOT 5 image dated 2010. All produced data were used as an input for the part of Multi Criteria Desicion Analysis (MCDA) method of this software. Criterias and their each sub criteras were weighted and flood vulnerability was determined with MCDA-AHP. Also, daily flood data was collected from Florya Meteorological Station, between 1975 to 2009 years and the daily flood peak discharge was calculated with the method of Soil Conservation Service-Curve Number (SCS-CN) and were used as an input in the software for the part of InfoDif.Obtained results were verified using ground truth data and it has been clearly

  12. Modeling Terrorism Risk to the Air Transportation System: An Independent Assessment of TSA’s Risk Management Analysis Tool and Associated Methods

    DTIC Science & Technology

    2012-01-01

    Phillips, Cros - scope, and Geddes, 2008; Body and Marston, 2011); and other terror- ism risk models under development at the Department of Homeland...ARTS ENERGY AND ENVIRONMENT HEALTH AND HEALTH CARE INFRASTRUCTURE AND TRANSPORTATION INTERNATIONAL AFFAIRS LAW AND BUSINESS NATIONAL SECURITY...4. TITLE AND SUBTITLE Modeling Terrorism Risk to the Air Transportation System: An Independent Assessment of TSA’s Risk Management Analysis Tool and

  13. North energy system risk analysis features

    NASA Astrophysics Data System (ADS)

    Prokhorov, V. A.; Prokhorov, D. V.

    2015-12-01

    Risk indicator analysis for a decentralized energy system of the North was carried out. Based on analysis of damages caused by accidents at energy systems, their structure is selected, and a North energy system risk determination method was proposed.

  14. Multidimensional Risk Analysis: MRISK

    NASA Technical Reports Server (NTRS)

    McCollum, Raymond; Brown, Douglas; O'Shea, Sarah Beth; Reith, William; Rabulan, Jennifer; Melrose, Graeme

    2015-01-01

    Multidimensional Risk (MRISK) calculates the combined multidimensional score using Mahalanobis distance. MRISK accounts for covariance between consequence dimensions, which de-conflicts the interdependencies of consequence dimensions, providing a clearer depiction of risks. Additionally, in the event the dimensions are not correlated, Mahalanobis distance reduces to Euclidean distance normalized by the variance and, therefore, represents the most flexible and optimal method to combine dimensions. MRISK is currently being used in NASA's Environmentally Responsible Aviation (ERA) project o assess risk and prioritize scarce resources.

  15. Risk assessment of water pollution sources based on an integrated k-means clustering and set pair analysis method in the region of Shiyan, China.

    PubMed

    Li, Chunhui; Sun, Lian; Jia, Junxiang; Cai, Yanpeng; Wang, Xuan

    2016-07-01

    Source water areas are facing many potential water pollution risks. Risk assessment is an effective method to evaluate such risks. In this paper an integrated model based on k-means clustering analysis and set pair analysis was established aiming at evaluating the risks associated with water pollution in source water areas, in which the weights of indicators were determined through the entropy weight method. Then the proposed model was applied to assess water pollution risks in the region of Shiyan in which China's key source water area Danjiangkou Reservoir for the water source of the middle route of South-to-North Water Diversion Project is located. The results showed that eleven sources with relative high risk value were identified. At the regional scale, Shiyan City and Danjiangkou City would have a high risk value in term of the industrial discharge. Comparatively, Danjiangkou City and Yunxian County would have a high risk value in terms of agricultural pollution. Overall, the risk values of north regions close to the main stream and reservoir of the region of Shiyan were higher than that in the south. The results of risk level indicated that five sources were in lower risk level (i.e., level II), two in moderate risk level (i.e., level III), one in higher risk level (i.e., level IV) and three in highest risk level (i.e., level V). Also risks of industrial discharge are higher than that of the agricultural sector. It is thus essential to manage the pillar industry of the region of Shiyan and certain agricultural companies in the vicinity of the reservoir to reduce water pollution risks of source water areas. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. [Groundwater pollution risk mapping method].

    PubMed

    Shen, Li-na; Li, Guang-he

    2010-04-01

    Based on methods for groundwater vulnerability assessment not involving in contamination source elements, and lack of the systemic and effective techniques and parameter system on groundwater pollution risk mapping in present, through analyzing the structure of groundwater system and characteristics of contaminant sources, and coupling groundwater intrinsic vulnerability with contaminant sources, the integrated multi-index models were developed to evaluate the risk sources of groundwater contaminant and form the groundwater pollution risk mapping in this paper. The models had been used to a large-scale karst groundwater source of northern China as a case study. The results indicated that vulnerability assessment overlaid risk pollution sources of groundwater could effectively confirm the high risk regions of groundwater pollution, and the methods might provide necessary support for the supervision of groundwater pollution.

  17. Program risk analysis handbook

    NASA Technical Reports Server (NTRS)

    Batson, R. G.

    1987-01-01

    NASA regulations specify that formal risk analysis be performed on a program at each of several milestones. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from extremely simple to complex network-based simulation, are described in this handbook in order to provide both analyst and manager with a guide for selection of the most appropriate technique. All program risk assessment techniques are shown to be based on elicitation and encoding of subjective probability estimates from the various area experts on a program. Techniques to encode the five most common distribution types are given. Then, a total of twelve distinct approaches to risk assessment are given. Steps involved, good and bad points, time involved, and degree of computer support needed are listed. Why risk analysis should be used by all NASA program managers is discussed. Tools available at NASA-MSFC are identified, along with commercially available software. Bibliography (150 entries) and a program risk analysis check-list are provided.

  18. A Comparison of Rule-based Analysis with Regression Methods in Understanding the Risk Factors for Study Withdrawal in a Pediatric Study.

    PubMed

    Haghighi, Mona; Johnson, Suzanne Bennett; Qian, Xiaoning; Lynch, Kristian F; Vehik, Kendra; Huang, Shuai

    2016-08-26

    Regression models are extensively used in many epidemiological studies to understand the linkage between specific outcomes of interest and their risk factors. However, regression models in general examine the average effects of the risk factors and ignore subgroups with different risk profiles. As a result, interventions are often geared towards the average member of the population, without consideration of the special health needs of different subgroups within the population. This paper demonstrates the value of using rule-based analysis methods that can identify subgroups with heterogeneous risk profiles in a population without imposing assumptions on the subgroups or method. The rules define the risk pattern of subsets of individuals by not only considering the interactions between the risk factors but also their ranges. We compared the rule-based analysis results with the results from a logistic regression model in The Environmental Determinants of Diabetes in the Young (TEDDY) study. Both methods detected a similar suite of risk factors, but the rule-based analysis was superior at detecting multiple interactions between the risk factors that characterize the subgroups. A further investigation of the particular characteristics of each subgroup may detect the special health needs of the subgroup and lead to tailored interventions.

  19. The report of Task Group 100 of the AAPM: Application of risk analysis methods to radiation therapy quality management

    PubMed Central

    Huq, M. Saiful; Fraass, Benedick A.; Dunscombe, Peter B.; Gibbons, John P.; Mundt, Arno J.; Mutic, Sasa; Palta, Jatinder R.; Rath, Frank; Thomadsen, Bruce R.; Williamson, Jeffrey F.; Yorke, Ellen D.

    2016-01-01

    The increasing complexity of modern radiation therapy planning and delivery challenges traditional prescriptive quality management (QM) methods, such as many of those included in guidelines published by organizations such as the AAPM, ASTRO, ACR, ESTRO, and IAEA. These prescriptive guidelines have traditionally focused on monitoring all aspects of the functional performance of radiotherapy (RT) equipment by comparing parameters against tolerances set at strict but achievable values. Many errors that occur in radiation oncology are not due to failures in devices and software; rather they are failures in workflow and process. A systematic understanding of the likelihood and clinical impact of possible failures throughout a course of radiotherapy is needed to direct limit QM resources efficiently to produce maximum safety and quality of patient care. Task Group 100 of the AAPM has taken a broad view of these issues and has developed a framework for designing QM activities, based on estimates of the probability of identified failures and their clinical outcome through the RT planning and delivery process. The Task Group has chosen a specific radiotherapy process required for “intensity modulated radiation therapy (IMRT)” as a case study. The goal of this work is to apply modern risk-based analysis techniques to this complex RT process in order to demonstrate to the RT community that such techniques may help identify more effective and efficient ways to enhance the safety and quality of our treatment processes. The task group generated by consensus an example quality management program strategy for the IMRT process performed at the institution of one of the authors. This report describes the methodology and nomenclature developed, presents the process maps, FMEAs, fault trees, and QM programs developed, and makes suggestions on how this information could be used in the clinic. The development and implementation of risk-assessment techniques will make radiation

  20. Translational benchmark risk analysis

    PubMed Central

    Piegorsch, Walter W.

    2010-01-01

    Translational development – in the sense of translating a mature methodology from one area of application to another, evolving area – is discussed for the use of benchmark doses in quantitative risk assessment. Illustrations are presented with traditional applications of the benchmark paradigm in biology and toxicology, and also with risk endpoints that differ from traditional toxicological archetypes. It is seen that the benchmark approach can apply to a diverse spectrum of risk management settings. This suggests a promising future for this important risk-analytic tool. Extensions of the method to a wider variety of applications represent a significant opportunity for enhancing environmental, biomedical, industrial, and socio-economic risk assessments. PMID:20953283

  1. Bivariate hydrologic risk analysis based on a coupled entropy-copula method for the Xiangxi River in the Three Gorges Reservoir area, China

    NASA Astrophysics Data System (ADS)

    Fan, Y. R.; Huang, W. W.; Huang, G. H.; Huang, K.; Li, Y. P.; Kong, X. M.

    2016-07-01

    In this study, a bivariate hydrologic risk framework is proposed based on a coupled entropy-copula method. In the proposed risk analysis framework, bivariate flood frequency would be analyzed for different flood variable pairs (i.e., flood peak-volume, flood peak-duration, flood volume-duration). The marginal distributions of flood peak, volume, and duration are quantified through both parametric (i.e., gamma, general extreme value (GEV), and lognormal distributions) and nonparametric (i.e., entropy) approaches. The joint probabilities of flood peak-volume, peak-duration, and volume-duration are established through copulas. The bivariate hydrologic risk is then derived based on the joint return period to reflect the interactive effects of flood variables on the final hydrologic risk values. The proposed method is applied to the risk analysis for the Xiangxi River in the Three Gorges Reservoir area, China. The results indicate the entropy method performs best in quantifying the distribution of flood duration. Bivariate hydrologic risk would then be generated to characterize the impacts of flood volume and duration on the occurrence of a flood. The results suggest that the bivariate risk for flood peak-volume would not decrease significantly for the flood volume less than 1000 m3/s. Moreover, a flood in the Xiangxi River may last at least 5 days without significant decrease of the bivariate risk for flood peak-duration.

  2. DWPF risk analysis summary

    SciTech Connect

    Shedrow, C.B.

    1990-10-01

    This document contains selected risk analysis data from Chapter 9 (Safety Analysis) of the Defense Waste Processing Facility Safety Analysis Report DWPF SAR and draft Addendum 1 to the Waste Tank Farms SAR. Although these data may be revised prior to finalization of the draft SAR and the draft addendum, they are presently the best available information and were therefore used in preparing the risk analysis portion of the DWPF Environmental Analysis (DWPF EA). This information has been extracted from those draft documents an approved under separate cover so that it can be used as reference material for the DWPF EA when it is placed in the public reading rooms. 9 refs., 4 tabs.

  3. A multifactorial analysis of obesity as CVD risk factor: Use of neural network based methods in a nutrigenetics context

    PubMed Central

    2010-01-01

    Background Obesity is a multifactorial trait, which comprises an independent risk factor for cardiovascular disease (CVD). The aim of the current work is to study the complex etiology beneath obesity and identify genetic variations and/or factors related to nutrition that contribute to its variability. To this end, a set of more than 2300 white subjects who participated in a nutrigenetics study was used. For each subject a total of 63 factors describing genetic variants related to CVD (24 in total), gender, and nutrition (38 in total), e.g. average daily intake in calories and cholesterol, were measured. Each subject was categorized according to body mass index (BMI) as normal (BMI ≤ 25) or overweight (BMI > 25). Two artificial neural network (ANN) based methods were designed and used towards the analysis of the available data. These corresponded to i) a multi-layer feed-forward ANN combined with a parameter decreasing method (PDM-ANN), and ii) a multi-layer feed-forward ANN trained by a hybrid method (GA-ANN) which combines genetic algorithms and the popular back-propagation training algorithm. Results PDM-ANN and GA-ANN were comparatively assessed in terms of their ability to identify the most important factors among the initial 63 variables describing genetic variations, nutrition and gender, able to classify a subject into one of the BMI related classes: normal and overweight. The methods were designed and evaluated using appropriate training and testing sets provided by 3-fold Cross Validation (3-CV) resampling. Classification accuracy, sensitivity, specificity and area under receiver operating characteristics curve were utilized to evaluate the resulted predictive ANN models. The most parsimonious set of factors was obtained by the GA-ANN method and included gender, six genetic variations and 18 nutrition-related variables. The corresponding predictive model was characterized by a mean accuracy equal of 61.46% in the 3-CV testing sets. Conclusions The ANN

  4. Augmenting the Deliberative Method for Ranking Risks.

    PubMed

    Susel, Irving; Lasley, Trace; Montezemolo, Mark; Piper, Joel

    2016-01-01

    The Department of Homeland Security (DHS) characterized and prioritized the physical cross-border threats and hazards to the nation stemming from terrorism, market-driven illicit flows of people and goods (illegal immigration, narcotics, funds, counterfeits, and weaponry), and other nonmarket concerns (movement of diseases, pests, and invasive species). These threats and hazards pose a wide diversity of consequences with very different combinations of magnitudes and likelihoods, making it very challenging to prioritize them. This article presents the approach that was used at DHS to arrive at a consensus regarding the threats and hazards that stand out from the rest based on the overall risk they pose. Due to time constraints for the decision analysis, it was not feasible to apply multiattribute methodologies like multiattribute utility theory or the analytic hierarchy process. Using a holistic approach was considered, such as the deliberative method for ranking risks first published in this journal. However, an ordinal ranking alone does not indicate relative or absolute magnitude differences among the risks. Therefore, the use of the deliberative method for ranking risks is not sufficient for deciding whether there is a material difference between the top-ranked and bottom-ranked risks, let alone deciding what the stand-out risks are. To address this limitation of ordinal rankings, the deliberative method for ranking risks was augmented by adding an additional step to transform the ordinal ranking into a ratio scale ranking. This additional step enabled the selection of stand-out risks to help prioritize further analysis.

  5. Initial Decision and Risk Analysis

    SciTech Connect

    Engel, David W.

    2012-02-29

    Decision and Risk Analysis capabilities will be developed for industry consideration and possible adoption within Year 1. These tools will provide a methodology for merging qualitative ranking of technology maturity and acknowledged risk contributors with quantitative metrics that drive investment decision processes. Methods and tools will be initially introduced as applications to the A650.1 case study, but modular spreadsheets and analysis routines will be offered to industry collaborators as soon as possible to stimulate user feedback and co-development opportunities.

  6. Risk analysis and meat hygiene.

    PubMed

    Hathaway, S C

    1993-12-01

    Meat hygiene consists of three major activities: post-mortem inspection; monitoring and surveillance for chemical hazards; and maintenance of good hygienic practice throughout all stages between slaughter and consumption of meat. Risk analysis is an applied science of increasing importance to these activities in the following areas: facilitating the distribution of pre-harvest, harvest and post-harvest inspection resources, proportional to the likelihood of public health and animal health hazards; establishing internationally-harmonized standards and specifications which are consistent and science-based; and improving the safety and wholesomeness of meat and meat products in local and international trade. Risk analysis, in one form or another, is well developed with respect to establishing standards and specifications for chemical hazards; methods for risk analysis of post-mortem meat inspection programmes are beginning to emerge. However, risk analysis of microbiological hazards in meat and meat products presents particular difficulties. All areas of application currently suffer from a lack of international agreement on risk assessment and risk management methodology.

  7. Qualitative methods for assessing risk

    SciTech Connect

    Mahn, J.A.; Hannaman, G.W.; Kryska, P.

    1995-03-01

    The purpose of this document is to describe a qualitative risk assessment process that supplements the requirements of DOE/AL 5481.1B. Although facility managers have a choice of assessing risk either quantitatively or qualitatively, trade offs are involved in making the most appropriate choice for a given application. The results that can be obtained from a quantitative risk assessment are significantly more robust than those results derived from a qualitative approach. However, the advantages derived from quantitative risk assessment are achieved at a greater expenditure of money, time and convenience. This document provides the elements of a framework for performing a much less costly qualitative risk assessment, while retaining the best attributes of quantitative methods. The approach discussed herein will; (1) provide facility managers with the tools to prepare consistent, site wide assessments, and (2) aid the reviewers who may be tasked to evaluate the assessments. Added cost/benefit measures of the qualitative methodology include the identification of mechanisms for optimally allocating resources for minimizing risk in an expeditious, and fiscally responsible manner.

  8. Study Of The Risks Arising From Natural Disasters And Hazards On Urban And Intercity Motorways By Using Failure Mode Effect Analysis (FMEA) Methods

    NASA Astrophysics Data System (ADS)

    DELİCE, Yavuz

    2015-04-01

    Highways, Located in the city and intercity locations are generally prone to many kind of natural disaster risks. Natural hazards and disasters that may occur firstly from highway project making to construction and operation stages and later during the implementation of highway maintenance and repair stages have to be taken into consideration. And assessment of risks that may occur against adverse situations is very important in terms of project design, construction, operation maintenance and repair costs. Making hazard and natural disaster risk analysis is largely depending on the definition of the likelihood of the probable hazards on the highways. However, assets at risk , and the impacts of the events must be examined and to be rated in their own. With the realization of these activities, intended improvements against natural hazards and disasters will be made with the utilization of Failure Mode Effects Analysis (FMEA) method and their effects will be analyzed with further works. FMEA, is a useful method to identify the failure mode and effects depending on the type of failure rate effects priorities and finding the most optimum economic and effective solution. Although relevant measures being taken for the identified risks by this analysis method , it may also provide some information for some public institutions about the nature of these risks when required. Thus, the necessary measures will have been taken in advance in the city and intercity highways. Many hazards and natural disasters are taken into account in risk assessments. The most important of these dangers can be listed as follows; • Natural disasters 1. Meteorological based natural disasters (floods, severe storms, tropical storms, winter storms, avalanches, etc.). 2. Geological based natural disasters (earthquakes, tsunamis, landslides, subsidence, sinkholes, etc) • Human originated disasters 1. Transport accidents (traffic accidents), originating from the road surface defects (icing

  9. [Comparative analysis of two different methods for risk assessment of groundwater pollution: a case study in Beijing plain].

    PubMed

    Wang, Hong-na; He, Jiang-tao; Ma, Wen-jie; Xu, Zhen

    2015-01-01

    Groundwater contamination risk assessment has important meaning to groundwater contamination prevention planning and groundwater exploitation potentiality. Recently, UN assessment system and WP assessment system have become the focuses of international research. In both systems, the assessment framework and indices were drawn from five aspects: intrinsic vulnerability, aquifer storage, groundwater quality, groundwater resource protection zone and contamination load. But, the five factors were built up in different ways. In order to expound the difference between the UN and WP assessment systems, and explain the main reasons, the UN and WP assessment systems were applied to Beijing Plain, China. The maps constructed from the UN and WP risk assessment systems were compared. The results showed that both kinds of groundwater contamination risk assessment maps were in accordance with the actual conditions and were similar in spatial distribution trends. However, there was quite significant different in the coverage area at the same level. It also revealed that during the system construction process, the structural hierarchy, relevant overlaying principles and classification method might have effects on the groundwater contamination risk assessment map. UN assessment system and WP assessment system were both suitable for groundwater contamination risk assessment of the plain, however, their emphasis was different.

  10. Comparison of nonlinear methods symbolic dynamics, detrended fluctuation, and Poincaré plot analysis in risk stratification in patients with dilated cardiomyopathy

    NASA Astrophysics Data System (ADS)

    Voss, Andreas; Schroeder, Rico; Truebner, Sandra; Goernig, Matthias; Figulla, Hans Reiner; Schirdewan, Alexander

    2007-03-01

    Dilated cardiomyopathy (DCM) has an incidence of about 20/100 000 new cases per annum and accounts for nearly 10 000 deaths per year in the United States. Approximately 36% of patients with dilated cardiomyopathy (DCM) suffer from cardiac death within five years after diagnosis. Currently applied methods for an early risk prediction in DCM patients are rather insufficient. The objective of this study was to investigate the suitability of short-term nonlinear methods symbolic dynamics (STSD), detrended fluctuation (DFA), and Poincaré plot analysis (PPA) for risk stratification in these patients. From 91 DCM patients and 30 healthy subjects (REF), heart rate and blood pressure variability (HRV, BPV), STSD, DFA, and PPA were analyzed. Measures from BPV analysis, DFA, and PPA revealed highly significant differences (p<0.0011) discriminating REF and DCM. For risk stratification in DCM patients, four parameters from BPV analysis, STSD, and PPA revealed significant differences between low and high risk (maximum sensitivity: 90%, specificity: 90%). These results suggest that STSD and PPA are useful nonlinear methods for enhanced risk stratification in DCM patients.

  11. [METHODS AND TECHNOLOGIES OF HEALTH RISK ANALYSIS IN THE SYSTEM OF THE STATE MANAGEMENT UNDER ASSURANCE OF THE SANITATION AND EPIDEMIOLOGICAL WELFARE OF POPULATION].

    PubMed

    Zaĭtseva, N V; Popova, A Iu; Maĭ, I V; Shur, P Z

    2015-01-01

    The methodology of the analysis of health risk at the present stage of development of Russian society is in-demand at all levels of government management. In conjunction with the methods of mathematical modeling, spatial-temporal analysis and economic tools the risk assessment in the analysis of the situation makes it possible to determine the level of safety of the population, workers and consumers, to select prior resources, and threat factors as a point for exertion efforts. At the planning stage risk assessment is a basis for the establishment of most effective measures for the minimization of hazard and dangers. At the realization stage the methodology allows to estimate the efficiency of measures; at the control and supervision phase it permits to select out priorities for the concentration of efforts on the objects of maximal health risk for population. Risk assessments, including the elements of evolutionary modeling, are incorporated in the system of state hygienic regulation, the formation of evidence base of harm to health, the organization of control and supervisory activities. This allows you to harmonize the domestic legal framework with ternational legal requirements and ultimately enhances the credibility of the Russian data on the safety of the environment products and services. There is seemed to be actual the further assignment of enforcement of methodology of health risk analysis in the field of assurance of sanitary and epidemiological well-being and health of employers; he development of informational and analytical base in the part of the establishment of models of dependencies "exposure-response" for different types and levels of exposure and risk contingents; the accuracy enhancement of estimations of exposure; improvement of the economic aspects of health risk analysis and forecasting of measures aimed at mitigation of the losses associated with the negative impact of manifold factors on the health of citizens.

  12. Risk Analysis Virtual ENvironment

    SciTech Connect

    2014-02-10

    RAVEN has 3 major functionalities: 1. Provides a Graphical User Interface for the pre- and post-processing of the RELAP-7 input and output. 2. Provides the capability to model nuclear power plants control logic for the RELAP-7 code and dynamic control of the accident scenario evolution. This capability is based on a software structure that realizes a direct connection between the RELAP-7 solver engine (MOOSE) and a python environment where the variables describing the plant status are accessible in a scripting environment. RAVEN support the generation of the probabilistic scenario control by supplying a wide range of probability and cumulative distribution functions and their inverse functions. 3. Provides a general environment to perform probability risk analysis for RELAP-7, RELAP-5 and any generic MOOSE based applications. The probabilistic analysis is performed by sampling the input space of the coupled code parameters and it is enhanced by using modern artificial intelligence algorithms that accelerate the identification of the areas of major risk (in the input parameter space). This environment also provides a graphical visualization capability to analyze the outcomes. Among other approaches, the classical Monte Carlo and Latin Hypercube sampling algorithms are available. For the acceleration of the convergence of the sampling methodologies, Support Vector Machines, Bayesian regression, and collocation stochastic polynomials chaos are implemented. The same methodologies here described could be used to solve optimization and uncertainties propagation problems using the RAVEN framework.

  13. Different survival analysis methods for measuring long-term outcomes of Indigenous and non-Indigenous Australian cancer patients in the presence and absence of competing risks.

    PubMed

    He, Vincent Y F; Condon, John R; Baade, Peter D; Zhang, Xiaohua; Zhao, Yuejen

    2017-01-17

    Net survival is the most common measure of cancer prognosis and has been used to study differentials in cancer survival between ethnic or racial population subgroups. However, net survival ignores competing risks of deaths and so provides incomplete prognostic information for cancer patients, and when comparing survival between populations with different all-cause mortality. Another prognosis measure, "crude probability of death", which takes competing risk of death into account, overcomes this limitation. Similar to net survival, it can be calculated using either life tables (using Cronin-Feuer method) or cause of death data (using Fine-Gray method). The aim of this study is two-fold: (1) to compare the multivariable results produced by different survival analysis methods; and (2) to compare the Cronin-Feuer with the Fine-Gray methods, in estimating the cancer and non-cancer death probability of both Indigenous and non-Indigenous cancer patients and the Indigenous cancer disparities. Cancer survival was investigated for 9,595 people (18.5% Indigenous) diagnosed with cancer in the Northern Territory of Australia between 1991 and 2009. The Cox proportional hazard model along with Poisson and Fine-Gray regression were used in the multivariable analysis. The crude probabilities of cancer and non-cancer methods were estimated in two ways: first, using cause of death data with the Fine-Gray method, and second, using life tables with the Cronin-Feuer method. Multivariable regression using the relative survival, cause-specific survival, and competing risk analysis produced similar results. In the presence of competing risks, the Cronin-Feuer method produced similar results to Fine-Gray in the estimation of cancer death probability (higher Indigenous cancer death probabilities for all cancers) and non-cancer death probabilities (higher Indigenous non-cancer death probabilities for all cancers except lung cancer and head and neck cancers). Cronin-Feuer estimated much lower

  14. A ubiquitous method for street scale spatial data collection and analysis in challenging urban environments: mapping health risks using spatial video in Haiti

    PubMed Central

    2013-01-01

    Background Fine-scale and longitudinal geospatial analysis of health risks in challenging urban areas is often limited by the lack of other spatial layers even if case data are available. Underlying population counts, residential context, and associated causative factors such as standing water or trash locations are often missing unless collected through logistically difficult, and often expensive, surveys. The lack of spatial context also hinders the interpretation of results and designing intervention strategies structured around analytical insights. This paper offers a ubiquitous spatial data collection approach using a spatial video that can be used to improve analysis and involve participatory collaborations. A case study will be used to illustrate this approach with three health risks mapped at the street scale for a coastal community in Haiti. Methods Spatial video was used to collect street and building scale information, including standing water, trash accumulation, presence of dogs, cohort specific population characteristics, and other cultural phenomena. These data were digitized into Google Earth and then coded and analyzed in a GIS using kernel density and spatial filtering approaches. The concentrations of these risks around area schools which are sometimes sources of diarrheal disease infection because of the high concentration of children and variable sanitary practices will show the utility of the method. In addition schools offer potential locations for cholera education interventions. Results Previously unavailable fine scale health risk data vary in concentration across the town, with some schools being proximate to greater concentrations of the mapped risks. The spatial video is also used to validate coded data and location specific risks within these “hotspots”. Conclusions Spatial video is a tool that can be used in any environment to improve local area health analysis and intervention. The process is rapid and can be repeated in study

  15. Reinterpretation of the results of a pooled analysis of dietary carotenoid intake and breast cancer risk by using the interval collapsing method.

    PubMed

    Bae, Jong-Myon

    2016-01-01

    A pooled analysis of 18 prospective cohort studies reported in 2012 for evaluating carotenoid intakes and breast cancer risk defined by estrogen receptor (ER) and progesterone receptor (PR) statuses by using the "highest versus lowest intake" method (HLM). By applying the interval collapsing method (ICM) to maximize the use of the estimated information, we reevaluated the results of the previous analysis in order to reinterpret the inferences made. In order to estimate the summary effect size (sES) and its 95% confidence interval (CI), meta-analyses with the random-effects model were conducted for adjusted relative risks and their 95% CI from the second to the fifth interval according to five kinds of carotenoids and ER/PR status. The following new findings were identified: α-Carotene and β-cryptoxanthin have protective effects on overall breast cancer. All five kinds of carotenoids showed protective effects on ER- breast cancer. β-Carotene level increased the risk of ER+ or ER+/PR+ breast cancer. α-Carotene, β-carotene, lutein/zeaxanthin, and lycopene showed a protective effect on ER-/PR+ or ER-/PR- breast cancer. The new facts support the hypothesis that carotenoids that show anticancer effects with anti-oxygen function might reduce the risk of ER- breast cancer. Based on the new facts, the modification of the effects of α-carotene, β-carotene, and β-cryptoxanthin should be evaluated according to PR and ER statuses.

  16. A ubiquitous method for street scale spatial data collection and analysis in challenging urban environments: mapping health risks using spatial video in Haiti.

    PubMed

    Curtis, Andrew; Blackburn, Jason K; Widmer, Jocelyn M; Morris, J Glenn

    2013-04-15

    Fine-scale and longitudinal geospatial analysis of health risks in challenging urban areas is often limited by the lack of other spatial layers even if case data are available. Underlying population counts, residential context, and associated causative factors such as standing water or trash locations are often missing unless collected through logistically difficult, and often expensive, surveys. The lack of spatial context also hinders the interpretation of results and designing intervention strategies structured around analytical insights. This paper offers a ubiquitous spatial data collection approach using a spatial video that can be used to improve analysis and involve participatory collaborations. A case study will be used to illustrate this approach with three health risks mapped at the street scale for a coastal community in Haiti. Spatial video was used to collect street and building scale information, including standing water, trash accumulation, presence of dogs, cohort specific population characteristics, and other cultural phenomena. These data were digitized into Google Earth and then coded and analyzed in a GIS using kernel density and spatial filtering approaches. The concentrations of these risks around area schools which are sometimes sources of diarrheal disease infection because of the high concentration of children and variable sanitary practices will show the utility of the method. In addition schools offer potential locations for cholera education interventions. Previously unavailable fine scale health risk data vary in concentration across the town, with some schools being proximate to greater concentrations of the mapped risks. The spatial video is also used to validate coded data and location specific risks within these "hotspots". Spatial video is a tool that can be used in any environment to improve local area health analysis and intervention. The process is rapid and can be repeated in study sites through time to track spatio

  17. Dynamic Blowout Risk Analysis Using Loss Functions.

    PubMed

    Abimbola, Majeed; Khan, Faisal

    2017-08-11

    Most risk analysis approaches are static; failing to capture evolving conditions. Blowout, the most feared accident during a drilling operation, is a complex and dynamic event. The traditional risk analysis methods are useful in the early design stage of drilling operation while falling short during evolving operational decision making. A new dynamic risk analysis approach is presented to capture evolving situations through dynamic probability and consequence models. The dynamic consequence models, the focus of this study, are developed in terms of loss functions. These models are subsequently integrated with the probability to estimate operational risk, providing a real-time risk analysis. The real-time evolving situation is considered dependent on the changing bottom-hole pressure as drilling progresses. The application of the methodology and models are demonstrated with a case study of an offshore drilling operation evolving to a blowout. © 2017 Society for Risk Analysis.

  18. The Risks of Multimedia Methods

    PubMed Central

    Lenert, Leslie A.; Ziegler, Jennifer; Lee, Tina; Unfred, Christine; Mahmoud, Ramy

    2000-01-01

    an actor's gender may influence the willingness of viewers to gamble to gain health benefits (or risk attitude). Conclusions: Educators and researchers considering the use of multimedia methods for decision support need to be aware of the potential for the race and gender of patients or actors to influence preferences for health states and thus, potentially, medical decisions. PMID:10730601

  19. Assessing environmental risks for high intensity agriculture using the material flow analysis method--a case study of the Dongting Lake basin in South Central China.

    PubMed

    Yin, Guanyi; Liu, Liming; Yuan, Chengcheng

    2015-07-01

    This study primarily examined the assessment of environmental risk in high intensity agricultural areas. Dongting Lake basin was taken as a case study, which is one of the major grain producing areas in China. Using data obtained from 1989 to 2012, we applied Material Flow Analysis (MFA) to show the material consumption, pollutant output and production storage in the agricultural-environmental system and assessed the environmental risk index on the basis of the MFA results. The results predicted that the status of the environmental quality of the Dongting Lake area is unsatisfactory for the foreseeable future. The direct material input (DMI) declined by 13.9%, the domestic processed output (DPO) increased by 28.21%, the intensity of material consumption (IMC) decreased by 36.7%, the intensity of material discharge (IMD) increased by 10%, the material productivity (MP) increased by 27 times, the environmental efficiency (EE) increased by 15.31 times, and the material storage (PAS) increased by 0.23%. The DMI and DPO was higher at rural places on the edge of cities, whereas the risk of urban agriculture has arisen due to the higher increasing rate of DMI and DPO in cities compared with the counties. The composite environmental risk index increased from 0.33 to 0.96, indicating that the total environmental risk changed gradually but seriously during the 24 years assessed. The driving factors that affect environmental risk in high intensity agriculture can be divided into five classes: social, economic, human, natural and disruptive incidents. This study discussed a number of effective measures for protecting the environment while ensuring food production yields. Additional research in other areas and certain improvements of this method in future studies may be necessary to develop a more effective method of managing and controlling agricultural-environmental interactions.

  20. Modified risk graph method using fuzzy rule-based approach.

    PubMed

    Nait-Said, R; Zidani, F; Ouzraoui, N

    2009-05-30

    The risk graph is one of the most popular methods used to determine the safety integrity level for safety instrumented functions. However, conventional risk graph as described in the IEC 61508 standard is subjective and suffers from an interpretation problem of risk parameters. Thus, it can lead to inconsistent outcomes that may result in conservative SILs. To overcome this difficulty, a modified risk graph using fuzzy rule-based system is proposed. This novel version of risk graph uses fuzzy scales to assess risk parameters and calibration may be made by varying risk parameter values. Furthermore, the outcomes which are numerical values of risk reduction factor (the inverse of the probability of failure on demand) can be compared directly with those given by quantitative and semi-quantitative methods such as fault tree analysis (FTA), quantitative risk assessment (QRA) and layers of protection analysis (LOPA).

  1. Compendium on Risk Analysis Techniques

    DTIC Science & Technology

    The evolution of risk analysis in the materiel acquisition process is traced from the Secretary Packard memorandum to current AMC guidance. Risk ... analysis is defined and many of the existing techniques are described in light of this definition and their specific role in program management and

  2. [The cascade scheme as a methodical platform for analysis of health risks in space flight and partially and fully analog conditions].

    PubMed

    Ushakov, I B; Poliakov, A V; Usov, V M

    2011-01-01

    Space anthropoecology, a subsection of human ecology, studies various aspects of physiological, psychological, social and professional adaptation to the extreme environment of space flight and human life and work in partially- and fully analogous conditions on Earth. Both SF and simulated extreme conditions are known for high human safety standards and a substantial analytic base that secures on-line analysis of torrent of information. Management evaluation and response to germing undesired developments aimed to curb their impact on the functioning of the crew-vehicle-environment system and human health involve the complete wealth of knowledge about risks to human health and performance. Spacecrew safety issues are tackled by experts of many specialties which emphasizes the importance of integral methodical approaches to risk estimation and mitigation, setting up barriers to adverse trends in human physiology and psychology in challenging conditions, and minimization of delayed effects on professional longevity and disorders in behavioral reactions.

  3. Relative risk regression analysis of epidemiologic data.

    PubMed

    Prentice, R L

    1985-11-01

    Relative risk regression methods are described. These methods provide a unified approach to a range of data analysis problems in environmental risk assessment and in the study of disease risk factors more generally. Relative risk regression methods are most readily viewed as an outgrowth of Cox's regression and life model. They can also be viewed as a regression generalization of more classical epidemiologic procedures, such as that due to Mantel and Haenszel. In the context of an epidemiologic cohort study, relative risk regression methods extend conventional survival data methods and binary response (e.g., logistic) regression models by taking explicit account of the time to disease occurrence while allowing arbitrary baseline disease rates, general censorship, and time-varying risk factors. This latter feature is particularly relevant to many environmental risk assessment problems wherein one wishes to relate disease rates at a particular point in time to aspects of a preceding risk factor history. Relative risk regression methods also adapt readily to time-matched case-control studies and to certain less standard designs. The uses of relative risk regression methods are illustrated and the state of development of these procedures is discussed. It is argued that asymptotic partial likelihood estimation techniques are now well developed in the important special case in which the disease rates of interest have interpretations as counting process intensity functions. Estimation of relative risks processes corresponding to disease rates falling outside this class has, however, received limited attention. The general area of relative risk regression model criticism has, as yet, not been thoroughly studied, though a number of statistical groups are studying such features as tests of fit, residuals, diagnostics and graphical procedures. Most such studies have been restricted to exponential form relative risks as have simulation studies of relative risk estimation

  4. Risk/benefit analysis

    SciTech Connect

    Crouch, E.A.C.; Wilson, R.

    1982-01-01

    The Reagan administration is intent on rolling back regulations it considers unwise to give new life to American industry, but regulations were instituted to protect individuals against long-term hazards. The authors believe these hazards must be assessed before a regulation is modified, suspended, or implemented. They point out the problems inherent in defining, perceiving, and estimating risk. Throughout, they combine theoretical discussions with actual case studies covering the risk associated with nuclear power plants, saccharin use, mass chest radiography, and others. They believe that risk assessment should be distinct from decision making, with the risk assessor supplying clear and objective information about hazards and the probability of damage as well as pointing out the uncertainties to policy makers. 149 references, 29 figures, 8 tables.

  5. Risk analysis and management

    NASA Technical Reports Server (NTRS)

    Smith, H. E.

    1990-01-01

    Present software development accomplishments are indicative of the emerging interest in and increasing efforts to provide risk assessment backbone tools in the manned spacecraft engineering community. There are indications that similar efforts are underway in the chemical processes industry and are probably being planned for other high risk ground base environments. It appears that complex flight systems intended for extended manned planetary exploration will drive this technology.

  6. Object Oriented Risk Analysis Workshop

    NASA Astrophysics Data System (ADS)

    Pons, M. Güell I.; Jaboyedoff, M.

    2009-04-01

    In the framework of the RISET Project (Interfaculty Network of Support to Education and Technology) an educational tool for introducing risk analysis has been developed. This workshop enables to carry a group of students (role-play game) through a step-by-step process of risk identification and quantification. The aim is to assess risk in a characteristic alpine village regarding natural hazards (rockfall, snow avalanche, flooding…) and is oriented to affected objects such as buildings, infrastructures... The workshop contains the following steps: 1.- Planning of the study and definition of stakeholders 2.- Hazard identification 3.- Risk analysis 4.- Risk assessment 5.- Proposition of mitigation measures 6- Risk management and cost-benefit analysis. During the process, information related to past events and useful concepts are provided in order to bring up discussion and decision making. The Risk Matrix and other graphical tools allow having a visual representation of the risk level and help to prioritize counter measures. At the end of the workshop, there is the possibility to compare the results between different groups and print out a summarizing report. This approach provides a rapid and comprehensible risk evaluation. The workshop is accessible from the internet and will be used for educational purposes at bachelor and master level as well as for external persons dealing with risk analysis.

  7. Fire Risk Implications in Safety Analysis Reports

    SciTech Connect

    Blanchard, A.

    1999-03-31

    Fire can be a significant risk for facilities that store and handle radiological material. Such events must be evaluated as part of a comprehensive safety analysis. SRS has been developing methods to evaluate radiological fire risk in such facilities. These methods combined with the analysis techniques proposed by DOE-STD-3009-94 have provided a better understanding of how fire risks in nuclear facilities should be managed. To ensure that these new insights are properly disseminated the DOE Savannah River Office and the Defense Nuclear Facility Safety Board (DNFSB) requested Westinghouse Savannah River Company (WSRC) prepare this paper.

  8. Budget Risk & Prioritization Analysis Tool

    SciTech Connect

    Carlos Castillo, Jerel Nelson

    2010-12-31

    BRPAtool performs the following: •Assists managers in making solid decisions on what scope/activities to reduce and/or eliminate, to meet constrained budgets, based on multiple risk factors •Enables analysis of different budget scenarios •Can analyze risks and cost for each activity based on technical, quantifiable risk criteria and management-determined risks •Real-time analysis •Enables managers to determine the multipliers and where funding is best applied •Promotes solid budget defense

  9. The Components of Microbiological Risk Analysis.

    PubMed

    Liuzzo, Gaetano; Bentley, Stefano; Giacometti, Federica; Serraino, Andrea

    2015-02-03

    The paper describes the process of risk analysis in a food safety perspective. The steps of risk analysis defined as a process consisting of three interconnected components (risk assessment, risk management, and risk communication) are analysed. The different components of the risk assessment, risk management and risk communication are further described.

  10. The Components of Microbiological Risk Analysis

    PubMed Central

    Bentley, Stefano; Giacometti, Federica; Serraino, Andrea

    2015-01-01

    The paper describes the process of risk analysis in a food safety perspective. The steps of risk analysis defined as a process consisting of three interconnected components (risk assessment, risk management, and risk communication) are analysed. The different components of the risk assessment, risk management and risk communication are further described. PMID:27800384

  11. Risk analysis based on hazards interactions

    NASA Astrophysics Data System (ADS)

    Rossi, Lauro; Rudari, Roberto; Trasforini, Eva; De Angeli, Silvia; Becker, Joost

    2017-04-01

    Despite an increasing need for open, transparent, and credible multi-hazard risk assessment methods, models, and tools, the availability of comprehensive risk information needed to inform disaster risk reduction is limited, and the level of interaction across hazards is not systematically analysed. Risk assessment methodologies for different hazards often produce risk metrics that are not comparable. Hazard interactions (consecutive occurrence two or more different events) are generally neglected, resulting in strongly underestimated risk assessment in the most exposed areas. This study presents cases of interaction between different hazards, showing how subsidence can affect coastal and river flood risk (Jakarta and Bandung, Indonesia) or how flood risk is modified after a seismic event (Italy). The analysis of well documented real study cases, based on a combination between Earth Observation and in-situ data, would serve as basis the formalisation of a multi-hazard methodology, identifying gaps and research frontiers. Multi-hazard risk analysis is performed through the RASOR platform (Rapid Analysis and Spatialisation Of Risk). A scenario-driven query system allow users to simulate future scenarios based on existing and assumed conditions, to compare with historical scenarios, and to model multi-hazard risk both before and during an event (www.rasor.eu).

  12. At-Risk Youngsters: Methods That Work.

    ERIC Educational Resources Information Center

    Obiakor, Festus E.

    This paper examines problems faced by youngsters at risk of failure in school, and discusses methods for helping them succeed in educational programs. At-risk youngsters confront many problems in school and in mainstream society, and are frequently misidentified, misdiagnosed, and improperly instructed. Problems faced by at-risk youngsters…

  13. Probabilistic Exposure Analysis for Chemical Risk Characterization

    PubMed Central

    Bogen, Kenneth T.; Cullen, Alison C.; Frey, H. Christopher; Price, Paul S.

    2009-01-01

    This paper summarizes the state of the science of probabilistic exposure assessment (PEA) as applied to chemical risk characterization. Current probabilistic risk analysis methods applied to PEA are reviewed. PEA within the context of risk-based decision making is discussed, including probabilistic treatment of related uncertainty, interindividual heterogeneity, and other sources of variability. Key examples of recent experience gained in assessing human exposures to chemicals in the environment, and other applications to chemical risk characterization and assessment, are presented. It is concluded that, although improvements continue to be made, existing methods suffice for effective application of PEA to support quantitative analyses of the risk of chemically induced toxicity that play an increasing role in key decision-making objectives involving health protection, triage, civil justice, and criminal justice. Different types of information required to apply PEA to these different decision contexts are identified, and specific PEA methods are highlighted that are best suited to exposure assessment in these separate contexts. PMID:19223660

  14. Confronting deep uncertainties in risk analysis.

    PubMed

    Cox, Louis Anthony

    2012-10-01

    How can risk analysts help to improve policy and decision making when the correct probabilistic relation between alternative acts and their probable consequences is unknown? This practical challenge of risk management with model uncertainty arises in problems from preparing for climate change to managing emerging diseases to operating complex and hazardous facilities safely. We review constructive methods for robust and adaptive risk analysis under deep uncertainty. These methods are not yet as familiar to many risk analysts as older statistical and model-based methods, such as the paradigm of identifying a single "best-fitting" model and performing sensitivity analyses for its conclusions. They provide genuine breakthroughs for improving predictions and decisions when the correct model is highly uncertain. We demonstrate their potential by summarizing a variety of practical risk management applications.

  15. Psychiatrists' follow-up of identified metabolic risk: a mixed-method analysis of outcomes and influences on practice

    PubMed Central

    Patterson, Sue; Freshwater, Kathleen; Goulter, Nicole; Ewing, Julie; Leamon, Boyd; Choudhary, Anand; Moudgil, Vikas; Emmerson, Brett

    2016-01-01

    Aims and method To describe and explain psychiatrists' responses to metabolic abnormalities identified during screening. We carried out an audit of clinical records to assess rates of monitoring and follow-up practice. Semi-structured interviews with 36 psychiatrists followed by descriptive and thematic analyses were conducted. Results Metabolic abnormalities were identified in 76% of eligible patients screened. Follow-up, recorded for 59%, was variable but more likely with four or more abnormalities. Psychiatrists endorse guidelines but ambivalence about responsibility, professional norms, resource constraints and skills deficits as well as patient factors influences practice. Therapeutic optimism and desire to be a ‘good doctor’ supported comprehensive follow-up. Clinical implications Psychiatrists are willing to attend to physical healthcare, and obstacles to recommended practice are surmountable. Psychiatrists seek consensus among stakeholders about responsibilities and a systemic approach addressing the social determinants of health inequities. Understanding patients' expectations is critical to promoting best practice. PMID:27752343

  16. Risk-stratified imputation in survival analysis.

    PubMed

    Kennedy, Richard E; Adragni, Kofi P; Tiwari, Hemant K; Voeks, Jenifer H; Brott, Thomas G; Howard, George

    2013-08-01

    Censoring that is dependent on covariates associated with survival can arise in randomized trials due to changes in recruitment and eligibility criteria to minimize withdrawals, potentially leading to biased treatment effect estimates. Imputation approaches have been proposed to address censoring in survival analysis; while these approaches may provide unbiased estimates of treatment effects, imputation of a large number of outcomes may over- or underestimate the associated variance based on the imputation pool selected. We propose an improved method, risk-stratified imputation, as an alternative to address withdrawal related to the risk of events in the context of time-to-event analyses. Our algorithm performs imputation from a pool of replacement subjects with similar values of both treatment and covariate(s) of interest, that is, from a risk-stratified sample. This stratification prior to imputation addresses the requirement of time-to-event analysis that censored observations are representative of all other observations in the risk group with similar exposure variables. We compared our risk-stratified imputation to case deletion and bootstrap imputation in a simulated dataset in which the covariate of interest (study withdrawal) was related to treatment. A motivating example from a recent clinical trial is also presented to demonstrate the utility of our method. In our simulations, risk-stratified imputation gives estimates of treatment effect comparable to bootstrap and auxiliary variable imputation while avoiding inaccuracies of the latter two in estimating the associated variance. Similar results were obtained in analysis of clinical trial data. Risk-stratified imputation has little advantage over other imputation methods when covariates of interest are not related to treatment. Risk-stratified imputation is intended for categorical covariates and may be sensitive to the width of the matching window if continuous covariates are used. The use of the risk

  17. Failure analysis of pinch-torsion tests as a thermal runaway risk evaluation method of Li-Ion Cells

    SciTech Connect

    Xia, Yuzhi; Li, Dr. Tianlei; Ren, Prof. Fei; Gao, Yanfei; Wang, Hsin

    2014-01-01

    Recently a pinch-torsion test is developed for safety testing of Li-ion batteries (Ren et al., J. Power Source, 2013). It has been demonstrated that this test can generate small internal short-circuit spots in the separator in a controllable and repeatable manner. In the current research, the failure mechanism is examined by numerical simulations and comparisons to experimental observations. Finite element models are developed to evaluate the deformation of the separators under both pure pinch and pinch-torsion loading conditions. It is discovered that the addition of the torsion component significantly increased the maximum principal strain, which is believed to induce the internal short circuit. In addition, the applied load in the pinch-torsion test is significantly less than in the pure pinch test, thus dramatically improving the applicability of this method to ultra-thick batteries which otherwise require heavy load in excess of machine capability. It is further found that the separator failure is achieved in the early stage of torsion (within a few degree of rotation). Effect of coefficient of friction on the maximum principal strain is also examined.

  18. Adversarial risk analysis for counterterrorism modeling.

    PubMed

    Rios, Jesus; Rios Insua, David

    2012-05-01

    Recent large-scale terrorist attacks have raised interest in models for resource allocation against terrorist threats. The unifying theme in this area is the need to develop methods for the analysis of allocation decisions when risks stem from the intentional actions of intelligent adversaries. Most approaches to these problems have a game-theoretic flavor although there are also several interesting decision-analytic-based proposals. One of them is the recently introduced framework for adversarial risk analysis, which deals with decision-making problems that involve intelligent opponents and uncertain outcomes. We explore how adversarial risk analysis addresses some standard counterterrorism models: simultaneous defend-attack models, sequential defend-attack-defend models, and sequential defend-attack models with private information. For each model, we first assess critically what would be a typical game-theoretic approach and then provide the corresponding solution proposed by the adversarial risk analysis framework, emphasizing how to coherently assess a predictive probability model of the adversary's actions, in a context in which we aim at supporting decisions of a defender versus an attacker. This illustrates the application of adversarial risk analysis to basic counterterrorism models that may be used as basic building blocks for more complex risk analysis of counterterrorism problems.

  19. Sexual and injection-related risks in Puerto Rican-born injection drug users living in New York City: A mixed-methods analysis

    PubMed Central

    2011-01-01

    Background These data were collected as part of the National HIV Behavioral Surveillance (NHBS) study. NHBS is a cross-sectional study to investigate HIV behavioral risks among core risk groups in 21 U.S. cities with the highest HIV/AIDS prevalence. This analysis examines data from the NHBS data collection cycle with IDU conducted in New York City in 2009. We explored how the recency of migration from Puerto Rico (PR) to New York City (NYC) impacts both syringe sharing and unprotected sex among injection drug users (IDU) currently living in NYC. Methods We used a mixed-methods approach to examine differences in risk between US-born IDU, PR IDU who migrated to NYC more than three years ago (non-recent migrants), and PR IDU who migrated in the last three years (recent migrants). Respondent-driven sampling (RDS) was used to recruit the sample (n = 514). In addition, qualitative individual and group interviews with recent PR migrants (n = 12) and community experts (n = 2) allowed for an in-depth exploration of the IDU migration process and the material and cultural factors behind continued risk behaviors in NYC. Results In multiple logistic regression controlling for confounding factors, recent migrants were significantly more likely to report unprotected sexual intercourse with casual or exchange partners (adjusted odds ratio [AOR]: 2.81; 95% confidence intervals [CI]: 1.37-5.76) and receptive syringe sharing (AOR = 2.44; 95% CI: 1.20-4.97) in the past year, compared to US-born IDU. HIV and HCV seroprevalence were highest among non-recent migrants. Qualitative results showed that risky injection practices are partly based on cultural norms acquired while injecting drugs in Puerto Rico. These same results also illustrate how homelessness influences risky sexual practices. Conclusions Poor material conditions (especially homelessness) may be key in triggering risky sexual practices. Cultural norms (ingrained while using drugs in PR) around injection drug use are

  20. Recursive Partitioning Method on Competing Risk Outcomes

    PubMed Central

    Xu, Wei; Che, Jiahua; Kong, Qin

    2016-01-01

    In some cancer clinical studies, researchers have interests to explore the risk factors associated with competing risk outcomes such as recurrence-free survival. We develop a novel recursive partitioning framework on competing risk data for both prognostic and predictive model constructions. We define specific splitting rules, pruning algorithm, and final tree selection algorithm for the competing risk tree models. This methodology is quite flexible that it can corporate both semiparametric method using Cox proportional hazards model and parametric competing risk model. Both prognostic and predictive tree models are developed to adjust for potential confounding factors. Extensive simulations show that our methods have well-controlled type I error and robust power performance. Finally, we apply both Cox proportional hazards model and flexible parametric model for prognostic tree development on a retrospective clinical study on oropharyngeal cancer patients. PMID:27486300

  1. A Course of Instruction in Risk Analysis.

    DTIC Science & Technology

    Contents: Risk analysis course schedule; Problems and perspectives - an introduction to a course of instruction in risk analysis ; Analytical...techniques; Overview of the process of risk analysis ; Network analysis; RISCA: USALMC’s network analyzer program; Case studies in risk analysis ; Armored...vehicle launched bridge (AVLB); Micom-air defense missile warhead/fuze subsystem performance; Helicopter performance risk analysis ; High performance fuze

  2. Comprehensive rotorcraft analysis methods

    NASA Technical Reports Server (NTRS)

    Stephens, Wendell B.; Austin, Edward E.

    1988-01-01

    The development and application of comprehensive rotorcraft analysis methods in the field of rotorcraft technology are described. These large scale analyses and the resulting computer programs are intended to treat the complex aeromechanical phenomena that describe the behavior of rotorcraft. They may be used to predict rotor aerodynamics, acoustic, performance, stability and control, handling qualities, loads and vibrations, structures, dynamics, and aeroelastic stability characteristics for a variety of applications including research, preliminary and detail design, and evaluation and treatment of field problems. The principal comprehensive methods developed or under development in recent years and generally available to the rotorcraft community because of US Army Aviation Research and Technology Activity (ARTA) sponsorship of all or part of the software systems are the Rotorcraft Flight Simulation (C81), Dynamic System Coupler (DYSCO), Coupled Rotor/Airframe Vibration Analysis Program (SIMVIB), Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics (CAMRAD), General Rotorcraft Aeromechanical Stability Program (GRASP), and Second Generation Comprehensive Helicopter Analysis System (2GCHAS).

  3. Risk-driven security testing using risk analysis with threat modeling approach.

    PubMed

    Palanivel, Maragathavalli; Selvadurai, Kanmani

    2014-01-01

    Security testing is a process of determining risks present in the system states and protects them from vulnerabilities. But security testing does not provide due importance to threat modeling and risk analysis simultaneously that affects confidentiality and integrity of the system. Risk analysis includes identification, evaluation and assessment of risks. Threat modeling approach is identifying threats associated with the system. Risk-driven security testing uses risk analysis results in test case identification, selection and assessment to prioritize and optimize the testing process. Threat modeling approach, STRIDE is generally used to identify both technical and non-technical threats present in the system. Thus, a security testing mechanism based on risk analysis results using STRIDE approach has been proposed for identifying highly risk states. Risk metrics considered for testing includes risk impact, risk possibility and risk threshold. Risk threshold value is directly proportional to risk impact and risk possibility. Risk-driven security testing results in reduced test suite which in turn reduces test case selection time. Risk analysis optimizes the test case selection and execution process. For experimentation, the system models namely LMS, ATM, OBS, OSS and MTRS are considered. The performance of proposed system is analyzed using Test Suite Reduction Rate (TSRR) and FSM coverage. TSRR varies from 13.16 to 21.43% whereas FSM coverage is achieved up to 91.49%. The results show that the proposed method combining risk analysis with threat modeling identifies states with high risks to improve the testing efficiency.

  4. Association analysis of the dopamine D{sub 2} receptor gene in Tourette`s syndrome using the haplotype relative risk method

    SciTech Connect

    Noethen, M.M.; Cichon, S.; Propping, P.

    1994-09-15

    Comings et al. have recently reported a highly significant association between Tourette`s syndrome (TS) and a restriction fragment length polymorphism (RFLP) of the dopamine D{sub 2} receptor gene (DRD2) locus. The A1 allele of the DRD2 Taq I RFLP was present in 45% of the Tourette patients compared with 25% of controls. We tried to replicate this finding by using the haplotype relative risk (HRR) method for association analysis. This method overcomes a major problem of conventional case-control studies, where undetected ethnic differences between patients and controls may result in a false-positive finding, by using parental alleles not inherited by the proband as control alleles. Sixty-one nuclear families encompassing an affected child and parents were typed for the DRD2 Taq I polymorphism. No significant differences in DRD2 A1 allele frequency were observed between TS probands, sub-populations of probands classified according to tic severity, or parental control alleles. Our data do not support the hypothesis that the DRD2 locus may act as a modifying gene in the expression of the disorder in TS probands. 40 refs., 1 tab.

  5. Draft Waste Management Programmatic Environmental Impact Statement for managing treatment, storage, and disposal of radioactive and hazardous waste. Volume 3, Appendix A: Public response to revised NOI, Appendix B: Environmental restoration, Appendix C, Environmental impact analysis methods, Appendix D, Risk

    SciTech Connect

    1995-08-01

    Volume three contains appendices for the following: Public comments do DOE`s proposed revisions to the scope of the waste management programmatic environmental impact statement; Environmental restoration sensitivity analysis; Environmental impacts analysis methods; and Waste management facility human health risk estimates.

  6. Comparison Of Intake Gate Closure Methods At Lower Granite, Little Goose, Lower Monumental, And Mcnary Dams Using Risk-Based Analysis

    SciTech Connect

    Gore, Bryan F.; Blackburn, Tyrone R.; Heasler, Patrick G.; Mara, Neil L.; Phan, Hahn K.; Bardy, David M.; Hollenbeck, Robert E.

    2001-01-19

    The objective of this report is to compare the benefits and costs of modifications proposed for intake gate closure systems at four hydroelectric stations on the Lower Snake and Upper Columbia Rivers in the Walla Walla District that are unable to meet the COE 10-minute closure rule due to the installation of fish screens. The primary benefit of the proposed modifications is to reduce the risk of damage to the station and environs when emergency intake gate closure is required. Consequently, this report presents the results and methodology of an extensive risk analysis performed to assess the reliability of powerhouse systems and the costs and timing of potential damages resulting from events requiring emergency intake gate closure. As part of this analysis, the level of protection provided by the nitrogen emergency closure system was also evaluated. The nitrogen system was the basis for the original recommendation to partially disable the intake gate systems. The risk analysis quantifies this protection level.

  7. Comparison of Intake Gate Closure Methods At Lower Granite, Little Goose, Lower Monumental, And McNary Dams Using Risk-Based Analysis

    SciTech Connect

    Gore, Bryan F; Blackburn, Tye R; Heasler, Patrick G; Mara, Neil L

    2001-01-19

    The objective of this report is to compare the benefits and costs of modifications proposed for intake gate closure systems at four hydroelectric stations on the Lower Snake and Upper Columbia Rivers in the Walla Walla District that are unable to meet the COE 10-minute closure rule due to the installation of fish screens. The primary benefit of the proposed modifications is to reduce the risk of damage to the station and environs when emergency intake gate closure is required. Consequently, this report presents the results and methodology of an extensive risk analysis performed to assess the reliability of powerhouse systems and the costs and timing of potential damages resulting from events requiring emergency intake gate closure. As part of this analysis, the level of protection provided by the nitrogen emergency closure system was also evaluated. The nitrogen system was the basis for the original recommendation to partially disable the intake gate systems. The risk analysis quantifies this protection level.

  8. Russian risk assessment methods and approaches

    SciTech Connect

    Dvorack, M.A.; Carlson, D.D.; Smith, R.E.

    1996-07-01

    One of the benefits resulting from the collapse of the Soviet Union is the increased dialogue currently taking place between American and Russian nuclear weapons scientists in various technical arenas. One of these arenas currently being investigated involves collaborative studies which illustrate how risk assessment is perceived and utilized in the Former Soviet Union (FSU). The collaborative studies indicate that, while similarities exist with respect to some methodologies, the assumptions and approaches in performing risk assessments were, and still are, somewhat different in the FSU as opposed to that in the US. The purpose of this paper is to highlight the present knowledge of risk assessment methodologies and philosophies within the two largest nuclear weapons laboratories of the Former Soviet Union, Arzamas-16 and Chelyabinsk-70. Furthermore, This paper will address the relative progress of new risk assessment methodologies, such as Fuzzy Logic, within the framework of current risk assessment methods at these two institutes.

  9. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2007-01-01

    A formal method is described to quantify structural reliability and risk in the presence of a multitude of uncertainties. The method is based on the materials behavior level where primitive variables with their respective scatters are used to describe that behavior. Computational simulation is then used to propagate those uncertainties to the structural scale where reliability and risk are usually specified. A sample case is described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from this method demonstrate that the method is mature and that it can be used for future strategic projections and planning to assure better, cheaper, faster products for competitive advantages in world markets. The results also indicate that the methods are suitable for predicting remaining life in aging or deteriorating structures.

  10. Intelligent adversary risk analysis: a bioterrorism risk management model.

    PubMed

    Parnell, Gregory S; Smith, Christopher M; Moxley, Frederick I

    2010-01-01

    The tragic events of 9/11 and the concerns about the potential for a terrorist or hostile state attack with weapons of mass destruction have led to an increased emphasis on risk analysis for homeland security. Uncertain hazards (natural and engineering) have been successfully analyzed using probabilistic risk analysis (PRA). Unlike uncertain hazards, terrorists and hostile states are intelligent adversaries who can observe our vulnerabilities and dynamically adapt their plans and actions to achieve their objectives. This article compares uncertain hazard risk analysis with intelligent adversary risk analysis, describes the intelligent adversary risk analysis challenges, and presents a probabilistic defender-attacker-defender model to evaluate the baseline risk and the potential risk reduction provided by defender investments. The model includes defender decisions prior to an attack; attacker decisions during the attack; defender actions after an attack; and the uncertainties of attack implementation, detection, and consequences. The risk management model is demonstrated with an illustrative bioterrorism problem with notional data.

  11. Methods for external event screening quantification: Risk Methods Integration and Evaluation Program (RMIEP) methods development

    SciTech Connect

    Ravindra, M.K.; Banon, H.

    1992-07-01

    In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as a three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC`s PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described.

  12. Methods for external event screening quantification: Risk Methods Integration and Evaluation Program (RMIEP) methods development

    SciTech Connect

    Ravindra, M.K.; Banon, H. )

    1992-07-01

    In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as a three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC's PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described.

  13. Acute and Chronic Risk Preceding Suicidal Crises Among Middle-Aged Men Without Known Mental Health and/or Substance Abuse Problems: An Exploratory Mixed-Methods Analysis.

    PubMed

    Schiff, Lara B; Holland, Kristin M; Stone, Deborah M; Logan, J; Marshall, Khiya J; Martell, Brandi; Bartholow, Brad

    2015-01-01

    Suicides among men aged 35-64 years increased by 27% between 1999 and 2013, yet little research exists to examine the nature of the suicide risk within this population. Many men do not seek help if they have mental health problems and suicides may occur in reaction to stressful circumstances. We examined the precipitating circumstances of 600 suicides without known mental health or substance abuse (MH/SA) problems and with a recent crisis. Whether these suicides occurred within the context of an acute crisis only or in the context of chronic circumstances was observed. Using data from the National Violent Death Reporting System and employing mixed-methods analysis, we examined the circumstances and context of a census of middle-aged male suicides (n = 600) in seven states between 2005 and 2010. Precipitating circumstances among this group involved intimate partner problems (IPP; 58.3%), criminal/legal problems (50.7%), job/financial problems (22.5%), and health problems (13.5%). Men with IPP and criminal/legal issues were more likely than men with health and/or job/financial issues to experience suicide in the context of an acute crisis only. Suicides occurring in reaction to an acute crisis only or in the context of acute and chronic circumstances lend themselves to opportunities for intervention. Further implications are discussed.

  14. Evaluating the Risks of Clinical Research: Direct Comparative Analysis

    PubMed Central

    Abdoler, Emily; Roberson-Nay, Roxann; Pine, Daniel S.; Wendler, David

    2014-01-01

    Abstract Objectives: Many guidelines and regulations allow children and adolescents to be enrolled in research without the prospect of clinical benefit when it poses minimal risk. However, few systematic methods exist to determine when research risks are minimal. This situation has led to significant variation in minimal risk judgments, raising concern that some children are not being adequately protected. To address this concern, we describe a new method for implementing the widely endorsed “risks of daily life” standard for minimal risk. This standard defines research risks as minimal when they do not exceed the risks posed by daily life activities or routine examinations. Methods: This study employed a conceptual and normative analysis, and use of an illustrative example. Results: Different risks are composed of the same basic elements: Type, likelihood, and magnitude of harm. Hence, one can compare the risks of research and the risks of daily life by comparing the respective basic elements with each other. We use this insight to develop a systematic method, direct comparative analysis, for implementing the “risks of daily life” standard for minimal risk. The method offers a way of evaluating research procedures that pose the same types of risk as daily life activities, such as the risk of experiencing anxiety, stress, or other psychological harm. We thus illustrate how direct comparative analysis can be applied in practice by using it to evaluate whether the anxiety induced by a respiratory CO2 challenge poses minimal or greater than minimal risks in children and adolescents. Conclusions: Direct comparative analysis is a systematic method for applying the “risks of daily life” standard for minimal risk to research procedures that pose the same types of risk as daily life activities. It thereby offers a method to protect children and adolescents in research, while ensuring that important studies are not blocked because of unwarranted concerns about

  15. [Study on the risk assessment method of regional groundwater pollution].

    PubMed

    Yang, Yan; Yu, Yun-Jiang; Wang, Zong-Qing; Li, Ding-Long; Sun, Hong-Wei

    2013-02-01

    Based on the boundary elements of system risk assessment, the regional groundwater pollution risk assessment index system was preliminarily established, which included: regional groundwater specific vulnerability assessment, the regional pollution sources characteristics assessment and the health risk assessment of regional featured pollutants. The three sub-evaluation systems were coupled with the multi-index comprehensive method, the risk was characterized with the Spatial Analysis of ArcMap, and a new method to evaluate regional groundwater pollution risk that suitable for different parts of natural conditions, different types of pollution was established. Take Changzhou as an example, the risk of shallow groundwater pollution was studied with the new method, and found that the vulnerability index of groundwater in Changzhou is high and distributes unevenly; The distribution of pollution sources is concentrated and has a great impact on groundwater pollution risks; Influenced by the pollutants and pollution sources, the values of health risks are high in the urban area of Changzhou. The pollution risk of shallow groundwater is high and distributes unevenly, and distributes in the north of the line of Anjia-Xuejia-Zhenglu, the center of the city and the southeast, where the human activities are more intense and the pollution sources are intensive.

  16. Command Process Modeling & Risk Analysis

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  17. Command Process Modeling & Risk Analysis

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  18. An Emerging New Risk Analysis Science: Foundations and Implications.

    PubMed

    Aven, Terje

    2017-09-07

    To solve real-life problems-such as those related to technology, health, security, or climate change-and make suitable decisions, risk is nearly always a main issue. Different types of sciences are often supporting the work, for example, statistics, natural sciences, and social sciences. Risk analysis approaches and methods are also commonly used, but risk analysis is not broadly accepted as a science in itself. A key problem is the lack of explanatory power and large uncertainties when assessing risk. This article presents an emerging new risk analysis science based on novel ideas and theories on risk analysis developed in recent years by the risk analysis community. It builds on a fundamental change in thinking, from the search for accurate predictions and risk estimates, to knowledge generation related to concepts, theories, frameworks, approaches, principles, methods, and models to understand, assess, characterize, communicate, and (in a broad sense) manage risk. Examples are used to illustrate the importance of this distinct/separate risk analysis science for solving risk problems, supporting science in general and other disciplines in particular. © 2016 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.

  19. A classification scheme for risk assessment methods.

    SciTech Connect

    Stamp, Jason Edwin; Campbell, Philip LaRoche

    2004-08-01

    This report presents a classification scheme for risk assessment methods. This scheme, like all classification schemes, provides meaning by imposing a structure that identifies relationships. Our scheme is based on two orthogonal aspects--level of detail, and approach. The resulting structure is shown in Table 1 and is explained in the body of the report. Each cell in the Table represent a different arrangement of strengths and weaknesses. Those arrangements shift gradually as one moves through the table, each cell optimal for a particular situation. The intention of this report is to enable informed use of the methods so that a method chosen is optimal for a situation given. This report imposes structure on the set of risk assessment methods in order to reveal their relationships and thus optimize their usage.We present a two-dimensional structure in the form of a matrix, using three abstraction levels for the rows and three approaches for the columns. For each of the nine cells in the matrix we identify the method type by name and example. The matrix helps the user understand: (1) what to expect from a given method, (2) how it relates to other methods, and (3) how best to use it. Each cell in the matrix represent a different arrangement of strengths and weaknesses. Those arrangements shift gradually as one moves through the table, each cell optimal for a particular situation. The intention of this report is to enable informed use of the methods so that a method chosen is optimal for a situation given. The matrix, with type names in the cells, is introduced in Table 2 on page 13 below. Unless otherwise stated we use the word 'method' in this report to refer to a 'risk assessment method', though often times we use the full phrase. The use of the terms 'risk assessment' and 'risk management' are close enough that we do not attempt to distinguish them in this report. The remainder of this report is organized as follows. In Section 2 we provide context for this report

  20. Nano risk analysis: advancing the science for nanomaterials risk management.

    PubMed

    Shatkin, Jo Anne; Abbott, Linda Carolyn; Bradley, Ann E; Canady, Richard Alan; Guidotti, Tee; Kulinowski, Kristen M; Löfstedt, Ragnar E; Louis, Garrick; MacDonell, Margaret; Macdonell, Margaret; Maynard, Andrew D; Paoli, Greg; Sheremeta, Lorraine; Walker, Nigel; White, Ronald; Williams, Richard

    2010-11-01

    Scientists, activists, industry, and governments have raised concerns about health and environmental risks of nanoscale materials. The Society for Risk Analysis convened experts in September 2008 in Washington, DC to deliberate on issues relating to the unique attributes of nanoscale materials that raise novel concerns about health risks. This article reports on the overall themes and findings of the workshop, uncovering the underlying issues for each of these topics that become recurring themes. The attributes of nanoscale particles and other nanomaterials that present novel issues for risk analysis are evaluated in a risk analysis framework, identifying challenges and opportunities for risk analysts and others seeking to assess and manage the risks from emerging nanoscale materials and nanotechnologies. Workshop deliberations and recommendations for advancing the risk analysis and management of nanotechnologies are presented.

  1. Method and apparatus for assessing cardiovascular risk

    NASA Technical Reports Server (NTRS)

    Albrecht, Paul (Inventor); Bigger, J. Thomas (Inventor); Cohen, Richard J. (Inventor)

    1998-01-01

    The method for assessing risk of an adverse clinical event includes detecting a physiologic signal in the subject and determining from the physiologic signal a sequence of intervals corresponding to time intervals between heart beats. The long-time structure of fluctuations in the intervals over a time period of more than fifteen minutes is analyzed to assess risk of an adverse clinical event. In a preferred embodiment, the physiologic signal is an electrocardiogram and the time period is at least fifteen minutes. A preferred method for analyzing the long-time structure variability in the intervals includes computing the power spectrum and fitting the power spectrum to a power law dependence on frequency over a selected frequency range such as 10.sup.-4 to 10.sup.-2 Hz. Characteristics of the long-time structure fluctuations in the intervals is used to assess risk of an adverse clinical event.

  2. Risk stratification for arrhythmic death in an emergency department cohort: a new method of nonlinear PD2i analysis of the ECG

    PubMed Central

    Skinner, James E; Meyer, Michael; Dalsey, William C; Nester, Brian A; Ramalanjaona, George; O’Neil, Brian J; Mangione, Antoinette; Terregino, Carol; Moreyra, Abel; Weiss, Daniel N; Anchin, Jerry M; Geary, Una; Taggart, Pamela

    2008-01-01

    Heart rate variability (HRV) reflects both cardiac autonomic function and risk of sudden arrhythmic death (AD). Indices of HRV based on linear stochastic models are independent risk factors for AD in postmyocardial infarction (MI) cohorts. Indices based on nonlinear deterministic models have a higher sensitivity and specificity for predicting AD in retrospective data. A new nonlinear deterministic model, the automated Point Correlation Dimension (PD2i), was prospectively evaluated for prediction of AD. Patients were enrolled (N = 918) in 6 emergency departments (EDs) upon presentation with chest pain and being determined to be at risk of acute MI (AMI) >7%. Brief digital ECGs (>1000 heartbeats, ∼15 min) were recorded and automated PD2i results obtained. Out-of-hospital AD was determined by modified Hinkle-Thaler criteria. All-cause mortality at 1 year was 6.2%, with 3.5% being ADs. Of the AD fatalities, 34% were without previous history of MI or diagnosis of AMI. The PD2i prediction of AD had sensitivity = 96%, specificity = 85%, negative predictive value = 99%, and relative risk >24.2 (p ≤ 0.001). HRV analysis by the time-dependent nonlinear PD2i algorithm can accurately predict risk of AD in an ED cohort and may have both life-saving and resource-saving implications for individual risk assessment. PMID:19209249

  3. Flood hazard energy in urban areas: a new integrated method for flood risk analysis in synthesizing interactions with urban boundary layer

    NASA Astrophysics Data System (ADS)

    Park, S. Y.; Schmidt, A.

    2015-12-01

    Since urban physical characteristics (such as morphology and land-use/land-cover) are different from those of nature, altered interactions between the surface and atmosphere (especially urban boundary layer, UBL) or surface and subsurface can affect the hydrologic behavior and hence the flood hazards. In this research we focus on three main aspects of the urban surface/atmosphere interactions that affect flood hazard: urban heat island (UHI) effect, increased surface roughness, and accumulated aerosols. These factors, along with the uncertainties in quantifying these components make risk analysis intractable. In order to perform a risk analysis, the impact of these components needs to be mapped to a variable that can be mathematically described in a risk-analysis framework. We propose defining hazard energy as a surrogate for the combined effect of these three components. Perturbations that can change the hazard energy come from diverse sources in the urban areas and these somewhat disconnected things can be combined by the energy concept to characterize the impacts of urban areas in risk assessment. This approach synthesizes across hydrological and hydraulic processes in UBL, land surface, subsurface, and sewer network with scrutinizing energy exchange across places. We can extend our understanding about not only the influence of cities on local climate in rural areas or larger scales but also the interaction of cities and nature affecting each other.

  4. Medicare's risk-adjusted capitation method.

    PubMed

    Grimaldi, Paul L

    2002-01-01

    Since 1997, the method to establish capitation rates for Medicare beneficiaries who are members of risk-bearing managed care plans has undergone several important developments. This includes the factoring of beneficiary health status into the rate-setting calculations. These changes were expected to increase the number of participating health plans, accelerate Medicare enrollment growth, and slice Medicare spending.

  5. Recent methods for assessing osteoporosis and fracture risk.

    PubMed

    Imai, Kazuhiro

    2014-01-01

    In the management and treatment of osteoporosis, the target is to assess fracture risk and the end-point is to prevent fractures. Traditionally, measurement of bone mineral density (BMD) by dual energy X-ray absorptiometry (DXA) has been the standard method for diagnosing osteoporosis, in addition to assessing fracture risk and therapeutic effects. Quantitative computed tomography (QCT) can quantify volumetric BMD, and cancellous bone can be measured independently of surrounding cortical bone and aortic calcification. Hip structure analysis (HSA) is a method using the DXA scan image and provides useful data for assessing hip fracture risk. Recently, new tools to assess osteoporosis and fracture risk have been developed. One of the recent advances has been the development of the FRAX (Fracture Risk Assessment Tool), which is helpful in conveying fracture risk to patients and providing treatment guidance to clinicians. Another advance is the finite element (FE) method based on data from computed tomography (CT), which is useful for assessing bone strength, fracture risk, and therapeutic effects on osteoporosis. In selecting the most appropriate drug for osteoporosis treatment, assessment by bone metabolic markers is an important factor. In this review, recent patents for assessing osteoporosis and fracture risk are discussed.

  6. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2008-01-01

    A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multi-factor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.

  7. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multifactor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.

  8. Modified risk evaluation method. Revision 1

    SciTech Connect

    Udell, C.J.; Tilden, J.A.; Toyooka, R.T.

    1993-08-01

    The purpose of this paper is to provide a structured and cost-oriented process to determine risks associated with nuclear material and other security interests. Financial loss is a continuing concern for US Department of Energy contractors. In this paper risk is equated with uncertainty of cost impacts to material assets or human resources. The concept provides a method for assessing the effectiveness of an integrated protection system, which includes operations, safety, emergency preparedness, and safeguards and security. The concept is suitable for application to sabotage evaluations. The protection of assets is based on risk associated with cost impacts to assets and the potential for undesirable events. This will allow managers to establish protection priorities in terms of the cost and the potential for the event, given the current level of protection.

  9. New method for assessing risks of email

    NASA Astrophysics Data System (ADS)

    Raja, Seyyed H.; Afrooz, Farzad

    2013-03-01

    E-mail technology, has become one of the requirements of human lives for correspondence between individuals. Given this, the important point is that the messages, server and client of e-mail and correspondences that exchanged between different people have acceptable security, to make people sure to use of this technology. In the information age, many of financial and non financial transactions are done electronically, data exchange takes place via the internet and theft and manipulation of data can make exorbitant cost in terms of integrity, financial, political, economic and culture. E-mail correspondence in there is same and it is very important. With review took place, a method that will focus on email system for risks assessment is not provided. We are examining ways of assessing for other systems and their strengths and weaknesses, then we use Mr Convery method for assessing email risks which it is for assessing network risks. At the end of paper we have offered special table for email risk assessment.

  10. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant... sensitive personal information, the risk analysis must also contain operational recommendations for responding to the data breach. Each risk analysis, regardless of findings and operational...

  11. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant... sensitive personal information, the risk analysis must also contain operational recommendations for responding to the data breach. Each risk analysis, regardless of findings and operational...

  12. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant... sensitive personal information, the risk analysis must also contain operational recommendations for responding to the data breach. Each risk analysis, regardless of findings and operational...

  13. Tsunamis: Global Exposure and Local Risk Analysis

    NASA Astrophysics Data System (ADS)

    Harbitz, C. B.; Løvholt, F.; Glimsdal, S.; Horspool, N.; Griffin, J.; Davies, G.; Frauenfelder, R.

    2014-12-01

    The 2004 Indian Ocean tsunami led to a better understanding of the likelihood of tsunami occurrence and potential tsunami inundation, and the Hyogo Framework for Action (HFA) was one direct result of this event. The United Nations International Strategy for Disaster Risk Reduction (UN-ISDR) adopted HFA in January 2005 in order to reduce disaster risk. As an instrument to compare the risk due to different natural hazards, an integrated worldwide study was implemented and published in several Global Assessment Reports (GAR) by UN-ISDR. The results of the global earthquake induced tsunami hazard and exposure analysis for a return period of 500 years are presented. Both deterministic and probabilistic methods (PTHA) are used. The resulting hazard levels for both methods are compared quantitatively for selected areas. The comparison demonstrates that the analysis is rather rough, which is expected for a study aiming at average trends on a country level across the globe. It is shown that populous Asian countries account for the largest absolute number of people living in tsunami prone areas, more than 50% of the total exposed people live in Japan. Smaller nations like Macao and the Maldives are among the most exposed by population count. Exposed nuclear power plants are limited to Japan, China, India, Taiwan, and USA. On the contrary, a local tsunami vulnerability and risk analysis applies information on population, building types, infrastructure, inundation, flow depth for a certain tsunami scenario with a corresponding return period combined with empirical data on tsunami damages and mortality. Results and validation of a GIS tsunami vulnerability and risk assessment model are presented. The GIS model is adapted for optimal use of data available for each study. Finally, the importance of including landslide sources in the tsunami analysis is also discussed.

  14. Screening of risk from patient manual handling with MAPO method.

    PubMed

    Battevi, Natale; Menoni, Olga

    2012-01-01

    International standards highlight the steps required by risk assessment and involving first hazard identification, then risk evaluation and finally, if necessary, risk assessment. To check approach appropriateness to "risk evaluation" from manual patient handling through MAPO, a cross study was carried out in view of checking relationship between this new risk assessment model and occurrence of acute low back pain. After proper training the MAPO screening method was assessed in 31 wards, 411 exposed subjects of geriatric hospitals. At the same time health data were collected on occurrence of low back pain episodes during the last year both in the exposed subjects' group and the external reference group (n�237). Risk and clinical assessment data were tutored and checked by EPM research unit. The logistic analysis was used as a method to evaluate the relationship between risk index and acute low back pain. Investigating relationship between acute low back pain episodes and levels of MAPO screening index, carried out only with the people exposed who claimed to work for at least 30 hours per week (n = 178), showed definitely positive trends. The study results indicate that MAPO screening may represent a useful tool to estimate the risk from manual handling patients.

  15. Spatio-temporal earthquake risk assessment for the Lisbon Metropolitan Area - A contribution to improving standard methods of population exposure and vulnerability analysis

    NASA Astrophysics Data System (ADS)

    Freire, Sérgio; Aubrecht, Christoph

    2010-05-01

    The recent 7.0 M earthquake that caused severe damage and destruction in parts of Haiti struck close to 5 PM (local time), at a moment when many people were not in their residences, instead being in their workplaces, schools, or churches. Community vulnerability assessment to seismic hazard relying solely on the location and density of resident-based census population, as is commonly the case, would grossly misrepresent the real situation. In particular in the context of global (climate) change, risk analysis is a research field increasingly gaining in importance whereas risk is usually defined as a function of hazard probability and vulnerability. Assessment and mapping of human vulnerability has however generally been lagging behind hazard analysis efforts. Central to the concept of vulnerability is the issue of human exposure. Analysis of exposure is often spatially tied to administrative units or reference objects such as buildings, spanning scales from the regional level to local studies for small areas. Due to human activities and mobility, the spatial distribution of population is time-dependent, especially in metropolitan areas. Accurately estimating population exposure is a key component of catastrophe loss modeling, one element of effective risk analysis and emergency management. Therefore, accounting for the spatio-temporal dynamics of human vulnerability correlates with recent recommendations to improve vulnerability analyses. Earthquakes are the prototype for a major disaster, being low-probability, rapid-onset, high-consequence events. Lisbon, Portugal, is subject to a high risk of earthquake, which can strike at any day and time, as confirmed by modern history (e.g. December 2009). The recently-approved Special Emergency and Civil Protection Plan (PEERS) is based on a Seismic Intensity map, and only contemplates resident population from the census as proxy for human exposure. In the present work we map and analyze the spatio-temporal distribution of

  16. Barriers to uptake among high-risk individuals declining participation in lung cancer screening: a mixed methods analysis of the UK Lung Cancer Screening (UKLS) trial.

    PubMed

    Ali, Noor; Lifford, Kate J; Carter, Ben; McRonald, Fiona; Yadegarfar, Ghasem; Baldwin, David R; Weller, David; Hansell, David M; Duffy, Stephen W; Field, John K; Brain, Kate

    2015-07-14

    The current study aimed to identify the barriers to participation among high-risk individuals in the UK Lung Cancer Screening (UKLS) pilot trial. The UKLS pilot trial is a randomised controlled trial of low-dose CT (LDCT) screening that has recruited high-risk people using a population approach in the Cambridge and Liverpool areas. High-risk individuals aged 50-75 years were invited to participate in UKLS. Individuals were excluded if a LDCT scan was performed within the last year, if they were unable to provide consent, or if LDCT screening was unable to be carried out due to coexisting comorbidities. Statistical associations between individual characteristics and UKLS uptake were examined using multivariable regression modelling. In those who completed a non-participation questionnaire (NPQ), thematic analysis of free-text data was undertaken to identify reasons for not taking part, with subsequent exploratory linkage of key themes to risk factors for non-uptake. Comparative data were available from 4061 high-risk individuals who consented to participate in the trial and 2756 who declined participation. Of those declining participation, 748 (27.1%) completed a NPQ. Factors associated with non-uptake included: female gender (OR=0.64, p<0.001), older age (OR=0.73, p<0.001), current smoking (OR=0.70, p<0.001), lower socioeconomic group (OR=0.56, p<0.001) and higher affective risk perception (OR=0.52, p<0.001). Among non-participants who provided a reason, two main themes emerged reflecting practical and emotional barriers. Smokers were more likely to report emotional barriers to participation. A profile of risk factors for non-participation in lung screening has emerged, with underlying reasons largely relating to practical and emotional barriers. Strategies for engaging high-risk, hard-to-reach groups are critical for the equitable uptake of a potential future lung cancer screening programme. The UKLS trial was registered with the International Standard

  17. General Risk Analysis Methodological Implications to Explosives Risk Management Systems,

    DTIC Science & Technology

    An investigation sponsored by the National Science Foundation has produced as one of its results a survey and evaluation of risk analysis methodologies...This paper presents some implications of the survey to risk analysis and decision making for explosives hazards such as may ultimately be

  18. Low-thrust mission risk analysis.

    NASA Technical Reports Server (NTRS)

    Yen, C. L.; Smith, D. B.

    1973-01-01

    A computerized multi-stage failure process simulation procedure is used to evaluate the risk in a solar electric space mission. The procedure uses currently available thrust-subsystem reliability data and performs approximate simulations of the thrust subsystem burn operation, the system failure processes, and the retargetting operations. The application of the method is used to assess the risks in carrying out a 1980 rendezvous mission to Comet Encke. Analysis of the results and evaluation of the effects of various risk factors on the mission show that system component failure rates is the limiting factor in attaining a high mission reliability. But it is also shown that a well-designed trajectory and system operation mode can be used effectively to partially compensate for unreliable thruster performance.

  19. Risk factors analysis of consecutive exotropia

    PubMed Central

    Gong, Qianwen; Wei, Hong; Zhou, Xu; Li, Ziyuan; Liu, Longqian

    2016-01-01

    Abstract To evaluate clinical factors associated with the onset of consecutive exotropia (XT) following esotropia surgery. By a retrospective nested case-control design, we reviewed the medical records of 193 patients who had undergone initial esotropia surgery between 2008 and 2015, and had follow-up longer than 6 months. The probable risk factors were evaluated between groups 1 (consecutive XT) and 2 (non-consecutive exotropia). Pearson chi-square test and Mann–Whitney U test were used for univariate analysis, and conditional logistic regression model was applied for exploring the potential risk factors of consecutive XT. Consecutive exotropia occurred in 23 (11.9%) of 193 patients. Patients who had undergone large bilateral medial rectus recession (BMR) (P = 0.017) had a high risk of developing consecutive XT. Oblique dysfunction (P = 0.001), adduction limitation (P = 0.000) were associated with a high risk of consecutive XT, which was confirmed in the conditional logistic regression analysis. In addition, large amount of BMR (6 mm or more) was associated with higher incidence of adduction limitation (P = 0.045). The surgical methods and preoperative factors did not appear to influence the risk of developing consecutive XT (P > 0.05). The amount of surgery could be optimized to reduce the risk of consecutive XT. The presence of oblique overaction and postoperative adduction limitation may be associated with a high risk of consecutive XT, which may require close supervision, and/or even earlier operation intervention. PMID:27977611

  20. Applying the partitioned multiobjective risk method (PMRM) to portfolio selection.

    PubMed

    Reyes Santos, Joost; Haimes, Yacov Y

    2004-06-01

    The analysis of risk-return tradeoffs and their practical applications to portfolio analysis paved the way for Modern Portfolio Theory (MPT), which won Harry Markowitz a 1992 Nobel Prize in Economics. A typical approach in measuring a portfolio's expected return is based on the historical returns of the assets included in a portfolio. On the other hand, portfolio risk is usually measured using volatility, which is derived from the historical variance-covariance relationships among the portfolio assets. This article focuses on assessing portfolio risk, with emphasis on extreme risks. To date, volatility is a major measure of risk owing to its simplicity and validity for relatively small asset price fluctuations. Volatility is a justified measure for stable market performance, but it is weak in addressing portfolio risk under aberrant market fluctuations. Extreme market crashes such as that on October 19, 1987 ("Black Monday") and catastrophic events such as the terrorist attack of September 11, 2001 that led to a four-day suspension of trading on the New York Stock Exchange (NYSE) are a few examples where measuring risk via volatility can lead to inaccurate predictions. Thus, there is a need for a more robust metric of risk. By invoking the principles of the extreme-risk-analysis method through the partitioned multiobjective risk method (PMRM), this article contributes to the modeling of extreme risks in portfolio performance. A measure of an extreme portfolio risk, denoted by f(4), is defined as the conditional expectation for a lower-tail region of the distribution of the possible portfolio returns. This article presents a multiobjective problem formulation consisting of optimizing expected return and f(4), whose solution is determined using Evolver-a software that implements a genetic algorithm. Under business-as-usual market scenarios, the results of the proposed PMRM portfolio selection model are found to be compatible with those of the volatility-based model

  1. Loss Exposure and Risk Analysis Methodology (LERAM) Project Database Design.

    DTIC Science & Technology

    1996-06-01

    MISREPS) to more capably support system safety engineering concepts such as hazard analysis and risk management. As part of the Loss Exposure and Risk ... Analysis Methodology (LERAM) project, the research into the methods which we employ to report, track, and analyze hazards has resulted in a series of low

  2. [Risk sharing methods in middle income countries].

    PubMed

    Inotai, András; Kaló, Zoltán

    2012-01-01

    The pricing strategy of innovative medicines is based on the therapeutic value in the largest pharmaceutical markets. The cost-effectiveness of new medicines with value based ex-factory price is justifiable. Due to the international price referencing and parallel trade the ex-factory price corridor of new medicines has been narrowed in recent years. Middle income countries have less negotiation power to change the narrow drug pricing corridor, although their fair intention is to buy pharmaceuticals at lower price from their scarce public resources compared to higher income countries. Therefore the reimbursement of new medicines at prices of Western-European countries may not be justifiable in Central-Eastern European countries. Confidential pricing agreements (i.e. confidential price discounts, claw-back or rebate) in lower income countries of the European Union can alleviate this problem, as prices of new medicines can be adjusted to local purchasing power without influencing the published ex-factory price and so the accessibility of patients to these drugs in other countries. In order to control the drug budget payers tend to apply financial risk sharing agreements for new medicines in more and more countries to shift the consequences of potential overspending to pharmaceutical manufacturers. The major paradox of financial risk-sharing schemes is that increased mortality, poor persistence of patients, reduced access to healthcare providers, and no treatment reduce pharmaceutical spending. Consequently, payers have started to apply outcome based risk sharing agreements for new medicines recently to improve the quality of health care provision. Our paper aims to review and assess the published financial and outcome based risk sharing methods. Introduction of outcome based risk-sharing schemes can be a major advancement in the drug reimbursement strategy of payers in middle income countries. These schemes can help to reduce the medical uncertainty in coverage

  3. RAMS (Risk Analysis - Modular System) methodology

    SciTech Connect

    Stenner, R.D.; Strenge, D.L.; Buck, J.W.

    1996-10-01

    The Risk Analysis - Modular System (RAMS) was developed to serve as a broad scope risk analysis tool for the Risk Assessment of the Hanford Mission (RAHM) studies. The RAHM element provides risk analysis support for Hanford Strategic Analysis and Mission Planning activities. The RAHM also provides risk analysis support for the Hanford 10-Year Plan development activities. The RAMS tool draws from a collection of specifically designed databases and modular risk analysis methodologies and models. RAMS is a flexible modular system that can be focused on targeted risk analysis needs. It is specifically designed to address risks associated with overall strategy, technical alternative, and `what if` questions regarding the Hanford cleanup mission. RAMS is set up to address both near-term and long-term risk issues. Consistency is very important for any comparative risk analysis, and RAMS is designed to efficiently and consistently compare risks and produce risk reduction estimates. There is a wide range of output information that can be generated by RAMS. These outputs can be detailed by individual contaminants, waste forms, transport pathways, exposure scenarios, individuals, populations, etc. However, they can also be in rolled-up form to support high-level strategy decisions.

  4. Economic Risk Analysis: Using Analytical and Monte Carlo Techniques.

    ERIC Educational Resources Information Center

    O'Donnell, Brendan R.; Hickner, Michael A.; Barna, Bruce A.

    2002-01-01

    Describes the development and instructional use of a Microsoft Excel spreadsheet template that facilitates analytical and Monte Carlo risk analysis of investment decisions. Discusses a variety of risk assessment methods followed by applications of the analytical and Monte Carlo methods. Uses a case study to illustrate use of the spreadsheet tool…

  5. The JPL Cost Risk Analysis Approach that Incorporates Engineering Realism

    NASA Technical Reports Server (NTRS)

    Harmon, Corey C.; Warfield, Keith R.; Rosenberg, Leigh S.

    2006-01-01

    This paper discusses the JPL Cost Engineering Group (CEG) cost risk analysis approach that accounts for all three types of cost risk. It will also describe the evaluation of historical cost data upon which this method is based. This investigation is essential in developing a method that is rooted in engineering realism and produces credible, dependable results to aid decision makers.

  6. The JPL Cost Risk Analysis Approach that Incorporates Engineering Realism

    NASA Technical Reports Server (NTRS)

    Harmon, Corey C.; Warfield, Keith R.; Rosenberg, Leigh S.

    2006-01-01

    This paper discusses the JPL Cost Engineering Group (CEG) cost risk analysis approach that accounts for all three types of cost risk. It will also describe the evaluation of historical cost data upon which this method is based. This investigation is essential in developing a method that is rooted in engineering realism and produces credible, dependable results to aid decision makers.

  7. METHODS OF MAGNETOTELLURIC ANALYSIS

    DTIC Science & Technology

    Magnetotelluric prospecting is a method of geophysical exploration that makes use of the fluctuations in the natural electric and magnetic fields...function of the conductivity structure of the earth’s substrata. This report describes some new methods for analyzing and interpreting magnetotelluric

  8. A comparison of radiological risk assessment methods for environmental restoration

    SciTech Connect

    Dunning, D.E. Jr.; Peterson, J.M.

    1993-09-01

    Evaluation of risks to human health from exposure to ionizing radiation at radioactively contaminated sites is an integral part of the decision-making process for determining the need for remediation and selecting remedial actions that may be required. At sites regulated under the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA), a target risk range of 10{sup {minus}4} to 10{sup {minus}6} incremental cancer incidence over a lifetime is specified by the US Environmental Protection Agency (EPA) as generally acceptable, based on the reasonable maximum exposure to any individual under current and future land use scenarios. Two primary methods currently being used in conducting radiological risk assessments at CERCLA sites are compared in this analysis. Under the first method, the radiation dose equivalent (i.e., Sv or rem) to the receptors of interest over the appropriate period of exposure is estimated and multiplied by a risk factor (cancer risk/Sv). Alternatively, incremental cancer risk can be estimated by combining the EPA`s cancer slope factors (previously termed potency factors) for radionuclides with estimates of radionuclide intake by ingestion and inhalation, as well as radionuclide concentrations in soil that contribute to external dose. The comparison of the two methods has demonstrated that resulting estimates of lifetime incremental cancer risk under these different methods may differ significantly, even when all other exposure assumptions are held constant, with the magnitude of the discrepancy depending upon the dominant radionuclides and exposure pathways for the site. The basis for these discrepancies, the advantages and disadvantages of each method, and the significance of the discrepant results for environmental restoration decisions are presented.

  9. Improving causal inferences in risk analysis.

    PubMed

    Cox, Louis Anthony Tony

    2013-10-01

    Recent headlines and scientific articles projecting significant human health benefits from changes in exposures too often depend on unvalidated subjective expert judgments and modeling assumptions, especially about the causal interpretation of statistical associations. Some of these assessments are demonstrably biased toward false positives and inflated effects estimates. More objective, data-driven methods of causal analysis are available to risk analysts. These can help to reduce bias and increase the credibility and realism of health effects risk assessments and causal claims. For example, quasi-experimental designs and analysis allow alternative (noncausal) explanations for associations to be tested, and refuted if appropriate. Panel data studies examine empirical relations between changes in hypothesized causes and effects. Intervention and change-point analyses identify effects (e.g., significant changes in health effects time series) and estimate their sizes. Granger causality tests, conditional independence tests, and counterfactual causality models test whether a hypothesized cause helps to predict its presumed effects, and quantify exposure-specific contributions to response rates in differently exposed groups, even in the presence of confounders. Causal graph models let causal mechanistic hypotheses be tested and refined using biomarker data. These methods can potentially revolutionize the study of exposure-induced health effects, helping to overcome pervasive false-positive biases and move the health risk assessment scientific community toward more accurate assessments of the impacts of exposures and interventions on public health.

  10. Problems With Risk Reclassification Methods for Evaluating Prediction Models

    PubMed Central

    Pepe, Margaret S.

    2011-01-01

    For comparing the performance of a baseline risk prediction model with one that includes an additional predictor, a risk reclassification analysis strategy has been proposed. The first step is to cross-classify risks calculated according to the 2 models for all study subjects. Summary measures including the percentage of reclassification and the percentage of correct reclassification are calculated, along with 2 reclassification calibration statistics. The author shows that interpretations of the proposed summary measures and P values are problematic. The author's recommendation is to display the reclassification table, because it shows interesting information, but to use alternative methods for summarizing and comparing model performance. The Net Reclassification Index has been suggested as one alternative method. The author argues for reporting components of the Net Reclassification Index because they are more clinically relevant than is the single numerical summary measure. PMID:21555714

  11. Cumulative Risk Assessment Toolbox: Methods and Approaches for the Practitioner

    PubMed Central

    MacDonell, Margaret M.; Haroun, Lynne A.; Teuschler, Linda K.; Rice, Glenn E.; Hertzberg, Richard C.; Butler, James P.; Chang, Young-Soo; Clark, Shanna L.; Johns, Alan P.; Perry, Camarie S.; Garcia, Shannon S.; Jacobi, John H.; Scofield, Marcienne A.

    2013-01-01

    The historical approach to assessing health risks of environmental chemicals has been to evaluate them one at a time. In fact, we are exposed every day to a wide variety of chemicals and are increasingly aware of potential health implications. Although considerable progress has been made in the science underlying risk assessments for real-world exposures, implementation has lagged because many practitioners are unaware of methods and tools available to support these analyses. To address this issue, the US Environmental Protection Agency developed a toolbox of cumulative risk resources for contaminated sites, as part of a resource document that was published in 2007. This paper highlights information for nearly 80 resources from the toolbox and provides selected updates, with practical notes for cumulative risk applications. Resources are organized according to the main elements of the assessment process: (1) planning, scoping, and problem formulation; (2) environmental fate and transport; (3) exposure analysis extending to human factors; (4) toxicity analysis; and (5) risk and uncertainty characterization, including presentation of results. In addition to providing online access, plans for the toolbox include addressing nonchemical stressors and applications beyond contaminated sites and further strengthening resource accessibility to support evolving analyses for cumulative risk and sustainable communities. PMID:23762048

  12. Risks, Benefits, and Importance of Collecting Sexual Orientation and Gender Identity Data in Healthcare Settings: A Multi-Method Analysis of Patient and Provider Perspectives.

    PubMed

    Maragh-Bass, Allysha C; Torain, Maya; Adler, Rachel; Schneider, Eric; Ranjit, Anju; Kodadek, Lisa M; Shields, Ryan; German, Danielle; Snyder, Claire; Peterson, Susan; Schuur, Jeremiah; Lau, Brandyn; Haider, Adil H

    2017-04-01

    Research suggests that LGBT populations experience barriers to healthcare. Organizations such as the Institute of Medicine recommend routine documentation of sexual orientation (SO) and gender identity (GI) in healthcare, to reduce LGBT disparities. We explore patient views regarding the importance of SO/GI collection, and patient and provider views on risks and benefits of routine SO/GI collection in various settings. We surveyed LGBT/non-LGBT patients and providers on their views on SO/GI collection. Weighted data were analyzed with descriptive statistics; content analysis was conducted with open-ended responses. One-half of the 1516 patients and 60% of 429 providers were female; 64% of patients and 71% of providers were White. Eighty percent of providers felt that collecting SO data would offend patients, whereas only 11% of patients reported that they would be offended. Patients rated it as more important for primary care providers to know the SO of all patients compared with emergency department (ED) providers knowing the SO of all patients (41.3% vs. 31.6%; P < 0.001). Patients commonly perceived individualized care as an SO/GI disclosure benefit, whereas providers perceived patient-provider interaction improvement as the main benefit. Patient comments cited bias/discrimination risk most frequently (49.7%; N = 781), whereas provider comments cited patient discomfort/offense most frequently (54.5%; N = 433). Patients see the importance of SO/GI more in primary care than ED settings. However, many LGBT patients seek ED care due to factors including uninsurance; therefore, the ED may represent an initial point of contact for SO/GI collection. Therefore, patient-centered approaches to collecting SO/GI are needed. Patients and providers differed in perceived risks and benefits to routine SO/GI collection. Provider training in LGBT health may address patients' bias/discrimination concerns, and ultimately reduce LGBT health disparities.

  13. Complementing Gender Analysis Methods.

    PubMed

    Kumar, Anant

    2016-01-01

    The existing gender analysis frameworks start with a premise that men and women are equal and should be treated equally. These frameworks give emphasis on equal distribution of resources between men and women and believe that this will bring equality which is not always true. Despite equal distribution of resources, women tend to suffer and experience discrimination in many areas of their lives such as the power to control resources within social relationships, and the need for emotional security and reproductive rights within interpersonal relationships. These frameworks believe that patriarchy as an institution plays an important role in women's oppression, exploitation, and it is a barrier in their empowerment and rights. Thus, some think that by ensuring equal distribution of resources and empowering women economically, institutions like patriarchy can be challenged. These frameworks are based on proposed equality principle which puts men and women in competing roles. Thus, the real equality will never be achieved. Contrary to the existing gender analysis frameworks, the Complementing Gender Analysis framework proposed by the author provides a new approach toward gender analysis which not only recognizes the role of economic empowerment and equal distribution of resources but suggests to incorporate the concept and role of social capital, equity, and doing gender in gender analysis which is based on perceived equity principle, putting men and women in complementing roles that may lead to equality. In this article the author reviews the mainstream gender theories in development from the viewpoint of the complementary roles of gender. This alternative view is argued based on existing literature and an anecdote of observations made by the author. While criticizing the equality theory, the author offers equity theory in resolving the gender conflict by using the concept of social and psychological capital.

  14. Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581

    SciTech Connect

    Prayogo, Galang Sandy Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin

    2016-04-19

    Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presented a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.

  15. Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581

    NASA Astrophysics Data System (ADS)

    Prayogo, Galang Sandy; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin

    2016-04-01

    Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presented a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.

  16. Sensitivity analysis in quantitative microbial risk assessment.

    PubMed

    Zwieterin, M H; van Gerwen, S J

    2000-07-15

    The occurrence of foodborne disease remains a widespread problem in both the developing and the developed world. A systematic and quantitative evaluation of food safety is important to control the risk of foodborne diseases. World-wide, many initiatives are being taken to develop quantitative risk analysis. However, the quantitative evaluation of food safety in all its aspects is very complex, especially since in many cases specific parameter values are not available. Often many variables have large statistical variability while the quantitative effect of various phenomena is unknown. Therefore, sensitivity analysis can be a useful tool to determine the main risk-determining phenomena, as well as the aspects that mainly determine the inaccuracy in the risk estimate. This paper presents three stages of sensitivity analysis. First, deterministic analysis selects the most relevant determinants for risk. Overlooking of exceptional, but relevant cases is prevented by a second, worst-case analysis. This analysis finds relevant process steps in worst-case situations, and shows the relevance of variations of factors for risk. The third, stochastic analysis, studies the effects of variations of factors for the variability of risk estimates. Care must be taken that the assumptions made as well as the results are clearly communicated. Stochastic risk estimates are, like deterministic ones, just as good (or bad) as the available data, and the stochastic analysis must not be used to mask lack of information. Sensitivity analysis is a valuable tool in quantitative risk assessment by determining critical aspects and effects of variations.

  17. A GIS-based method for flood risk assessment

    NASA Astrophysics Data System (ADS)

    Kalogeropoulos, Kleomenis; Stathopoulos, Nikos; Psarogiannis, Athanasios; Penteris, Dimitris; Tsiakos, Chrisovalantis; Karagiannopoulou, Aikaterini; Krikigianni, Eleni; Karymbalis, Efthimios; Chalkias, Christos

    2016-04-01

    Floods are physical global hazards with negative environmental and socio-economic impacts on local and regional scale. The technological evolution during the last decades, especially in the field of geoinformatics, has offered new advantages in hydrological modelling. This study seeks to use this technology in order to quantify flood risk assessment. The study area which was used is an ungauged catchment and by using mostly GIS hydrological and geomorphological analysis together with a GIS-based distributed Unit Hydrograph model, a series of outcomes have risen. More specifically, this paper examined the behaviour of the Kladeos basin (Peloponnese, Greece) using real rainfall data, as well hypothetical storms. The hydrological analysis held using a Digital Elevation Model of 5x5m pixel size, while the quantitative drainage basin characteristics were calculated and were studied in terms of stream order and its contribution to the flood. Unit Hydrographs are, as it known, useful when there is lack of data and in this work, based on time-area method, a sequences of flood risk assessments have been made using the GIS technology. Essentially, the proposed methodology estimates parameters such as discharge, flow velocity equations etc. in order to quantify flood risk assessment. Keywords Flood Risk Assessment Quantification; GIS; hydrological analysis; geomorphological analysis.

  18. A Comparison of Disease Risk Analysis Tools for Conservation Translocations.

    PubMed

    Dalziel, Antonia Eleanor; Sainsbury, Anthony W; McInnes, Kate; Jakob-Hoff, Richard; Ewen, John G

    2017-03-01

    Conservation translocations are increasingly used to manage threatened species and restore ecosystems. Translocations increase the risk of disease outbreaks in the translocated and recipient populations. Qualitative disease risk analyses have been used as a means of assessing the magnitude of any effect of disease and the probability of the disease occurring associated with a translocation. Currently multiple alternative qualitative disease risk analysis packages are available to practitioners. Here we compare the ease of use, expertise required, transparency, and results from, three different qualitative disease risk analyses using a translocation of the endangered New Zealand passerine, the hihi (Notiomystis cincta), as a model. We show that the three methods use fundamentally different approaches to define hazards. Different methods are used to produce estimations of the risk from disease, and the estimations are different for the same hazards. Transparency of the process varies between methods from no referencing, or explanations of evidence to justify decisions, through to full documentation of resources, decisions and assumptions made. Evidence to support decisions on estimation of risk from disease is important, to enable knowledge acquired in the future, for example, from translocation outcome, to be used to improve the risk estimation for future translocations. Information documenting each disease risk analysis differs along with variation in emphasis of the questions asked within each package. The expertise required to commence a disease risk analysis varies and an action flow chart tailored for the non-wildlife health specialist are included in one method but completion of the disease risk analysis requires wildlife health specialists with epidemiological and pathological knowledge in all three methods. We show that disease risk analysis package choice may play a greater role in the overall risk estimation of the effect of disease on animal populations

  19. [Screening of patient manual handling risk using the MAPO method].

    PubMed

    Battevi, N; Menoni, Olga; Alvarez-Casado, E

    2012-01-01

    International standards draw attention to the steps that risk assessment should follow to first identify hazards, then proceed to risk evaluation and lastly, if necessary, risk assessment. The same logic also applies to risk assessment of manual patient handling. To check appropriateness of approach to "risk evaluation" of manual patient handling using MAPO, a cross sectional study was carried out aimed at checking the relationship between this new risk assessment model (MAPO screening) and occurrence of acute low back pain. After proper training the MAPO screening method was used to assess risk of manual handling of patients in 31 wards, covering 411 exposed subjects employed in geriatric hospitals belonging to the UNEBA (National Union Institutions and Social Welfare Initiatives) of the Veneto Region. At the same time health data were collected on the occurrence of low back pain episodes during the last year both in the group of exposed subjects and in an external reference group (237 subjects). Risk and clinical assessment data were verified and checked by the EPM research unit. Logistic analysis was used as a method to evaluate the relationship between MAPO screening risk index and acute low back pain. Investigating the relationship between acute low back pain episodes and levels of MAPO screening index, carried out only on exposed subjects who reported working for at least 30 hours per week (N=178), showed definitely positive trends: for MAPO screening index of exposure levels between 1.51 and 5, OR were double (OR=2.22; IC 95% 0.88-5.63) whereas for index levels exceeding 5, OR were about 4 (OR=3.77; IC 95% 1.33-10.74). These results did not show significant differences when correcting the analysis for confounding factors such as gender and age classes. The results of the study indicate that the proposed method, "MAPO screening", can be a useful tool to estimate risk due to manual handling of patients and can also be used to test the efficacy of preventive

  20. Speciation analysis, bioavailability and risk assessment of trace metals in herbal decoctions using a combined technique of in Vitro digestion and biomembrane filtration as sample pretreatment method.

    PubMed

    Li, Shun-Xing; Lin, Lu-xiu; Lin, Jing; Zheng, Feng-Ying; Wang, Qing-Xiang; Weng, Wen

    2010-01-01

    Sample preparation is the first crucial step in the speciation analysis, bioavailability and risk assessment of trace metals in plant samples such as herb and vegetables. Two bionic technologies titled 'in vitro digestion' and 'extraction with biomembrane' were developed for pre-treatment of herbal decoction. The decoctions of Aconiteum carmichaeli and Paeonia lactiflora were digested at body temperature, at the acidity of the stomach or intestine and with inorganic and organic materials (digestive enzymes were included for whole-bionic and excluded for semi-bionic) found in the stomach or intestine. Being similar to the biomembrane between the gastrointestinal tract and blood vessels, monolayer liposome was used as a biomembrane model. Affinity-monolayer liposome metals and water-soluble metals were used for speciation analysis and bioavailability assessment of copper and zinc in herbal decoction. In the decoction of Aconiteum carmichaeli and Paeonia lactiflora, Zn was mainly absorbed in the intestine and Cu was mainly absorbed by both stomach and intestine. The safe dosage for males and females is below 257.1 g/day Aconiteum carmichaeli and 529.4 g/day Paeonia lactiflora. Copyright © 2010 John Wiley & Sons, Ltd.

  1. Critical review of methods for risk ranking of food related hazards, based on risks for human health.

    PubMed

    van der Fels-Klerx, H J; van Asselt, E D; Raley, M; Poulsen, M; Korsgaard, H; Bredsdorff, L; Nauta, M; D'Agostino, M; Coles, D; Marvin, H J P; Frewer, L J

    2016-02-08

    This study aimed to critically review methods for ranking risks related to food safety and dietary hazards on the basis of their anticipated human health impacts. A literature review was performed to identify and characterize methods for risk ranking from the fields of food, environmental science and socio-economic sciences. The review used a predefined search protocol, and covered the bibliographic databases Scopus, CAB Abstracts, Web of Sciences, and PubMed over the period 1993-2013. All references deemed relevant, on the basis of of predefined evaluation criteria, were included in the review, and the risk ranking method characterized. The methods were then clustered - based on their characteristics - into eleven method categories. These categories included: risk assessment, comparative risk assessment, risk ratio method, scoring method, cost of illness, health adjusted life years, multi-criteria decision analysis, risk matrix, flow charts/decision trees, stated preference techniques and expert synthesis. Method categories were described by their characteristics, weaknesses and strengths, data resources, and fields of applications. It was concluded there is no single best method for risk ranking. The method to be used should be selected on the basis of risk manager/assessor requirements, data availability, and the characteristics of the method. Recommendations for future use and application are provided.

  2. Carbon Fiber Risk Analysis. [conference

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The scope and status of the effort to assess the risks associated with the accidental release of carbon/graphite fibers from civil aircraft is presented. Vulnerability of electrical and electronic equipment to carbon fibers, dispersal of carbon fibers, effectiveness of filtering systems, impact of fiber induced failures, and risk methodology are among the topics covered.

  3. Advances in Risk Analysis with Big Data.

    PubMed

    Choi, Tsan-Ming; Lambert, James H

    2017-08-01

    With cloud computing, Internet-of-things, wireless sensors, social media, fast storage and retrieval, etc., organizations and enterprises have access to unprecedented amounts and varieties of data. Current risk analysis methodology and applications are experiencing related advances and breakthroughs. For example, highway operations data are readily available, and making use of them reduces risks of traffic crashes and travel delays. Massive data of financial and enterprise systems support decision making under risk by individuals, industries, regulators, etc. In this introductory article, we first discuss the meaning of big data for risk analysis. We then examine recent advances in risk analysis with big data in several topic areas. For each area, we identify and introduce the relevant articles that are featured in the special issue. We conclude with a discussion on future research opportunities. © 2017 Society for Risk Analysis.

  4. Revised Methods for Worker Risk Assessment

    EPA Pesticide Factsheets

    EPA is updating and changing the way it approaches pesticide risk assessments. This new approach will result in more comprehensive and consistent evaluation of potential risks of food use pesticides, non-food use pesticides, and occupational exposures.

  5. Methods of Cosmochemical Analysis

    NASA Astrophysics Data System (ADS)

    Lahiri, S.; Maiti, M.

    Some radionuclides, like 10Be (T 1/2 = 1.5 Ma), 14C (T 1/2 = 5,730 years), 26Al (T 1/2 = 0.716 Ma), 53Mn (T 1/2 = 3.7 Ma), and 60Fe (T 1/2 = 1.5 Ma), 146Sm (T 1/2 = 103 Ma), 182Hf (T 1/2 = 9 Ma), 244Pu (T 1/2 = 80 Ma) are either being produced continuously by the interaction of cosmic rays (CR) or might have been produced in supernovae millions of years ago. Analysis of these radionuclides in ultratrace scale has strong influence in almost all branches of sciences, starting from archaeology to biology, nuclear physics to astrophysics. However, measurement of these radionuclides appeared as a borderline problem exploiting their decay properties because of scarcity in natural archives and long half-life. The one and only way seemed to be that of mass measurement. Accelerator mass spectrometry (AMS) is the best suited for this purpose. Apart from AMS, other mass measurement techniques like inductively coupled plasma-mass spectrometry (ICP-MS), thermal ionization mass spectrometry (TIMS), resonant laser ionization mass spectrometry (RIMS), secondary ionization mass spectrometry (SIMS) have also been used with limited sensitivity and approach.

  6. Socioeconomic Methods in Educational Analysis.

    ERIC Educational Resources Information Center

    Weber, William H., III

    This book explores the possibilities in a new approach to educational analysis--a fusion of methods drawn from economics, sociology, and social psychology. The author combines his explanation of socioeconomic analysis with the presentation of several examples that illustrate the application of his method to different analytical problems. The book…

  7. Risk Analysis and Uncertainty: Implications for Counselling

    ERIC Educational Resources Information Center

    Hassenzahl, David

    2004-01-01

    Over the past two decades, the risk analysis community has made substantial advances in understanding and describing uncertainty. Uncertainty is ubiquitous, complex, both quantitative and qualitative in nature, and often irreducible. Uncertainty thus creates a challenge when using risk analysis to evaluate the rationality of group and individual…

  8. Risk Analysis for Resource Planning Optimization

    NASA Technical Reports Server (NTRS)

    Chueng, Kar-Ming

    2008-01-01

    The main purpose of this paper is to introduce a risk management approach that allows planners to quantify the risk and efficiency tradeoff in the presence of uncertainties, and to make forward-looking choices in the development and execution of the plan. Demonstrate a planning and risk analysis framework that tightly integrates mathematical optimization, empirical simulation, and theoretical analysis techniques to solve complex problems.

  9. How important are venue-based HIV risks among male clients of female sex workers? A mixed methods analysis of the risk environment in nightlife venues in Tijuana, Mexico

    PubMed Central

    Goldenberg, Shira; Strathdee, Steffanie A.; Gallardo, Manuel; Nguyen, Lucie; Lozada, Remedios; Semple, Shirley J.; Patterson, Thomas L.

    2011-01-01

    In 2008, 400 males ≥ 18 years old who paid or traded for sex with a female sex worker (FSW) in Tijuana, Mexico, in the past 4 months completed surveys and HIV/STI testing; 30 also completed qualitative interviews. To analyze environmental HIV vulnerability among male clients of FSWs in Tijuana, Mexico, we used mixed methods to investigate correlates of clients who met FSWs in nightlife venues and clients’ perspectives on venue-based risks. Logistic regression identified micro-level correlates of meeting FSWs in nightlife venues, which were triangulated with clients’ narratives regarding macro-level influences. In a multivariate model, offering increased pay for unprotected sex and binge drinking were micro-level factors that were independently associated with meeting FSWs in nightlife venues versus other places. In qualitative interviews, clients characterized nightlife venues as high risk due to the following macro-level features: social norms dictating heavy alcohol consumption; economic exploitation by establishment owners; and poor enforcement of sex work regulations in nightlife venues. Structural interventions in nightlife venues are needed to address venue-based risks. PMID:21396875

  10. How important are venue-based HIV risks among male clients of female sex workers? A mixed methods analysis of the risk environment in nightlife venues in Tijuana, Mexico.

    PubMed

    Goldenberg, Shira M; Strathdee, Steffanie A; Gallardo, Manuel; Nguyen, Lucie; Lozada, Remedios; Semple, Shirley J; Patterson, Thomas L

    2011-05-01

    In 2008, 400 males ≥18 years old who paid or traded for sex with a female sex worker (FSW) in Tijuana, Mexico, in the past 4 months completed surveys and HIV/STI testing; 30 also completed qualitative interviews. To analyze environmental sources of HIV vulnerability among male clients of FSWs in Tijuana, we used mixed methods to investigate correlates of clients who met FSWs in nightlife venues and clients' perspectives on venue-based HIV risk. Logistic regression identified micro-level correlates of meeting FSWs in nightlife venues, which were triangulated with clients' narratives regarding macro-level influences. In a multivariate model, offering increased pay for unprotected sex and binge drinking were micro-level factors that were independently associated with meeting FSWs in nightlife venues versus other places. In qualitative interviews, clients characterized nightlife venues as high risk due to the following macro-level features: social norms dictating heavy alcohol consumption; economic exploitation by establishment owners; and poor enforcement of sex work regulations in nightlife venues. Structural interventions in nightlife venues are needed to address venue-based risks. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. Selecting high-risk micro-scale enterprises using a qualitative risk assessment method.

    PubMed

    Kim, Hyunwook; Park, Dong-Uk

    2006-01-01

    Micro-scale enterprises (MSEs) with less than 5 employees are subject to be covered by the scheme of the regular workplace environmental inspection and medical health examination from 2002 in Korea. Due to limited resources as well as vast number of enterprises to be covered, there is an urgent need to focus these efforts to only those high-risk MSEs. To identify them, a qualitative risk assessment methodology was developed combining the hazardous nature of chemicals and exposure potentials as modeled by the HSE and the risk categorization technique by the AIHA. Risk Index (RI) was determined by combining characteristics specific to chemicals and scale of use of the chemicals. The method was applied to 514 MSEs that were selected from a random sample of 4000 MSEs. A total of 170 out of 514 MSEs studied were included in the final analysis. Current status and characteristics of MSEs were identified and RI was assigned to chemicals in each industry. Based on the distribution of RIs, the high-risk MSEs were selected. These include: wood and products of wood, chemicals and chemical products, basic metals, other machinery and equipment, motor vehicles, trailer and semi-trailer manufacturing, and furniture manufacturing. Since these MSEs are high-risk ones, more attentions should be focused on them. This method can be applied to other workplaces with no previous history of quantitative workplace inspections.

  12. Resource allocation using risk analysis

    SciTech Connect

    Bott, T. F.; Eisenhawer, S. W.

    2003-01-01

    Allocating limited resources among competing priorities is an important problem in management. In this paper we describe an approach to resource allocation using risk as a metric. We call this approach the Logic-Evolved Decision (LED) approach because we use logic-models to generate an exhaustive set of competing options and to describe the often highly complex model used for evaluating the risk reduction achieved by different resource allocations among these options. The risk evaluation then proceeds using probabilistic or linguistic input data.

  13. Risking basin analysis: Procedures, methods and case studies in the Arctic National Wildlife Refuge, Alaska, and the Gulf of Mexico, Mexico

    NASA Astrophysics Data System (ADS)

    Rocha Legorreta, Francisco Javier

    Integrated basin analysis was conducted using a state-of-the-art code developed for Excel, interfacing with the Monte Carlo risking program Crystal BallRTM with the purpose to perform a total uncertainty analysis that can be done with as many uncertain inputs as required and as many outputs of interest as needed without increasing the computer time involved. Two main examples using the code are described: the first one uses synthetic information and the second example uses real but minimal information from the Arctic National Wildlife Refuge (ANWR) Area 1002 (undeformed area Brookian Sequence), Alaska, USA. In both examples, the code serves to identify which parameters in the input (ranging from uncertain data, uncertain thermal history, uncertain permeability, uncertain fracture coefficients for rocks, uncertain geochemistry kinetics, uncertain kerogen amounts and types per formation, through to uncertain volumetric factors) are causing the greatest contributions to the uncertainty in any of the selected outputs. Relative importance, relative contribution and relative sensitivity are examined to illustrate when individual parameters need to have their ranges of uncertainty narrowed in order to reduce the range of uncertainty of particular outputs. Relevant results from the ANWR case of study revealed that forecasts for oil available charge gave around 2.3 Bbbl; for gas the maximum charge available reached is 46 Bm3 . These ranges, in comparison with previous assessments, are quite different due to the group of variables used being influenced basically by the input data, the equation parameter and intrinsic assumptions. As part of the future research, the third section of this work describes the actual and future prospective areas for gas in the Mexican Basins. The main point here is to identify the advances and the important role of the Mexican gas industry as part of a future strategic investment opportunity.

  14. Risk analysis and risk management in an uncertain world.

    PubMed

    Kunreuther, Howard

    2002-08-01

    The tragic attacks of September 11 and the bioterrorist threats with respect to anthrax that followed have raised a set of issues regarding how we deal with events where there is considerable ambiguity and uncertainty about the likelihood of their occurrence and their potential consequences. This paper discusses how one can link the tools of risk assessment and our knowledge of risk perception to develop risk management options for dealing with extreme events. In particular, it suggests ways that the members of the Society for Risk Analysis can apply their expertise and talent to the risks associated with terrorism and discusses the changing roles of the public and private sectors in dealing with extreme events.

  15. Carbon Fiber Risk Analysis: Conclusions

    NASA Technical Reports Server (NTRS)

    Huston, R. J.

    1979-01-01

    It was concluded that preliminary estimates indicate that the public risk due to accidental release of carbon fiber from air transport aircraft is small. It was also concluded that further work is required to increase confidence in these estimates.

  16. Supply-Chain Risk Analysis

    DTIC Science & Technology

    2016-06-07

    security score upon first submission – 3/1/2010 Measured Against CWE/SANS Top-25 Errors 24 SQL Database Query Output: All records with ID = 48983...exploitable design or coding errors • Very little data for software supply chains 8 Software Supply Chain Complexity-1 Composite inherits risk from any point... Relative Effort Operational Capabilities Knowledge of Supplier Capabilities Knowledge of Product Attributes 13 Supply-Chain Risk Categories Category

  17. Occupational safety and HIV risk among female sex workers in China: A mixed-methods analysis of sex-work harms and mommies

    PubMed Central

    Yi, Huso; Zheng, Tiantian; Wan, Yanhai; Mantell, Joanne E.; Park, Minah; Csete, Joanne

    2013-01-01

    Female sex workers (FSWs) in China are exposed to multiple work-related harms that increase HIV vulnerability. Using mixed-methods, we explored the social-ecological aspects of sexual risk among 348 FSWs in Beijing. Sex-work harms were assessed by property stolen, being underpaid or not paid at all, verbal and sexual abuse, forced drinking; and forced sex more than once. The majority (90%) reported at least one type of harm, 38% received harm protection from ‘mommies’ (i.e., managers) and 32% reported unprotected sex with clients. In multivariate models, unprotected sex was significantly associated with longer involvement in sex work, greater exposure to harms, and no protection from mommies. Mommies’ protection moderated the effect of sex-work harms on unprotected sex with clients. Our ethnography indicated that mommies played a core role in sex-work networks. Such networks provide a basis for social capital; they are not only profitable economically, but also protect FSWs from sex-work harms. Effective HIV prevention interventions for FSWs in China must address the occupational safety and health of FSWs by facilitating social capital and protection agency (e.g., mommies) in the sex-work industry. PMID:22375698

  18. Traditional Methods for Mineral Analysis

    NASA Astrophysics Data System (ADS)

    Ward, Robert E.; Carpenter, Charles E.

    This chapter describes traditional methods for analysis of minerals involving titrimetric and colorimetric procedures, and the use of ion selective electrodes. Other traditional methods of mineral analysis include gravimetric titration (i.e., insoluble forms of minerals are precipitated, rinse, dried, and weighed) and redox reactions (i.e., mineral is part of an oxidation-reduction reaction, and product is quantitated). However, these latter two methods will not be covered because they currently are used little in the food industry. The traditional methods that will be described have maintained widespread usage in the food industry despite the development of more modern instrumentation such as atomic absorption spectroscopy and inductively coupled plasma-atomic emission spectroscopy (Chap. 24). Traditional methods generally require chemicals and equipment that are routinely available in an analytical laboratory and are within the experience of most laboratory technicians. Additionally, traditional methods often form the basis for rapid analysis kits (e.g., Quantab®; for salt determination) that are increasingly in demand. Procedures for analysis of minerals of major nutritional or food processing concern are used for illustrative purposes. For additional examples of traditional methods refer to references (1-6). Slight modifications of these traditional methods are often needed for specific foodstuffs to minimize interferences or to be in the range of analytical performance. For analytical requirements for specific foods see the Official Methods of Analysis of AOAC International (5) and related official methods (6).

  19. [Quantitative method of representative contaminants in groundwater pollution risk assessment].

    PubMed

    Wang, Jun-Jie; He, Jiang-Tao; Lu, Yan; Liu, Li-Ya; Zhang, Xiao-Liang

    2012-03-01

    In the light of the problem that stress vulnerability assessment in groundwater pollution risk assessment is lack of an effective quantitative system, a new system was proposed based on representative contaminants and corresponding emission quantities through the analysis of groundwater pollution sources. And quantitative method of the representative contaminants in this system was established by analyzing the three properties of representative contaminants and determining the research emphasis using analytic hierarchy process. The method had been applied to the assessment of Beijing groundwater pollution risk. The results demonstrated that the representative contaminants hazards greatly depended on different research emphasizes. There were also differences between the sequence of three representative contaminants hazards and their corresponding properties. It suggested that subjective tendency of the research emphasis had a decisive impact on calculation results. In addition, by the means of sequence to normalize the three properties and to unify the quantified properties results would zoom in or out of the relative properties characteristic of different representative contaminants.

  20. Risk analysis of colorectal cancer incidence by gene expression analysis

    PubMed Central

    Shangkuan, Wei-Chuan; Lin, Hung-Che; Chang, Yu-Tien; Jian, Chen-En; Fan, Hueng-Chuen; Chen, Kang-Hua; Liu, Ya-Fang; Hsu, Huan-Ming; Chou, Hsiu-Ling; Yao, Chung-Tay

    2017-01-01

    Background Colorectal cancer (CRC) is one of the leading cancers worldwide. Several studies have performed microarray data analyses for cancer classification and prognostic analyses. Microarray assays also enable the identification of gene signatures for molecular characterization and treatment prediction. Objective Microarray gene expression data from the online Gene Expression Omnibus (GEO) database were used to to distinguish colorectal cancer from normal colon tissue samples. Methods We collected microarray data from the GEO database to establish colorectal cancer microarray gene expression datasets for a combined analysis. Using the Prediction Analysis for Microarrays (PAM) method and the GSEA MSigDB resource, we analyzed the 14,698 genes that were identified through an examination of their expression values between normal and tumor tissues. Results Ten genes (ABCG2, AQP8, SPIB, CA7, CLDN8, SCNN1B, SLC30A10, CD177, PADI2, and TGFBI) were found to be good indicators of the candidate genes that correlate with CRC. From these selected genes, an average of six significant genes were obtained using the PAM method, with an accuracy rate of 95%. The results demonstrate the potential of utilizing a model with the PAM method for data mining. After a detailed review of the published reports, the results confirmed that the screened candidate genes are good indicators for cancer risk analysis using the PAM method. Conclusions Six genes were selected with 95% accuracy to effectively classify normal and colorectal cancer tissues. We hope that these results will provide the basis for new research projects in clinical practice that aim to rapidly assess colorectal cancer risk using microarray gene expression analysis. PMID:28229027

  1. Risk Based Requirements for Long Term Stewardship: A Proof-of-Principle Analysis of an Analytic Method Tested on Selected Hanford Locations

    SciTech Connect

    GM Gelston; JW Buck; LR Huesties; MS Peffers; TB Miley; TT Jarvis; WB Andrews

    1998-12-03

    Since 1989, the Department of Energy's (DOE) Environmental Management (EM) Program has managed the environmental legacy of US nuclear weapons production research and testing at 137 facilities in 31 states and one US territory. The EM program has conducted several studies on the public risks posed by contaminated sites at these facilities. In Risks and the Risk Debate DOE, 1995a, the Department analyzed the risks at sites before, during, and after remediation work by the EM program. The results indicated that aside from a few urgent risks, most hazards present little inherent risk because physical and active site management controls limit both the releases of site contaminants, and public access to these hazards. Without these controls, these sites would pose greater risks to the public. Past risk reports, however, provided little irdiormation about post- cleanup risk, primarily because of uncertainty about fiture site uses and site characteristics at the end of planned cleanup activities. This is of concern because in many cases current cleanup technologies, and remedies, will last a shorter period of time than the waste itself and the resulting contamination will remain hazardous.

  2. Risk based requirements for long term stewardship: A proof-of-principle analysis of an analytic method tested on selected Hanford locations

    SciTech Connect

    Jarvis, T.T.; Andrews, W.B.; Buck, J.W.

    1998-03-01

    Since 1989, the Department of Energy`s (DOE) Environmental Management (EM) Program has managed the environmental legacy of US nuclear weapons production, research and testing at 137 facilities in 31 states and one US territory. The EM program has conducted several studies on the public risks posed by contaminated sites at these facilities. In Risks and the Risk Debate [DOE, 1995a], the Department analyzed the risks at sites before, during, and after remediation work by the EM program. The results indicated that aside from a few urgent risks, most hazards present little inherent risk because physical and active site management controls limit both the releases of site contaminants, and public access to these hazards. Without these controls, these sites would pose greater risks to the public. Past risk reports, however, provided little information about post-cleanup risk, primarily because of uncertainty about future site uses and site characteristics at the end of planned cleanup activities. This is of concern because in many cases current cleanup technologies, and remedies, will last a shorter period of time than the waste itself and the resulting contamination will remain hazardous.

  3. [Profitability analysis of clinical risk management].

    PubMed

    Banduhn, C; Schlüchtermann, J

    2013-05-01

    Medical treatment entails many risks. Increasingly, the negative impact of these risks on patients' health is revealed and corresponding cases are reported to hospital insurances. A systematic clinical risk management can reduce risks. This analysis is designed to demonstrate the financial profitability of implementing a clinical risk management. The decision analysis of a clinical risk management includes information from published articles and studies, publicly available data from the Federal Statistical Office and expert interviews and was conducted in 2 scenarios. The 2 scenarios result from a maximum and minimum value of preventable adverse events reported in Germany. The planning horizon was a 1-year ­period. The analysis was performed from a hospital's perspective. Subsequently, a threshold-analysis of the reduction of preventable adverse events as an effect of clinical risk management was executed. Furthermore, a static capital budgeting over a 5-year period was added, complemented by a risk analysis. Regarding the given assumptions, the implementation of clinical risk management would save about 53 000 € or 175 000 €, respectively, for an average hospital within the first year. Only if the reduction of preventable adverse events is as low as 5.6 or 2.8%, respectively, will the implementation of clinical risk management produce losses. According to a comprehensive risk simulation this happens in less than one out of 1 million cases. The investment in a clinical risk management, based on a 5-year period and an interest rate of 5%, has an annually pay off of 81 000 € or 211 000 €, respectively. The implementation of clinical risk management in a hospital pays off within the first year. In the subsequent years the surplus is even higher due to the elimination of implementation costs. © Georg Thieme Verlag KG Stuttgart · New York.

  4. Bridging the two cultures of risk analysis

    SciTech Connect

    Jasanoff, S. )

    1993-04-01

    During the past 15 years, risk analysis has come of age as an interdisciplinary field of remarkable breadth, nurturing connections among fields as diverse as mathematics, biostatistics, toxicology, and engineering on one hand, and law, psychology, sociology, and economics on the other hand. In this editorial, the author addresses the question: What has the presence of social scientists in the network meant to the substantive development of the field of risk analysis The answers offered here discuss the substantial progress in bridging the two cultures of risk analysis. Emphasis is made of the continual need for monitoring risk analysis. Topics include: the micro-worlds of risk assessment; constraining assumptions; and exchange programs. 14 refs.

  5. Preference Functions for Spatial Risk Analysis.

    PubMed

    Keller, L Robin; Simon, Jay

    2017-09-07

    When outcomes are defined over a geographic region, measures of spatial risk regarding these outcomes can be more complex than traditional measures of risk. One of the main challenges is the need for a cardinal preference function that incorporates the spatial nature of the outcomes. We explore preference conditions that will yield the existence of spatial measurable value and utility functions, and discuss their application to spatial risk analysis. We also present a simple example on household freshwater usage across regions to demonstrate how such functions can be assessed and applied. © 2017 Society for Risk Analysis.

  6. 31 CFR 223.11 - Limitation of risk: Protective methods.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false Limitation of risk: Protective... BUSINESS WITH THE UNITED STATES § 223.11 Limitation of risk: Protective methods. The limitation of risk... may underwrite a risk on any bond or policy, the amount of which does not exceed their aggregate...

  7. Probabilistic methods for rotordynamics analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.

    1991-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.

  8. Intelligent Adversary Risk Analysis: A Bioterrorism Risk Management Model (PREPRINT)

    DTIC Science & Technology

    2009-02-20

    4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to a penalty for failing to comply with a...OF: 17. LIMITATION OF ABSTRACT Same as Report (SAR) 18. NUMBER OF PAGES 29 19a. NAME OF RESPONSIBLE PERSON a. REPORT unclassified b. ABSTRACT...Society for Risk Analysis, February 20, 2009    1. INTELLIGENT ADVERSARY RISK  ANALISIS  IS DIFFERENT THAN  HAZARD RISK ANALYSIS  Risk analysis has

  9. Initial Risk Analysis and Decision Making Framework

    SciTech Connect

    Engel, David W.

    2012-02-01

    Commercialization of new carbon capture simulation initiative (CCSI) technology will include two key elements of risk management, namely, technical risk (will process and plant performance be effective, safe, and reliable) and enterprise risk (can project losses and costs be controlled within the constraints of market demand to maintain profitability and investor confidence). Both of these elements of risk are incorporated into the risk analysis subtask of Task 7. Thus far, this subtask has developed a prototype demonstration tool that quantifies risk based on the expected profitability of expenditures when retrofitting carbon capture technology on a stylized 650 MW pulverized coal electric power generator. The prototype is based on the selection of specific technical and financial factors believed to be important determinants of the expected profitability of carbon capture, subject to uncertainty. The uncertainty surrounding the technical performance and financial variables selected thus far is propagated in a model that calculates the expected profitability of investments in carbon capture and measures risk in terms of variability in expected net returns from these investments. Given the preliminary nature of the results of this prototype, additional work is required to expand the scope of the model to include additional risk factors, additional information on extant and proposed risk factors, the results of a qualitative risk factor elicitation process, and feedback from utilities and other interested parties involved in the carbon capture project. Additional information on proposed distributions of these risk factors will be integrated into a commercial implementation framework for the purpose of a comparative technology investment analysis.

  10. Dealing with Uncertainty in Chemical Risk Analysis

    DTIC Science & Technology

    1988-12-01

    0 * (OF 41 C-DEALING WITH UNCERTAINTY IN - CHEMICAL RISK ANALYSIS THESIS David S. Clement Captain, USAF AFIT/GOR/MA/8CD-2 DT[C. ~ELECTEf 2 9 MAR 18...AFIT/GOR/MA/88D-2 DEALING WITH UNCERTAINTY IN CHEMICAL RISK ANALYSIS THESIS David S. Clement Captain, USAF AFIT/GOR/MA/88D-2 DTIC V ~ 27989 Approved...for public release; distribution unlimited S . AFIT/GOR/KA/88D-2 DEALING WITH UNCERTAINTY IN CHEMICAL RISK ANALYSIS THESIS Presented to the Faculty

  11. Novel application of statistical methods for analysis of multiple toxicants identifies DDT as a risk factor for early child behavioral problems.

    PubMed

    Forns, Joan; Mandal, Siddhartha; Iszatt, Nina; Polder, Anuschka; Thomsen, Cathrine; Lyche, Jan Ludvig; Stigum, Hein; Vermeulen, Roel; Eggesbø, Merete

    2016-11-01

    The aim of this study was to assess the association between postnatal exposure to multiple persistent organic pollutants (POPs) measured in breast milk samples and early behavioral problems using statistical methods to deal with correlated exposure data. We used data from the Norwegian HUMIS study. We measured concentrations of 24 different POPs in human milk from 612 mothers (median collection time: 32 days after delivery), including 13 polychlorinated biphenyls (PCB) congeners, 6 polybrominated diphenyl ethers (PBDE) congeners and five organochlorine compounds. We assessed child behavioral problems at 12 and 24 months using the infant toddler symptom checklist (ITSC). Higher score in ITSC corresponds to more behavioral problems. First we performed principal component analysis (PCA). Then two variable selection methods, elastic net (ENET) and Bayesian model averaging (BMA), were applied to select any toxicants associated with behavioral problems. Finally, the effect size of the selected toxicants was estimated using multivariate linear regression analyses. p,p'-DDT was associated with behavioral problems at 12 months in all the applied models. Specifically, the principal component composed of organochlorine pesticides was significantly associated with behavioral problems and both ENET and BMA identified p,p'-DDT as associated with behavioral problems. Using a multiple linear regression model an interquartile increase in p,p'-DDT was associated with a 0.62 unit increase in ITSC score (95% CI 0.45, 0.79) at 12 months, corresponding to more behavioral problems. The association was modified by maternal education: the effect of p,p'-DDT was strongest in women with lower education (β=0.59; 95%CI: 0.38, 0.81) compared to the mother with higher education (β=0.14; 95%CI: -0.05, 0.34) (p-value for interaction=0.089). At 24 months, neither selection method consistently identified any toxicant associated with behavioral problems. Within a mixture of 24 toxicants measured in

  12. Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William

    2009-01-01

    This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).

  13. CUMULATIVE RISK ANALYSIS FOR ORGANOPHOSPHORUS PESTICIDES

    EPA Science Inventory

    Cumulative Risk Analysis for Organophosphorus Pesticides
    R. Woodrow Setzer, Jr. NHEERL MD-74, USEPA, RTP, NC 27711

    The US EPA has recently completed a risk assessment of the effects of exposure to 33 organophosphorous pesticides (OPs) through the diet, water, and resi...

  14. CUMULATIVE RISK ANALYSIS FOR ORGANOPHOSPHORUS PESTICIDES

    EPA Science Inventory

    Cumulative Risk Analysis for Organophosphorus Pesticides
    R. Woodrow Setzer, Jr. NHEERL MD-74, USEPA, RTP, NC 27711

    The US EPA has recently completed a risk assessment of the effects of exposure to 33 organophosphorous pesticides (OPs) through the diet, water, and resi...

  15. Transportation scenarios for risk analysis.

    SciTech Connect

    Weiner, Ruth F.

    2010-09-01

    Transportation risk, like any risk, is defined by the risk triplet: what can happen (the scenario), how likely it is (the probability), and the resulting consequences. This paper evaluates the development of transportation scenarios, the associated probabilities, and the consequences. The most likely radioactive materials transportation scenario is routine, incident-free transportation, which has a probability indistinguishable from unity. Accident scenarios in radioactive materials transportation are of three different types: accidents in which there is no impact on the radioactive cargo, accidents in which some gamma shielding may be lost but there is no release of radioactive material, and accident in which radioactive material may potentially be released. Accident frequencies, obtainable from recorded data validated by the U.S. Department of Transportation, are considered equivalent to accident probabilities in this study. Probabilities of different types of accidents are conditional probabilities, conditional on an accident occurring, and are developed from event trees. Development of all of these probabilities and the associated highway and rail accident event trees are discussed in this paper.

  16. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant... degree of protection for the data, e.g., unencrypted, plain text; (6) Time the data has been out of VA...

  17. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant... degree of protection for the data, e.g., unencrypted, plain text; (6) Time the data has been out of VA...

  18. Cassini nuclear risk analysis with SPARRC

    NASA Astrophysics Data System (ADS)

    Ha, Chuong T.; Deane, Nelson A.

    1998-01-01

    The nuclear risk analysis of the Cassini mission is one of the most comprehensive risk analyses ever conducted for a space nuclear mission. The complexity of postulated accident scenarios and source term definitions, from launch to Earth swingby, has necessitated an extensive series of analyses in order to provide best-estimates of potential consequence results and bounding uncertainty intervals. The Space Accident Radiological Release and Consequence (SPARRC) family of codes, developed by Lockheed Martin to analyze polydispersed source terms and a combination of different atmospheric transport patterns, have been used for the Cassini Final Safety Analysis Report (FSAR). By identifying dominant contributors, the nuclear risk of each mission segment is understood with a high level of confidence. This paper provides the overall analysis process and insights developed from the risk analysis.

  19. Cassini nuclear risk analysis with SPARRC

    SciTech Connect

    Ha, Chuong T.; Deane, Nelson A.

    1998-01-15

    The nuclear risk analysis of the Cassini mission is one of the most comprehensive risk analyses ever conducted for a space nuclear mission. The complexity of postulated accident scenarios and source term definitions, from launch to Earth swingby, has necessitated an extensive series of analyses in order to provide best-estimates of potential consequence results and bounding uncertainty intervals. The Space Accident Radiological Release and Consequence (SPARRC) family of codes, developed by Lockheed Martin to analyze polydispersed source terms and a combination of different atmospheric transport patterns, have been used for the Cassini Final Safety Analysis Report (FSAR). By identifying dominant contributors, the nuclear risk of each mission segment is understood with a high level of confidence. This paper provides the overall analysis process and insights developed from the risk analysis.

  20. Cassini nuclear risk analysis with SPARRC

    SciTech Connect

    Ha, C.T.; Deane, N.A.

    1998-01-01

    The nuclear risk analysis of the Cassini mission is one of the most comprehensive risk analyses ever conducted for a space nuclear mission. The complexity of postulated accident scenarios and source term definitions, from launch to Earth swingby, has necessitated an extensive series of analyses in order to provide best-estimates of potential consequence results and bounding uncertainty intervals. The Space Accident Radiological Release and Consequence (SPARRC) family of codes, developed by Lockheed Martin to analyze polydispersed source terms and a combination of different atmospheric transport patterns, have been used for the Cassini Final Safety Analysis Report (FSAR). By identifying dominant contributors, the nuclear risk of each mission segment is understood with a high level of confidence. This paper provides the overall analysis process and insights developed from the risk analysis. {copyright} {ital 1998 American Institute of Physics.}

  1. Risk analysis approach. [of carbon fiber release

    NASA Technical Reports Server (NTRS)

    Huston, R. J.

    1979-01-01

    The assessment of the carbon fiber hazard is outlined. Program objectives, requirements of the risk analysis, and elements associated with the physical phenomena of the accidental release are described.

  2. RISK ANALYSIS, ANALYSIS OF VARIANCE: GETTING MORE FROM OUR DATA

    USDA-ARS?s Scientific Manuscript database

    Analysis of variance (ANOVA) and regression are common statistical techniques used to analyze agronomic experimental data and determine significant differences among yields due to treatments or other experimental factors. Risk analysis provides an alternate and complimentary examination of the same...

  3. Comparison of Hartmann analysis methods.

    PubMed

    Canovas, Carmen; Ribak, Erez N

    2007-04-01

    Analysis of Hartmann-Shack wavefront sensors for the eye is traditionally performed by locating and centroiding the sensor spots. These centroids provide the gradient, which is integrated to yield the ocular aberration. Fourier methods can replace the centroid stage, and Fourier integration can replace the direct integration. The two--demodulation and integration--can be combined to directly retrieve the wavefront, all in the Fourier domain. Now we applied this full Fourier analysis to circular apertures and real images. We performed a comparison between it and previous methods of convolution, interpolation, and Fourier demodulation. We also compared it with a centroid method, which yields the Zernike coefficients of the wavefront. The best performance was achieved for ocular pupils with a small boundary slope or far from the boundary and acceptable results for images missing part of the pupil. The other Fourier analysis methods had much higher tolerance to noncentrosymmetric apertures.

  4. [Study on application of two risk assessment methods in coal dust occupational health risk assessment].

    PubMed

    Wu, B; Zhang, Y L; Chen, Y Q

    2017-04-20

    Objective: To evaluate the applicability of quantitative grading method (GBZ/T 229.1-2010) and occupational hazard risk index method in coal dust occupational health risk assessment. Methods: Taking 4 coal mines as the research object of risk assessment and making occupational health field testing and investigation. Based on two risk assessment methods, we analysed the health risk levels of 20 occupations which were exposed to coal dust in workplaces. Results: Coal dust working post had different risk levels in 4 coal mines, the post of higher risk level were mainly concentrated in the underground workplace of coal mine, especially the post of coal mining and tunneling system. The two risk assessment results showed that the risk levels of coal-mining machine drivers and tunneling machine drivers were the highest. The risk levels of coal dust working post used by two risk assessment methods had no significant difference (P>0.05) and were highly correlated (r=0.821, P<0.001) . Evaluation results of two risk assessment methods were supported by the field investigation and literatures. Conclusion: The two risk assessment methods can be used in coal dust occupational health risk assessment.

  5. Fuzzy Comprehensive Evaluation Method Applied in the Real Estate Investment Risks Research

    NASA Astrophysics Data System (ADS)

    ML(Zhang Minli), Zhang; Wp(Yang Wenpo), Yang

    Real estate investment is a high-risk and high returned of economic activity, the key of real estate analysis is the identification of their types of investment risk and the risk of different types of effective prevention. But, as the financial crisis sweeping the world, the real estate industry also faces enormous risks, how effective and correct evaluation of real estate investment risks becomes the multitudinous scholar concern[1]. In this paper, real estate investment risks were summarized and analyzed, and comparative analysis method is discussed and finally presented fuzzy comprehensive evaluation method, not only in theory has the advantages of science, in the application also has the reliability, for real estate investment risk assessment provides an effective means for investors in real estate investing guidance on risk factors and forecasts.

  6. Fuzzy risk analysis for nuclear safeguards

    SciTech Connect

    Zardecki, A.

    1993-01-01

    Analysis of a safeguards system, based on the notion of fuzzy sets and linguistic variables, concerns such as complexity and inherent imprecision in estimating the possibility of loss or compromise. The automated risk analysis allows the risk to be determined for an entire system based on estimates for lowest level components and the component proportion. In addition, for each component (asset) the most effective combination of protection mechanisms against a given set of threats is determined. A distinction between bar and featured risk is made.

  7. Alcohol Consumption and Gastric Cancer Risk: A Meta-Analysis

    PubMed Central

    Ma, Ke; Baloch, Zulqarnain; He, Ting-Ting; Xia, Xueshan

    2017-01-01

    Background We sought to determine by meta-analysis the relationship between drinking alcohol and the risk of gastric cancer. Material/Methods A systematic Medline search was performed to identify all published reports of drinking alcohol and the associated risk of gastric cancer. Initially we retrieved 2,494 studies, but after applying inclusion and exclusion criteria, only ten studies were found to be eligible for our meta-analysis. Results Our meta-analysis showed that alcohol consumption elevated the risk of gastric cancer with an odds ratio (OR) of 1.39 (95% CI 1.20–1.61). Additionally, subgroup analysis showed that only a nested case-control report from Sweden did not support this observation. Subgroup analysis of moderate drinking and heavy drinking also confirmed that drinking alcohol increased the risk of gastric cancer. Publication bias analysis (Begg’s and Egger’s tests) showed p values were more than 0.05, suggesting that the 10 articles included in our analysis did not have a publication bias. Conclusions The results from this meta-analysis support the hypothesis that alcohol consumption can increase the risk of gastric cancer; suggesting that effective moderation of alcohol drinking may reduce the risk of gastric cancer. PMID:28087989

  8. The application of risk analysis in aquatic animal health management.

    PubMed

    Peeler, E J; Murray, A G; Thebault, A; Brun, E; Giovaninni, A; Thrush, M A

    2007-09-14

    assessment. Risk analysis has improved decision making in aquatic animal health management by providing a transparent method for using the available scientific information. The lack of data is the main constraint to the application of risk analysis in aquatic animal health. The identification of critical parameters is an important output from risk analysis models which should be used to prioritise research.

  9. Risk-Based Prioritization Method for the Classification of Groundwater Pollution from Hazardous Waste Landfills.

    PubMed

    Yang, Yu; Jiang, Yong-Hai; Lian, Xin-Ying; Xi, Bei-Dou; Ma, Zhi-Fei; Xu, Xiang-Jian; An, Da

    2016-12-01

    Hazardous waste landfill sites are a significant source of groundwater pollution. To ensure that these landfills with a significantly high risk of groundwater contamination are properly managed, a risk-based ranking method related to groundwater contamination is needed. In this research, a risk-based prioritization method for the classification of groundwater pollution from hazardous waste landfills was established. The method encompasses five phases, including risk pre-screening, indicator selection, characterization, classification and, lastly, validation. In the risk ranking index system employed here, 14 indicators involving hazardous waste landfills and migration in the vadose zone as well as aquifer were selected. The boundary of each indicator was determined by K-means cluster analysis and the weight of each indicator was calculated by principal component analysis. These methods were applied to 37 hazardous waste landfills in China. The result showed that the risk for groundwater contamination from hazardous waste landfills could be ranked into three classes from low to high risk. In all, 62.2 % of the hazardous waste landfill sites were classified in the low and medium risk classes. The process simulation method and standardized anomalies were used to validate the result of risk ranking; the results were consistent with the simulated results related to the characteristics of contamination. The risk ranking method was feasible, valid and can provide reference data related to risk management for groundwater contamination at hazardous waste landfill sites.

  10. Risk-Based Prioritization Method for the Classification of Groundwater Pollution from Hazardous Waste Landfills

    NASA Astrophysics Data System (ADS)

    Yang, Yu; Jiang, Yong-Hai; lian, Xin-Ying; Xi, Bei-Dou; Ma, Zhi-fei; Xu, Xiang-Jian; An, Da

    2016-12-01

    Hazardous waste landfill sites are a significant source of groundwater pollution. To ensure that these landfills with a significantly high risk of groundwater contamination are properly managed, a risk-based ranking method related to groundwater contamination is needed. In this research, a risk-based prioritization method for the classification of groundwater pollution from hazardous waste landfills was established. The method encompasses five phases, including risk pre-screening, indicator selection, characterization, classification and, lastly, validation. In the risk ranking index system employed here, 14 indicators involving hazardous waste landfills and migration in the vadose zone as well as aquifer were selected. The boundary of each indicator was determined by K-means cluster analysis and the weight of each indicator was calculated by principal component analysis. These methods were applied to 37 hazardous waste landfills in China. The result showed that the risk for groundwater contamination from hazardous waste landfills could be ranked into three classes from low to high risk. In all, 62.2 % of the hazardous waste landfill sites were classified in the low and medium risk classes. The process simulation method and standardized anomalies were used to validate the result of risk ranking; the results were consistent with the simulated results related to the characteristics of contamination. The risk ranking method was feasible, valid and can provide reference data related to risk management for groundwater contamination at hazardous waste landfill sites.

  11. Starlink corn: a risk analysis.

    PubMed Central

    Bucchini, Luca; Goldman, Lynn R

    2002-01-01

    Modern biotechnology has dramatically increased our ability to alter the agronomic traits of plants. Among the novel traits that biotechnology has made available, an important group includes Bacillus thuringiensis-derived insect resistance. This technology has been applied to potatoes, cotton, and corn. Benefits of Bt crops, and biotechnology generally, can be realized only if risks are assessed and managed properly. The case of Starlink corn, a plant modified with a gene that encodes the Bt protein Cry9c, was a severe test of U.S. regulatory agencies. The U.S. Environmental Protection Agency had restricted its use to animal feed due to concern about the potential for allergenicity. However, Starlink corn was later found throughout the human food supply, resulting in food recalls by the Food and Drug Administration and significant disruption of the food supply. Here we examine the regulatory history of Starlink, the assessment framework employed by the U.S. government, assumptions and information gaps, and the key elements of government efforts to manage the product. We explore the impacts on regulations, science, and society and conclude that only significant advances in our understanding of food allergies and improvements in monitoring and enforcement will avoid similar events in the future. Specifically, we need to develop a stronger fundamental basis for predicting allergic sensitization and reactions if novel proteins are to be introduced in this fashion. Mechanisms are needed to assure that worker and community aeroallergen risks are considered. Requirements are needed for the development of valid assays so that enforcement and post market surveillance activities can be conducted. PMID:11781159

  12. Starlink corn: a risk analysis.

    PubMed

    Bucchini, Luca; Goldman, Lynn R

    2002-01-01

    Modern biotechnology has dramatically increased our ability to alter the agronomic traits of plants. Among the novel traits that biotechnology has made available, an important group includes Bacillus thuringiensis-derived insect resistance. This technology has been applied to potatoes, cotton, and corn. Benefits of Bt crops, and biotechnology generally, can be realized only if risks are assessed and managed properly. The case of Starlink corn, a plant modified with a gene that encodes the Bt protein Cry9c, was a severe test of U.S. regulatory agencies. The U.S. Environmental Protection Agency had restricted its use to animal feed due to concern about the potential for allergenicity. However, Starlink corn was later found throughout the human food supply, resulting in food recalls by the Food and Drug Administration and significant disruption of the food supply. Here we examine the regulatory history of Starlink, the assessment framework employed by the U.S. government, assumptions and information gaps, and the key elements of government efforts to manage the product. We explore the impacts on regulations, science, and society and conclude that only significant advances in our understanding of food allergies and improvements in monitoring and enforcement will avoid similar events in the future. Specifically, we need to develop a stronger fundamental basis for predicting allergic sensitization and reactions if novel proteins are to be introduced in this fashion. Mechanisms are needed to assure that worker and community aeroallergen risks are considered. Requirements are needed for the development of valid assays so that enforcement and post market surveillance activities can be conducted.

  13. Risk Analysis Training within the Army: Current Status, Future Trends,

    DTIC Science & Technology

    risk analysis . Since risk analysis training in the Army is...become involved in risk analysis training. He reviews all risk analysis -related training done in any course at the Center. Also provided is information...expected to use the training. Then the future trend in risk analysis training is presented. New course, course changes and hardware/software changes that will make risk analysis more palatable are

  14. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    PubMed

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.

  15. Fault tree analysis for integrated and probabilistic risk analysis of drinking water systems.

    PubMed

    Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof

    2009-04-01

    Drinking water systems are vulnerable and subject to a wide range of risks. To avoid sub-optimisation of risk-reduction options, risk analyses need to include the entire drinking water system, from source to tap. Such an integrated approach demands tools that are able to model interactions between different events. Fault tree analysis is a risk estimation tool with the ability to model interactions between events. Using fault tree analysis on an integrated level, a probabilistic risk analysis of a large drinking water system in Sweden was carried out. The primary aims of the study were: (1) to develop a method for integrated and probabilistic risk analysis of entire drinking water systems; and (2) to evaluate the applicability of Customer Minutes Lost (CML) as a measure of risk. The analysis included situations where no water is delivered to the consumer (quantity failure) and situations where water is delivered but does not comply with water quality standards (quality failure). Hard data as well as expert judgements were used to estimate probabilities of events and uncertainties in the estimates. The calculations were performed using Monte Carlo simulations. CML is shown to be a useful measure of risks associated with drinking water systems. The method presented provides information on risk levels, probabilities of failure, failure rates and downtimes of the system. This information is available for the entire system as well as its different sub-systems. Furthermore, the method enables comparison of the results with performance targets and acceptable levels of risk. The method thus facilitates integrated risk analysis and consequently helps decision-makers to minimise sub-optimisation of risk-reduction options.

  16. Preliminary Technical Risk Analysis for the Geothermal Technologies Program

    SciTech Connect

    2009-01-18

    This report explains the goals, methods, and results of a probabilistic analysis of technical risk for a portfolio of R&D projects in the DOE Geothermal Technologies Program (The Program). The analysis is a task by Princeton Energy Resources International, LLC, in support of the National Renewable Energy Laboratory on behalf of the Program. The main challenge in the analysis lies in translating R&D results to a quantitative reflection of technical risk for a key Program metric: levelized cost of energy (LCOE).

  17. Method of multivariate spectral analysis

    DOEpatents

    Keenan, Michael R.; Kotula, Paul G.

    2004-01-06

    A method of determining the properties of a sample from measured spectral data collected from the sample by performing a multivariate spectral analysis. The method can include: generating a two-dimensional matrix A containing measured spectral data; providing a weighted spectral data matrix D by performing a weighting operation on matrix A; factoring D into the product of two matrices, C and S.sup.T, by performing a constrained alternating least-squares analysis of D=CS.sup.T, where C is a concentration intensity matrix and S is a spectral shapes matrix; unweighting C and S by applying the inverse of the weighting used previously; and determining the properties of the sample by inspecting C and S. This method can be used to analyze X-ray spectral data generated by operating a Scanning Electron Microscope (SEM) with an attached Energy Dispersive Spectrometer (EDS).

  18. Advanced uncertainty modelling for container port risk analysis.

    PubMed

    Alyami, Hani; Yang, Zaili; Riahi, Ramin; Bonsall, Stephen; Wang, Jin

    2016-08-13

    Globalization has led to a rapid increase of container movements in seaports. Risks in seaports need to be appropriately addressed to ensure economic wealth, operational efficiency, and personnel safety. As a result, the safety performance of a Container Terminal Operational System (CTOS) plays a growing role in improving the efficiency of international trade. This paper proposes a novel method to facilitate the application of Failure Mode and Effects Analysis (FMEA) in assessing the safety performance of CTOS. The new approach is developed through incorporating a Fuzzy Rule-Based Bayesian Network (FRBN) with Evidential Reasoning (ER) in a complementary manner. The former provides a realistic and flexible method to describe input failure information for risk estimates of individual hazardous events (HEs) at the bottom level of a risk analysis hierarchy. The latter is used to aggregate HEs safety estimates collectively, allowing dynamic risk-based decision support in CTOS from a systematic perspective. The novel feature of the proposed method, compared to those in traditional port risk analysis lies in a dynamic model capable of dealing with continually changing operational conditions in ports. More importantly, a new sensitivity analysis method is developed and carried out to rank the HEs by taking into account their specific risk estimations (locally) and their Risk Influence (RI) to a port's safety system (globally). Due to its generality, the new approach can be tailored for a wide range of applications in different safety and reliability engineering and management systems, particularly when real time risk ranking is required to measure, predict, and improve the associated system safety performance. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Campylobacter detection along the food chain--towards improved quantitative risk analysis by live/dead discriminatory culture-independent methods.

    PubMed

    Stingl, Kerstin; Buhler, Christiane; Krüger, Nora-Johanna

    2015-01-01

    Death, although absolute in its consequence, is not measurable by an absolute parameter in bacteria. Viability assays address different aspects of life, e. g. the capability to form a colony on an agar plate (CFU), metabolic properties or mem- brane integrity. For food safety, presence of infectious potential is the relevant criterion for risk assessment, currently only partly reflected by the quantification of CFU. It will be necessary for future improved risk assessment, in particular when fastidious bacterial pathogens are implicated, to enhance the informative value of CFU. This might be feasible by quantification of the number of intact and potentially infectious Campylobacter, impermeable to the DNA intercalating dye propidium monoazide (PMA). The latter are quantifiable by the combination of PMA with real-time PCR, although thorough controls have to be developed for standardization and the circumvention of pitfalls. Under consideration of differ- ent physiological states of the food-borne pathogen, we provide an overview of current and future suitable detection/quantification targets along the food chain, including putative limitations of detection.

  20. Risk-Based Explosive Safety Analysis

    DTIC Science & Technology

    2016-11-30

    safety siting of energetic liquids and propellants can be greatly aided by the use of risk- based methodologies. The low probability of exposed...of energetic liquids and propellants can be greatly aided by the use of risk- based methodologies. The low probability of exposed personnel and the... based analysis of scenario 2 would likely determine that the hazard of death or injury to any single person is low due to the separation distance

  1. Risk Assessment and Integration Team (RAIT) Portfolio Risk Analysis Strategy

    NASA Technical Reports Server (NTRS)

    Edwards, Michelle

    2010-01-01

    Impact at management level: Qualitative assessment of risk criticality in conjunction with risk consequence, likelihood, and severity enable development of an "investment policy" towards managing a portfolio of risks. Impact at research level: Quantitative risk assessments enable researchers to develop risk mitigation strategies with meaningful risk reduction results. Quantitative assessment approach provides useful risk mitigation information.

  2. A risk analysis model for radioactive wastes.

    PubMed

    Külahcı, Fatih

    2011-07-15

    Hazardous wastes affect natural environmental systems to a significant extend, and therefore, it is necessary to control their harm through risk analysis. Herein, an effective risk methodology is proposed by considering their uncertain behaviors on stochastic, statistical and probabilistic bases. The basic element is attachment of a convenient probability distribution function (pdf) to a given waste quality measurement sequence. In this paper, (40)K contaminant measurements are adapted for risk assessment application after derivation of necessary fundamental formulations. The spatial contaminant distribution of (40)K is presented in the forms of maps and three-dimensional surfaces.

  3. Integrating Household Risk Mitigation Behavior in Flood Risk Analysis: An Agent-Based Model Approach.

    PubMed

    Haer, Toon; Botzen, W J Wouter; de Moel, Hans; Aerts, Jeroen C J H

    2016-11-28

    Recent studies showed that climate change and socioeconomic trends are expected to increase flood risks in many regions. However, in these studies, human behavior is commonly assumed to be constant, which neglects interaction and feedback loops between human and environmental systems. This neglect of human adaptation leads to a misrepresentation of flood risk. This article presents an agent-based model that incorporates human decision making in flood risk analysis. In particular, household investments in loss-reducing measures are examined under three economic decision models: (1) expected utility theory, which is the traditional economic model of rational agents; (2) prospect theory, which takes account of bounded rationality; and (3) a prospect theory model, which accounts for changing risk perceptions and social interactions through a process of Bayesian updating. We show that neglecting human behavior in flood risk assessment studies can result in a considerable misestimation of future flood risk, which is in our case study an overestimation of a factor two. Furthermore, we show how behavior models can support flood risk analysis under different behavioral assumptions, illustrating the need to include the dynamic adaptive human behavior of, for instance, households, insurers, and governments. The method presented here provides a solid basis for exploring human behavior and the resulting flood risk with respect to low-probability/high-impact risks.

  4. Research on the method of information system risk state estimation based on clustering particle filter

    NASA Astrophysics Data System (ADS)

    Cui, Jia; Hong, Bei; Jiang, Xuepeng; Chen, Qinghua

    2017-05-01

    With the purpose of reinforcing correlation analysis of risk assessment threat factors, a dynamic assessment method of safety risks based on particle filtering is proposed, which takes threat analysis as the core. Based on the risk assessment standards, the method selects threat indicates, applies a particle filtering algorithm to calculate influencing weight of threat indications, and confirms information system risk levels by combining with state estimation theory. In order to improve the calculating efficiency of the particle filtering algorithm, the k-means cluster algorithm is introduced to the particle filtering algorithm. By clustering all particles, the author regards centroid as the representative to operate, so as to reduce calculated amount. The empirical experience indicates that the method can embody the relation of mutual dependence and influence in risk elements reasonably. Under the circumstance of limited information, it provides the scientific basis on fabricating a risk management control strategy.

  5. Analysis of labour risks in the Spanish industrial aerospace sector.

    PubMed

    Laguardia, Juan; Rubio, Emilio; Garcia, Ana; Garcia-Foncillas, Rafael

    2016-01-01

    Labour risk prevention is an activity integrated within Safety and Hygiene at Work in Spain. In 2003, the Electronic Declaration for Accidents at Work, Delt@ (DELTA) was introduced. The industrial aerospace sector is subject to various risks. Our objective is to analyse the Spanish Industrial Aerospace Sector (SIAS) using the ACSOM methodology to assess its labour risks and to prioritise preventive actions. The SIAS and the Services Subsector (SS) were created and the relevant accident rate data were obtained. The ACSOM method was applied through double contrast (deviation and translocation) of the SIAS or SS risk polygon with the considered pattern, accidents from all sectors (ACSOM G) or the SIAS. A list of risks was obtained, ordered by action phases. In the SIAS vs. ACSOM G analysis, radiation risks were the worst, followed by overstrains. Accidents caused by living beings were also significant in the SS vs. SIAE, which will be able to be used to improve Risk Prevention. Radiation is the most significant risk in the SIAS and the SS. Preventive actions will be primary and secondary. ACSOM has shown itself to be a valid tool for the analysis of labour risks.

  6. Method of photon spectral analysis

    DOEpatents

    Gehrke, R.J.; Putnam, M.H.; Killian, E.W.; Helmer, R.G.; Kynaston, R.L.; Goodwin, S.G.; Johnson, L.O.

    1993-04-27

    A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and [gamma]-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2,000 keV), as well as high-energy [gamma] rays (>1 MeV). A 8,192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The [gamma]-ray portion of each spectrum is analyzed by a standard Ge [gamma]-ray analysis program. This method can be applied to any analysis involving x- and [gamma]-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the [gamma]-ray analysis and accommodated during the x-ray analysis.

  7. Method of photon spectral analysis

    DOEpatents

    Gehrke, Robert J.; Putnam, Marie H.; Killian, E. Wayne; Helmer, Richard G.; Kynaston, Ronnie L.; Goodwin, Scott G.; Johnson, Larry O.

    1993-01-01

    A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and .gamma.-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2000 keV), as well as high-energy .gamma. rays (>1 MeV). A 8192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The .gamma.-ray portion of each spectrum is analyzed by a standard Ge .gamma.-ray analysis program. This method can be applied to any analysis involving x- and .gamma.-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the .gamma.-ray analysis and accommodated during the x-ray analysis.

  8. State of the art in benefit-risk analysis: medicines.

    PubMed

    Luteijn, J M; White, B C; Gunnlaugsdóttir, H; Holm, F; Kalogeras, N; Leino, O; Magnússon, S H; Odekerken, G; Pohjola, M V; Tijhuis, M J; Tuomisto, J T; Ueland, Ø; McCarron, P A; Verhagen, H

    2012-01-01

    Benefit-risk assessment in medicine has been a valuable tool in the regulation of medicines since the 1960s. Benefit-risk assessment takes place in multiple stages during a medicine's life-cycle and can be conducted in a variety of ways, using methods ranging from qualitative to quantitative. Each benefit-risk assessment method is subject to its own specific strengths and limitations. Despite its widespread and long-time use, benefit-risk assessment in medicine is subject to debate and suffers from a number of limitations and is currently still under development. This state of the art review paper will discuss the various aspects and approaches to benefit-risk assessment in medicine in a chronological pathway. The review will discuss all types of benefit-risk assessment a medicinal product will undergo during its lifecycle, from Phase I clinical trials to post-marketing surveillance and health technology assessment for inclusion in public formularies. The benefit-risk profile of a drug is dynamic and differs for different indications and patient groups. In the end of this review we conclude benefit-risk analysis in medicine is a developed practice that is subject to continuous improvement and modernisation. Improvement not only in methodology, but also in cooperation between organizations can improve benefit-risk assessment.

  9. The risk management implications of NUREG--1150 methods and results

    SciTech Connect

    Camp, A.L.; Maloney, K.J.; Sype, T.T. )

    1989-09-01

    This report describes the potential uses of NUREG-1150 and similar Probabilistic Risk Assessments (PRAs) in NRC and industry risk management programs. NUREG-1150 uses state-of-the-art PRA techniques to estimate the risk from five nuclear power plants. The methods and results produced in NUREG-1150 provide a framework within which current risk management strategies can be evaluated, and future risk management programs can be developed and assessed. While the development of plant-specific risk management strategies is beyond the scope of this document, examples of the use of the NUREG-1150 framework for identifying and evaluating risk management options are presented. All phases of risk management from prevention of initiating events though reduction of offsite consequences are discussed, with particular attention given to the early phase of accidents. 14 refs., 9 figs., 28 tabs.

  10. Seismic vulnerability assessments in risk analysis

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Larionov, Valery; Bonnin, Jean; Ugarov, Alexander

    2013-04-01

    The assessment of seismic vulnerability is a critical issue within natural and technological risk analysis. In general, there are three common types of methods used for development of vulnerability functions of different elements at risk: empirical, analytical and expert estimations. The paper addresses the empirical methods for seismic vulnerability estimation for residential buildings and industrial facilities. The results of engineering analysis of past earthquake consequences, as well as the statistical data on buildings behavior during strong earthquakes presented in the different seismic intensity scales, are used to verify the regional parameters of mathematical models in order to simulate physical and economic vulnerability for different building types classified according to seismic scale MMSK-86. Verified procedure has been used to estimate the physical and economic vulnerability of buildings and constructions against earthquakes for the Northern Caucasus Federal region of the Russian Federation and Krasnodar area, which are characterized by rather high level of seismic activity and high population density. In order to estimate expected damage states to buildings and constructions in the case of the earthquakes according to the OSR-97B (return period T=1,000 years) within big cities and towns, they were divided into unit sites and their coordinates were presented as dots located in the centers of unit sites. Then the indexes obtained for each unit site were summed up. The maps of physical vulnerability zoning for Northern Caucasus Federal region of the Russian Federation and Krasnodar area includes two elements: percent of different damage states for settlements with number of inhabitants less than 1,000 and vulnerability for cities and towns with number of inhabitants more than 1,000. The hypsometric scale is used to represent both elements on the maps. Taking into account the size of oil pipe line systems located in the highly active seismic zones in

  11. [New methods for studying drug associated risk: experience of the Toulouse Regional Pharmacovigilance Center].

    PubMed

    Montastruc, Jean-Louis; Bagheri, Haleh; Lacroix, Isabelle; Olivier, Pascale; Durrieu, Genevieve; Damase-Michel, Christine; Lapeyre-Mestre, Maryse

    2005-03-01

    This review examines the methods used to study adverse drug reactions (ADRs) and to quantify drug risk in pharmacovigilance. Beside analysis of spontaneous reports (the basic and universal alert method in pharmacovigilance), epidemiological methods can also be used. Classical methods are based on intensive recording, case-control or cohort studies, and clinical trials. Other more recent methods have been applied to pharmacovigilance. Using recent personal data, the authors present and discuss these new methods. Case/non-case studies quantify drug risk based on a pharmacovigilance database, and are used as signals in pharmacovigilance. Analysis of laboratory data can also be used to evaluate biological ADRs such as elevated liver enzyme and creatine phosphokinase levels. Cross-linkage studies can be used to minimize the consequences of under-notification in pharmacovigilance. Finally, ADR perception analysis is a useful method to evaluate the perceived social and medical importance of drug risks in different populations (public, pharmacists, general practitioners, medical specialists).

  12. Soil and waste analysis for environmental risk assessment in France.

    PubMed

    Sterckeman, T; Gomez, A; Ciesielski, H

    1996-01-19

    In France today, analysis of soil and waste after digestion by strong acids is a technique used for the estimation of environmental risks due to soil pollution and spreading of wastes on cultivated soils. The technique of digestion by strong acid accounts for total or 'near total' content of As, Cd, Cr, Cu, Hg, Ni, Pb, Se and Zn. Risk management based on these methods aims to minimize the risks, since the concentration limits are derived from the geochemical levels. However, this method of analysis gives no idea of the extent to which elements are really transferable or bioavailable. Analytical methods based on partial extraction are used to discern deficiencies in B, Cu, Fe, Mn and Zn in soil. These extractions are carried out using boiling water and EDTA or DTPA solutions. The extraction methods have been standardized for use in agriculture, but have not been tested for assessing the risks due to the pollution by trace elements. One partial extraction method has been standardized for the analysis of wastes. It uses successive water extractions. Researchers have studied different partial extraction methods for estimating the bioavailability of mineral pollutants. Some of them gave results which correlated well with the amounts taken up by plants. However, at present, no general frame of reference has yet been established for the interpretation of results on a broad scale.

  13. A Method for Dynamic Risk Assessment and Management of Rockbursts in Drill and Blast Tunnels

    NASA Astrophysics Data System (ADS)

    Liu, Guo-Feng; Feng, Xia-Ting; Feng, Guang-Liang; Chen, Bing-Rui; Chen, Dong-Fang; Duan, Shu-Qian

    2016-08-01

    Focusing on the problems caused by rockburst hazards in deep tunnels, such as casualties, damage to construction equipment and facilities, construction schedule delays, and project cost increase, this research attempts to present a methodology for dynamic risk assessment and management of rockbursts in D&B tunnels. The basic idea of dynamic risk assessment and management of rockbursts is determined, and methods associated with each step in the rockburst risk assessment and management process are given, respectively. Among them, the main parts include a microseismic method for early warning the occurrence probability of rockburst risk, an estimation method that aims to assess potential consequences of rockburst risk, an evaluation method that utilizes a new quantitative index considering both occurrence probability and consequences for determining the level of rockburst risk, and the dynamic updating. Specifically, this research briefly describes the referenced microseismic method of warning rockburst, but focuses on the analysis of consequences and associated risk assessment and management of rockburst. Using the proposed method of risk assessment and management of rockburst, the occurrence probability, potential consequences, and the level of rockburst risk can be obtained in real-time during tunnel excavation, which contributes to the dynamic optimisation of risk mitigation measures and their application. The applicability of the proposed method has been verified by those cases from the Jinping II deep headrace and water drainage tunnels at depths of 1900-2525 m (with a length of 11.6 km in total for D&B tunnels).

  14. [Statistical prediction methods in violence risk assessment and its application].

    PubMed

    Liu, Yuan-Yuan; Hu, Jun-Mei; Yang, Min; Li, Xiao-Song

    2013-06-01

    It is an urgent global problem how to improve the violence risk assessment. As a necessary part of risk assessment, statistical methods have remarkable impacts and effects. In this study, the predicted methods in violence risk assessment from the point of statistics are reviewed. The application of Logistic regression as the sample of multivariate statistical model, decision tree model as the sample of data mining technique, and neural networks model as the sample of artificial intelligence technology are all reviewed. This study provides data in order to contribute the further research of violence risk assessment.

  15. The dissection of risk: a conceptual analysis.

    PubMed

    O'Byrne, Patrick

    2008-03-01

    Recently, patient safety has gained popularity in the nursing literature. While this topic is used extensively and has been analyzed thoroughly, some of the concepts upon which it relies, such as risk, have remained undertheorized. In fact, despite its considerable use, the term 'risk' has been largely assumed to be inherently neutral - meaning that its definition and discovery is seen as objective and impartial, and that risk avoidance is natural and logical. Such an oversight in evaluation requires that the concept of risk be thoroughly analyzed as it relates to nursing practices, particularly in relation to those practices surrounding bio-political nursing care, such as public health, as well as other more trendy nursing topics, such as patient safety. Thus, this paper applies the Evolutionary Model of concept analysis to explore 'risk', and expose it as one mechanism of maintaining prescribed/ proscribed social practices. Thereby, an analysis of risk results in the definitions and roles of the discipline and profession of nursing expanding from solely being dedicated to patient care, to include, in addition, its functions as a governmental body that unwittingly maintains hegemonic infrastructures.

  16. Multivariate analysis methods for spectroscopic blood analysis

    NASA Astrophysics Data System (ADS)

    Wood, Michael F. G.; Rohani, Arash; Ghazalah, Rashid; Vitkin, I. Alex; Pawluczyk, Romuald

    2012-01-01

    Blood tests are an essential tool in clinical medicine with the ability diagnosis or monitor various diseases and conditions; however, the complexities of these measurements currently restrict them to a laboratory setting. P&P Optica has developed and currently produces patented high performance spectrometers and is developing a spectrometer-based system for rapid reagent-free blood analysis. An important aspect of this analysis is the need to extract the analyte specific information from the measured signal such that the analyte concentrations can be determined. To this end, advanced chemometric methods are currently being investigated and have been tested using simulated spectra. A blood plasma model was used to generate Raman, near infrared, and optical rotatory dispersion spectra with glucose as the target analyte. The potential of combined chemometric techniques, where multiple spectroscopy modalities are used in a single regression model to improve the prediction ability was investigated using unfold partial least squares and multiblock partial least squares. Results show improvement in the predictions of glucose levels using the combined methods and demonstrate potential for multiblock chemometrics in spectroscopic blood analysis.

  17. Method for Analyzing District Level IAI Data Bases to Identify Learning Opportunity Risks.

    ERIC Educational Resources Information Center

    Milazzo, Patricia; And Others

    A learning opportunity risk is defined as an absence of instruction or insufficient attention to proficiency at an early grade of instruction in a subject matter which will generate serious learning problems in later grades. A method for identifying such risks has been derived from analysis of district-level Instructional Accomplishment…

  18. Flow methods in chiral analysis.

    PubMed

    Trojanowicz, Marek; Kaniewska, Marzena

    2013-11-01

    The methods used for the separation and analytical determination of individual isomers are based on interactions with substances exhibiting optical activity. The currently used methods for the analysis of optically active compounds are primarily high-performance separation methods, such as gas and liquid chromatography using chiral stationary phases or chiral selectors in the mobile phase, and highly efficient electromigration techniques, such as capillary electrophoresis using chiral selectors. Chemical sensors and biosensors may also be designed for the analysis of optically active compounds. As enantiomers of the same compound are characterised by almost identical physico-chemical properties, their differentiation/separation in one-step unit operation in steady-state or dynamic flow systems requires the use of highly effective chiral selectors. Examples of such determinations are reviewed in this paper, based on 105 references. The greatest successes for isomer determination involve immunochemical interactions, enantioselectivity of the enzymatic biocatalytic processes, and interactions with ion-channel receptors or molecularly imprinted polymers. Conducting such processes under dynamic flow conditions may significantly enhance the differences in the kinetics of such processes, leading to greater differences in the signals recorded for enantiomers. Such determinations in flow conditions are effectively performed using surface-plasmon resonance and piezoelectric detections, as well as using common spectroscopic and electrochemical detections.

  19. Reliability/Risk Methods and Design Tools for Application in Space Programs

    NASA Technical Reports Server (NTRS)

    Townsend, John S.; Smart, Christian

    1999-01-01

    Since 1984 NASA has funded several major programs to develop Reliability/Risk Methods and tools for engineers to apply in the design and assessment of aerospace hardware. Two probabilistic software tools that show great promise for practical application are the finite element code NESSUS and the system risk analysis code QRAS. This paper examines NASA's past, present, and future directions in reliability and risk engineering applications, Both the NESSUS and QRAS software tools are detailed.

  20. An approximate method for determining of investment risk

    NASA Astrophysics Data System (ADS)

    Slavkova, Maria; Tzenova, Zlatina

    2016-12-01

    In this work a method for determining of investment risk during all economic states is considered. It is connected to matrix games with two players. A definition for risk in a matrix game is introduced. Three properties are proven. It is considered an appropriate example.

  1. Voltametric analysis apparatus and method

    SciTech Connect

    Almon, A.C.

    1991-12-31

    An apparatus and method are disclosed for electrochemical analysis of elements in solution. An auxiliary electrode a reference electrode and five working electrodes are positioned in a container containing a sample solution. The working electrodes are spaced apart evenly from each other and auxiliary electrode to minimize any inter-electrode interference that may occur during analysis. An electric potential is applied between auxiliary electrode and each of the working electrodes. Simultaneous measurements taken of the current flow through each of the working electrodes for each given potential in a potential range are used for identifying chemical elements present in sample solution and their respective concentrations. Multiple working electrodes enable a more positive identification to be made by providing unique data characteristic of chemical elements present in the sample solution.

  2. Voltametric analysis apparatus and method

    DOEpatents

    Almon, Amy C.

    1993-01-01

    An apparatus and method for electrochemical analysis of elements in solution. An auxiliary electrode 14, a reference electrode 18, and five working electrodes 20, 22, 26, 28, and 30 are positioned in a container 12 containing a sample solution 34. The working electrodes are spaced apart evenly from each other and auxiliary electrode 14 to minimize any inter-electrode interference that may occur during analysis. An electric potential is applied between auxiliary electrode 14 and each of the working electrodes 20, 22, 26, 28, and 30. Simultaneous measurements taken of the current flow through each of the working electrodes for each given potential in a potential range are used for identifying chemical elements present in sample solution 34 and their respective concentrations. Multiple working electrodes enable a more positive identification to be made by providing unique data characteristic of chemical elements present in the sample solution.

  3. Voltammetric analysis apparatus and method

    DOEpatents

    Almon, A.C.

    1993-06-08

    An apparatus and method is described for electrochemical analysis of elements in solution. An auxiliary electrode, a reference electrode, and five working electrodes are positioned in a container containing a sample solution. The working electrodes are spaced apart evenly from each other and the auxiliary electrode to minimize any inter-electrode interference that may occur during analysis. An electric potential is applied between the auxiliary electrode and each of the working electrodes. Simultaneous measurements taken of the current flow through each of the working electrodes for each given potential in a potential range are used for identifying chemical elements present in the sample solution and their respective concentrations. Multiple working electrodes enable a more positive identification to be made by providing unique data characteristic of chemical elements present in the sample solution.

  4. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    SciTech Connect

    Lowry, Peter P.; Wagner, Katie A.

    2015-04-01

    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish a lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.

  5. Compounding conservatisms: EPA's health risk assessment methods

    SciTech Connect

    Stackelberg, K. von; Burmaster, D.E. )

    1993-03-01

    Superfund conjures up images of hazardous waste sites, which EPA is spending billions of dollars to remediate. One of the law's most worrisome effects is that it drains enormous economic resources without returning commensurate benefits. In a Sept. 1, 1991, front page article in The New York Times, experts argued that most health dangers at Superfund sites could be eliminated for a fraction of the billions that will be spent cleaning up the 1,200 high-priority sites across the country. Even EPA has suggested that the Superfund program may receive disproportionate resources, compared with other public health programs, such as radon in houses, the diminishing ozone layer and occupational diseases. Public opinion polls over the last decade consistently have mirrored the public's vast fear of hazardous waste sites, a fear as great as that held for nuclear power plants. Fear notwithstanding, the high cost of chosen remedies at given sites may have less to do with public health goals than with the method EPA uses to translate them into acceptable contaminant concentrations in soil, groundwater and other environmental media.

  6. Revealing the underlying drivers of disaster risk: a global analysis

    NASA Astrophysics Data System (ADS)

    Peduzzi, Pascal

    2017-04-01

    Disasters events are perfect examples of compound events. Disaster risk lies at the intersection of several independent components such as hazard, exposure and vulnerability. Understanding the weight of each component requires extensive standardisation. Here, I show how footprints of past disastrous events were generated using GIS modelling techniques and used for extracting population and economic exposures based on distribution models. Using past event losses, it was possible to identify and quantify a wide range of socio-politico-economic drivers associated with human vulnerability. The analysis was applied to about nine thousand individual past disastrous events covering earthquakes, floods and tropical cyclones. Using a multiple regression analysis on these individual events it was possible to quantify each risk component and assess how vulnerability is influenced by various hazard intensities. The results show that hazard intensity, exposure, poverty, governance as well as other underlying factors (e.g. remoteness) can explain the magnitude of past disasters. Analysis was also performed to highlight the role of future trends in population and climate change and how this may impacts exposure to tropical cyclones in the future. GIS models combined with statistical multiple regression analysis provided a powerful methodology to identify, quantify and model disaster risk taking into account its various components. The same methodology can be applied to various types of risk at local to global scale. This method was applied and developed for the Global Risk Analysis of the Global Assessment Report on Disaster Risk Reduction (GAR). It was first applied on mortality risk in GAR 2009 and GAR 2011. New models ranging from global assets exposure and global flood hazard models were also recently developed to improve the resolution of the risk analysis and applied through CAPRA software to provide probabilistic economic risk assessments such as Average Annual Losses (AAL

  7. Multi-SNP Haplotype Analysis Methods for Association Analysis.

    PubMed

    Stram, Daniel O

    2017-01-01

    Haplotype analysis forms the basis of much of genetic association analysis using both related and unrelated individuals (we concentrate on unrelated). For example, haplotype analysis indirectly underlies the SNP imputation methods that are used for testing trait associations with known but unmeasured variants and for performing collaborative post-GWAS meta-analysis. This chapter is focused on the direct use of haplotypes in association testing. It reviews the rationale for haplotype-based association testing, discusses statistical issues related to haplotype uncertainty that affect the analysis, then gives practical guidance for testing haplotype-based associations with phenotype or outcome trait, first of candidate gene regions and then for the genome as a whole. Haplotypes are interesting for two reasons, first they may be in closer LD with a causal variant than any single measured SNP, and therefore may enhance the coverage value of the genotypes over single SNP analysis. Second, haplotypes may themselves be the causal variants of interest and some solid examples of this have appeared in the literature.This chapter discusses three possible approaches to incorporation of SNP haplotype analysis into generalized linear regression models: (1) a simple substitution method involving imputed haplotypes, (2) simultaneous maximum likelihood (ML) estimation of all parameters, including haplotype frequencies and regression parameters, and (3) a simplified approximation to full ML for case-control data.Examples of the various approaches for a haplotype analysis of a candidate gene are provided. We compare the behavior of the approximation-based methods and argue that in most instances the simpler methods hold up well in practice. We also describe the practical implementation of haplotype risk estimation genome-wide and discuss several shortcuts that can be used to speed up otherwise potentially very intensive computational requirements.

  8. Advanced Risk Analysis for High-Performing Organizations

    DTIC Science & Technology

    2006-01-01

    using traditional risk analysis techniques. Mission Assurance Analysis Protocol (MAAP) is one technique that high performers can use to identify and mitigate the risks arising from operational complexity....The operational environment for many types of organizations is changing. Changes in operational environments are driving the need for advanced risk ... analysis techniques. Many types of risk prevalent in today’s operational environments (e.g., event risks, inherited risk) are not readily identified

  9. Comparison study on qualitative and quantitative risk assessment methods for urban natural gas pipeline network.

    PubMed

    Han, Z Y; Weng, W G

    2011-05-15

    In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.

  10. Risk-based prioritization method for the classification of groundwater pesticide pollution from agricultural regions.

    PubMed

    Yang, Yu; Lian, Xin-Ying; Jiang, Yong-Hai; Xi, Bei-Dou; He, Xiao-Song

    2017-06-03

    Agricultural regions are a significant source of groundwater pesticide pollution. To ensure that agricultural regions with a significantly high risk of groundwater pesticide contamination are properly managed, a risk-based ranking method related to groundwater pesticide contamination is needed. In the present paper, a risk-based prioritization method for the classification of groundwater pesticide pollution from agricultural regions was established. The method encompasses 3 phases, including indicator selection, characterization, and classification. In the risk ranking index system employed here, 17 indicators involving the physicochemical properties, environmental behavior characteristics, pesticide application methods, and inherent vulnerability of groundwater in the agricultural region were selected. The boundary of each indicator was determined using K-means cluster analysis based on a survey of a typical agricultural region and the physical and chemical properties of 300 typical pesticides. The total risk characterization was calculated by multiplying the risk value of each indicator, which could effectively avoid the subjectivity of index weight calculation and identify the main factors associated with the risk. The results indicated that the risk for groundwater pesticide contamination from agriculture in a region could be ranked into 4 classes from low to high risk. This method was applied to an agricultural region in Jiangsu Province, China, and it showed that this region had a relatively high risk for groundwater contamination from pesticides, and that the pesticide application method was the primary factor contributing to the relatively high risk. The risk ranking method was determined to be feasible, valid, and able to provide reference data related to the risk management of groundwater pesticide pollution from agricultural regions. Integr Environ Assess Manag 2017;00:000-000. © 2017 SETAC. © 2017 SETAC.

  11. Risk analysis for confined space entries: Critical analysis of four tools applied to three risk scenarios.

    PubMed

    Burlet-Vienney, Damien; Chinniah, Yuvin; Bahloul, Ali; Roberge, Brigitte

    2016-01-01

    Investigation reports of fatal confined space accidents nearly always point to a problem of identifying or underestimating risks. This paper compares 4 different risk analysis tools developed for confined spaces by applying them to 3 hazardous scenarios. The tools were namely 1. a checklist without risk estimation (Tool A), 2. a checklist with a risk scale (Tool B), 3. a risk calculation without a formal hazard identification stage (Tool C), and 4. a questionnaire followed by a risk matrix (Tool D). Each tool's structure and practical application were studied. Tools A and B gave crude results comparable to those of more analytic tools in less time. Their main limitations were lack of contextual information for the identified hazards and greater dependency on the user's expertise and ability to tackle hazards of different nature. Tools C and D utilized more systematic approaches than tools A and B by supporting risk reduction based on the description of the risk factors. Tool D is distinctive because of 1. its comprehensive structure with respect to the steps suggested in risk management, 2. its dynamic approach to hazard identification, and 3. its use of data resulting from the risk analysis.

  12. Risk Analysis of the Supply-Handling Conveyor System.

    DTIC Science & Technology

    The report documents the risk analysis that was performed on a supply-handling conveyor system. The risk analysis was done to quantify the risks...involved for project development in addition to compliance with the draft AMC regulation on risk analysis . The conveyor system is in the final phase of

  13. A generic computerized method for estimate of familial risks.

    PubMed Central

    Colombet, Isabelle; Xu, Yigang; Jaulent, Marie-Christine; Desages, Daniel; Degoulet, Patrice; Chatellier, Gilles

    2002-01-01

    Most guidelines developed for cancers screening and for cardiovascular risk management use rules to estimate familial risk. These rules are complex, difficult to memorize, and need to collect a complete pedigree. This paper describes a generic computerized method to estimate familial risks and its implementation in an internet-based application. The program is based on 3 generic models: a model of the family; a model of familial risk; a display model for the pedigree. The model of family allows to represent each member of the family and to construct and display a family tree. The model of familial risk is generic and allows easy update of the program with new diseases or new rules. It was possible to implement guidelines dealing with breast and colorectal cancer and cardiovascular diseases prevention. First evaluation with general practitioners showed that the program was usable. Impact on quality of familial risk estimate should be more documented. PMID:12463810

  14. Analysis of interactions among barriers in project risk management

    NASA Astrophysics Data System (ADS)

    Dandage, Rahul V.; Mantha, Shankar S.; Rane, Santosh B.; Bhoola, Vanita

    2017-06-01

    In the context of the scope, time, cost, and quality constraints, failure is not uncommon in project management. While small projects have 70% chances of success, large projects virtually have no chance of meeting the quadruple constraints. While there is no dearth of research on project risk management, the manifestation of barriers to project risk management is a less dwelt topic. The success of project management is oftentimes based on the understanding of barriers to effective risk management, application of appropriate risk management methodology, proactive leadership to avoid barriers, workers' attitude, adequate resources, organizational culture, and involvement of top management. This paper represents various risk categories and barriers to risk management in domestic and international projects through literature survey and feedback from project professionals. After analysing the various modelling methods used in project risk management literature, interpretive structural modelling (ISM) and MICMAC analysis have been used to analyse interactions among the barriers and prioritize them. The analysis indicates that lack of top management support, lack of formal training, and lack of addressing cultural differences are the high priority barriers, among many others.

  15. Integrated Reliability and Risk Analysis System (IRRAS)

    SciTech Connect

    Russell, K D; McKay, M K; Sattison, M.B. Skinner, N.L.; Wood, S T; Rasmuson, D M

    1992-01-01

    The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the user the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification. Version 1.0 of the IRRAS program was released in February of 1987. Since that time, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 4.0 and is the subject of this Reference Manual. Version 4.0 of IRRAS provides the same capabilities as Version 1.0 and adds a relational data base facility for managing the data, improved functionality, and improved algorithm performance.

  16. A Handbook of Cost Risk Analysis Methods

    DTIC Science & Technology

    1993-04-01

    considered products MIA publishes. They normally embody results of major projects which (a) have a direct bearing on decisions affecting major programs...April 1981. Budescu , David V., and Thomas S. Wallsten. "Encoding Subjective Probabilities: A Psychological and Psychometric Review." Management

  17. Risk analysis for environmental health triage.

    PubMed

    Bogen, Kenneth T

    2005-10-01

    The Homeland Security Act mandates the development of a national, risk-based system to support planning for, response to, and recovery from emergency situations involving large-scale toxic exposures. To prepare for and manage consequences effectively, planners and responders need not only to identify zones of potentially elevated individual risk but also to predict expected casualties. Emergency response support systems now define "consequences" by mapping areas in which toxic chemical concentrations do or may exceed Acute Exposure Guideline Levels (AEGLs) or similar guidelines. However, because AEGLs do not estimate expected risks, current unqualified claims that such maps support consequence management are misleading. Intentionally protective, AEGLs incorporate various safety/uncertainty factors depending on the scope and quality of chemical-specific toxicity data. Some of these factors are irrelevant, and others need to be modified, whenever resource constraints or exposure-scenario complexities require responders to make critical trade-off (triage) decisions in order to minimize expected casualties. AEGL-exceedance zones cannot consistently be aggregated, compared, or used to calculate expected casualties and so may seriously misguide emergency response triage decisions. Methods and tools well established and readily available to support environmental health protection are not yet developed for chemically-related environmental health triage. Effective triage decisions involving chemical risks require a new assessment approach that focuses on best estimates of likely casualties, rather than on upper plausible bounds of individual risk. If risk-based consequence management is to become a reality, federal agencies tasked with supporting emergency response must actively coordinate to foster new methods that can support effective environmental health triage.

  18. Risk Analysis for Environmental Health Triage

    SciTech Connect

    Bogen, K T

    2005-11-18

    The Homeland Security Act mandates development of a national, risk-based system to support planning for, response to and recovery from emergency situations involving large-scale toxic exposures. To prepare for and manage consequences effectively, planners and responders need not only to identify zones of potentially elevated individual risk, but also to predict expected casualties. Emergency response support systems now define ''consequences'' by mapping areas in which toxic chemical concentrations do or may exceed Acute Exposure Guideline Levels (AEGLs) or similar guidelines. However, because AEGLs do not estimate expected risks, current unqualified claims that such maps support consequence management are misleading. Intentionally protective, AEGLs incorporate various safety/uncertainty factors depending on scope and quality of chemical-specific toxicity data. Some of these factors are irrelevant, and others need to be modified, whenever resource constraints or exposure-scenario complexities require responders to make critical trade-off (triage) decisions in order to minimize expected casualties. AEGL-exceedance zones cannot consistently be aggregated, compared, or used to calculate expected casualties, and so may seriously misguide emergency response triage decisions. Methods and tools well established and readily available to support environmental health protection are not yet developed for chemically related environmental health triage. Effective triage decisions involving chemical risks require a new assessment approach that focuses on best estimates of likely casualties, rather than on upper plausible bounds of individual risk. If risk-based consequence management is to become a reality, federal agencies tasked with supporting emergency response must actively coordinate to foster new methods that can support effective environmental health triage.

  19. Risk and value analysis of SETI

    NASA Technical Reports Server (NTRS)

    Billingham, J.

    1990-01-01

    This paper attempts to apply a traditional risk and value analysis to the Search for Extraterrestrial Intelligence--SETI. In view of the difficulties of assessing the probability of success, a comparison is made between SETI and a previous search for extraterrestrial life, the biological component of Project Viking. Our application of simple Utility Theory, given some reasonable assumptions, suggests that SETI is at least as worthwhile as the biological experiment on Viking.

  20. Risk and value analysis of SETI.

    PubMed

    Billingham, J

    1990-01-01

    This paper attempts to apply a traditional risk and value analysis to the Search for Extraterrestrial Intelligence--SETI. In view of the difficulties of assessing the probability of success, a comparison is made between SETI and a previous search for extraterrestrial life, the biological component of Project Viking. Our application of simple Utility Theory, given some reasonable assumptions, suggests that SETI is at least as worthwhile as the biological experiment on Viking.

  1. Risk and value analysis of SETI

    NASA Technical Reports Server (NTRS)

    Billingham, J.

    1990-01-01

    This paper attempts to apply a traditional risk and value analysis to the Search for Extraterrestrial Intelligence--SETI. In view of the difficulties of assessing the probability of success, a comparison is made between SETI and a previous search for extraterrestrial life, the biological component of Project Viking. Our application of simple Utility Theory, given some reasonable assumptions, suggests that SETI is at least as worthwhile as the biological experiment on Viking.

  2. An integrated risk analysis methodology in a multidisciplinary design environment

    NASA Astrophysics Data System (ADS)

    Hampton, Katrina Renee

    Design of complex, one-of-a-kind systems, such as space transportation systems, is characterized by high uncertainty and, consequently, high risk. It is necessary to account for these uncertainties in the design process to produce systems that are more reliable. Systems designed by including uncertainties and managing them, as well, are more robust and less prone to poor operations as a result of parameter variability. The quantification, analysis and mitigation of uncertainties are challenging tasks as many systems lack historical data. In such an environment, risk or uncertainty quantification becomes subjective because input data is based on professional judgment. Additionally, there are uncertainties associated with the analysis tools and models. Both the input data and the model uncertainties must be considered for a multi disciplinary systems level risk analysis. This research synthesizes an integrated approach for developing a method for risk analysis. Expert judgment methodology is employed to quantify external risk. This methodology is then combined with a Latin Hypercube Sampling - Monte Carlo simulation to propagate uncertainties across a multidisciplinary environment for the overall system. Finally, a robust design strategy is employed to mitigate risk during the optimization process. This type of approach to risk analysis is conducive to the examination of quantitative risk factors. The core of this research methodology is the theoretical framework for uncertainty propagation. The research is divided into three stages or modules. The first two modules include the identification/quantification and propagation of uncertainties. The third module involves the management of uncertainties or response optimization. This final module also incorporates the integration of risk into program decision-making. The risk analysis methodology, is applied to a launch vehicle conceptual design study at NASA Langley Research Center. The launch vehicle multidisciplinary

  3. Sensitivity analysis of a two-dimensional probabilistic risk assessment model using analysis of variance.

    PubMed

    Mokhtari, Amirhossein; Frey, H Christopher

    2005-12-01

    This article demonstrates application of sensitivity analysis to risk assessment models with two-dimensional probabilistic frameworks that distinguish between variability and uncertainty. A microbial food safety process risk (MFSPR) model is used as a test bed. The process of identifying key controllable inputs and key sources of uncertainty using sensitivity analysis is challenged by typical characteristics of MFSPR models such as nonlinearity, thresholds, interactions, and categorical inputs. Among many available sensitivity analysis methods, analysis of variance (ANOVA) is evaluated in comparison to commonly used methods based on correlation coefficients. In a two-dimensional risk model, the identification of key controllable inputs that can be prioritized with respect to risk management is confounded by uncertainty. However, as shown here, ANOVA provided robust insights regarding controllable inputs most likely to lead to effective risk reduction despite uncertainty. ANOVA appropriately selected the top six important inputs, while correlation-based methods provided misleading insights. Bootstrap simulation is used to quantify uncertainty in ranks of inputs due to sampling error. For the selected sample size, differences in F values of 60% or more were associated with clear differences in rank order between inputs. Sensitivity analysis results identified inputs related to the storage of ground beef servings at home as the most important. Risk management recommendations are suggested in the form of a consumer advisory for better handling and storage practices.

  4. Fire Risk Analysis for Armenian NPP Confinement

    SciTech Connect

    Poghosyan, Shahen; Malkhasyan, Albert; Bznuni, Surik; Amirjanyan, Armen

    2006-07-01

    Major fire occurred at Armenian NPP (ANPP) in October 1982 showed that fire-induced initiating events (IE) can have dominant contribution in overall risk of core damage. Probabilistic Safety Assessment study for fire-induced initiating events for ANPP was initiated in 2002. Analysis was performed for compartments fires in which could result in failure of components which are necessary for reactor cold shutdown. Analysis shows that main risk from fire at ANPP is conditioned by fire in cable tunnels 61-64. Meanwhile fire in confinement compartments don't have significant contribution to overall risk of core damage. The exception is so called 'confinement valves compartment' (room no.A-013/2) fire (more than 7.5% of CDF) in which fire could result in the loss of coolant accident with unavailability of primary makeup system, which directly leads to core damage. Detailed analysis of this problem that is common for typical WWER-440/230 reactors with no hermetic MCPs and recommendations for solution are presented in this paper. (authors)

  5. Flow analysis system and method

    NASA Technical Reports Server (NTRS)

    Hill, Wayne S. (Inventor); Barck, Bruce N. (Inventor)

    1998-01-01

    A non-invasive flow analysis system and method wherein a sensor, such as an acoustic sensor, is coupled to a conduit for transmitting a signal which varies depending on the characteristics of the flow in the conduit. The signal is amplified and there is a filter, responsive to the sensor signal, and tuned to pass a narrow band of frequencies proximate the resonant frequency of the sensor. A demodulator generates an amplitude envelope of the filtered signal and a number of flow indicator quantities are calculated based on variations in amplitude of the amplitude envelope. A neural network, or its equivalent, is then used to determine the flow rate of the flow in the conduit based on the flow indicator quantities.

  6. Aircraft accidents : method of analysis

    NASA Technical Reports Server (NTRS)

    1929-01-01

    This report on a method of analysis of aircraft accidents has been prepared by a special committee on the nomenclature, subdivision, and classification of aircraft accidents organized by the National Advisory Committee for Aeronautics in response to a request dated February 18, 1928, from the Air Coordination Committee consisting of the Assistant Secretaries for Aeronautics in the Departments of War, Navy, and Commerce. The work was undertaken in recognition of the difficulty of drawing correct conclusions from efforts to analyze and compare reports of aircraft accidents prepared by different organizations using different classifications and definitions. The air coordination committee's request was made "in order that practices used may henceforth conform to a standard and be universally comparable." the purpose of the special committee therefore was to prepare a basis for the classification and comparison of aircraft accidents, both civil and military. (author)

  7. Preliminary Technical Risk Analysis for the Geothermal Technologies Program

    SciTech Connect

    McVeigh, J.; Cohen, J.; Vorum, M.; Porro, G.; Nix, G.

    2007-03-01

    This report explains the goals, methods, and results of a probabilistic analysis of technical risk for a portfolio of R&D projects in the DOE Geothermal Technologies Program ('the Program'). The analysis is a task by Princeton Energy Resources International, LLC (PERI), in support of the National Renewable Energy Laboratory (NREL) on behalf of the Program. The main challenge in the analysis lies in translating R&D results to a quantitative reflection of technical risk for a key Program metric: levelized cost of energy (LCOE). This requires both computational development (i.e., creating a spreadsheet-based analysis tool) and a synthesis of judgments by a panel of researchers and experts of the expected results of the Program's R&D.

  8. Hybrid methods for rotordynamic analysis

    NASA Technical Reports Server (NTRS)

    Noah, Sherif T.

    1986-01-01

    Effective procedures are presented for the response analysis of the Space Shuttle Main Engine turbopumps under transient loading conditions. Of particular concern is the determination of the nonlinear response of the systems to rotor imbalance in presence of bearing clearances. The proposed procedures take advantage of the nonlinearities involved being localized at only a few rotor/housing coupling joints. The methods include those based on integral formulations for the incremental solutions involving the transition matrices of the rotor and housing. Alternatively, a convolutional representation of the housing displacements at the coupling points is proposed which would allow performing the transient analysis on a reduced model of the housing. The integral approach is applied to small dynamical models to demonstrate the efficiency of the approach. For purposes of assessing the numerical integration results for the nonlinear rotor/housing systems, a numerical harmonic balance procedure is developed to enable determining all possible harmonic, subharmonic, and nonperiodic solutions of the systems. A brief account of the Fourier approach is presented as applied to a two degree of freedon rotor-support system.

  9. 14 CFR 417.225 - Debris risk analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Debris risk analysis. 417.225 Section 417... OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.225 Debris risk analysis. A flight safety analysis must demonstrate that the risk to the public potentially exposed to inert and...

  10. Analysis Methods of Magnesium Chips

    NASA Astrophysics Data System (ADS)

    Ohmann, Sven; Ditze, André; Scharf, Christiane

    2015-11-01

    The quality of recycled magnesium from chips depends strongly on their exposure to inorganic and organic impurities that are added during the production processes. Different kinds of magnesium chips from these processes were analyzed by several methods. In addition, the accuracy and effectiveness of the methods are discussed. The results show that the chips belong either to the AZ91, AZ31, AM50/60, or AJ62 alloy. Some kinds of chips show deviations from the above-mentioned normations. Different impurities result mainly from transition metals and lime. The water and oil content does not exceed 25%, and the chip size is not more than 4 mm in the diameter. The sieve analysis shows good results for oily and wet chips. The determination of oil and water shows better results for the application of a Soxhlet compared with the addition of lime and vacuum distillation. The most accurate values for the determination of water and oil are obtained by drying at 110°C (for water) and washing with acetone (for oil) by hand.

  11. Reliability and cost analysis methods

    NASA Technical Reports Server (NTRS)

    Suich, Ronald C.

    1991-01-01

    In the design phase of a system, how does a design engineer or manager choose between a subsystem with .990 reliability and a more costly subsystem with .995 reliability? When is the increased cost justified? High reliability is not necessarily an end in itself but may be desirable in order to reduce the expected cost due to subsystem failure. However, this may not be the wisest use of funds since the expected cost due to subsystem failure is not the only cost involved. The subsystem itself may be very costly. We should not consider either the cost of the subsystem or the expected cost due to subsystem failure separately but should minimize the total of the two costs, i.e., the total of the cost of the subsystem plus the expected cost due to subsystem failure. This final report discusses the Combined Analysis of Reliability, Redundancy, and Cost (CARRAC) methods which were developed under Grant Number NAG 3-1100 from the NASA Lewis Research Center. CARRAC methods and a CARRAC computer program employ five models which can be used to cover a wide range of problems. The models contain an option which can include repair of failed modules.

  12. Methods for diagnosing the risk factors of stone formation

    PubMed Central

    Robertson, William G.

    2012-01-01

    Objective To compare various systems for assessing the risk of recurrent stones, based on the composition of urine. Methods The relative supersaturation (RSS) of urine, the Tiselius Indices, the Robertson Risk Factor Algorithms (RRFA) and the BONN-Risk Index were compared in terms of the numbers of variables required to be measured, the ease of use of the system and the value of the information obtained. Results The RSS methods require up to 14 analyses in every urine sample but measure the RSS of all the main constituents of kidney stones. The Tiselius Indices and the RRFA require only seven analyses. The Tiselius Indices yield information on the crystallisation potentials (CP) of calcium oxalate and calcium phosphate; the RRFA also provide information on the CP of uric acid. Both methods provide details on the particular urinary abnormalities that lead to the abnormal CP of that urine. The BONN-Risk Index requires two measurements in each urine sample but only provides information on the CP of calcium oxalate. Additional measurements in urine have to be made to identify the cause of any abnormality. Conclusions The methods that are based on measuring RSS are work-intensive and unsuitable for the routine screening of patients. The Tiselius Indices and the RRFA are equally good at predicting the risk of a patient forming further stones. The BONN-Risk Index provides no additional information about the causative factors for any abnormality detected. PMID:26558033

  13. Computational methods for global/local analysis

    NASA Technical Reports Server (NTRS)

    Ransom, Jonathan B.; Mccleary, Susan L.; Aminpour, Mohammad A.; Knight, Norman F., Jr.

    1992-01-01

    Computational methods for global/local analysis of structures which include both uncoupled and coupled methods are described. In addition, global/local analysis methodology for automatic refinement of incompatible global and local finite element models is developed. Representative structural analysis problems are presented to demonstrate the global/local analysis methods.

  14. Semicompeting risks in aging research: methods, issues and needs

    PubMed Central

    Varadhan, Ravi; Xue, Qian-Li; Bandeen-Roche, Karen

    2015-01-01

    A semicompeting risks problem involves two-types of events: a nonterminal and a terminal event (death). Typically, the nonterminal event is the focus of the study, but the terminal event can preclude the occurrence of the nonterminal event. Semicompeting risks are ubiquitous in studies of aging. Examples of semicompeting risk dyads include: dementia and death, frailty syndrome and death, disability and death, and nursing home placement and death. Semicompeting risk models can be divided into two broad classes: models based only on observables quantities (class O) and those based on potential (latent) failure times (class L). The classical illness-death model belongs to class O. This model is a special case of the multistate models, which has been an active area of methodology development. During the past decade and a half, there has also been a flurry of methodological activity on semicompeting risks based on latent failure times (L models). These advances notwithstanding, the semi-competing risks methodology has not penetrated biomedical research, in general, and gerontological research, in particular. Some possible reasons for this lack of uptake are: the methods are relatively new and sophisticated, conceptual problems associated with potential failure time models are difficult to overcome, paucity of expository articles aimed at educating practitioners, and non-availability of readily usable software. The main goals of this review article are: (i) to describe the major types of semicompeting risks problems arising in aging research, (ii) to provide a brief survey of the semicompeting risks methods, (iii) to suggest appropriate methods for addressing the problems in aging research, (iv) to highlight areas where more work is needed, and (v) to suggest ways to facilitate the uptake of the semicompeting risks methodology by the broader biomedical research community. PMID:24729136

  15. Risk analysis by FMEA as an element of analytical validation.

    PubMed

    van Leeuwen, J F; Nauta, M J; de Kaste, D; Odekerken-Rombouts, Y M C F; Oldenhof, M T; Vredenbregt, M J; Barends, D M

    2009-12-05

    We subjected a Near-Infrared (NIR) analytical procedure used for screening drugs on authenticity to a Failure Mode and Effects Analysis (FMEA), including technical risks as well as risks related to human failure. An FMEA team broke down the NIR analytical method into process steps and identified possible failure modes for each step. Each failure mode was ranked on estimated frequency of occurrence (O), probability that the failure would remain undetected later in the process (D) and severity (S), each on a scale of 1-10. Human errors turned out to be the most common cause of failure modes. Failure risks were calculated by Risk Priority Numbers (RPNs)=O x D x S. Failure modes with the highest RPN scores were subjected to corrective actions and the FMEA was repeated, showing reductions in RPN scores and resulting in improvement indices up to 5.0. We recommend risk analysis as an addition to the usual analytical validation, as the FMEA enabled us to detect previously unidentified risks.

  16. Bleeding after endoscopic submucosal dissection: Risk factors and preventive methods

    PubMed Central

    Kataoka, Yosuke; Tsuji, Yosuke; Sakaguchi, Yoshiki; Minatsuki, Chihiro; Asada-Hirayama, Itsuko; Niimi, Keiko; Ono, Satoshi; Kodashima, Shinya; Yamamichi, Nobutake; Fujishiro, Mitsuhiro; Koike, Kazuhiko

    2016-01-01

    Endoscopic submucosal dissection (ESD) has become widely accepted as a standard method of treatment for superficial gastrointestinal neoplasms because it enables en block resection even for large lesions or fibrotic lesions with minimal invasiveness, and decreases the local recurrence rate. Moreover, specimens resected in an en block fashion enable accurate histological assessment. Taking these factors into consideration, ESD seems to be more advantageous than conventional endoscopic mucosal resection (EMR), but the associated risks of perioperative adverse events are higher than in EMR. Bleeding after ESD is the most frequent among these adverse events. Although post-ESD bleeding can be controlled by endoscopic hemostasis in most cases, it may lead to serious conditions including hemorrhagic shock. Even with preventive methods including administration of acid secretion inhibitors and preventive hemostasis, post-ESD bleeding cannot be completely prevented. In addition high-risk cases for post-ESD bleeding, which include cases with the use of antithrombotic agents or which require large resection, are increasing. Although there have been many reports about associated risk factors and methods of preventing post-ESD bleeding, many issues remain unsolved. Therefore, in this review, we have overviewed risk factors and methods of preventing post-ESD bleeding from previous studies. Endoscopists should have sufficient knowledge of these risk factors and preventive methods when performing ESD. PMID:27468187

  17. A Method for Accounting for Risk in Lending.

    DTIC Science & Technology

    1997-06-01

    interest rate is profitable or whether the interest rate should be raised to increase profitability and compensate for risk, or decreased to increase competitiveness? Many lending institutions, specifically furniture retailers, do not use scientific methods for determining their risk of payment defaults on loans to... interest rate to charge. Many of these techniques will be cited in chapter II. None, however, seem to involve calculating a rate based directly on

  18. Methods to Develop Inhalation Cancer Risk Estimates for ...

    EPA Pesticide Factsheets

    This document summarizes the approaches and rationale for the technical and scientific considerations used to derive inhalation cancer risks for emissions of chromium and nickel compounds from electric utility steam generating units. The purpose of this document is to discuss the methods used to develop inhalation cancer risk estimates associated with emissions of chromium and nickel compounds from coal- and oil-fired electric utility steam generating units (EGUs) in support of EPA's recently proposed Air Toxics Rule.

  19. Best self visualization method with high-risk youth.

    PubMed

    Schussel, Lorne; Miller, Lisa

    2013-08-01

    The healing process of the Best Self Visualization Method (BSM) is described within the framework of meditation, neuroscience, and psychodynamic theory. Cases are drawn from the treatment of high-risk youth, who have histories of poverty, survival of sexual and physical abuse, and/or current risk for perpetrating abuse. Clinical use of BSM is demonstrated in two case illustrations, one of group psychotherapy and another of individual therapy.

  20. Assessment of Methods for Estimating Risk to Birds from ...

    EPA Pesticide Factsheets

    The U.S. EPA Ecological Risk Assessment Support Center (ERASC) announced the release of the final report entitled, Assessment of Methods for Estimating Risk to Birds from Ingestion of Contaminated Grit Particles. This report evaluates approaches for estimating the probability of ingestion by birds of contaminated particles such as pesticide granules or lead particles (i.e. shot or bullet fragments). In addition, it presents an approach for using this information to estimate the risk of mortality to birds from ingestion of lead particles. Response to ERASC Request #16

  1. Approach to uncertainty in risk analysis

    SciTech Connect

    Rish, W.R.

    1988-08-01

    In the Fall of 1985 EPA's Office of Radiation Programs (ORP) initiated a project to develop a formal approach to dealing with uncertainties encountered when estimating and evaluating risks to human health and the environment. Based on a literature review of modeling uncertainty, interviews with ORP technical and management staff, and input from experts on uncertainty analysis, a comprehensive approach was developed. This approach recognizes by design the constraints on budget, time, manpower, expertise, and availability of information often encountered in ''real world'' modeling. It is based on the observation that in practice risk modeling is usually done to support a decision process. As such, the approach focuses on how to frame a given risk modeling problem, how to use that framing to select an appropriate mixture of uncertainty analyses techniques, and how to integrate the techniques into an uncertainty assessment that effectively communicates important information and insight to decision-makers. The approach is presented in this report. Practical guidance on characterizing and analyzing uncertainties about model form and quantities and on effectively communicating uncertainty analysis results is included. Examples from actual applications are presented.

  2. Sensitivity Analysis of Launch Vehicle Debris Risk Model

    NASA Technical Reports Server (NTRS)

    Gee, Ken; Lawrence, Scott L.

    2010-01-01

    As part of an analysis of the loss of crew risk associated with an ascent abort system for a manned launch vehicle, a model was developed to predict the impact risk of the debris resulting from an explosion of the launch vehicle on the crew module. The model consisted of a debris catalog describing the number, size and imparted velocity of each piece of debris, a method to compute the trajectories of the debris and a method to calculate the impact risk given the abort trajectory of the crew module. The model provided a point estimate of the strike probability as a function of the debris catalog, the time of abort and the delay time between the abort and destruction of the launch vehicle. A study was conducted to determine the sensitivity of the strike probability to the various model input parameters and to develop a response surface model for use in the sensitivity analysis of the overall ascent abort risk model. The results of the sensitivity analysis and the response surface model are presented in this paper.

  3. Expanding the Scope of Risk Assessment: Methods of Studying Differential Vulnerability and Susceptibility

    PubMed Central

    Bellinger, David; Glass, Thomas

    2011-01-01

    Several methodological issues have been identified in analysis of epidemiological data to better assess the distributional effects of exposures and hypotheses about effect modification. We discuss the hierarchical mixed model and some more complex methods. Methods of capturing inequality are a second dimension of risk assessment, and simulation studies are important because plausible choices for air pollution effects and effect modifiers could result in extremely high risks in a small subset of the population. Future epidemiological studies should explore contextual and individual-level factors that might modify these relationships. The Environmental Protection Agency should make this a standard part of their risk assessments whenever the necessary information is available. PMID:22021313

  4. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments.

    PubMed

    Conners, Erin E; West, Brooke S; Roth, Alexis M; Meckel-Parker, Kristen G; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D; Brouwer, Kimberly C

    2016-01-01

    Increasingly, 'place', including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions.

  5. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments

    PubMed Central

    Conners, Erin E.; West, Brooke S.; Roth, Alexis M.; Meckel-Parker, Kristen G.; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D.; Brouwer, Kimberly C.

    2016-01-01

    Increasingly, ‘place’, including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions. PMID:27191846

  6. Method and tool for network vulnerability analysis

    DOEpatents

    Swiler, Laura Painton; Phillips, Cynthia A.

    2006-03-14

    A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

  7. A method for scenario-based risk assessment for robust aerospace systems

    NASA Astrophysics Data System (ADS)

    Thomas, Victoria Katherine

    In years past, aircraft conceptual design centered around creating a feasible aircraft that could be built and could fly the required missions. More recently, aircraft viability entered into conceptual design, allowing that the product's potential to be profitable should also be examined early in the design process. While examining an aerospace system's feasibility and viability early in the design process is extremely important, it is also important to examine system risk. In traditional aerospace systems risk analysis, risk is examined from the perspective of performance, schedule, and cost. Recently, safety and reliability analysis have been brought forward in the design process to also be examined during late conceptual and early preliminary design. While these analyses work as designed, existing risk analysis methods and techniques are not designed to examine an aerospace system's external operating environment and the risks present there. A new method has been developed here to examine, during the early part of concept design, the risk associated with not meeting assumptions about the system's external operating environment. The risks are examined in five categories: employment, culture, government and politics, economics, and technology. The risks are examined over a long time-period, up to the system's entire life cycle. The method consists of eight steps over three focus areas. The first focus area is Problem Setup. During problem setup, the problem is defined and understood to the best of the decision maker's ability. There are four steps in this area, in the following order: Establish the Need, Scenario Development, Identify Solution Alternatives, and Uncertainty and Risk Identification. There is significant iteration between steps two through four. Focus area two is Modeling and Simulation. In this area the solution alternatives and risks are modeled, and a numerical value for risk is calculated. A risk mitigation model is also created. The four steps

  8. Modeling Opponents in Adversarial Risk Analysis.

    PubMed

    Rios Insua, David; Banks, David; Rios, Jesus

    2016-04-01

    Adversarial risk analysis has been introduced as a framework to deal with risks derived from intentional actions of adversaries. The analysis supports one of the decisionmakers, who must forecast the actions of the other agents. Typically, this forecast must take account of random consequences resulting from the set of selected actions. The solution requires one to model the behavior of the opponents, which entails strategic thinking. The supported agent may face different kinds of opponents, who may use different rationality paradigms, for example, the opponent may behave randomly, or seek a Nash equilibrium, or perform level-k thinking, or use mirroring, or employ prospect theory, among many other possibilities. We describe the appropriate analysis for these situations, and also show how to model the uncertainty about the rationality paradigm used by the opponent through a Bayesian model averaging approach, enabling a fully decision-theoretic solution. We also show how as we observe an opponent's decision behavior, this approach allows learning about the validity of each of the rationality models used to predict his decision by computing the models' (posterior) probabilities, which can be understood as a measure of their validity. We focus on simultaneous decision making by two agents.

  9. Defining Human Failure Events for Petroleum Risk Analysis

    SciTech Connect

    Ronald L. Boring; Knut Øien

    2014-06-01

    In this paper, an identification and description of barriers and human failure events (HFEs) for human reliability analysis (HRA) is performed. The barriers, called target systems, are identified from risk significant accident scenarios represented as defined situations of hazard and accident (DSHAs). This report serves as the foundation for further work to develop petroleum HFEs compatible with the SPAR-H method and intended for reuse in future HRAs.

  10. Expert opinion in risk analysis: The NUREG-1150 methodology

    SciTech Connect

    Hora, S.C.; Iman, R.L.

    1988-01-01

    The Reactor Risk Reference Document (US Nuclear Regulatory Commission, 1987) is the most comprehensive study and application of probabilistic risk analysis and uncertainty analysis methods for nuclear power generation safety since the Reactor Safety Study (US Nuclear Regulatory Commission, 1975). Many of the issues addressed in PRA work such as NUREG-1150 involve phenomena that have not been studied through experiment or observation to an extent that makes possible a definitive analysis. In many instances, the rarity or severity of the phenomena make resolution impossible at this time. In these instances, the best available information resides with experts who have studied the phenomena in question. This paper is about a reasoned approach to the acquisition of expert opinion for use in PRA work and other public policy areas.

  11. FORTRAN computer program for seismic risk analysis

    USGS Publications Warehouse

    McGuire, Robin K.

    1976-01-01

    A program for seismic risk analysis is described which combines generality of application, efficiency and accuracy of operation, and the advantage of small storage requirements. The theoretical basis for the program is first reviewed, and the computational algorithms used to apply this theory are described. The information required for running the program is listed. Published attenuation functions describing the variation with earthquake magnitude and distance of expected values for various ground motion parameters are summarized for reference by the program user. Finally, suggestions for use of the program are made, an example problem is described (along with example problem input and output) and the program is listed.

  12. Risk Analysis Related to Quality Management Principles

    NASA Astrophysics Data System (ADS)

    Vykydal, David; Halfarová, Petra; Nenadál, Jaroslav; Plura, Jiří; Hekelová, Edita

    2012-12-01

    Efficient and effective implementation of quality management principles asks for a responsible approach from top managers' perspectives. A study of the current state of affairs in Czech organizations discovers a lot of shortcomings in this field that can be changed to vary managerial risks. The article identifies and analyses some of them and gives short guidance for appropriate treatment. Text of the article reflects the authors' experience as well as knowledge obtained from the systematic analysis of industrial companies' environments.

  13. Method and system for dynamic probabilistic risk assessment

    NASA Technical Reports Server (NTRS)

    Dugan, Joanne Bechta (Inventor); Xu, Hong (Inventor)

    2013-01-01

    The DEFT methodology, system and computer readable medium extends the applicability of the PRA (Probabilistic Risk Assessment) methodology to computer-based systems, by allowing DFT (Dynamic Fault Tree) nodes as pivot nodes in the Event Tree (ET) model. DEFT includes a mathematical model and solution algorithm, supports all common PRA analysis functions and cutsets. Additional capabilities enabled by the DFT include modularization, phased mission analysis, sequence dependencies, and imperfect coverage.

  14. Application of Risk Analysis: Response from a Systems Division,

    DTIC Science & Technology

    A review of theoretical literature reveals that most technical aspects of risk analysis have become a reasonably well-defined process with many... risk analysis in order to enhance its application. Also needed are better tools to enhance use of both subjective judgment and group decision processes...hope that it would lead to increased application of risk analysis in the acquisition process.

  15. New challenges on uncertainty propagation assessment of flood risk analysis

    NASA Astrophysics Data System (ADS)

    Martins, Luciano; Aroca-Jiménez, Estefanía; Bodoque, José M.; Díez-Herrero, Andrés

    2016-04-01

    Natural hazards, such as floods, cause considerable damage to the human life, material and functional assets every year and around the World. Risk assessment procedures has associated a set of uncertainties, mainly of two types: natural, derived from stochastic character inherent in the flood process dynamics; and epistemic, that are associated with lack of knowledge or the bad procedures employed in the study of these processes. There are abundant scientific and technical literature on uncertainties estimation in each step of flood risk analysis (e.g. rainfall estimates, hydraulic modelling variables); but very few experience on the propagation of the uncertainties along the flood risk assessment. Therefore, epistemic uncertainties are the main goal of this work, in particular,understand the extension of the propagation of uncertainties throughout the process, starting with inundability studies until risk analysis, and how far does vary a proper analysis of the risk of flooding. These methodologies, such as Polynomial Chaos Theory (PCT), Method of Moments or Monte Carlo, are used to evaluate different sources of error, such as data records (precipitation gauges, flow gauges...), hydrologic and hydraulic modelling (inundation estimation), socio-demographic data (damage estimation) to evaluate the uncertainties propagation (UP) considered in design flood risk estimation both, in numerical and cartographic expression. In order to consider the total uncertainty and understand what factors are contributed most to the final uncertainty, we used the method of Polynomial Chaos Theory (PCT). It represents an interesting way to handle to inclusion of uncertainty in the modelling and simulation process. PCT allows for the development of a probabilistic model of the system in a deterministic setting. This is done by using random variables and polynomials to handle the effects of uncertainty. Method application results have a better robustness than traditional analysis

  16. Meta-analysis of osteoporosis: fracture risks, medication and treatment.

    PubMed

    Liu, W; Yang, L-H; Kong, X-C; An, L-K; Wang, R

    2015-08-01

    Osteoporosis is a brittle bone disease that can cause fractures mostly in older men and women. Meta-analysis is the statistical method which is applied in the frame work for the assessment of results obtained from various research studies conducted in several years. A meta-analysis of osteoporotic fracture risk with medication non-adherence has been described to assess the bone fracture risk among patients non-adherent versus adherent to therapy for osteoporosis by many researchers. Osteoporosis therapy reduces the risk of fracture in clinical trials, and real-world adherence to therapy which is suboptimal and can reduce the effectiveness of intervention. The methods of Medline, Embase, and CINAHL were literature searched for these observational studies from year 1998 to 2009, and up to 2015. The results of meta-analysis of osteoporosis research on fractures of postmenopausal women and men are presented. The use of bisphosphonate therapy for osteoporosis has been described with other drugs. The authors, design, studies (women %), years (data), follow-up (wks), fractures (types), and compliance or persistence results from years 2004 to 2009 from are shown in a brief table. The meta-analysis studies have been reviewed from other researchers on osteoporosis and fractures, medications and treatments.

  17. Integration of PKPD relationships into benefit–risk analysis

    PubMed Central

    Bellanti, Francesco; van Wijk, Rob C; Danhof, Meindert; Della Pasqua, Oscar

    2015-01-01

    Aim Despite the continuous endeavour to achieve high standards in medical care through effectiveness measures, a quantitative framework for the assessment of the benefit–risk balance of new medicines is lacking prior to regulatory approval. The aim of this short review is to summarise the approaches currently available for benefit–risk assessment. In addition, we propose the use of pharmacokinetic–pharmacodynamic (PKPD) modelling as the pharmacological basis for evidence synthesis and evaluation of novel therapeutic agents. Methods A comprehensive literature search has been performed using MESH terms in PubMed, in which articles describing benefit–risk assessment and modelling and simulation were identified. In parallel, a critical review of multi-criteria decision analysis (MCDA) is presented as a tool for characterising a drug's safety and efficacy profile. Results A definition of benefits and risks has been proposed by the European Medicines Agency (EMA), in which qualitative and quantitative elements are included. However, in spite of the value of MCDA as a quantitative method, decisions about benefit–risk balance continue to rely on subjective expert opinion. By contrast, a model-informed approach offers the opportunity for a more comprehensive evaluation of benefit–risk balance before extensive evidence is generated in clinical practice. Conclusions Benefit–risk balance should be an integral part of the risk management plan and as such considered before marketing authorisation. Modelling and simulation can be incorporated into MCDA to support the evidence synthesis as well evidence generation taking into account the underlying correlations between favourable and unfavourable effects. In addition, it represents a valuable tool for the optimization of protocol design in effectiveness trials. PMID:25940398

  18. Screening-Level Ecological Risk Assessment Methods, Revision 3

    SciTech Connect

    Mirenda, Richard J.

    2012-08-16

    This document provides guidance for screening-level assessments of potential adverse impacts to ecological resources from release of environmental contaminants at the Los Alamos National Laboratory (LANL or the Laboratory). The methods presented are based on two objectives, namely: to provide a basis for reaching consensus with regulators, managers, and other interested parties on how to conduct screening-level ecological risk investigations at the Laboratory; and to provide guidance for ecological risk assessors under the Environmental Programs (EP) Directorate. This guidance promotes consistency, rigor, and defensibility in ecological screening investigations and in reporting those investigation results. The purpose of the screening assessment is to provide information to the risk managers so informed riskmanagement decisions can be made. This document provides examples of recommendations and possible risk-management strategies.

  19. Topographic Avalanche Risk: DEM Sensitivity Analysis

    NASA Astrophysics Data System (ADS)

    Nazarkulova, Ainura; Strobl, Josef

    2015-04-01

    GIS-based models are frequently used to assess the risk and trigger probabilities of (snow) avalanche releases, based on parameters and geomorphometric derivatives like elevation, exposure, slope, proximity to ridges and local relief energy. Numerous models, and model-based specific applications and project results have been published based on a variety of approaches and parametrizations as well as calibrations. Digital Elevation Models (DEM) come with many different resolution (scale) and quality (accuracy) properties, some of these resulting from sensor characteristics and DEM generation algorithms, others from different DEM processing workflows and analysis strategies. This paper explores the impact of using different types and characteristics of DEMs for avalanche risk modeling approaches, and aims at establishing a framework for assessing the uncertainty of results. The research question is derived from simply demonstrating the differences in release risk areas and intensities by applying identical models to DEMs with different properties, and then extending this into a broader sensitivity analysis. For the quantification and calibration of uncertainty parameters different metrics are established, based on simple value ranges, probabilities, as well as fuzzy expressions and fractal metrics. As a specific approach the work on DEM resolution-dependent 'slope spectra' is being considered and linked with the specific application of geomorphometry-base risk assessment. For the purpose of this study focusing on DEM characteristics, factors like land cover, meteorological recordings and snowpack structure and transformation are kept constant, i.e. not considered explicitly. Key aims of the research presented here are the development of a multi-resolution and multi-scale framework supporting the consistent combination of large area basic risk assessment with local mitigation-oriented studies, and the transferability of the latter into areas without availability of

  20. Cleanup standards and pathways analysis methods

    SciTech Connect

    Devgun, J.S.

    1993-09-01

    Remediation of a radioactively contaminated site requires that certain regulatory criteria be met before the site can be released for unrestricted future use. Since the ultimate objective of remediation is to protect the public health and safety, residual radioactivity levels remaining at a site after cleanup must be below certain preset limits or meet acceptable dose or risk criteria. Release of a decontaminated site requires proof that the radiological data obtained from the site meet the regulatory criteria for such a release. Typically release criteria consist of a composite of acceptance limits that depend on the radionuclides, the media in which they are present, and federal and local regulations. In recent years, the US Department of Energy (DOE) has developed a pathways analysis model to determine site-specific soil activity concentration guidelines for radionuclides that do not have established generic acceptance limits. The DOE pathways analysis computer code (developed by Argonne National Laboratory for the DOE) is called RESRAD (Gilbert et al. 1989). Similar efforts have been initiated by the US Nuclear Regulatory Commission (NRC) to develop and use dose-related criteria based on genetic pathways analyses rather than simplistic numerical limits on residual radioactivity. The focus of this paper is radionuclide contaminated soil. Cleanup standards are reviewed, pathways analysis methods are described, and an example is presented in which RESRAD was used to derive cleanup guidelines.

  1. Using Qualitative Disease Risk Analysis for Herpetofauna Conservation Translocations Transgressing Ecological and Geographical Barriers.

    PubMed

    Bobadilla Suarez, Mariana; Ewen, John G; Groombridge, Jim J; Beckmann, K; Shotton, J; Masters, N; Hopkins, T; Sainsbury, Anthony W

    2017-03-01

    Through the exploration of disease risk analysis methods employed for four different UK herpetofauna translocations, we illustrate how disease hazards can be identified, and how the risk of disease can be analysed. Where ecological or geographical barriers between source and destination sites exist, parasite populations are likely to differ in identity or strain between the two sites, elevating the risk from disease and increasing the number and category of hazards requiring analysis. Simplification of the translocation pathway through the avoidance of these barriers reduces the risk from disease. The disease risk analysis tool is intended to aid conservation practitioners in decision making relating to disease hazards prior to implementation of a translocation.

  2. Computerized methods for trafficability analysis

    NASA Technical Reports Server (NTRS)

    Lewandowski, G. M.; Mc Adams, H. T.; Reese, P. A.

    1971-01-01

    Computer program produces trafficability maps displaying terrain characteristics in digital form for computer analysis. Maps serve as aid to vehicular operation and highway planning based on maneuverability parameters.

  3. Comparison of five methods used to determine low back disorder risk in a manufacturing environment.

    PubMed

    Lavender, S A; Oleske, D M; Nicholson, L; Andersson, G B; Hahn, J

    1999-07-15

    Five methods for quantifying work-related low back disorder (LBD) risk were used to assess 178 autoworkers from 93 randomly selected production jobs. To determine if five occupational LBD risk evaluation methods yielded similar assessments of manual material handling tasks. Several techniques are available for quantifying LBD risk in the workplace and are used in industry for job evaluation and redesign. It is unknown whether the methods yield similar results. The five job evaluation methods were the 1993 National Institute for Occupational Safety and Health model, the Static Strength Prediction Program, the Lumbar Motion Monitor model, and two variations of the United Auto Workers (UAW)-General Motors Ergonomic Risk Factor Checklist. These methods were selected because they represent common practice within the automotive industry, the result of governmental efforts to protect the workforce, or models thought to be the most scientifically advanced. Intercorrelations between methods ranged between 0.21 and 0.80. Pairwise analysis of risk group classifications identified biases on the part of the National Institute for Occupational Safety and Health equation, which considered jobs to be of higher risk relative to other methods, and on the part of the Static Strength Prediction Program, which considered nearly all the jobs sampled to be low risk. There is little agreement among the five quantitative ergonomic analysis methods used. In part, this may be because of their differential focus on acute versus cumulative trauma, thereby suggesting that greater consideration needs to be given to the underlying causes of LBD within a facility before selecting an ergonomic evaluation method.

  4. Pedophilia: an evaluation of diagnostic and risk prediction methods.

    PubMed

    Wilson, Robin J; Abracen, Jeffrey; Looman, Jan; Picheca, Janice E; Ferguson, Meaghan

    2011-06-01

    One hundred thirty child sexual abusers were diagnosed using each of following four methods: (a) phallometric testing, (b) strict application of Diagnostic and Statistical Manual of Mental Disorders (4th ed., text revision [DSM-IV-TR]) criteria, (c) Rapid Risk Assessment of Sex Offender Recidivism (RRASOR) scores, and (d) "expert" diagnoses rendered by a seasoned clinician. Comparative utility and intermethod consistency of these methods are reported, along with recidivism data indicating predictive validity for risk management. Results suggest that inconsistency exists in diagnosing pedophilia, leading to diminished accuracy in risk assessment. Although the RRASOR and DSM-IV-TR methods were significantly correlated with expert ratings, RRASOR and DSM-IV-TR were unrelated to each other. Deviant arousal was not associated with any of the other methods. Only the expert ratings and RRASOR scores were predictive of sexual recidivism. Logistic regression analyses showed that expert diagnosis did not add to prediction of sexual offence recidivism over and above RRASOR alone. Findings are discussed within a context of encouragement of clinical consistency and evidence-based practice regarding treatment and risk management of those who sexually abuse children.

  5. Risk prediction with machine learning and regression methods.

    PubMed

    Steyerberg, Ewout W; van der Ploeg, Tjeerd; Van Calster, Ben

    2014-07-01

    This is a discussion of issues in risk prediction based on the following papers: "Probability estimation with machine learning methods for dichotomous and multicategory outcome: Theory" by Jochen Kruppa, Yufeng Liu, Gérard Biau, Michael Kohler, Inke R. König, James D. Malley, and Andreas Ziegler; and "Probability estimation with machine learning methods for dichotomous and multicategory outcome: Applications" by Jochen Kruppa, Yufeng Liu, Hans-Christian Diener, Theresa Holste, Christian Weimar, Inke R. König, and Andreas Ziegler.

  6. A comprehensive risk analysis of coastal zones in China

    NASA Astrophysics Data System (ADS)

    Wang, Guanghui; Liu, Yijun; Wang, Hongbing; Wang, Xueying

    2014-03-01

    Although coastal zones occupy an important position in the world development, they face high risks and vulnerability to natural disasters because of their special locations and their high population density. In order to estimate their capability for crisis-response, various models have been established. However, those studies mainly focused on natural factors or conditions, which could not reflect the social vulnerability and regional disparities of coastal zones. Drawing lessons from the experiences of the United Nations Environment Programme (UNEP), this paper presents a comprehensive assessment strategy based on the mechanism of Risk Matrix Approach (RMA), which includes two aspects that are further composed of five second-class indicators. The first aspect, the probability phase, consists of indicators of economic conditions, social development, and living standards, while the second one, the severity phase, is comprised of geographic exposure and natural disasters. After weighing all of the above indicators by applying the Analytic Hierarchy Process (AHP) and Delphi Method, the paper uses the comprehensive assessment strategy to analyze the risk indices of 50 coastal cities in China. The analytical results are presented in ESRI ArcGis10.1, which generates six different risk maps covering the aspects of economy, society, life, environment, disasters, and an overall assessment of the five areas. Furthermore, the study also investigates the spatial pattern of these risk maps, with detailed discussion and analysis of different risks in coastal cities.

  7. Risk analysis for renewable energy projects due to constraints arising

    NASA Astrophysics Data System (ADS)

    Prostean, G.; Vasar, C.; Prostean, O.; Vartosu, A.

    2016-02-01

    Starting from the target of the European Union (EU) to use renewable energy in the area that aims a binding target of 20% renewable energy in final energy consumption by 2020, this article illustrates the identification of risks for implementation of wind energy projects in Romania, which could lead to complex technical implications, social and administrative. In specific projects analyzed in this paper were identified critical bottlenecks in the future wind power supply chain and reasonable time periods that may arise. Renewable energy technologies have to face a number of constraints that delayed scaling-up their production process, their transport process, the equipment reliability, etc. so implementing these types of projects requiring complex specialized team, the coordination of which also involve specific risks. The research team applied an analytical risk approach to identify major risks encountered within a wind farm project developed in Romania in isolated regions with different particularities, configured for different geographical areas (hill and mountain locations in Romania). Identification of major risks was based on the conceptual model set up for the entire project implementation process. Throughout this conceptual model there were identified specific constraints of such process. Integration risks were examined by an empirical study based on the method HAZOP (Hazard and Operability). The discussion describes the analysis of our results implementation context of renewable energy projects in Romania and creates a framework for assessing energy supply to any entity from renewable sources.

  8. Risk analysis of landslide disaster in Ponorogo, East Java, Indonesia

    NASA Astrophysics Data System (ADS)

    Koesuma, S.; Saido, A. P.; Fukuda, Y.

    2016-11-01

    Ponorogo is one of regency in South-West of East Java Province, Indonesia, where located in subduction zone between Eurasia and Australia plate tectonics. It has a lot of mountain area which is disaster-prone area for landslide. We have collected landslide data in 305 villages in Ponorogo and make it to be Hazards Index. Then we also calculate Vulnerability Index, Economic Loss index, Environmental Damage Index and Capacity Index. The risk analysis map is composed of three components H (Hazards), V (Vulnerability, Economic Loss index, Environmental Damage Index) and C (Capacity Index). The method is based on regulations of National Disaster Management Authority (BNPB) number 02/2012 and number 03/2012. It has three classes of risk index, i.e. Low, Medium and High. Ponorogo city has a medium landslide risk index.

  9. [Risk analysis in radiation therapy: state of the art].

    PubMed

    Mazeron, R; Aguini, N; Deutsch, É

    2013-01-01

    Five radiotherapy accidents, from which two serial, occurred in France from 2003 to 2007, led the authorities to establish a roadmap for securing radiotherapy. By analogy with industrial processes, a technical decision form the French Nuclear Safety Authority in 2008 requires radiotherapy professionals to conduct analyzes of risks to patients. The process of risk analysis had been tested in three pilot centers, before the occurrence of accidents, with the creation of cells feedback. The regulation now requires all radiotherapy services to have similar structures to collect precursor events, incidents and accidents, to perform analyzes following rigorous methods and to initiate corrective actions. At the same time, it is also required to conduct analyzes a priori, less intuitive, and usually require the help of a quality engineer, with the aim of reducing risk. The progressive implementation of these devices is part of an overall policy to improve the quality of radiotherapy. Since 2007, no radiotherapy accident was reported.

  10. Methods of Building Cost Analysis.

    ERIC Educational Resources Information Center

    Building Research Inst., Inc., Washington, DC.

    Presentation of symposium papers includes--(1) a study describing techniques for economic analysis of building designs, (2) three case studies of analysis techniques, (3) procedures for measuring the area and volume of buildings, and (4) an open forum discussion. Case studies evaluate--(1) the thermal economics of building enclosures, (2) an…

  11. Movement recognition technology as a method of assessing spontaneous general movements in high risk infants.

    PubMed

    Marcroft, Claire; Khan, Aftab; Embleton, Nicholas D; Trenell, Michael; Plötz, Thomas

    2014-01-01

    Preterm birth is associated with increased risks of neurological and motor impairments such as cerebral palsy. The risks are highest in those born at the lowest gestations. Early identification of those most at risk is challenging meaning that a critical window of opportunity to improve outcomes through therapy-based interventions may be missed. Clinically, the assessment of spontaneous general movements is an important tool, which can be used for the prediction of movement impairments in high risk infants. Movement recognition aims to capture and analyze relevant limb movements through computerized approaches focusing on continuous, objective, and quantitative assessment. Different methods of recording and analyzing infant movements have recently been explored in high risk infants. These range from camera-based solutions to body-worn miniaturized movement sensors used to record continuous time-series data that represent the dynamics of limb movements. Various machine learning methods have been developed and applied to the analysis of the recorded movement data. This analysis has focused on the detection and classification of atypical spontaneous general movements. This article aims to identify recent translational studies using movement recognition technology as a method of assessing movement in high risk infants. The application of this technology within pediatric practice represents a growing area of inter-disciplinary collaboration, which may lead to a greater understanding of the development of the nervous system in infants at high risk of motor impairment.

  12. Movement Recognition Technology as a Method of Assessing Spontaneous General Movements in High Risk Infants

    PubMed Central

    Marcroft, Claire; Khan, Aftab; Embleton, Nicholas D.; Trenell, Michael; Plötz, Thomas

    2015-01-01

    Preterm birth is associated with increased risks of neurological and motor impairments such as cerebral palsy. The risks are highest in those born at the lowest gestations. Early identification of those most at risk is challenging meaning that a critical window of opportunity to improve outcomes through therapy-based interventions may be missed. Clinically, the assessment of spontaneous general movements is an important tool, which can be used for the prediction of movement impairments in high risk infants. Movement recognition aims to capture and analyze relevant limb movements through computerized approaches focusing on continuous, objective, and quantitative assessment. Different methods of recording and analyzing infant movements have recently been explored in high risk infants. These range from camera-based solutions to body-worn miniaturized movement sensors used to record continuous time-series data that represent the dynamics of limb movements. Various machine learning methods have been developed and applied to the analysis of the recorded movement data. This analysis has focused on the detection and classification of atypical spontaneous general movements. This article aims to identify recent translational studies using movement recognition technology as a method of assessing movement in high risk infants. The application of this technology within pediatric practice represents a growing area of inter-disciplinary collaboration, which may lead to a greater understanding of the development of the nervous system in infants at high risk of motor impairment. PMID:25620954

  13. A Proposal of Operational Risk Management Method Using FMEA for Drug Manufacturing Computerized System

    NASA Astrophysics Data System (ADS)

    Takahashi, Masakazu; Nanba, Reiji; Fukue, Yoshinori

    This paper proposes operational Risk Management (RM) method using Failure Mode and Effects Analysis (FMEA) for drug manufacturing computerlized system (DMCS). The quality of drug must not be influenced by failures and operational mistakes of DMCS. To avoid such situation, DMCS has to be conducted enough risk assessment and taken precautions. We propose operational RM method using FMEA for DMCS. To propose the method, we gathered and compared the FMEA results of DMCS, and develop a list that contains failure modes, failures and countermeasures. To apply this list, we can conduct RM in design phase, find failures, and conduct countermeasures efficiently. Additionally, we can find some failures that have not been found yet.

  14. Approaches to uncertainty analysis in probabilistic risk assessment

    SciTech Connect

    Bohn, M.P.; Wheeler, T.A.; Parry, G.W.

    1988-01-01

    An integral part of any probabilistic risk assessment (PRA) is the performance of an uncertainty analysis to quantify the uncertainty in the point estimates of the risk measures considered. While a variety of classical methods of uncertainty analysis exist, application of these methods and developing new techniques consistent with existing PRA data bases and the need for expert (subjective) input has been an area of considerable interest since the pioneering Reactor Safety Study (WASH-1400) in 1975. This report presents the results of a critical review of existing methods for performing uncertainty analyses for PRAs, with special emphasis on identifying data base limitations on the various methods. Both classical and Baysian approaches have been examined. This work was funded by the US Nuclear Regulatory Commission in support of its ongoing full-scope PRA of the LaSalle nuclear power station. Thus in addition to the review, this report contains recommendations for a suitable uncertainty analysis methodology for the LaSalle PRA.

  15. Estimate capital for operational risk using peak over threshold method

    NASA Astrophysics Data System (ADS)

    Saputri, Azizah Anugrahwati; Noviyanti, Lienda; Soleh, Achmad Zanbar

    2015-12-01

    Operational risk is inherent in bank activities. To cover this risk a bank reserves a fund called as capital. Often a bank uses Basic Indicator approach (BIA), Standardized Approach (SA), or Advanced Measurement Approach (AMA) for estimating the capital amount. BIA and SA are less-objective in comparison to AMA, since BIA and SA use non-actual loss data while AMA use the actual one. In this research, we define the capital as an OpVaR (i.e. the worst loss at a given confidence level) which will be estimated by Peak Over Threshold Method.

  16. Advances in validation, risk and uncertainty assessment of bioanalytical methods.

    PubMed

    Rozet, E; Marini, R D; Ziemons, E; Boulanger, B; Hubert, Ph

    2011-06-25

    Bioanalytical method validation is a mandatory step to evaluate the ability of developed methods to provide accurate results for their routine application in order to trust the critical decisions that will be made with them. Even if several guidelines exist to help perform bioanalytical method validations, there is still the need to clarify the meaning and interpretation of bioanalytical method validation criteria and methodology. Yet, different interpretations can be made of the validation guidelines as well as for the definitions of the validation criteria. This will lead to diverse experimental designs implemented to try fulfilling these criteria. Finally, different decision methodologies can also be interpreted from these guidelines. Therefore, the risk that a validated bioanalytical method may be unfit for its future purpose will depend on analysts personal interpretation of these guidelines. The objective of this review is thus to discuss and highlight several essential aspects of methods validation, not only restricted to chromatographic ones but also to ligand binding assays owing to their increasing role in biopharmaceutical industries. The points that will be reviewed are the common validation criteria, which are selectivity, standard curve, trueness, precision, accuracy, limits of quantification and range, dilutional integrity and analyte stability. Definitions, methodology, experimental design and decision criteria are reviewed. Two other points closely connected to method validation are also examined: incurred sample reproducibility testing and measurement uncertainty as they are highly linked to bioanalytical results reliability. Their additional implementation is foreseen to strongly reduce the risk of having validated a bioanalytical method unfit for its purpose.

  17. Risk-Informed Safety Margin Characterization Methods Development Work

    SciTech Connect

    Smith, Curtis L; Ma, Zhegang; Riley, Tom; Mandelli, Diego; Nielsen, Joseph W; Alfonsi, Andrea; Rabiti, Cristian

    2014-09-01

    This report summarizes the research activity developed during the Fiscal year 2014 within the Risk Informed Safety Margin and Characterization (RISMC) pathway within the Light Water Reactor Sustainability (LWRS) campaign. This research activity is complementary to the one presented in the INL/EXT-??? report which shows advances Probabilistic Risk Assessment Analysis using RAVEN and RELAP-7 in conjunction to novel flooding simulation tools. Here we present several analyses that prove the values of the RISMC approach in order to assess risk associated to nuclear power plants (NPPs). We focus on simulation based PRA which, in contrast to classical PRA, heavily employs system simulator codes. Firstly we compare, these two types of analyses, classical and RISMC, for a Boiling water reactor (BWR) station black out (SBO) initiating event. Secondly we present an extended BWR SBO analysis using RAVEN and RELAP-5 which address the comments and suggestions received about he original analysis presented in INL/EXT-???. This time we focus more on the stochastic analysis such probability of core damage and on the determination of the most risk-relevant factors. We also show some preliminary results regarding the comparison between RELAP5-3D and the new code RELAP-7 for a simplified Pressurized Water Reactors system. Lastly we present some conceptual ideas regarding the possibility to extended the RISMC capabilities from an off-line tool (i.e., as PRA analysis tool) to an online-tool. In this new configuration, RISMC capabilities can be used to assist and inform reactor operator during real accident scenarios.

  18. Suicide risk and suicide method in patients with personality disorders.

    PubMed

    Björkenstam, Charlotte; Ekselius, Lisa; Berlin, Marie; Gerdin, Bengt; Björkenstam, Emma

    2016-12-01

    The influence of psychopathology on suicide method has revealed different distributions among different psychiatric disorders. However, evidence is still scarce. We hypothesized that having a diagnosis of personality disorder (PD) affect the suicide method, and that different PD clusters would influence the suicide method in different ways. In addition, we hypothesized that the presence of psychiatric and somatic co-morbidity also affects the suicide method. We examined 25,217 individuals aged 15-64 who had been hospitalized in Sweden with a main diagnosis of PD the years 1987-2013 (N = 25,217). The patients were followed from the date of first discharge until death or until the end of the follow-up period, i.e. December 31, 2013, for a total of 323,508.8 person-years, with a mean follow up time of 11.7 years. The SMR, i.e. the ratio between the observed number of suicides and the expected number of suicides, was used as a measure of risk. Overall PD, different PD-clusters, and comorbidity influenced the suicide method. Hanging evidenced highest SMR in female PD patients (SMR 34.2 (95% CI: 29.3-39.8)), as compared to non-PD patients and jumping among male PD patients (SMR 24.8 (95% CI: 18.3-33.6)), as compared to non PD-patients. Furthermore, the elevated suicide risk was related to both psychiatric and somatic comorbidity. The increased suicide risk was unevenly distributed with respect to suicide method and type of PD. However, these differences were only moderate and greatly overshadowed by the overall excess suicide risk in having PD. Any attempt from society to decrease the suicide rate in persons with PD must take these characteristics into account. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Mosquito habitat and dengue risk potential in Kenya: alternative methods to traditional risk mapping techniques.

    PubMed

    Attaway, David F; Jacobsen, Kathryn H; Falconer, Allan; Manca, Germana; Rosenshein Bennett, Lauren; Waters, Nigel M

    2014-11-01

    Outbreaks, epidemics and endemic conditions make dengue a disease that has emerged as a major threat in tropical and sub-tropical countries over the past 30 years. Dengue fever creates a growing burden for public health systems and has the potential to affect over 40% of the world population. The problem being investigated is to identify the highest and lowest areas of dengue risk. This paper presents "Similarity Search", a geospatial analysis aimed at identifying these locations within Kenya. Similarity Search develops a risk map by combining environmental susceptibility analysis and geographical information systems, and then compares areas with dengue prevalence to all other locations. Kenya has had outbreaks of dengue during the past 3 years, and we identified areas with the highest susceptibility to dengue infection using bioclimatic variables, elevation and mosquito habitat as input to the model. Comparison of the modelled risk map with the reported dengue epidemic cases obtained from the open source reporting ProMED and Government news reports from 1982-2013 confirmed the high-risk locations that were used as the Similarity Search presence cells. Developing the risk model based upon the bioclimatic variables, elevation and mosquito habitat increased the efficiency and effectiveness of the dengue fever risk mapping process.

  20. A probabilistic method for computing quantitative risk indexes from medical injuries compensation claims.

    PubMed

    Dalle Carbonare, S; Folli, F; Patrini, E; Giudici, P; Bellazzi, R

    2013-01-01

    The increasing demand of health care services and the complexity of health care delivery require Health Care Organizations (HCOs) to approach clinical risk management through proper methods and tools. An important aspect of risk management is to exploit the analysis of medical injuries compensation claims in order to reduce adverse events and, at the same time, to optimize the costs of health insurance policies. This work provides a probabilistic method to estimate the risk level of a HCO by computing quantitative risk indexes from medical injury compensation claims. Our method is based on the estimate of a loss probability distribution from compensation claims data through parametric and non-parametric modeling and Monte Carlo simulations. The loss distribution can be estimated both on the whole dataset and, thanks to the application of a Bayesian hierarchical model, on stratified data. The approach allows to quantitatively assessing the risk structure of the HCO by analyzing the loss distribution and deriving its expected value and percentiles. We applied the proposed method to 206 cases of injuries with compensation requests collected from 1999 to the first semester of 2007 by the HCO of Lodi, in the Northern part of Italy. We computed the risk indexes taking into account the different clinical departments and the different hospitals involved. The approach proved to be useful to understand the HCO risk structure in terms of frequency, severity, expected and unexpected loss related to adverse events.

  1. Cable Overheating Risk Warning Method Based on Impedance Parameter Estimation in Distribution Network

    NASA Astrophysics Data System (ADS)

    Yu, Zhang; Xiaohui, Song; Jianfang, Li; Fei, Gao

    2017-05-01

    Cable overheating will lead to the cable insulation level reducing, speed up the cable insulation aging, even easy to cause short circuit faults. Cable overheating risk identification and warning is nessesary for distribution network operators. Cable overheating risk warning method based on impedance parameter estimation is proposed in the paper to improve the safty and reliability operation of distribution network. Firstly, cable impedance estimation model is established by using least square method based on the data from distribiton SCADA system to improve the impedance parameter estimation accuracy. Secondly, calculate the threshold value of cable impedance based on the historical data and the forecast value of cable impedance based on the forecasting data in future from distribiton SCADA system. Thirdly, establish risks warning rules library of cable overheating, calculate the cable impedance forecast value and analysis the change rate of impedance, and then warn the overheating risk of cable line based on the overheating risk warning rules library according to the variation relationship between impedance and line temperature rise. Overheating risk warning method is simulated in the paper. The simulation results shows that the method can identify the imedance and forecast the temperature rise of cable line in distribution network accurately. The result of overheating risk warning can provide decision basis for operation maintenance and repair.

  2. Convergence analysis of combinations of different methods

    SciTech Connect

    Kang, Y.

    1994-12-31

    This paper provides a convergence analysis for combinations of different numerical methods for solving systems of differential equations. The author proves that combinations of two convergent linear multistep methods or Runge-Kutta methods produce a new convergent method of which the order is equal to the smaller order of the two original methods.

  3. Prognostic Analysis System and Methods of Operation

    NASA Technical Reports Server (NTRS)

    MacKey, Ryan M. E. (Inventor); Sneddon, Robert (Inventor)

    2014-01-01

    A prognostic analysis system and methods of operating the system are provided. In particular, a prognostic analysis system for the analysis of physical system health applicable to mechanical, electrical, chemical and optical systems and methods of operating the system are described herein.

  4. 49 CFR 260.17 - Credit risk premium analysis.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Financial Assistance § 260.17 Credit risk premium analysis. (a) When Federal appropriations are not...; (D) Capital expenditures; and (E) Operating efficiency. (ii) Financial risk, based on Applicant?s... 49 Transportation 4 2011-10-01 2011-10-01 false Credit risk premium analysis. 260.17 Section 260...

  5. Analysis and classification of the tools for assessing the risks associated with industrial machines.

    PubMed

    Paques, Joseph-Jean; Gauthier, François; Perez, Alejandro

    2007-01-01

    To assess and plan future risk-analysis research projects, 275 documents describing methods and tools for assessing the risks associated with industrial machines or with other sectors such as the military, and the nuclear and aeronautics industries, etc., were collected. These documents were in the format of published books or papers, standards, technical guides and company procedures collected throughout industry. From the collected documents, 112 documents were selected for analysis; 108 methods applied or potentially applicable for assessing the risks associated with industrial machines were analyzed and classified. This paper presents the main quantitative results of the analysis of the methods and tools.

  6. Trial Sequential Methods for Meta-Analysis

    ERIC Educational Resources Information Center

    Kulinskaya, Elena; Wood, John

    2014-01-01

    Statistical methods for sequential meta-analysis have applications also for the design of new trials. Existing methods are based on group sequential methods developed for single trials and start with the calculation of a required information size. This works satisfactorily within the framework of fixed effects meta-analysis, but conceptual…

  7. Trial Sequential Methods for Meta-Analysis

    ERIC Educational Resources Information Center

    Kulinskaya, Elena; Wood, John

    2014-01-01

    Statistical methods for sequential meta-analysis have applications also for the design of new trials. Existing methods are based on group sequential methods developed for single trials and start with the calculation of a required information size. This works satisfactorily within the framework of fixed effects meta-analysis, but conceptual…

  8. Concentration of Risk Model (CORM) Verification and Analysis

    DTIC Science & Technology

    2014-06-15

    Mental Health and using data from a repository at the University of Michigan, had attempted to identify soldiers at higher-than-average risk of suicide ...TRAC-M-TR-14-023 15 June 2014 Concentration of Risk Model (CORM) Verification and Analysis TRADOC Analysis Center - Monterey 700 Dyer Road Monterey...TRAC-M-TR-14-023 15 June 2014 Concentration of Risk Model (CORM) Verification and Analysis Edward M. Masotti Sam Buttrey TRADOC Analysis Center

  9. A novel risk assessment method for landfill slope failure: Case study application for Bhalswa Dumpsite, India.

    PubMed

    Jahanfar, Ali; Amirmojahedi, Mohsen; Gharabaghi, Bahram; Dubey, Brajesh; McBean, Edward; Kumar, Dinesh

    2017-03-01

    Rapid population growth of major urban centres in many developing countries has created massive landfills with extraordinary heights and steep side-slopes, which are frequently surrounded by illegal low-income residential settlements developed too close to landfills. These extraordinary landfills are facing high risks of catastrophic failure with potentially large numbers of fatalities. This study presents a novel method for risk assessment of landfill slope failure, using probabilistic analysis of potential failure scenarios and associated fatalities. The conceptual framework of the method includes selecting appropriate statistical distributions for the municipal solid waste (MSW) material shear strength and rheological properties for potential failure scenario analysis. The MSW material properties for a given scenario is then used to analyse the probability of slope failure and the resulting run-out length to calculate the potential risk of fatalities. In comparison with existing methods, which are solely based on the probability of slope failure, this method provides a more accurate estimate of the risk of fatalities associated with a given landfill slope failure. The application of the new risk assessment method is demonstrated with a case study for a landfill located within a heavily populated area of New Delhi, India.

  10. Small theories and large risks--is risk analysis relevant for epistemology?

    PubMed

    Cirković, Milan M

    2012-11-01

    Ought we to take seriously large risks predicted by "exotic" or improbable theories? We routinely assess risks on the basis or either common sense, or some developed theoretical framework based on the best available scientific explanations. Recently, there has been a substantial increase of interest in the low-probability "failure modes" of well-established theories, which can involve global catastrophic risks. However, here I wish to discuss a partially antithetical situation: alternative, low-probability ("small") scientific theories predicting catastrophic outcomes with large probability. I argue that there is an important methodological issue (determining what counts as the best available explanation in cases where the theories involved describe possibilities of extremely destructive global catastrophes), which has been neglected thus far. There is no simple answer to the correct method for dealing with high-probability high-stakes risks following from low-probability theories that still cannot be rejected outright, and much further work is required in this area. I further argue that cases like these are more numerous than usually assumed, for reasons including cognitive biases, sociological issues in science and the media image of science. If that is indeed so, it might lead to a greater weight of these cases in areas such as moral deliberation and policy making. © 2012 Society for Risk Analysis.

  11. Development of a preliminary framework for informing the risk analysis and risk management of nanoparticles.

    PubMed

    Morgan, Kara

    2005-12-01

    Decisions are often made even when there is uncertainty about the possible outcomes. However, methods for making decisions with uncertainty in the problem framework are scarce. Presently, safety assessment for a product containing engineered nano-scale particles is a very poorly structured problem. Many fields of study may inform the safety assessment of such particles (e.g., ultrafines, aerosols, debris from medical devices), but engineered nano-scale particles may present such unique properties that extrapolating from other types of studies may introduce, and not resolve, uncertainty. Some screening-level health effects studies conducted specifically on engineered nano-scale materials have been published and many more are underway. However, it is clear that the extent of research needed to fully and confidently understand the potential for health or environmental risk from engineered nano-scale particles may take years or even decades to complete. In spite of the great uncertainty, there is existing research and experience among researchers that can help to provide a taxonomy of particle properties, perhaps indicating a relative likelihood of risk, in order to prioritize nanoparticle risk research. To help structure this problem, a framework was developed from expert interviews of nanotechnology researchers. The analysis organizes the information as a system based on the risk assessment framework, in order to support the decision about safety. In the long term, this framework is designed to incorporate research results as they are generated, and therefore serve as a tool for estimating the potential for human health and environmental risk.

  12. Method Analysis of Microbial-Resistant Gypsum Products

    EPA Science Inventory

    Method Analysis of Microbial-Resistant Gypsum ProductsD.A. Betancourt1, T.R.Dean1, A. Evans2, and G.Byfield2 1. US Environmental Protection Agency, Office of Research and Development, National Risk Management Research Laboratory; RTP, NC 277112. RTI International, RTP, NCSeveral...

  13. Method Analysis of Microbial-Resistant Gypsum Products

    EPA Science Inventory

    Method Analysis of Microbial-Resistant Gypsum ProductsD.A. Betancourt1, T.R.Dean1, A. Evans2, and G.Byfield2 1. US Environmental Protection Agency, Office of Research and Development, National Risk Management Research Laboratory; RTP, NC 277112. RTI International, RTP, NCSeveral...

  14. Global Human Settlement Analysis for Disaster Risk Reduction

    NASA Astrophysics Data System (ADS)

    Pesaresi, M.; Ehrlich, D.; Ferri, S.; Florczyk, A.; Freire, S.; Haag, F.; Halkia, M.; Julea, A. M.; Kemper, T.; Soille, P.

    2015-04-01

    The Global Human Settlement Layer (GHSL) is supported by the European Commission, Joint Research Center (JRC) in the frame of his institutional research activities. Scope of GHSL is developing, testing and applying the technologies and analysis methods integrated in the JRC Global Human Settlement analysis platform for applications in support to global disaster risk reduction initiatives (DRR) and regional analysis in the frame of the European Cohesion policy. GHSL analysis platform uses geo-spatial data, primarily remotely sensed and population. GHSL also cooperates with the Group on Earth Observation on SB-04-Global Urban Observation and Information, and various international partners andWorld Bank and United Nations agencies. Some preliminary results integrating global human settlement information extracted from Landsat data records of the last 40 years and population data are presented.

  15. Methods of DNA methylation analysis.

    USDA-ARS?s Scientific Manuscript database

    The purpose of this review was to provide guidance for investigators who are new to the field of DNA methylation analysis. Epigenetics is the study of mitotically heritable alterations in gene expression potential that are not mediated by changes in DNA sequence. Recently, it has become clear that n...

  16. Pressure Systems Stored-Energy Threshold Risk Analysis

    SciTech Connect

    Paulsen, Samuel S.

    2009-08-25

    Federal Regulation 10 CFR 851, which became effective February 2007, brought to light potential weaknesses regarding the Pressure Safety Program at the Pacific Northwest National Laboratory (PNNL). The definition of a pressure system in 10 CFR 851 does not contain a limit based upon pressure or any other criteria. Therefore, the need for a method to determine an appropriate risk-based hazard level for pressure safety was identified. The Laboratory has historically used a stored energy of 1000 lbf-ft to define a pressure hazard; however, an analytical basis for this value had not been documented. This document establishes the technical basis by evaluating the use of stored energy as an appropriate criterion to establish a pressure hazard, exploring a suitable risk threshold for pressure hazards, and reviewing the methods used to determine stored energy. The literature review and technical analysis concludes the use of stored energy as a method for determining a potential risk, the 1000 lbf-ft threshold, and the methods used by PNNL to calculate stored energy are all appropriate. Recommendations for further program improvements are also discussed

  17. White Paper: A Defect Prioritization Method Based on the Risk Priority Number

    DTIC Science & Technology

    2013-11-01

    adapted The Failure Modes and Effects Analysis ( FMEA ) method employs a measurement technique called Risk Priority Number (RPN) to quantify the...Up to an hour 16-60 1.5 Brief Interrupt 0-15 1 Table 1 – Time Scaling Factors In the FMEA formulation, RPN is a product of the three categories

  18. Overcoming barriers to integrating economic analysis into risk assessment.

    PubMed

    Hoffmann, Sandra

    2011-09-01

    Regulatory risk analysis is designed to provide decisionmakers with a clearer understanding of how policies are likely to affect risk. The systems that produce risk are biological, physical, and social and economic. As a result, risk analysis is an inherently interdisciplinary task. Yet in practice, risk analysis has been interdisciplinary in only limited ways. Risk analysis could provide more accurate assessments of risk if there were better integration of economics and other social sciences into risk assessment itself. This essay examines how discussions about risk analysis policy have influenced the roles of various disciplines in risk analysis. It explores ways in which integrated bio/physical-economic modeling could contribute to more accurate assessments of risk. It reviews examples of the kind of integrated economics-bio/physical modeling that could be used to enhance risk assessment. The essay ends with a discussion of institutional barriers to greater integration of economic modeling into risk assessment and provides suggestions on how these might be overcome.

  19. Osteoporosis risk prediction using machine learning and conventional methods.

    PubMed

    Kim, Sung Kean; Yoo, Tae Keun; Oh, Ein; Kim, Deok Won

    2013-01-01

    A number of clinical decision tools for osteoporosis risk assessment have been developed to select postmenopausal women for the measurement of bone mineral density. We developed and validated machine learning models with the aim of more accurately identifying the risk of osteoporosis in postmenopausal women, and compared with the ability of a conventional clinical decision tool, osteoporosis self-assessment tool (OST). We collected medical records from Korean postmenopausal women based on the Korea National Health and Nutrition Surveys (KNHANES V-1). The training data set was used to construct models based on popular machine learning algorithms such as support vector machines (SVM), random forests (RF), artificial neural networks (ANN), and logistic regression (LR) based on various predictors associated with low bone density. The learning models were compared with OST. SVM had significantly better area under the curve (AUC) of the receiver operating characteristic (ROC) than ANN, LR, and OST. Validation on the test set showed that SVM predicted osteoporosis risk with an AUC of 0.827, accuracy of 76.7%, sensitivity of 77.8%, and specificity of 76.0%. We were the first to perform comparisons of the performance of osteoporosis prediction between the machine learning and conventional methods using population-based epidemiological data. The machine learning methods may be effective tools for identifying postmenopausal women at high risk for osteoporosis.

  20. Cost-effectiveness of various risk stratification methods for asymptomatic ventricular pre-excitation.

    PubMed

    Czosek, Richard J; Anderson, Jeffrey; Cassedy, Amy; Spar, David S; Knilans, Timothy K

    2013-07-15

    Accessory pathways with "high-risk" properties confer a small but potential risk of sudden cardiac death. Pediatric guidelines advocate for either risk stratification or ablation in patients with ventricular pre-excitation but do not advocate specific methodology. We sought to compare the cost of differing risk-stratification methodologies in pediatric patients with ventricular pre-excitation in this single institutional, retrospective cohort study of asymptomatic pediatric patients who underwent risk stratification for ventricular pre-excitation. Institutional methodology consisted of stratification using graded exercise testing (GXT) followed by esophageal testing in patients without loss of pre-excitation and ultimately ablation in high-risk patients or patients who became clinically symptomatic during follow-up. A decision analysis model was used to compare this methodology with hypothetical methodologies using different components of the stratification technique and an "ablate all" method. One hundred and two pediatric patients with asymptomatic ventricular pre-excitation underwent staged risk stratification; 73% of patients were deemed low risk and avoided ablation and the remaining 27% ultimately were successfully ablated. The use of esophageal testing was associated with a 23% (p ≤0.0001) reduction in cost compared with GXT stratification alone and a 48% (p ≤0.0001) reduction compared with the "ablate all" model. GXT as a lone stratification method was also associated with a 15% cost reduction (p ≤0.0001) compared with the "ablate all" method. In conclusion, risk stratification of pediatric patients with asymptomatic ventricular pre-excitation is associated with reduced cost. These outcomes of cost-effectiveness need to be combined with the risks and benefits associated with ablation and risk stratification.

  1. Working session 5: Operational aspects and risk analysis

    SciTech Connect

    Cizelj, L.; Donoghue, J.

    1997-02-01

    A general observation is that both operational aspects and risk analysis cannot be adequately discussed without information presented in other sessions. Some overlap of conclusions and recommendations is therefore to be expected. Further, it was assumed that recommendations concerning improvements in some related topics were generated by other sessions and are not repeated here. These include: (1) Knowledge on degradation mechanisms (initiation, progression, and failure). (2) Modeling of degradation (initiation, progression, and failure). (3) Capabilities of NDE methods. (4) Preventive maintenance and repair. One should note here, however, that all of these directly affect both operational and risk aspects of affected plants. A list of conclusions and recommendations is based on available presentations and discussions addressing risk and operational experience. The authors aimed at reaching as broad a consensus as possible. It should be noted here that there is no strict delineation between operational and safety aspects of degradation of steam generator tubes. This is caused by different risk perceptions in different countries/societies. The conclusions and recommendations were divided into four broad groups: human reliability; leakage monitoring; risk impact; and consequence assessment.

  2. HANFORD SAFETY ANALYSIS & RISK ASSESSMENT HANDBOOK (SARAH)

    SciTech Connect

    EVANS, C B

    2004-12-21

    The purpose of the Hanford Safety Analysis and Risk Assessment Handbook (SARAH) is to support the development of safety basis documentation for Hazard Category 2 and 3 (HC-2 and 3) U.S. Department of Energy (DOE) nuclear facilities to meet the requirements of 10 CFR 830, ''Nuclear Safety Management''. Subpart B, ''Safety Basis Requirements.'' Consistent with DOE-STD-3009-94, Change Notice 2, ''Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'' (STD-3009), and DOE-STD-3011-2002, ''Guidance for Preparation of Basis for Interim Operation (BIO) Documents'' (STD-3011), the Hanford SARAH describes methodology for performing a safety analysis leading to development of a Documented Safety Analysis (DSA) and derivation of Technical Safety Requirements (TSR), and provides the information necessary to ensure a consistently rigorous approach that meets DOE expectations. The DSA and TSR documents, together with the DOE-issued Safety Evaluation Report (SER), are the basic components of facility safety basis documentation. For HC-2 or 3 nuclear facilities in long-term surveillance and maintenance (S&M), for decommissioning activities, where source term has been eliminated to the point that only low-level, residual fixed contamination is present, or for environmental remediation activities outside of a facility structure, DOE-STD-1120-98, ''Integration of Environment, Safety, and Health into Facility Disposition Activities'' (STD-1120), may serve as the basis for the DSA. HC-2 and 3 environmental remediation sites also are subject to the hazard analysis methodologies of this standard.

  3. Methods development to evaluate the risk of upgrading to DCS: The human factor

    SciTech Connect

    Ostrom, L.T.; Wilhelmsen, C.A.

    1995-04-01

    The NRC recognizes that a more complete technical basis for understanding and regulating advanced digital technologies in commercial nuclear power plants is needed. A concern is that the introduction of digital safety systems may have an impact on risk. There is currently no standard methodology for measuring digital system reliability. A tool currently used to evaluate NPP risk in analog systems is the probabilistic risk assessment (PRA). The use of this tool to evaluate the digital system risk was considered to be a potential methodology for determining the risk. To test this hypothesis, it was decided to perform a limited PRA on a single dominant accident sequence. However, a review of existing human reliability analysis (HRA) methods showed that they were inadequate to analyze systems utilizing digital technology. A four step process was used to adapt existing HRA methodologies to digital environments and to develop new techniques. The HRA methods were then used to analyze an NPP that had undergone a backfit to digital technology in order to determine, as a first step, whether the methods were effective. The very small-break loss of coolant accident sequence was analyzed to determine whether the upgrade to the Eagle-21 process protection system had an effect on risk. The analysis of the very small-break LOCA documented in the Sequoyah PRA was used as the basis of the analysis. The analysis of the results of the HRA showed that the mean human error probabilities for the Eagle-21 PPS were slightly less than those for the analog system it replaced. One important observation from the analysis is that the operators have increased confidence steming from the better level of control provided by the digital system. The analysis of the PRA results, which included the human error component and the Eagle-21 PPS, disclosed that the reactor protection system had a higher failure rate than the analog system, although the difference was not statistically significant.

  4. Methodology for risk analysis based on atmospheric dispersion modelling from nuclear risk sites

    NASA Astrophysics Data System (ADS)

    Baklanov, A.; Mahura, A.; Sørensen, J. H.; Rigina, O.

    2003-04-01

    The main purpose of this multidisciplinary study is to develop a methodology for complex nuclear risk and vulnerability assessment, and to test it on example of estimation of nuclear risk to the population in the Nordic countries in case of a severe accident at a nuclear risk site (NRS). The main focus of the paper is the methodology for the evaluation of the atmospheric transport and deposition of radioactive pollutants from NRSs. The method developed for this evaluation is derived from a probabilistic point of view. The main question we are trying to answer is: What is the probability for radionuclide atmospheric transport and impact to different neighbouring regions and countries in case of an accident at an NPP? To answer this question we applied a number of different tools: (i) Trajectory Modelling - to calculate multiyear forward trajectories originating over the locations of selected risk sites; (ii) Dispersion Modelling - for long-term simulation and case studies of radionuclide transport from hypothetical accidental releases at NRSs; (iii) Cluster Analysis - to identify atmospheric transport pathways from NRSs; (iv) Probability Fields Analysis - to construct annual, monthly, and seasonal NRS impact indicators to identify the most impacted geographical regions; (v) Specific Case Studies - to estimate consequences for the environment and the populations after a hypothetical accident; (vi) Vulnerability Evaluation to Radioactive Deposition - to describe its persistence in the ecosystems with a focus to the transfer of certain radionuclides into the food chains of key importance for the intake and exposure for a whole population and for certain population groups; (vii) Risk Evaluation and Mapping - to analyse socio-economical consequences for different geographical areas and various population groups taking into account social-geophysical factors and probabilities, and using demographic databases based on GIS analysis.

  5. Hybrid methods for cybersecurity analysis :

    SciTech Connect

    Davis, Warren Leon,; Dunlavy, Daniel M.

    2014-01-01

    Early 2010 saw a signi cant change in adversarial techniques aimed at network intrusion: a shift from malware delivered via email attachments toward the use of hidden, embedded hyperlinks to initiate sequences of downloads and interactions with web sites and network servers containing malicious software. Enterprise security groups were well poised and experienced in defending the former attacks, but the new types of attacks were larger in number, more challenging to detect, dynamic in nature, and required the development of new technologies and analytic capabilities. The Hybrid LDRD project was aimed at delivering new capabilities in large-scale data modeling and analysis to enterprise security operators and analysts and understanding the challenges of detection and prevention of emerging cybersecurity threats. Leveraging previous LDRD research e orts and capabilities in large-scale relational data analysis, large-scale discrete data analysis and visualization, and streaming data analysis, new modeling and analysis capabilities were quickly brought to bear on the problems in email phishing and spear phishing attacks in the Sandia enterprise security operational groups at the onset of the Hybrid project. As part of this project, a software development and deployment framework was created within the security analyst work ow tool sets to facilitate the delivery and testing of new capabilities as they became available, and machine learning algorithms were developed to address the challenge of dynamic threats. Furthermore, researchers from the Hybrid project were embedded in the security analyst groups for almost a full year, engaged in daily operational activities and routines, creating an atmosphere of trust and collaboration between the researchers and security personnel. The Hybrid project has altered the way that research ideas can be incorporated into the production environments of Sandias enterprise security groups, reducing time to deployment from months and

  6. Root Cause Analysis: Methods and Mindsets.

    ERIC Educational Resources Information Center

    Kluch, Jacob H.

    This instructional unit is intended for use in training operations personnel and others involved in scram analysis at nuclear power plants in the techniques of root cause analysis. Four lessons are included. The first lesson provides an overview of the goals and benefits of the root cause analysis method. Root cause analysis techniques are covered…

  7. Application of advanced reliability methods to local strain fatigue analysis

    NASA Technical Reports Server (NTRS)

    Wu, T. T.; Wirsching, P. H.

    1983-01-01

    When design factors are considered as random variables and the failure condition cannot be expressed by a closed form algebraic inequality, computations of risk (or probability of failure) might become extremely difficult or very inefficient. This study suggests using a simple, and easily constructed, second degree polynomial to approximate the complicated limit state in the neighborhood of the design point; a computer analysis relates the design variables at selected points. Then a fast probability integration technique (i.e., the Rackwitz-Fiessler algorithm) can be used to estimate risk. The capability of the proposed method is demonstrated in an example of a low cycle fatigue problem for which a computer analysis is required to perform local strain analysis to relate the design variables. A comparison of the performance of this method is made with a far more costly Monte Carlo solution. Agreement of the proposed method with Monte Carlo is considered to be good.

  8. Technical Overview of Ecological Risk Assessment - Analysis Phase: Exposure Characterization

    EPA Pesticide Factsheets

    Exposure Characterization is the second major component of the analysis phase of a risk assessment. For a pesticide risk assessment, the exposure characterization describes the potential or actual contact of a pesticide with a plant, animal, or media.

  9. Methodology for Risk Analysis of Dam Gates and Associated Operating Equipment Using Fault Tree Analysis

    DTIC Science & Technology

    2005-05-01

    aging gate structures at dam spillways, there is an increasing risk of potential dam failures due to gate inoperability, malfunction, or under-design...method uses probabilities for more events defined more precisely than in standard practice, and adds criticality analysis to rank each of the potential ...a combination of the two. One method defined by Boeing Systems (1998) classifies failure modes according to the three levels defined be- low in

  10. Kaplan-Meier Survival Analysis Overestimates the Risk of Revision Arthroplasty: A Meta-analysis.

    PubMed

    Lacny, Sarah; Wilson, Todd; Clement, Fiona; Roberts, Derek J; Faris, Peter D; Ghali, William A; Marshall, Deborah A

    2015-11-01

    Although Kaplan-Meier survival analysis is commonly used to estimate the cumulative incidence of revision after joint arthroplasty, it theoretically overestimates the risk of revision in the presence of competing risks (such as death). Because the magnitude of overestimation is not well documented, the potential associated impact on clinical and policy decision-making remains unknown. We performed a meta-analysis to answer the following questions: (1) To what extent does the Kaplan-Meier method overestimate the cumulative incidence of revision after joint replacement compared with alternative competing-risks methods? (2) Is the extent of overestimation influenced by followup time or rate of competing risks? We searched Ovid MEDLINE, EMBASE, BIOSIS Previews, and Web of Science (1946, 1980, 1980, and 1899, respectively, to October 26, 2013) and included article bibliographies for studies comparing estimated cumulative incidence of revision after hip or knee arthroplasty obtained using both Kaplan-Meier and competing-risks methods. We excluded conference abstracts, unpublished studies, or studies using simulated data sets. Two reviewers independently extracted data and evaluated the quality of reporting of the included studies. Among 1160 abstracts identified, six studies were included in our meta-analysis. The principal reason for the steep attrition (1160 to six) was that the initial search was for studies in any clinical area that compared the cumulative incidence estimated using the Kaplan-Meier versus competing-risks methods for any event (not just the cumulative incidence of hip or knee revision); we did this to minimize the likelihood of missing any relevant studies. We calculated risk ratios (RRs) comparing the cumulative incidence estimated using the Kaplan-Meier method with the competing-risks method for each study and used DerSimonian and Laird random effects models to pool these RRs. Heterogeneity was explored using stratified meta-analyses and

  11. New Method For Classification of Avalanche Paths With Risks

    NASA Astrophysics Data System (ADS)

    Rapin, François

    After the Chamonix-Montroc avalanche event in February 1999, the French Ministry of the environment wanted to engage a new examination of the "sensitive avalanche paths", i.e. sites with stakes (in particular habitat) whose operation cannot be apprehended in a simple way. The ordered objective consisted in establishing a tool, a method, making it possible to identify them and to treat on a hierarchical basis them according to the risk which they generate, in order to later on as well as possible distribute the efforts of public policy. The proposed tool is based only on objective and quantifiable criteria, a priori of relatively fast access. These criteria are gathered in 4 groups : vulnerability concerned, the morphology of the site, known avalanche history, snow-climatology. Each criterion selected is affected by a " weight ", according to the group to which it belongs and relatively compared to the others. Thus this tool makes it possible to classify the sites subjected at one avalanche risk in a three dangerousness levels grid, which are: - low sensitivity: a priori the site does not deserve a particular avalanche study; - doubtful sensitivity: the site can deserve a study specifying the avalanche risk; - strong sensitivity: the site deserves a thorough study of the avalanche risk. According to conclusions' of these studies, existing measurements of prevention and risk management (zoning, protection, alert, help) will be examined and supplemented as a need. The result obtained by the application of the method by no means imposes the renewal of a thorough study of the avalanche risk which would exist beforehand. A priori less than one ten percent of the paths will be in a strong sensitivity. The present method is thus a new tool of decision-making aid for the first phase of identification and classification of the avalanche sites according to the risk which they generate. To be recognized and used under good conditions, this tool was worked out by the search for

  12. Cost Risk Analysis Based on Perception of the Engineering Process

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.; Wood, Darrell A.; Moore, Arlene A.; Bogart, Edward H.

    1986-01-01

    In most cost estimating applications at the NASA Langley Research Center (LaRC), it is desirable to present predicted cost as a range of possible costs rather than a single predicted cost. A cost risk analysis generates a range of cost for a project and assigns a probability level to each cost value in the range. Constructing a cost risk curve requires a good estimate of the expected cost of a project. It must also include a good estimate of expected variance of the cost. Many cost risk analyses are based upon an expert's knowledge of the cost of similar projects in the past. In a common scenario, a manager or engineer, asked to estimate the cost of a project in his area of expertise, will gather historical cost data from a similar completed project. The cost of the completed project is adjusted using the perceived technical and economic differences between the two projects. This allows errors from at least three sources. The historical cost data may be in error by some unknown amount. The managers' evaluation of the new project and its similarity to the old project may be in error. The factors used to adjust the cost of the old project may not correctly reflect the differences. Some risk analyses are based on untested hypotheses about the form of the statistical distribution that underlies the distribution of possible cost. The usual problem is not just to come up with an estimate of the cost of a project, but to predict the range of values into which the cost may fall and with what level of confidence the prediction is made. Risk analysis techniques that assume the shape of the underlying cost distribution and derive the risk curve from a single estimate plus and minus some amount usually fail to take into account the actual magnitude of the uncertainty in cost due to technical factors in the project itself. This paper addresses a cost risk method that is based on parametric estimates of the technical factors involved in the project being costed. The engineering

  13. Environmental risk analysis for indirect coal liquefaction

    SciTech Connect

    Barnthouse, L.W.; Suter, G.W. II; Baes, C.F. III; Bartell, S.M.; Cavendish, M.G.; Gardner, R.H.; O'Neill, R.V.; Rosen, A.E.

    1985-01-01

    This report presents an analysis of the risks to fish, water quality (due to noxious algal blooms), crops, forests, and wildlife of two technologies for the indirect liquefaction of coal: Lurgi and Koppers-Totzek gasification of coal for Fischer-Tropsch synthesis. A variety of analytical techniques were used to make maximum use of the available data to consider effects of effluents on different levels of ecological organization. The most significant toxicants to fish were found to be ammonia, cadmium, and acid gases. An analysis of whole-effluent toxicity indicated that the Lurgi effluent is more acutely toxic than the Koppers-Totzek effluent. Six effluent components appear to pose a potential threat of blue-green algal blooms, primarily because of their effects on higher trophic levels. The most important atmospheric emissions with respect to crops, forests, and wildlife were found to be the conventional combustion products SO/sub 2/ and NO/sub 2/. Of the materials deposited on the soil, arsenic, cadmium, and nickel appear of greatest concern for phytotoxicity. 147 references, 5 figures, 41 tables.

  14. Putting problem formulation at the forefront of GMO risk analysis.

    PubMed

    Tepfer, Mark; Racovita, Monica; Craig, Wendy

    2013-01-01

    When applying risk assessment and the broader process of risk analysis to decisions regarding the dissemination of genetically modified organisms (GMOs), the process has a tendency to become remarkably complex. Further, as greater numbers of countries consider authorising the large-scale dissemination of GMOs, and as GMOs with more complex traits reach late stages of development, there has been increasing concern about the burden posed by the complexity of risk analysis. We present here an improved approach for GMO risk analysis that gives a central role to problem formulation. Further, the risk analysis strategy has been clarified and simplified in order to make rigorously scientific risk assessment and risk analysis more broadly accessible to diverse stakeholder groups.

  15. Microparticle analysis system and method

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R. (Inventor)

    2007-01-01

    A device for analyzing microparticles is provided which includes a chamber with an inlet and an outlet for respectively introducing and dispensing a flowing fluid comprising microparticles, a light source for providing light through the chamber and a photometer for measuring the intensity of light transmitted through individual microparticles. The device further includes an imaging system for acquiring images of the fluid. In some cases, the device may be configured to identify and determine a quantity of the microparticles within the fluid. Consequently, a method for identifying and tracking microparticles in motion is contemplated herein. The method involves flowing a fluid comprising microparticles in laminar motion through a chamber, transmitting light through the fluid, measuring the intensities of the light transmitted through the microparticles, imaging the fluid a plurality of times and comparing at least some of the intensities of light between different images of the fluid.

  16. Comparison of risk assessment methods: multiple perspectives of flood and avalanche hazards in North East France

    NASA Astrophysics Data System (ADS)

    Giacona, Florie; Eleuterio, Julian

    2010-05-01

    Mountainous areas are exposed to several natural hazards such as snow avalanches, debris flows or floods. Such processes may be more frequent and intense in high mountains but they occur in medium-high mountains as well causing loss of life and materials. Thus, the Vosges range, a medium-high mountain located in the north-east of France, is concerned by two kind of natural hazards namely avalanches and floods. While the avalanches constitute the most murderous natural risk in Alsace, its management is paradoxically not a priority. Because it causes more material damages and affects larger places with multiple and complex consequences, the flood risk is more worrying for the administrators. They didn't have the same approach toward these two kinds of risk. So, two different approaches used to assess risk and two study cases are presented: flood risk in the river Bruche (located in the north of the Vosges range, Alsace) and avalanche risk in the Vosges range. The first one is mainly focused on economic aspects of risk. Flood risk analyses are discussed from a hydro-economical perspective. The second one focuses the analysis on human, material and environmental vulnerabilities. Avalanche risk analysis is discussed from a geo-historical point of view. About 300 avalanche events have been reported since the end of the 18th century. The two approaches that we describe illustrate the complementarity of human and physical science to improve the understanding and assessment of hazardous processes in medium-high mountain range. On the one hand, the geo-historical method developed for the avalanche risk could be extended to the flood hazard. Indeed, contrary to high mountains, no service is in charge of the systematic inventory of floods and avalanches in the Vosges mountains. The geo-historical approach could address this lack of data. On the other hand, the methods of damages assessment and vulnerability characterization could be a good tool for the human science.

  17. Aircraft accidents : method of analysis

    NASA Technical Reports Server (NTRS)

    1931-01-01

    The revised report includes the chart for the analysis of aircraft accidents, combining consideration of the immediate causes, underlying causes, and results of accidents, as prepared by the special committee, with a number of the definitions clarified. A brief statement of the organization and work of the special committee and of the Committee on Aircraft Accidents; and statistical tables giving a comparison of the types of accidents and causes of accidents in the military services on the one hand and in civil aviation on the other, together with explanations of some of the important differences noted in these tables.

  18. Studies of cancer risk among Chernobyl liquidators: materials and methods.

    PubMed

    Kesminiene, A; Cardis, E; Tenet, V; Ivanov, V K; Kurtinaitis, J; Malakhova, I; Stengrevics, A; Tekkel, M

    2002-09-01

    The current paper presents the methods and design of two case-control studies among Chernobyl liquidators-one of leukaemia and non-Hodgkin lymphoma, the other of thyroid cancer risk-carried out in Belarus, Estonia, Latvia, Lithuania and Russia. The specific objective of these studies is to estimate the radiation induced risk of these diseases among liquidators of the Chernobyl accident, and, in particular, to study the effect of exposure protraction and radiation type on the risk of radiation induced cancer in the low-to-medium- (0-500 mSv) radiation dose range. The study population consists of the approximately 10000 Baltic, 40000 Belarus and 51 000 Russian liquidators who worked in the 30 km zone in 1986-1987, and who were registered in the Chernobyl registry of these countries. The studies included cases diagnosed in 1993-1998 for all countries but Belarus, where the study period was extended until 2000. Four controls were selected in each country from the national cohort for each case, matched on age, gender and region of residence. Information on study subjects was obtained through face-to-face interview using a standardised questionnaire with questions on demographic factors, time, place and conditions of work as a liquidator and potential risk and confounding factors for the tumours of interest. Overall, 136 cases and 595 controls after receiving their consent were included in the studies. A method of analytical dose reconstruction has been developed, validated and applied to the estimation of doses and related uncertainties for all the subjects in the study. Dose-response analyses are underway and results are likely to have important implications to assess the adequacy of existing protection standards, which are based on risk estimates derived from analyses of the mortality of atomic bomb survivors and other high dose studies.

  19. The utility assessment method order influences measurement of parents' risk attitude.

    PubMed

    Finnell, S Maria E; Carroll, Aaron E; Downs, Stephen M

    2012-01-01

    Standard gamble (SG) and time trade-off (TTO) are two methods used for obtaining health utility values (utilities). Whether the order in which the methods are applied alters the relative utilities obtained by each method is unknown. We sought to determine whether the order in which SG and TTO utilities were obtained affects the relative values of the utilities obtained by each technique. Utilities were assessed for 29 health states from 4016 parents by using SG and TTO. The assessment order was randomized by respondent. For analysis by health state, we calculated (SG - TTO) for each assessment and tested whether the SG - TTO difference was significantly different between the two groups (SG first and TTO first). For analysis by individual, we calculated a risk-posture coefficient, γ, defined by the utility curve, SG = TTO(γ). We predicted γ through regression analysis with the covariates: child age, child sex, birth order, respondent age, respondent education level, and assessment method order. In 19 of 29 health states, the SG - TTO difference was significantly greater (more risk averse) when TTO was assessed first. In the regression analysis, "child age" and "assessment method order" were significant predictors of risk attitude. The risk posture coefficient γ was higher (more risk-seeking) with increasing child age and in the SG-first respondents. The order in which the SG versus TTO method is used strongly influences the relative values of the utilities obtained. Copyright © 2012 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  20. 14 CFR 417.225 - Debris risk analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Debris risk analysis. 417.225 Section 417.225 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.225 Debris risk analysis. A...

  1. 14 CFR 417.225 - Debris risk analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Debris risk analysis. 417.225 Section 417.225 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.225 Debris risk analysis. A...

  2. 14 CFR 417.225 - Debris risk analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Debris risk analysis. 417.225 Section 417.225 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.225 Debris risk analysis. A...

  3. 14 CFR 417.225 - Debris risk analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Debris risk analysis. 417.225 Section 417.225 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.225 Debris risk analysis. A...

  4. The Use and Abuse of Risk Analysis in Policy Debate.

    ERIC Educational Resources Information Center

    Herbeck, Dale A.; Katsulas, John P.

    The best check on the preposterous claims of crisis rhetoric is an appreciation of the nature of risk analysis and how it functions in argumentation. The use of risk analysis is common in policy debate. While the stock issues paradigm focused the debate exclusively on the affirmative case, the advent of policy systems analysis has transformed…

  5. Risk Analysis from a Top-Down Perspective

    DTIC Science & Technology

    1983-07-15

    and focused studies in critical areas. A variety of analyses, such as a localized version of the bottom up risk analysis approach and sensitivity...analysis, focus on these open ended cases to resolve them. Unresolvable decision conflicts include value judgments which risk analysis cannot solve

  6. Methods to evaluate the nutrition risk in hospitalized patients

    PubMed Central

    Erkan, Tülay

    2014-01-01

    The rate of malnutrition is substantially high both in the population and in chronic patients hospitalized because of different reasons. The rate of patients with no marked malnutrition at the time of hospitalization who develop malnutrition during hospitalization is also substantially high. Therefore, there are currently different screening methods with different targets to prevent malnutrition and its overlook. These methods should be simple and reliable and should not be time-consuming in order to be used in daily practice. Seven nutrition risk screening methods used in children have been established until the present time. However, no consensus has been made on any method as in adults. It should be accepted that interrogation of nutrition is a part of normal examination to increase awareness on this issue and to draw attention to this issue. PMID:26078678

  7. Comparison of risk assessment procedures used in OCRA and ULRA methods

    PubMed Central

    Roman-Liu, Danuta; Groborz, Anna; Tokarski, Tomasz

    2013-01-01

    The aim of this study was to analyse the convergence of two methods by comparing exposure and the assessed risk of developing musculoskeletal disorders at 18 repetitive task workstations. The already established occupational repetitive actions (OCRA) and the recently developed upper limb risk assessment (ULRA) produce correlated results (R = 0.84, p = 0.0001). A discussion of the factors that influence the values of the OCRA index and ULRA's repetitive task indicator shows that both similarities and differences in the results produced by the two methods can arise from the concepts that underlie them. The assessment procedure and mathematical calculations that the basic parameters are subjected to are crucial to the results of risk assessment. The way the basic parameters are defined influences the assessment of exposure and risk assessment to a lesser degree. The analysis also proved that not always do great differences in load indicator values result in differences in risk zones. Practitioner Summary: We focused on comparing methods that, even though based on different concepts, serve the same purpose. The results proved that different methods with different assumptions can produce similar assessment of upper limb load; sharp criteria in risk assessment are not the best solution. PMID:24041375

  8. Sexual Pleasure and Sexual Risk among Women who Use Methamphetamine: A Mixed Methods Study

    PubMed Central

    Lorvick, Jennifer; Bourgois, Philippe; Wenger, Lynn D.; Arreola, Sonya G.; Lutnick, Alexandra; Wechsberg, Wendee M.; Kral, Alex H.

    2012-01-01

    Background The intersection of drug use, sexual pleasure and sexual risk behavior is rarely explored when it comes to poor women who use drugs. This paper explores the relationship between sexual behavior and methamphetamine use in a community-based sample of women, exploring not only risk, but also desire, pleasure and the challenges of overcoming trauma. Methods Quantitative data were collected using standard epidemiological methods (N=322) for community-based studies. In addition, using purposive sampling, qualitative data were collected among a subset of participants (n=34). Data were integrated for mixed methods analysis. Results While many participants reported sexual risk behavior (unprotected vaginal or anal intercourse) in the quantitative survey, sexual risk was not the central narrative pertaining to sexual behavior and methamphetamine use in qualitative findings. Rather, desire, pleasure and disinhibition arose as central themes. Women described feelings of power and agency related to sexual behavior while high on methamphetamine. Findings were mixed on whether methamphetamine use increased sexual risk behavior. Conclusion The use of mixed methods afforded important insights into the sexual behavior and priorities of methamphetamine-using women. Efforts to reduce sexual risk should recognize and valorize the positive aspects of methamphetamine use for some women, building on positive feelings of power and agency as an approach to harm minimization. PMID:22954501

  9. Multiobjective Risk Partitioning: An Application to Dam Safety Risk Analysis

    DTIC Science & Technology

    1988-04-01

    expectation distorts, and a’most eliminates, the distinctive features of many viable alternative policy options that could lead to the reduction of the risk...height of the dam) from 20 to 30 million dollirs would contribute to a negligible reduction of 0.1 units of conventional (unconditional) expected social...results could be easily influenced by either a change in the return period of the PMH or by the choice of the distribution. Therefore, it is

  10. Methods and Techniques for Risk Prediction of Space Shuttle Upgrades

    NASA Technical Reports Server (NTRS)

    Hoffman, Chad R.; Pugh, Rich; Safie, Fayssal

    1998-01-01

    Since the Space Shuttle Accident in 1986, NASA has been trying to incorporate probabilistic risk assessment (PRA) in decisions concerning the Space Shuttle and other NASA projects. One major study NASA is currently conducting is in the PRA area in establishing an overall risk model for the Space Shuttle System. The model is intended to provide a tool to predict the Shuttle risk and to perform sensitivity analyses and trade studies including evaluation of upgrades. Marshall Space Flight Center (MSFC) and its prime contractors including Pratt and Whitney (P&W) are part of the NASA team conducting the PRA study. MSFC responsibility involves modeling the External Tank (ET), the Solid Rocket Booster (SRB), the Reusable Solid Rocket Motor (RSRM), and the Space Shuttle Main Engine (SSME). A major challenge that faced the PRA team is modeling the shuttle upgrades. This mainly includes the P&W High Pressure Fuel Turbopump (HPFTP) and the High Pressure Oxidizer Turbopump (HPOTP). The purpose of this paper is to discuss the various methods and techniques used for predicting the risk of the P&W redesigned HPFTP and HPOTP.

  11. Technical Risk Analysis - Exploiting the Power of MBSE

    DTIC Science & Technology

    2012-11-01

    UNCLASSIFIED DSTO-GD-0734 18. Technical Risk Analysis – Exploiting the Power of MBSE – Despina Tramoundanis1, Wayne Power1 and Daniel Spencer2...Functional Risk Analysis (FRA) conducted within a Model Based Systems Engineering ( MBSE ) environment. FRA is a rigorous technique used to explore potential...TITLE AND SUBTITLE Technical Risk Analysis â Exploiting the Power of MBSE â 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6

  12. Terrorism Risk Modeling for Intelligence Analysis and Infrastructure Protection

    DTIC Science & Technology

    2007-01-01

    COVERED 00-00-2007 to 00-00-2007 4. TITLE AND SUBTITLE Terrorism Risk Modeling for Intelligence Analysis and Infrastructure Protection 5a...meet high standards for re- search quality and objectivity. Terrorism Risk Modeling for Intelligence Analysis and Infrastructure Protection Henry H...across differ- ent urban areas, to assess terrorism risks within a metropolitan area, and to target intelligence analysis and collection efforts. The

  13. Component outage data analysis methods. Volume 2: Basic statistical methods

    NASA Astrophysics Data System (ADS)

    Marshall, J. A.; Mazumdar, M.; McCutchan, D. A.

    1981-08-01

    Statistical methods for analyzing outage data on major power system components such as generating units, transmission lines, and transformers are identified. The analysis methods produce outage statistics from component failure and repair data that help in understanding the failure causes and failure modes of various types of components. Methods for forecasting outage statistics for those components used in the evaluation of system reliability are emphasized.

  14. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    DOEpatents

    West, Phillip B.; Novascone, Stephen R.; Wright, Jerry P.

    2011-09-27

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  15. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    DOEpatents

    West, Phillip B [Idaho Falls, ID; Novascone, Stephen R [Idaho Falls, ID; Wright, Jerry P [Idaho Falls, ID

    2012-05-29

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  16. Geotechnical risk analysis by flat dilatometer (DMT)

    NASA Astrophysics Data System (ADS)

    Amoroso, Sara; Monaco, Paola

    2015-04-01

    In the last decades we have assisted at a massive migration from laboratory testing to in situ testing, to the point that, today, in situ testing is often the major part of a geotechnical investigation. The State of the Art indicates that direct-push in situ tests, such as the Cone Penetration Test (CPT) and the Flat Dilatometer Test (DMT), are fast and convenient in situ tests for routine site investigation. In most cases the DMT estimated parameters, in particular the undrained shear strength su and the constrained modulus M, are used with the common design methods of Geotechnical Engineering for evaluating bearing capacity, settlements etc. The paper focuses on the prediction of settlements of shallow foundations, that is probably the No. 1 application of the DMT, especially in sands, where undisturbed samples cannot be retrieved, and on the risk associated with their design. A compilation of documented case histories that compare DMT-predicted vs observed settlements, was collected by Monaco et al. (2006), indicating that, in general, the constrained modulus M can be considered a reasonable "operative modulus" (relevant to foundations in "working conditions") for settlement predictions based on the traditional linear elastic approach. Indeed, the use of a site investigation method, such as DMT, that improve the accuracy of design parameters, reduces risk, and the design can then center on the site's true soil variability without parasitic test variability. In this respect, Failmezger et al. (1999, 2015) suggested to introduce Beta probability distribution, that provides a realistic and useful description of variability for geotechnical design problems. The paper estimates Beta probability distribution in research sites where DMT tests and observed settlements are available. References Failmezger, R.A., Rom, D., Ziegler, S.R. (1999). "SPT? A better approach of characterizing residual soils using other in-situ tests", Behavioral Characterics of Residual Soils, B

  17. A workshop on developing risk assessment methods for medical use of radioactive material. Volume 1: Summary

    SciTech Connect

    Tortorelli, J.P.

    1995-08-01

    A workshop was held at the Idaho National Engineering Laboratory, August 16--18, 1994 on the topic of risk assessment on medical devices that use radioactive isotopes. Its purpose was to review past efforts to develop a risk assessment methodology to evaluate these devices, and to develop a program plan and a scoping document for future methodology development. This report contains a summary of that workshop. Participants included experts in the fields of radiation oncology, medical physics, risk assessment, human-error analysis, and human factors. Staff from the US Nuclear Regulatory Commission (NRC) associated with the regulation of medical uses of radioactive materials and with research into risk-assessment methods participated in the workshop. The workshop participants concurred in NRC`s intended use of risk assessment as an important technology in the development of regulations for the medical use of radioactive material and encouraged the NRC to proceed rapidly with a pilot study. Specific recommendations are included in the executive summary and the body of this report. An appendix contains the 8 papers presented at the conference: NRC proposed policy statement on the use of probabilistic risk assessment methods in nuclear regulatory activities; NRC proposed agency-wide implementation plan for probabilistic risk assessment; Risk evaluation of high dose rate remote afterloading brachytherapy at a large research/teaching institution; The pros and cons of using human reliability analysis techniques to analyze misadministration events; Review of medical misadministration event summaries and comparison of human error modeling; Preliminary examples of the development of error influences and effects diagrams to analyze medical misadministration events; Brachytherapy risk assessment program plan; and Principles of brachytherapy quality assurance.

  18. Quantitative risk analysis for landslides -- Examples from Bíldudalur, NW-Iceland

    NASA Astrophysics Data System (ADS)

    Bell, R.; Glade, T.

    2004-03-01

    Although various methods to carry out quantitative landslide risk analyses are available, applications are still rare and mostly dependent on the occurrence of disasters. In Iceland, two catastrophic snow avalanches killed 34 people in 1995. As a consequence the Ministry of the Environment issued a new regulation on hazard zoning due to snow avalanches and landslides in 2000, which aims to prevent people living or working within the areas most at risk until 2010. The regulation requires to carry out landslide and snow avalanche risk analyses, however, a method to calculate landslide risk adopted to Icelandic conditions is still missing. Therefore, the ultimate goal of this study is to develop such a method for landslides, focussing on debris flows and rock falls and to test it in Bíldudalur, NW-Iceland. Risk analysis, beside risk evaluation and risk management, is part of the holistic concept of risk assessment. Within this study, risk analysis is considered only, focussing on the risks to life. To calculate landslide risk, the spatial and temporal probability of occurrence of potential damaging events, as well as the distribution of the elements at risk in space and time, considering also changing vulnerabilities, must be determined. Within this study, a new raster-based approach is developed. Thus, all existent vector data are transferred into raster data using a resolution of 1m x 1m. The specific attribute data are attributed to the grid cells, resulting in specific raster data layers for each input parameter. The calculation of the landslide risk follows a function of the input parameters hazard, damage potential of the elements at risk, vulnerability, probability of the spatial impact, probability of the temporal impact and probability of the seasonal occurrence. Finally, results are upscaled to a resolution of 20m x 20m and are presented as individual risk to life and object risk to life for each process. Within the quantitative landslide risk analysis the

  19. a Meteorological Risk Assessment Method for Power Lines Based on GIS and Multi-Sensor Integration

    NASA Astrophysics Data System (ADS)

    Lin, Zhiyong; Xu, Zhimin

    2016-06-01

    Power lines, exposed in the natural environment, are vulnerable to various kinds of meteorological factors. Traditional research mainly deals with the influence of a single meteorological condition on the power line, which lacks of comprehensive effects evaluation and analysis of the multiple meteorological factors. In this paper, we use multiple meteorological monitoring data obtained by multi-sensors to implement the meteorological risk assessment and early warning of power lines. Firstly, we generate meteorological raster map from discrete meteorological monitoring data using spatial interpolation. Secondly, the expert scoring based analytic hierarchy process is used to compute the power line risk index of all kinds of meteorological conditions and establish the mathematical model of meteorological risk. By adopting this model in raster calculator of ArcGIS, we will have a raster map showing overall meteorological risks for power line. Finally, by overlaying the power line buffer layer to that raster map, we will get to know the exact risk index around a certain part of power line, which will provide significant guidance for power line risk management. In the experiment, based on five kinds of observation data gathered from meteorological stations in Guizhou Province of China, including wind, lightning, rain, ice, temperature, we carry on the meteorological risk analysis for the real power lines, and experimental results have proved the feasibility and validity of our proposed method.

  20. Automating Risk Analysis of Software Design Models

    PubMed Central

    Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  1. Risk analysis for Arctic offshore operations

    SciTech Connect

    Slomski, S.; Vivatrat, V.

    1986-04-01

    Offshore exploration for hydrocarbons is being conducted in the near-shore regions of the Beaufort Sea. This activity is expected to be intensified and expanded into the deeper portions of the Beaufort, as well as into the Chukchi Sea. The ice conditions in the Beaufort Sea are very variable, particularly in the deeper water regions. This variability greatly influences the probability of success or failure of an offshore operation. For example, a summer exploratory program conducted from a floating drilling unit may require a period of 60 to 100 days on station. The success of such a program depends on: (a) the time when the winter ice conditions deteriorate sufficiently for the drilling unit to move on station; (b) the number of summer invasions by the arctic ice pack, forcing the drilling unit to abandon station; (c) the rate at which first-year ice grows to the ice thickness limit of the supporting icebreakers; and (d) the extent of arctic pack expansion during the fall and early winter. In general, the ice conditions are so variable that, even with good planning, the change of failure of an offshore operation will not be negligible. Contingency planning for such events is therefore necessary. This paper presents a risk analysis procedure which can greatly benefit the planning of an offshore operation. A floating drilling program and a towing and installation operation for a fixed structure are considered to illustrate the procedure.

  2. Automating risk analysis of software design models.

    PubMed

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  3. Probabilistic risk analysis of groundwater remediation strategies

    NASA Astrophysics Data System (ADS)

    Bolster, D.; Barahona, M.; Dentz, M.; Fernandez-Garcia, D.; Sanchez-Vila, X.; Trinchero, P.; Valhondo, C.; Tartakovsky, D. M.

    2009-06-01

    Heterogeneity of subsurface environments and insufficient site characterization are some of the reasons why decisions about groundwater exploitation and remediation have to be made under uncertainty. A typical decision maker chooses between several alternative remediation strategies by balancing their respective costs with the probability of their success or failure. We conduct a probabilistic risk assessment (PRA) to determine the likelihood of the success of a permeable reactive barrier, one of the leading approaches to groundwater remediation. While PRA is used extensively in many engineering fields, its applications in hydrogeology are scarce. This is because rigorous PRA requires one to quantify structural and parametric uncertainties inherent in predictions of subsurface flow and transport. We demonstrate how PRA can facilitate a comprehensive uncertainty quantification for complex subsurface phenomena by identifying key transport processes contributing to a barrier's failure, each of which is amenable to uncertainty analysis. Probability of failure of a remediation strategy is computed by combining independent and conditional probabilities of failure of each process. Individual probabilities can be evaluated either analytically or numerically or, barring both, can be inferred from expert opinion.

  4. [Quantitative methods of cancer risk assessment in exposure to chemicals].

    PubMed

    Szymczak, Wiesław

    2009-01-01

    This is a methodology paper--it contains a review of different quantitative risk assessment methods and their comparison. There are two aspects of cancer risk modeling discussed here: 1. When there is one effective dose only. There were compared two models in this evaluation: one proposed by the Dutch Expert Committee on Occupational Standards and the other--a classical two-stage model. It was taken into account that in both models the animals were exposed for less than two years. An exposure period and a study period of animals were considered in the Dutch methodology. If we use as an exposure measure average lifespan dose estimated with different coefficients of exposure time in an experiment, we get two different dose-response models. And each of them will create different human risk models. There is no criterion that would let us assess which of them is better. 2. There are many models used in the BenchMark Dose (BMD) method. But there is no criterion that allows us to choose the best model objectively. In this paper a two-stage classical model and three BMD models (two-stage, Weibull and linear) were fit for particular data. Very small differences between all the models were noticed. The differences were insignificant because of uncertainties in the risk modeling. The possibility of choice of one model from a bigger set of models is the greatest benefit of this comparison. If the examined chemical is a genotoxic carcinogen, nothing more is needed than to estimate the threshold value.

  5. Practical, transparent prospective risk analysis for the clinical laboratory.

    PubMed

    Janssens, Pim Mw

    2014-11-01

    Prospective risk analysis (PRA) is an essential element in quality assurance for clinical laboratories. Practical approaches to conducting PRA in laboratories, however, are scarce. On the basis of the classical Failure Mode and Effect Analysis method, an approach to PRA was developed for application to key laboratory processes. First, the separate, major steps of the process under investigation are identified. Scores are then given for the Probability (P) and Consequence (C) of predefined types of failures and the chances of Detecting (D) these failures. Based on the P and C scores (on a 10-point scale), an overall Risk score (R) is calculated. The scores for each process were recorded in a matrix table. Based on predetermined criteria for R and D, it was determined whether a more detailed analysis was required for potential failures and, ultimately, where risk-reducing measures were necessary, if any. As an illustration, this paper presents the results of the application of PRA to our pre-analytical and analytical activities. The highest R scores were obtained in the stat processes, the most common failure type in the collective process steps was 'delayed processing or analysis', the failure type with the highest mean R score was 'inappropriate analysis' and the failure type most frequently rated as suboptimal was 'identification error'. The PRA designed is a useful semi-objective tool to identify process steps with potential failures rated as risky. Its systematic design and convenient output in matrix tables makes it easy to perform, practical and transparent. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  6. Risk analysis for autonomous underwater vehicle operations in extreme environments.

    PubMed

    Brito, Mario Paulo; Griffiths, Gwyn; Challenor, Peter

    2010-12-01

    Autonomous underwater vehicles (AUVs) are used increasingly to explore hazardous marine environments. Risk assessment for such complex systems is based on subjective judgment and expert knowledge as much as on hard statistics. Here, we describe the use of a risk management process tailored to AUV operations, the implementation of which requires the elicitation of expert judgment. We conducted a formal judgment elicitation process where eight world experts in AUV design and operation were asked to assign a probability of AUV loss given the emergence of each fault or incident from the vehicle's life history of 63 faults and incidents. After discussing methods of aggregation and analysis, we show how the aggregated risk estimates obtained from the expert judgments were used to create a risk model. To estimate AUV survival with mission distance, we adopted a statistical survival function based on the nonparametric Kaplan-Meier estimator. We present theoretical formulations for the estimator, its variance, and confidence limits. We also present a numerical example where the approach is applied to estimate the probability that the Autosub3 AUV would survive a set of missions under Pine Island Glacier, Antarctica in January-March 2009.

  7. Pesticide residues in cashew apple, guava, kaki and peach: GC-μECD, GC-FPD and LC-MS/MS multiresidue method validation, analysis and cumulative acute risk assessment.

    PubMed

    Jardim, Andréia Nunes Oliveira; Mello, Denise Carvalho; Goes, Fernanda Caroline Silva; Frota Junior, Elcio Ferreira; Caldas, Eloisa Dutra

    2014-12-01

    A multiresidue method for the determination of 46 pesticides in fruits was validated. Samples were extracted with acidified ethyl acetate, MgSO4 and CH3COONa and cleaned up by dispersive SPE with PSA. The compounds were analysed by GC-FPD, GC-μECD or LC-MS/MS, with LOQs from 1 to 8 μg/kg. The method was used to analyse 238 kaki, cashew apple, guava, and peach fruit and pulp samples, which were also analysed for dithiocarbamates (DTCs) using a spectrophotometric method. Over 70% of the samples were positive, with DTC present in 46.5%, λ-cyhalothrin in 37.1%, and omethoate in 21.8% of the positive samples. GC-MS/MS confirmed the identities of the compounds detected by GC. None of the pesticides found in kaki, cashew apple and guava was authorised for these crops in Brazil. The risk assessment has shown that the cumulative acute intake of organophosphorus or pyrethroid compounds from the consumption of these fruits is unlikely to pose a health risk to consumers.

  8. A performance-based approach to landslide risk analysis

    NASA Astrophysics Data System (ADS)

    Romeo, R. W.

    2009-04-01

    An approach for the risk assessment based on a probabilistic analysis of the performance of structures threatened by landslides is shown and discussed. The risk is a possible loss due to the occurrence of a potentially damaging event. Analytically the risk is the probability convolution of hazard, which defines the frequency of occurrence of the event (i.e., the demand), and fragility that defines the capacity of the system to withstand the event given its characteristics (i.e., severity) and those of the exposed goods (vulnerability), that is: Risk=p(D>=d|S,V) The inequality sets a damage (or loss) threshold beyond which the system's performance is no longer met. Therefore a consistent approach to risk assessment should: 1) adopt a probabilistic model which takes into account all the uncertainties of the involved variables (capacity and demand), 2) follow a performance approach based on given loss or damage thresholds. The proposed method belongs to the category of the semi-empirical ones: the theoretical component is given by the probabilistic capacity-demand model; the empirical component is given by the observed statistical behaviour of structures damaged by landslides. Two landslide properties alone are required: the area-extent and the type (or kinematism). All other properties required to determine the severity of landslides (such as depth, speed and frequency) are derived via probabilistic methods. The severity (or intensity) of landslides, in terms of kinetic energy, is the demand of resistance; the resistance capacity is given by the cumulative distribution functions of the limit state performance (fragility functions) assessed via damage surveys and cards compilation. The investigated limit states are aesthetic (of nominal concern alone), functional (interruption of service) and structural (economic and social losses). The damage probability is the probabilistic convolution of hazard (the probability mass function of the frequency of occurrence of given

  9. [Reliability theory based on quality risk network analysis for Chinese medicine injection].

    PubMed

    Li, Zheng; Kang, Li-Yuan; Fan, Xiao-Hui

    2014-08-01

    A new risk analysis method based upon reliability theory was introduced in this paper for the quality risk management of Chinese medicine injection manufacturing plants. The risk events including both cause and effect ones were derived in the framework as nodes with a Bayesian network analysis approach. It thus transforms the risk analysis results from failure mode and effect analysis (FMEA) into a Bayesian network platform. With its structure and parameters determined, the network can be used to evaluate the system reliability quantitatively with probabilistic analytical appraoches. Using network analysis tools such as GeNie and AgenaRisk, we are able to find the nodes that are most critical to influence the system reliability. The importance of each node to the system can be quantitatively evaluated by calculating the effect of the node on the overall risk, and minimization plan can be determined accordingly to reduce their influences and improve the system reliability. Using the Shengmai injection manufacturing plant of SZYY Ltd as a user case, we analyzed the quality risk with both static FMEA analysis and dynamic Bayesian Network analysis. The potential risk factors for the quality of Shengmai injection manufacturing were identified with the network analysis platform. Quality assurance actions were further defined to reduce the risk and improve the product quality.

  10. SWECS tower dynamics analysis methods and results

    NASA Technical Reports Server (NTRS)

    Wright, A. D.; Sexton, J. H.; Butterfield, C. P.; Thresher, R. M.

    1981-01-01

    Several different tower dynamics analysis methods and computer codes were used to determine the natural frequencies and mode shapes of both guyed and freestanding wind turbine towers. These analysis methods are described and the results for two types of towers, a guyed tower and a freestanding tower, are shown. The advantages and disadvantages in the use of and the accuracy of each method are also described.

  11. Spatial risk assessment for critical network infrastructure using sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Möderl, Michael; Rauch, Wolfgang

    2011-12-01

    The presented spatial risk assessment method allows for managing critical network infrastructure in urban areas under abnormal and future conditions caused e.g., by terrorist attacks, infrastructure deterioration or climate change. For the spatial risk assessment, vulnerability maps for critical network infrastructure are merged with hazard maps for an interfering process. Vulnerability maps are generated using a spatial sensitivity analysis of network transport models to evaluate performance decrease under investigated thread scenarios. Thereby parameters are varied according to the specific impact of a particular threat scenario. Hazard maps are generated with a geographical information system using raster data of the same threat scenario derived from structured interviews and cluster analysis of events in the past. The application of the spatial risk assessment is exemplified by means of a case study for a water supply system, but the principal concept is applicable likewise to other critical network infrastructure. The aim of the approach is to help decision makers in choosing zones for preventive measures.

  12. N Reactor external events probabilistic risk assessment using NUREG-1150 methods

    SciTech Connect

    Wang, O.S.; Baxter, J.T.; Coles, G.A.; Zentner, M.D.; Powers, T.B.; Collard, L.B.; Rainey, T.E.

    1990-01-01

    This is the first full-scope Level-III PRA completed for the DOE Category A reactor using the updated NUREG-1150 methods. The comparisons to the quantitative NRC safety objectives and DOE nuclear safety guidelines also set analytical precedent for DOE production reactors. Generally speaking, the risks of operating N Reactor are low because of a combination of factors such as low power density, large confinement volume, effective redundant scram systems and core cooling systems, remote location, etc. This work has been a major effort to evaluate the N Reactor risk using state-of-the-art PRA technology. It is believed that this PRA has resulted in realistic, or slightly conservative, results (as opposed to unduly conservative or nonconservative results). The study concluded that the risk to the public and to nearby DOE workers from the operation of N Reactor is very low. This analysis also found that N Reactor meets all the quantitative NRC safety objectives and DOE nuclear safety guidelines, and is generally as safe as, or safer than most commercial reactors in terms of societal and individual risks. The calculated risk to Hanford onsite workers is comparable to public risk from commercial reactors in the NUREG-1150 study. As a result of these low-risk estimates, only a small effort has been devoted to identifying significant risk reduction alternatives. 22 refs., 2 figs., 10 tabs.

  13. Proposal of a management method of rockfall risk induced on a road

    NASA Astrophysics Data System (ADS)

    Mignelli, C.; Peila, D.; Lo Russo, S.

    2012-04-01

    Many kilometers of roads have adjacent rock slopes that are prone to rockfall. The analysis of risks associated with these types of instabilities is a complex operation requiring the precise assessment of hazard, the vulnerability and therefore the risk of vehicles on roads along the foothills. Engineering design of protection devices should aim to minimize risk while taking advantage of the most advanced technologies. Decision makers should be equipped with the technical tools permitting them to choose the best solution within the context of local maximum acceptable risk levels. The fulfilment of safety requirements for mountainside routes involves in many cases the implementation of protective measures and devices to control and manage rockfall and it is of key importance the evaluation of the positive effects of such measures in terms of risk reduction. A risk analysis management procedure for roads subject to rockfall phenomena using a specifically developed method named: Rockfall risk Management (RO.MA.) is presented and discussed. The method is based on statistic tools, using as input the data coming both from in situ survey and from historical data. It is important to highline that historical database are not often available and usually there is a lack of useful information due to a not complete setting of parameters. The analysis based only on historical data can be difficult to be developed. For this purpose a specific database collection system has been developed to provide geotechnical and geomechanical description of the studied rockside. This parameters and the data collected from historical database, define the input parameters of the Ro.Ma method. Moreover to allow the quantification of the harm, the data coming from the monitoring of the road by the road manager are required. The value of harm is proportional to the number of persons on the road (i.e. people in a vehicle) and the following traffic characteristics: type of vehicles (i.e. bicycles

  14. Global cardiovascular risk stratification: comparison of the Framingham method with the SCORE method in the Mexican population.

    PubMed

    Alcocer, Luis Antonio; Lozada, Osvaldo; Fanghänel, Guillermo; Sánchez-Reyes, Leticia; Campos-Franco, Enrique

    2011-01-01

    In the Mexican population we are unaware if the Framingham model is a better system than the SCORE system for stratifying cardiovascular risk. The present study was conducted to compare risk stratification with the Framingham tables using the same procedure but using the SCORE, with the aim of recommending the use of the most appropriate method. We analyzed a database of apparently healthy workers from the Mexico City General Hospital included in the study group "PRIT" (Prevalencia de Factores de Riesgo de Infarto del Miocardio en Trabajadores del Hospital General de México) and we calculated the risk in each simultaneously with the Framingham method and the SCORE method. It was possible to perform risk calculation with both methods in 1990 subjects from a total of 5803 PRITHGM study participants. When using the SCORE method, we stratified 1853 patients into low risk, 133 into medium risk and 4 into high risk. The Framingham method qualified 1586 subjects as low risk, 268 as medium risk and 130 as high risk. Concordance between scales to classify both patients according to the same risk was 98% in those classified as low risk, 19.4% among those classified as intermediate risk and only 3% in those classified as high risk. According to our results, it seems more appropriate in our country to recommend the Framingham model for calculating cardiovascular risk due to the fact that the SCORE model underestimated risk.

  15. Application of fuzzy expert systems for EOR project risk analysis

    SciTech Connect

    Ting-Horng, Chung; Carroll, H.B.; Lindsey, R.

    1995-12-31

    This work describes a new method for enhanced oil recovery (EOR) project preassessment. Instead of using the conventional costly simulation approach, a fuzzy expert system was developed. A database of EOR project costs and oil prices of the past decades was incorporated into the expert system. The EOR project risk-analysis includes three stages: (1) preliminary screening of EOR methods, (2) field performance estimation, and (3) economic analysis. Since this fuzzy expert system has incorporated both implemented EOR technology and experts` experience, it thus reduces the requirements of massive laboratory and field data for input. Estimates of displacement efficiency (E{sub d}) and sweep efficiency (E{sub v}) were formulated for each EOR process. E{sub d} and E{sub v} were treated as fuzzy variables. The overall recovery efficiency is evaluated from the product of Ed and Ev using fuzzy set arithmetic. Economic analysis is based on the estimated recovery efficiency, residual oil inplace, oil price, and operating costs. Examples of the application of the developed method for a CO{sub 2}-flooding project analysis is presented.

  16. Radar Hardware Second Buy Decision Risk Analysis,

    DTIC Science & Technology

    Operations research, *Radar equipment, *Army procurement, *Decision making , *Risk, Symposia, Army research, Contracts, Cost estimates, Scheduling, Time, Requirements, Logistics planning, Army planning

  17. Petri net modeling of fault analysis for probabilistic risk assessment

    NASA Astrophysics Data System (ADS)

    Lee, Andrew

    Fault trees and event trees have been widely accepted as the modeling strategy to perform Probabilistic Risk Assessment (PRA). However, there are several limitations associated with fault tree/event tree modeling. These include 1. It only considers binary events; 2. It assumes independence among basic events; and 3. It does not consider timing sequence of basic events. This thesis investigates Petri net modeling as a potential alternative for PRA modeling. Petri nets have mainly been used as a simulation tool for queuing and network systems. However, it has been suggested that they could also model failure scenarios, and thus could be a potential modeling strategy for PRA. In this thesis, the transformations required to model logic gates in a fault tree by Petri nets are explored. The gap between fault tree analysis and Petri net analysis is bridged through gate equivalency analysis. Methods for qualitative and quantitative analysis for Petri nets are presented. Techniques are developed and implemented to revise and tailor traditional Petri net modeling for system failure analysis. The airlock system and the maintenance cooling system of a CANada Deuterium Uranium (CANDU) reactor are used as case studies to demonstrate Petri nets ability to model system failure and provide a structured approach for qualitative and quantitative analysis. The minimal cutsets and the probability of the airlock system failing to maintain the pressure boundary are obtained. Furthermore, the case study is extended to non-coherent system analysis due to system maintenance.

  18. What patient characteristics guide nurses' clinical judgement on pressure ulcer risk? A mixed methods study.

    PubMed

    Balzer, K; Kremer, L; Junghans, A; Halfens, R J G; Dassen, T; Kottner, J

    2014-05-01

    Nurses' clinical judgement plays a vital role in pressure ulcer risk assessment, but evidence is lacking which patient characteristics are important for nurses' perception of patients' risk exposure. To explore which patient characteristics nurses employ when assessing pressure ulcer risk without use of a risk assessment scale. Mixed methods design triangulating observational data from the control group of a quasi-experimental trial and data from semi-structured interviews with nurses. Two traumatological wards at a university hospital. Quantitative data: A consecutive sample of 106 patients matching the eligibility criteria (age ≥ 18 years, no pressure ulcers category ≥ 2 at admission and ≥ 5 days expected length of stay). Qualitative data: A purposive sample of 16 nurses. Quantitative data: Predictor variables for pressure ulcer risk were measured by study assistants at the bedside each second day. Concurrently, nurses documented their clinical judgement on patients' pressure ulcer risk by means of a 4-step global judgement scale. Bivariate correlations between predictor variables and nurses' risk estimates were established. Qualitative data: In interviews, nurses were asked to assess fictitious patients' pressure ulcer risk and to justify their risk estimates. Patient characteristics perceived as relevant for nurses' judements were thematically clustered. Triangulation: Firstly, predictors of nurses' risk estimates identified in bivariate analysis were cross-mapped with interview findings. Secondly, three models to predict nurses' risk estimates underwent multiple linear regression analysis. Nurses consider multiple patient characteristics for pressure ulcer risk assessment, but regard some conditions more important than others. Triangulation showed that these are measures reflecting patients' exposure to pressure or overall care dependency. Qualitative data furthermore indicate that nurses are likely to trade off risk-enhancing conditions against

  19. Text analysis methods, text analysis apparatuses, and articles of manufacture

    DOEpatents

    Whitney, Paul D; Willse, Alan R; Lopresti, Charles A; White, Amanda M

    2014-10-28

    Text analysis methods, text analysis apparatuses, and articles of manufacture are described according to some aspects. In one aspect, a text analysis method includes accessing information indicative of data content of a collection of text comprising a plurality of different topics, using a computing device, analyzing the information indicative of the data content, and using results of the analysis, identifying a presence of a new topic in the collection of text.

  20. On methods for assessing water-resource risks and vulnerabilities

    NASA Astrophysics Data System (ADS)

    Gleick, Peter H.

    2015-11-01

    Because of the critical role that freshwater plays in maintaining ecosystem health and supporting human development through agricultural and industrial production there have been numerous efforts over the past few decades to develop indicators and indices of water vulnerability. Each of these efforts has tried to identify key factors that both offer insights into water-related risks and strategies that might be useful for reducing those risks. These kinds of assessments have serious limitations associated with data, the complexity of water challenges, and the changing nature of climatic and hydrologic variables. This new letter by Padowski et al (2015 Environ. Res. Lett. 10 104014) adds to the field by broadening the kinds of measures that should be integrated into such tools, especially in the area of institutional characteristics, and analyzing them in a way that provides new insights into the similarities and differences in water risks facing different countries, but much more can and should be done with new data and methods to improve our understanding of water challenges.

  1. The effectiveness of risk management: an analysis of project risk planning across industries and countries.

    PubMed

    Zwikael, Ofer; Ahn, Mark

    2011-01-01

    This article examines the effectiveness of current risk management practices to reduce project risk using a multinational, multi-industry study across different scenarios and cultures. A survey was administered to 701 project managers, and their supervisors, in seven industries and three diverse countries (New Zealand, Israel, and Japan), in multiple languages during the 2002-2007 period. Results of this study show that project context--industry and country where a project is executed--significantly impacts perceived levels of project risk, and the intensity of risk management processes. Our findings also suggest that risk management moderates the relationship between risk level and project success. Specifically, we found that even moderate levels of risk management planning are sufficient to reduce the negative effect risk levels have on project success. © 2010 Society for Risk Analysis.

  2. Risk analysis for plant-made vaccines.

    PubMed

    Kirk, Dwayne D; McIntosh, Kim; Walmsley, Amanda M; Peterson, Robert K D

    2005-08-01

    The production of vaccines in transgenic plants was first proposed in 1990 however no product has yet reached commercialization. There are several risks during the production and delivery stages of this technology, with potential impact on the environment and on human health. Risks to the environment include gene transfer and exposure to antigens or selectable marker proteins. Risks to human health include oral tolerance, allergenicity, inconsistent dosage, worker exposure and unintended exposure to antigens or selectable marker proteins in the food chain. These risks are controllable through appropriate regulatory measures at all stages of production and distribution of a potential plant-made vaccine. Successful use of this technology is highly dependant on stewardship and active risk management by the developers of this technology, and through quality standards for production, which will be set by regulatory agencies. Regulatory agencies can also negatively affect the future viability of this technology by requiring that all risks must be controlled, or by applying conventional regulations which are overly cumbersome for a plant production and oral delivery system. The value of new or replacement vaccines produced in plant cells and delivered orally must be considered alongside the probability and severity of potential risks in their production and use, and the cost of not deploying this technology--the risk of continuing with the status quo alternative.

  3. Virtues and Limitations of Risk Analysis

    ERIC Educational Resources Information Center

    Weatherwax, Robert K.

    1975-01-01

    After summarizing the Rasmussion Report, the author reviews the probabilistic portion of the report from the perspectives of engineering utility and risk assessment uncertainty. The author shows that the report may represent a significant step forward in the assurance of reactor safety and an imperfect measure of actual reactor risk. (BT)

  4. Virtues and Limitations of Risk Analysis

    ERIC Educational Resources Information Center

    Weatherwax, Robert K.

    1975-01-01

    After summarizing the Rasmussion Report, the author reviews the probabilistic portion of the report from the perspectives of engineering utility and risk assessment uncertainty. The author shows that the report may represent a significant step forward in the assurance of reactor safety and an imperfect measure of actual reactor risk. (BT)

  5. An evaluation of fracture analysis methods

    NASA Technical Reports Server (NTRS)

    Newman, J. C., Jr.

    1985-01-01

    The results of an experimental and predictive round robin on the applications of fracture analysis methods are presented. The objective of the round robin was to verify whether fracture analysis methods currently in use can or cannot predict failure loads on complex structural components containing cracks. Fracture results from tests on a number of compact specimens were used to make the predictions. The accuracy of the prediction methods was evaluated in terms of the variation in the ratio of predicted to experimental failure loads, and the predictions methods are ranked in order of minimum standard error. A range of applicability of the different methods was also considered in assessing their usefulness. For 7075-T651 aluminum alloy, the best methods were: the effective K sub R curve; the critical crack-tip opening displacement (CTOD) criterion using a finite element analysis; and the K sub R curve with the Dugdale model. For the 2024-T351 aluminum alloy, the best methods included: the two-parameter fracture criterion (TPFC); the CTOD parameter using finite element analysis; the K-curve with the Dugdale model; the deformation plasticity failure assessment diagram (DPFAD); and the effective K sub R curve with a limit load condition. For 304 stainless steel, the best methods were the limit load analysis; the CTOD criterion using finite-element analysis TPFC and DPFAD. Some sample experimental results are given in an appendix.

  6. Human-centered risk management for medical devices - new methods and tools.

    PubMed

    Janß, Armin; Plogmann, Simon; Radermacher, Klaus

    2016-04-01

    Studies regarding adverse events with technical devices in the medical context showed, that in most of the cases non-usable interfaces are the cause for use deficiencies and therefore a potential harm for the patient and third parties. This is partially due to the lack of suitable methods for interlinking usability engineering and human-centered risk management. Especially regarding the early identification of human-induced errors and the systematic control of these failures, medical device manufacturers and in particular the developers have to be supported in order to guarantee reliable design and error-tolerant human-machine interfaces (HMI). In this context, we developed the HiFEM methodology and a corresponding software tool (mAIXuse) for model-based human risk analysis. Based on a two-fold approach, HiFEM provides a task-type-sensitive modeling structure with integrated temporal relations in order to represent and analyze the use process in a detailed way. The approach can be used from early developmental stages up to the validation process. Results of a comparative study with the HiFEM method and a classical process-failure mode and effect analysis (FMEA) depict, that the new modeling and analysis technique clearly outperforms the FMEA. Besides, we implemented a new method for systematic human risk control (mAIXcontrol). Accessing information from the method's knowledge base enables the operator to detect the most suitable countermeasures for a respective risk. Forty-one approved generic countermeasure principles have been indexed as a resulting combination of root causes and failures in a matrix. The methodology has been tested in comparison to a conventional approach as well. Evaluation of the matrix and the reassessment of the risk priority numbers by a blind expert demonstrate a substantial benefit of the new mAIXcontrol method.

  7. Risk Analysis for Unintentional Slide Deployment During Airline Operations.

    PubMed

    Ayra, Eduardo S; Insua, David Ríos; Castellanos, María Eugenia; Larbi, Lydia

    2015-09-01

    We present a risk analysis undertaken to mitigate problems in relation to the unintended deployment of slides under normal operations within a commercial airline. This type of incident entails relevant costs for the airline industry. After assessing the likelihood and severity of its consequences, we conclude that such risks need to be managed. We then evaluate the effectiveness of various countermeasures, describing and justifying the chosen ones. We also discuss several issues faced when implementing and communicating the proposed measures, thus fully illustrating the risk analysis process. © 2015 Society for Risk Analysis.

  8. Toward a risk assessment of the spent fuel and high-level nuclear waste disposal system. Risk assessment requirements, literature review, methods evaluation: an interim report

    SciTech Connect

    Hamilton, L.D.; Hill, D.; Rowe, M.D.; Stern, E.

    1986-04-01

    This report provides background information for a risk assessment of the disposal system for spent nuclear fuel and high-level radioactive waste (HLW). It contains a literature review, a survey of the statutory requirements for risk assessment, and a preliminary evaluation of methods. The literature review outlines the state of knowledge of risk assessment and accident consequence analysis in the nuclear fuel cycle and its applicability to spent fuel and HLW disposal. The survey of statutory requirements determines the extent to which risk assessment may be needed in development of the waste-disposal system. The evaluation of methods reviews and evaluates merits and applicabilities of alternative methods for assessing risks and relates them to the problems of spent fuel and HLW disposal. 99 refs.

  9. Risk analysis for worker exposure to benzene

    NASA Astrophysics Data System (ADS)

    Hallenbeck, William H.; Flowers, Roxanne E.

    1992-05-01

    Cancer risk factors (characterized by route, dose, dose rate per kilogram, fraction of lifetime exposed, species, and sex) were derived for workers exposed to benzene via inhalation or ingestion. Exposure at the current Occupational Safety and Health Administration (OSHA) permissible exposure limit (PEL) and at leaking underground storage tank (LUST) sites were evaluated. At the current PEL of 1 ppm, the theoretical lifetime excess risk of cancer from benzene inhalation is ten per 1000. The theoretical lifetime excess risk for worker inhalation exposure at LUST sites ranged from 10 to 40 per 1000. These results indicate that personal protection should be required. The theoretical lifetime excess risk due to soil ingestion is five to seven orders of magnitude less than the inhalation risks.

  10. A Model-Free Machine Learning Method for Risk Classification and Survival Probability Prediction.

    PubMed

    Geng, Yuan; Lu, Wenbin; Zhang, Hao Helen

    2014-01-01

    Risk classification and survival probability prediction are two major goals in survival data analysis since they play an important role in patients' risk stratification, long-term diagnosis, and treatment selection. In this article, we propose a new model-free machine learning framework for risk classification and survival probability prediction based on weighted support vector machines. The new procedure does not require any specific parametric or semiparametric model assumption on data, and is therefore capable of capturing nonlinear covariate effects. We use numerous simulation examples to demonstrate finite sample performance of the proposed method under various settings. Applications to a glioma tumor data and a breast cancer gene expression survival data are shown to illustrate the new methodology in real data analysis.

  11. Passive sampling methods for contaminated sediments: Risk assessment and management

    PubMed Central

    Greenberg, Marc S; Chapman, Peter M; Allan, Ian J; Anderson, Kim A; Apitz, Sabine E; Beegan, Chris; Bridges, Todd S; Brown, Steve S; Cargill, John G; McCulloch, Megan C; Menzie, Charles A; Shine, James P; Parkerton, Thomas F

    2014-01-01

    This paper details how activity-based passive sampling methods (PSMs), which provide information on bioavailability in terms of freely dissolved contaminant concentrations (Cfree), can be used to better inform risk management decision making at multiple points in the process of assessing and managing contaminated sediment sites. PSMs can increase certainty in site investigation and management, because Cfree is a better predictor of bioavailability than total bulk sediment concentration (Ctotal) for 4 key endpoints included in conceptual site models (benthic organism toxicity, bioaccumulation, sediment flux, and water column exposures). The use of passive sampling devices (PSDs) presents challenges with respect to representative sampling for estimating average concentrations and other metrics relevant for exposure and risk assessment. These challenges can be addressed by designing studies that account for sources of variation associated with PSMs and considering appropriate spatial scales to meet study objectives. Possible applications of PSMs include: quantifying spatial and temporal trends in bioavailable contaminants, identifying and evaluating contaminant source contributions, calibrating site-specific models, and, improving weight-of-evidence based decision frameworks. PSM data can be used to assist in delineating sediment management zones based on likelihood of exposure effects, monitor remedy effectiveness, and, evaluate risk reduction after sediment treatment, disposal, or beneficial reuse after management actions. Examples are provided illustrating why PSMs and freely dissolved contaminant concentrations (Cfree) should be incorporated into contaminated sediment investigations and study designs to better focus on and understand contaminant bioavailability, more accurately estimate exposure to sediment-associated contaminants, and better inform risk management decisions. Research and communication needs for encouraging broader use are discussed. Integr

  12. Passive sampling methods for contaminated sediments: risk assessment and management.

    PubMed

    Greenberg, Marc S; Chapman, Peter M; Allan, Ian J; Anderson, Kim A; Apitz, Sabine E; Beegan, Chris; Bridges, Todd S; Brown, Steve S; Cargill, John G; McCulloch, Megan C; Menzie, Charles A; Shine, James P; Parkerton, Thomas F

    2014-04-01

    This paper details how activity-based passive sampling methods (PSMs), which provide information on bioavailability in terms of freely dissolved contaminant concentrations (Cfree ), can be used to better inform risk management decision making at multiple points in the process of assessing and managing contaminated sediment sites. PSMs can increase certainty in site investigation and management, because Cfree is a better predictor of bioavailability than total bulk sediment concentration (Ctotal ) for 4 key endpoints included in conceptual site models (benthic organism toxicity, bioaccumulation, sediment flux, and water column exposures). The use of passive sampling devices (PSDs) presents challenges with respect to representative sampling for estimating average concentrations and other metrics relevant for exposure and risk assessment. These challenges can be addressed by designing studies that account for sources of variation associated with PSMs and considering appropriate spatial scales to meet study objectives. Possible applications of PSMs include: quantifying spatial and temporal trends in bioavailable contaminants, identifying and evaluating contaminant source contributions, calibrating site-specific models, and, improving weight-of-evidence based decision frameworks. PSM data can be used to assist in delineating sediment management zones based on likelihood of exposure effects, monitor remedy effectiveness, and, evaluate risk reduction after sediment treatment, disposal, or beneficial reuse after management actions. Examples are provided illustrating why PSMs and freely dissolved contaminant concentrations (Cfree ) should be incorporated into contaminated sediment investigations and study designs to better focus on and understand contaminant bioavailability, more accurately estimate exposure to sediment-associated contaminants, and better inform risk management decisions. Research and communication needs for encouraging broader use are discussed. © 2014

  13. Control of Risks Through the Use of Procedures: A Method for Evaluating the Change in Risk

    NASA Technical Reports Server (NTRS)

    Praino, Gregory T.; Sharit, Joseph

    2010-01-01

    This paper considers how procedures can be used to control risks faced by an organization and proposes a means of recognizing if a particular procedure reduces risk or contributes to the organization's exposure. The proposed method was developed out of the review of work documents and the governing procedures performed in the wake of the Columbia accident by NASA and the Space Shuttle prime contractor, United Space Alliance, LLC. A technique was needed to understand the rules, or procedural controls, in place at the time in the context of how important the role of each rule was. The proposed method assesses procedural risks, the residual risk associated with a hazard after a procedure's influence is accounted for, by considering each clause of a procedure as a unique procedural control that may be beneficial or harmful. For procedural risks with consequences severe enough to threaten the survival of the organization, the method measures the characteristics of each risk on a scale that is an alternative to the traditional consequence/likelihood couple. The dual benefits of the substitute scales are that they eliminate both the need to quantify a relationship between different consequence types and the need for the extensive history a probabilistic risk assessment would require. Control Value is used as an analog for the consequence, where the value of a rule is based on how well the control reduces the severity of the consequence when operating successfully. This value is composed of two parts: the inevitability of the consequence in the absence of the control, and the opportunity to intervene before the consequence is realized. High value controls will be ones where there is minimal need for intervention but maximum opportunity to actively prevent the outcome. Failure Likelihood is used as the substitute for the conventional likelihood of the outcome. For procedural controls, a failure is considered to be any non-malicious violation of the rule, whether intended or

  14. Balancing precision and risk: should multiple detection methods be analyzed separately in N-mixture models?

    USGS Publications Warehouse

    Graves, Tabitha A.; Royle, J. Andrew; Kendall, Katherine C.; Beier, Paul; Stetz, Jeffrey B.; Macleod, Amy C.

    2012-01-01

    Using multiple detection methods can increase the number, kind, and distribution of individuals sampled, which may increase accuracy and precision and reduce cost of population abundance estimates. However, when variables influencing abundance are of interest, if individuals detected via different methods are influenced by the landscape differently, separate analysis of multiple detection methods may be more appropriate. We evaluated the effects of combining two detection methods on the identification of variables important to local abundance using detections of grizzly bears with hair traps (systematic) and bear rubs (opportunistic). We used hierarchical abundance models (N-mixture models) with separate model components for each detection method. If both methods sample the same population, the use of either data set alone should (1) lead to the selection of the same variables as important and (2) provide similar estimates of relative local abundance. We hypothesized that the inclusion of 2 detection methods versus either method alone should (3) yield more support for variables identified in single method analyses (i.e. fewer variables and models with greater weight), and (4) improve precision of covariate estimates for variables selected in both separate and combined analyses because sample size is larger. As expected, joint analysis of both methods increased precision as well as certainty in variable and model selection. However, the single-method analyses identified different variables and the resulting predicted abundances had different spatial distributions. We recommend comparing single-method and jointly modeled results to identify the presence of individual heterogeneity between detection methods in N-mixture models, along with consideration of detection probabilities, correlations among variables, and tolerance to risk of failing to identify variables important to a subset of the population. The benefits of increased precision should be weighed against

  15. 49 CFR 260.17 - Credit risk premium analysis.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 4 2014-10-01 2014-10-01 false Credit risk premium analysis. 260.17 Section 260.17 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD... Financial Assistance § 260.17 Credit risk premium analysis. (a) When Federal appropriations are not...

  16. 49 CFR 260.17 - Credit risk premium analysis.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 4 2013-10-01 2013-10-01 false Credit risk premium analysis. 260.17 Section 260.17 Transportation Other Regulations Relating to Transportation (Continued) FEDERAL RAILROAD... Financial Assistance § 260.17 Credit risk premium analysis. (a) When Federal appropriations are not...

  17. Food adulteration: Sources, health risks, and detection methods.

    PubMed

    Bansal, Sangita; Singh, Apoorva; Mangal, Manisha; Mangal, Anupam K; Kumar, Sanjiv

    2017-04-13

    Adulteration in food has been a concern since the beginning of civilization, as it not only decreases the quality of food products but also results in a number of ill effects on health. Authentic testing of food and adulterant detection of various food products is required for value assessment and to assure consumer protection against fraudulent activities. Through this review we intend to compile different types of adulterations made in different food items, the health risks imposed by these adulterants and detection methods available for them. Concerns about food safety and regulation have ensured the development of various techniques like physical, biochemical/immunological and molecular techniques, for adulterant detection in food. Molecular methods are more preferable when it comes to detection of biological adulterants in food, although physical and biochemical techniques are preferable for detection of other adulterants in food.

  18. A Micro-Method of Protein Analysis

    DTIC Science & Technology

    The determination of protein by means of Weichselbaum’s (1) biuret method is too inexact when dealing with small quantities of protein (less than 200...microgram/ml initial reactant), owing to the low sensitivity of the color reaction . Although we have used this method for protein analysis of...have searched for a more sensitive colorimetric method. Nielsen (3) recently reported on a method in which the Cu bound by protein in the biuret

  19. Issues in Benchmarking Human Reliability Analysis Methods: A Literature Review

    SciTech Connect

    Ronald L. Boring; Stacey M. L. Hendrickson; John A. Forester; Tuan Q. Tran; Erasmia Lois

    2010-06-01

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessments (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study comparing and evaluating HRA methods in assessing operator performance in simulator experiments is currently underway. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies are presented in order to aid in the design of future HRA benchmarking endeavors.

  20. Issues in benchmarking human reliability analysis methods : a literature review.

    SciTech Connect

    Lois, Erasmia; Forester, John Alan; Tran, Tuan Q.; Hendrickson, Stacey M. Langfitt; Boring, Ronald L.

    2008-04-01

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessment (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study is currently underway that compares HRA methods with each other and against operator performance in simulator studies. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies are presented in order to aid in the design of future HRA benchmarking endeavors.

  1. DMAICR in an ergonomic risks analysis.

    PubMed

    Santos, E F; Lima, C R C

    2012-01-01

    The DMAICR problem-solving methodology is used throughout this paper to show you how to implement ergonomics recommendations. The DMAICR method consists of the following five six steps by which you can solve ergonomic design problems: The steps of the proposed method, adapting DMAICR, are the following: In the steep D, there is the definition of the project or the situation to be assessed and its guiding objectives, known as demand. In the step M, it relates to the work, tasks and organizational protocols and also includes the need of measuring. In the step A, all concepts are about the analysis itself. The step I is the moment of improving or incrementing. In the step C, control, prevention from prospective troublesome situation and implementation of management are the activities controlling the situation. R is Report. Some relevant technical and conceptual aspects for the comparison of these methodologies are illustrated in this paper. The steps of DMAICR were taken by a multifunctional team (multi-professional and multi-disciplinary) termed as focus group, composed by selected members of the company and supported by experts in ergonomics.

  2. [Assessment of non-ionic contrast agents in reducing the risk of side effects: analysis on the basis of voluntary willingness-to-pay measured by the contingent valuation method].

    PubMed

    Sugimura, K

    2000-01-01

    The benefit of replacing ionic contrast agents with non-ionic ones was assessed by employing cost-benefit analysis, a method of medical economic analysis. The benefit derived from replacing ionic with non-ionic contrast agents was assessed by a questionnaire survey of patients using the willingness-to-pay method based on the contingent valuation method. This questionnaire survey was conducted on 204 patients in Shimane Medical University Hospital during the period from October to December 1998. The result of analysis showed that when ionic contrast agents are replaced with non-ionic ones, patients' willingness-to-pay stands at a median value of 12,500 yen and a mean value of 17,082 +/- 1,049 yen. These figures are identical with or larger than the NHI-price differences (12,266-14,234 yen; average 13,287 yen), suggesting that patients think the benefit of reduced side effects from non-ionic contrast agents has a value that is equal to or higher than the actual NHI-prices of these agents. Further, analysis of patient profiles indicated that patients' willingness-to-pay went up with age and income but decreased when age exceeded 60 years, a finding which also suggests that the willingness-to-pay amount is closely related to the economic strength of patients.

  3. EC Transmission Line Risk Identification and Analysis

    SciTech Connect

    Bigelow, Tim S

    2012-04-01

    The purpose of this document is to assist in evaluating and planning for the cost, schedule, and technical project risks associated with the delivery and operation of the EC (Electron cyclotron) transmission line system. In general, the major risks that are anticipated to be encountered during the project delivery phase associated with the implementation of the Procurement Arrangement for the EC transmission line system are associated with: (1) Undefined or changing requirements (e.g., functional or regulatory requirements) (2) Underperformance of prototype, first unit, or production components during testing (3) Unavailability of qualified vendors for critical components Technical risks associated with the design and operation of the system are also identified.

  4. Computational methods in the pricing and risk management of modern financial derivatives

    NASA Astrophysics Data System (ADS)

    Deutsch, Hans-Peter

    1999-09-01

    In the last 20 years modern finance has developed into a complex mathematically challenging field. Very complicated risks exist in financial markets which need very advanced methods to measure and/or model them. The financial instruments invented by the market participants to trade these risk, the so called derivatives are usually even more complicated than the risks themselves and also sometimes generate new riks. Topics like random walks, stochastic differential equations, martingale measures, time series analysis, implied correlations, etc. are of common use in the field. This is why more and more people with a science background, such as physicists, mathematicians, or computer scientists, are entering the field of finance. The measurement and management of all theses risks is the key to the continuing success of banks. This talk gives insight into today's common methods of modern market risk management such as variance-covariance, historical simulation, Monte Carlo, “Greek” ratios, etc., including the statistical concepts on which they are based. Derivatives are at the same time the main reason for and the most effective means of conducting risk management. As such, they stand at the beginning and end of risk management. The valuation of derivatives and structured financial instruments is therefore the prerequisite, the condition sine qua non, for all risk management. This talk introduces some of the important valuation methods used in modern derivatives pricing such as present value, Black-Scholes, binomial trees, Monte Carlo, etc. In summary this talk highlights an area outside physics where there is a lot of interesting work to do, especially for physicists. Or as one of our consultants said: The fascinating thing about this job is that Arthur Andersen hired me not ALTHOUGH I am a physicist but BECAUSE I am a physicist.

  5. Improved Methods for Fire Risk Assessment in Low-Income and Informal Settlements

    PubMed Central

    Twigg, John; Christie, Nicola; Haworth, James; Osuteye, Emmanuel; Skarlatidou, Artemis

    2017-01-01

    Fires cause over 300,000 deaths annually worldwide and leave millions more with permanent injuries: some 95% of these deaths are in low- and middle-income countries. Burn injury risk is strongly associated with low-income and informal (or slum) settlements, which are growing rapidly in an urbanising world. Fire policy and mitigation strategies in poorer countries are constrained by inadequate data on incidence, impacts, and causes, which is mainly due to a lack of capacity and resources for data collection, analysis, and modelling. As a first step towards overcoming such challenges, this project reviewed the literature on the subject to assess the potential of a range of methods and tools for identifying, assessing, and addressing fire risk in low-income and informal settlements; the process was supported by an expert workshop at University College London in May 2016. We suggest that community-based risk and vulnerability assessment methods, which are widely used in disaster risk reduction, could be adapted to urban fire risk assessment, and could be enhanced by advances in crowdsourcing and citizen science for geospatial data creation and collection. To assist urban planners, emergency managers, and community organisations who are working in resource-constrained settings to identify and assess relevant fire risk factors, we also suggest an improved analytical framework based on the Haddon Matrix. PMID:28157149

  6. Improved Methods for Fire Risk Assessment in Low-Income and Informal Settlements.

    PubMed

    Twigg, John; Christie, Nicola; Haworth, James; Osuteye, Emmanuel; Skarlatidou, Artemis

    2017-02-01

    Fires cause over 300,000 deaths annually worldwide and leave millions more with permanent injuries: some 95% of these deaths are in low- and middle-income countries. Burn injury risk is strongly associated with low-income and informal (or slum) settlements, which are growing rapidly in an urbanising world. Fire policy and mitigation strategies in poorer countries are constrained by inadequate data on incidence, impacts, and causes, which is mainly due to a lack of capacity and resources for data collection, analysis, and modelling. As a first step towards overcoming such challenges, this project reviewed the literature on the subject to assess the potential of a range of methods and tools for identifying, assessing, and addressing fire risk in low-income and informal settlements; the process was supported by an expert workshop at University College London in May 2016. We suggest that community-based risk and vulnerability assessment methods, which are widely used in disaster risk reduction, could be adapted to urban fire risk assessment, and could be enhanced by advances in crowdsourcing and citizen science for geospatial data creation and collection. To assist urban planners, emergency managers, and community organisations who are working in resource-constrained settings to identify and assess relevant fire risk factors, we also suggest an improved analytical framework based on the Haddon Matrix.

  7. Fracture risk in patients with type 2 diabetes mellitus and possible risk factors: a systematic review and meta-analysis

    PubMed Central

    Moayeri, Ardeshir; Mohamadpour, Mahmoud; Mousavi, Seyedeh Fatemeh; Shirzadpour, Ehsan; Mohamadpour, Safoura; Amraei, Mansour

    2017-01-01

    Aim Patients with type 2 diabetes mellitus (T2DM) have an increased risk of bone fractures. A variable increase in fracture risk has been reported depending on skeletal site, diabetes duration, study design, insulin use, and so on. The present meta-analysis aimed to investigate the association between T2DM with fracture risk and possible risk factors. Methods Different databases including PubMed, Institute for Scientific Information, and Scopus were searched up to May 2016. All epidemiologic studies on the association between T2DM and fracture risk were included. The relevant data obtained from these papers were analyzed by a random effects model and publication bias was assessed by funnel plot. All analyses were done by R software (version 3.2.1) and STATA (version 11.1). Results Thirty eligible studies were selected for the meta-analysis. We found a statistically significant positive association between T2DM and hip, vertebral, or foot fractures and no association between T2DM and wrist, proximal humerus, or ankle fractures. Overall, T2DM was associated with an increased risk of any fracture (summary relative risk =1.05, 95% confidence interval: 1.04, 1.06) and increased with age, duration of diabetes, and insulin therapy. Conclusion Our findings strongly support an association between T2DM and increased risk of overall fracture. These findings emphasize the need for fracture prevention strategies in patients with diabetes. PMID:28442913

  8. Fire behavior and risk analysis in spacecraft

    NASA Technical Reports Server (NTRS)

    Friedman, Robert; Sacksteder, Kurt R.

    1988-01-01

    Practical risk management for present and future spacecraft, including space stations, involves the optimization of residual risks balanced by the spacecraft operational, technological, and economic limitations. Spacecraft fire safety is approached through three strategies, in order of risk: (1) control of fire-causing elements, through exclusion of flammable materials for example; (2) response to incipient fires through detection and alarm; and (3) recovery of normal conditions through extinguishment and cleanup. Present understanding of combustion in low gravity is that, compared to normal gravity behavior, fire hazards may be reduced by the absence of buoyant gas flows yet at the same time increased by ventilation flows and hot particle expulsion. This paper discusses the application of low-gravity combustion knowledge and appropriate aircraft analogies to fire detection, fire fighting, and fire-safety decisions for eventual fire-risk management and optimization in spacecraft.

  9. A statistical method for studying correlated rare events and their risk factors

    PubMed Central

    Xue, Xiaonan; Kim, Mimi Y; Wang, Tao; Kuniholm, Mark H; Strickler, Howard D

    2016-01-01

    Longitudinal studies of rare events such as cervical high-grade lesions or colorectal polyps that can recur often involve correlated binary data. Risk factor for these events cannot be reliably examined using conventional statistical methods. For example, logistic regression models that incorporate generalized estimating equations often fail to converge or provide inaccurate results when analyzing data of this type. Although exact methods have been reported, they are complex and computationally difficult. The current paper proposes a mathematically straightforward and easy-to-use two-step approach involving (i) an additive model to measure associations between a rare or uncommon correlated binary event and potential risk factors and (ii) a permutation test to estimate the statistical significance of these associations. Simulation studies showed that the proposed method reliably tests and accurately estimates the associations of exposure with correlated binary rare events. This method was then applied to a longitudinal study of human leukocyte antigen (HLA) genotype and risk of cervical high grade squamous intraepithelial lesions (HSIL) among HIV-infected and HIV-uninfected women. Results showed statistically significant associations of two HLA alleles among HIV-negative but not HIV-positive women, suggesting that immune status may modify the HLA and cervical HSIL association. Overall, the proposed method avoids model nonconvergence problems and provides a computationally simple, accurate, and powerful approach for the analysis of risk factor associations with rare/uncommon correlated binary events. PMID:25854937

  10. A statistical method for studying correlated rare events and their risk factors.

    PubMed

    Xue, Xiaonan; Kim, Mimi Y; Wang, Tao; Kuniholm, Mark H; Strickler, Howard D

    2017-06-01

    Longitudinal studies of rare events such as cervical high-grade lesions or colorectal polyps that can recur often involve correlated binary data. Risk factor for these events cannot be reliably examined using conventional statistical methods. For example, logistic regression models that incorporate generalized estimating equations often fail to converge or provide inaccurate results when analyzing data of this type. Although exact methods have been reported, they are complex and computationally difficult. The current paper proposes a mathematically straightforward and easy-to-use two-step approach involving (i) an additive model to measure associations between a rare or uncommon correlated binary event and potential risk factors and (ii) a permutation test to estimate the statistical significance of these associations. Simulation studies showed that the proposed method reliably tests and accurately estimates the associations of exposure with correlated binary rare events. This method was then applied to a longitudinal study of human leukocyte antigen (HLA) genotype and risk of cervical high grade squamous intraepithelial lesions (HSIL) among HIV-infected and HIV-uninfected women. Results showed statistically significant associations of two HLA alleles among HIV-negative but not HIV-positive women, suggesting that immune status may modify the HLA and cervical HSIL association. Overall, the proposed method avoids model nonconvergence problems and provides a computationally simple, accurate, and powerful approach for the analysis of risk factor associations with rare/uncommon correlated binary events.

  11. Airbreathing hypersonic vehicle design and analysis methods

    NASA Technical Reports Server (NTRS)

    Lockwood, Mary Kae; Petley, Dennis H.; Hunt, James L.; Martin, John G.

    1996-01-01

    The design, analysis, and optimization of airbreathing hypersonic vehicles requires analyses involving many highly coupled disciplines at levels of accuracy exceeding those traditionally considered in a conceptual or preliminary-level design. Discipline analysis methods including propulsion, structures, thermal management, geometry, aerodynamics, performance, synthesis, sizing, closure, and cost are discussed. Also, the on-going integration of these methods into a working environment, known as HOLIST, is described.

  12. [Framework analysis method in qualitative research].

    PubMed

    Liao, Xing; Liu, Jian-ping; Robison, Nicola; Xie, Ya-ming

    2014-05-01

    In recent years a number of qualitative research methods have gained popularity within the health care arena. Despite this popularity, different qualitative analysis methods pose many challenges to most researchers. The present paper responds to the needs expressed by recent Chinese medicine researches. The present paper is mainly focused on the concepts, nature, application of framework analysis, especially on how to use it, in such a way to assist the newcomer of Chinese medicine researchers to engage with the methodology.

  13. N reactor level III probabilistic risk assessment using NUREG-1150 methods

    SciTech Connect

    Wang, O.S.; Coles, G.A.; Kelly, J.E.; Powers, T.B.; Rainey, T.E.; Zentner, M.D. ); Wyss, G.D.; Kunsman, D.M.; Miller, L.A.; Wheeler, T.A.; Sprung, J.L.; Camp, A.L. )

    1991-11-01

    This paper reports that in the late 1980s, a level III probabilistic risk assessment (PRA) was performed for the N Reactor, a U.S. Department of Energy (DOE) production reactor located on the Hanford site in Washington State. The PRA objectives were to assess the risks to the public and to the Hanford on-site workers posed by the operation of the N Reactor, to compare those risks to proposed DOE nuclear safety guidelines, and to identify risk-reduction changes to the plant. State-of-the-art methodology was employed based largely on the methods developed by Sandia National Laboratories for the U.S. Nuclear Regulatory Commission in support of the NUREG-1150 study of five commercial nuclear power plants. The structure of the probabilistic models allowed complex interactions and dependencies between systems to be explicitly considered. Latin hypercube sampling techniques were used to develop uncertainty distribution for the risks associated with postulated core damage events initiated by fire, seismic, and internal events as well as the overall combined risk. The risk results show that the N Reactor meets the proposed DOE nuclear safety guidelines and compares favorably to the commercial nuclear power plants considered in the NUREG-1150 analysis.

  14. Developing a Methodology for Risk-Informed Trade-Space Analysis in Acquisition

    DTIC Science & Technology

    2015-01-01

    Craig A. Bond, Lauren A. Mayer, Michael E. McMahon, James G. Kallimani, Ricardo Sanchez Developing a Methodology for Risk- Informed Trade -Space...Methodology for Risk-Informed Trade -Space Analysis in Acquisition 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d...project “Developing a Method- ology Framework for Conducting Risk-Informed Trade Space Analy- ses.” The primary objective of this study was to

  15. Analysis of dengue fever risk using geostatistics model in bone regency

    NASA Astrophysics Data System (ADS)

    Amran, Stang, Mallongi, Anwar

    2017-03-01

    This research aim is to analysis of dengue fever risk based on Geostatistics model in Bone Regency. Risk levels of dengue fever are denoted by parameter of Binomial distribution. Effect of temperature, rainfalls, elevation, and larvae abundance are investigated through Geostatistics model. Bayesian hierarchical method is used in estimation process. Using dengue fever data in eleven locations this research shows that temperature and rainfall have significant effect of dengue fever risk in Bone regency.

  16. Extended risk-analysis model for activities of the project.

    PubMed

    Kušar, Janez; Rihar, Lidija; Zargi, Urban; Starbek, Marko

    2013-12-01

    Project management of product/service orders has become a mode of operation in many companies. Although these are mostly cyclically recurring projects, risk management is very important for them. An extended risk-analysis model for new product/service projects is presented in this paper. Emphasis is on a solution developed in the Faculty of Mechanical Engineering in Ljubljana, Slovenia. The usual project activities risk analysis is based on evaluation of the probability that risk events occur and on evaluation of their consequences. A third parameter has been added in our model: an estimate of the incidence of risk events. On the basis of the calculated activity risk level, a project team prepares preventive and corrective measures that should be taken according to the status indicators. An important advantage of the proposed solution is that the project manager and his team members are timely warned of risk events and they can thus activate the envisaged preventive and corrective measures as necessary.

  17. Laboratory theory and methods for sediment analysis

    USGS Publications Warehouse

    Guy, Harold P.

    1969-01-01

    The diverse character of fluvial sediments makes the choice of laboratory analysis somewhat arbitrary and the pressing of sediment samples difficult. This report presents some theories and methods used by the Water Resources Division for analysis of fluvial sediments to determine the concentration of suspended-sediment samples and the particle-size distribution of both suspended-sediment and bed-material samples. Other analyses related to these determinations may include particle shape, mineral content, and specific gravity, the organic matter and dissolved solids of samples, and the specific weight of soils. The merits and techniques of both the evaporation and filtration methods for concentration analysis are discussed. Methods used for particle-size analysis of suspended-sediment samples may include the sieve pipet, the VA tube-pipet, or the BW tube-VA tube depending on the equipment available, the concentration and approximate size of sediment in the sample, and the settling medium used. The choice of method for most bed-material samples is usually limited to procedures suitable for sand or to some type of visual analysis for large sizes. Several tested forms are presented to help insure a well-ordered system in the laboratory to handle the samples, to help determine the kind of analysis required for each, to conduct the required processes, and to assist in the required computations. Use of the manual should further 'standardize' methods of fluvial sediment analysis among the many laboratories and thereby help to achieve uniformity and precision of the data.

  18. An analysis of the new EPA risk management rule

    SciTech Connect

    Loran, B.; Nand, K.; Male, M.

    1997-08-01

    Due to increasing public concern of risks from handling highly hazardous chemicals at various facilities, a number of state and federal regulatory agencies, such as the Occupational Safety and Health Administration (OSHA) and recently the US Environmental Protection Agency (EPA), have enacted regulations requiring these facilities to perform accidental risk analysis and develop process safety and risk management programs. The regulatory requirements to be fulfilled are described; the major components involved are a Process Hazard Analysis, a Consequence Analysis, and a Management Program. The performance of these analyses and the development of a management program for 21 facilities operated by the City of Los Angeles Department of Water and Power, treating drinking water supplies with chlorine, is discussed. The effectiveness of the EPA risk management rule in achieving risk reduction is critically analyzed; it is found that, while the rule increases the worker and public awareness of the inherent risks present, some of the analytical results obtained may have a limited practical application.

  19. Pharmaceutical supply chain risk assessment in Iran using analytic hierarchy process (AHP) and simple additive weighting (SAW) methods.

    PubMed

    Jaberidoost, Mona; Olfat, Laya; Hosseini, Alireza; Kebriaeezadeh, Abbas; Abdollahi, Mohammad; Alaeddini, Mahdi; Dinarvand, Rassoul

    2015-01-01

    Pharmaceutical supply chain is a significant component of the health system in supplying medicines, particularly in countries where main drugs are provided by local pharmaceutical companies. No previous studies exist assessing risks and disruptions in pharmaceutical companies while assessing the pharmaceutical supply chain. Any risks affecting the pharmaceutical companies could disrupt supply medicines and health system efficiency. The goal of this study was the risk assessment in pharmaceutical industry in Iran considering process's priority, hazard and probability of risks. The study was carried out in 4 phases; risk identification through literature review, risk identification in Iranian pharmaceutical companies through interview with experts, risk analysis through a questionnaire and consultation with experts using group analytic hierarchy process (AHP) method and rating scale (RS) and risk evaluation of simple additive weighting (SAW) method. In total, 86 main risks were identified in the pharmaceutical supply chain with perspective of pharmaceutical companies classified in 11 classes. The majority of risks described in this study were related to the financial and economic category. Also financial management was found to be the most important factor for consideration. Although pharmaceutical industry and supply chain were affected by current political conditions in Iran during the study time, but half of total risks in the pharmaceutical supply chain were found to be internal risks which could be fixed by companies, internally. Likewise, political status and related risks forced companies to focus more on financial and supply management resulting in less attention to quality management.

  20. Two MIS Analysis Methods: An Experimental Comparison.

    ERIC Educational Resources Information Center

    Wang, Shouhong

    1996-01-01

    In China, 24 undergraduate business students applied data flow diagrams (DFD) to a mini-case, and 20 used object-oriented analysis (OOA). DFD seemed easier to learn, but after training, those using the OOA method for systems analysis made fewer errors. (SK)

  1. Two MIS Analysis Methods: An Experimental Comparison.

    ERIC Educational Resources Information Center

    Wang, Shouhong

    1996-01-01

    In China, 24 undergraduate business students applied data flow diagrams (DFD) to a mini-case, and 20 used object-oriented analysis (OOA). DFD seemed easier to learn, but after training, those using the OOA method for systems analysis made fewer errors. (SK)

  2. Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture

    DOEpatents

    Sanfilippo, Antonio P [Richland, WA; Cowell, Andrew J [Kennewick, WA; Gregory, Michelle L [Richland, WA; Baddeley, Robert L [Richland, WA; Paulson, Patrick R [Pasco, WA; Tratz, Stephen C [Richland, WA; Hohimer, Ryan E [West Richland, WA

    2012-03-20

    Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture are described according to some aspects. In one aspect, a hypothesis analysis method includes providing a hypothesis, providing an indicator which at least one of supports and refutes the hypothesis, using the indicator, associating evidence with the hypothesis, weighting the association of the evidence with the hypothesis, and using the weighting, providing information regarding the accuracy of the hypothesis.

  3. Meta-analysis: Circulating vitamin D and ovarian cancer risk.

    PubMed

    Yin, Lu; Grandi, Norma; Raum, Elke; Haug, Ulrike; Arndt, Volker; Brenner, Hermann

    2011-05-01

    To review and summarize evidence from longitudinal studies on the association between circulating 25 hydroxyvitamin D (25(OH)D) and the risk of ovarian cancer (OC). Relevant prospective cohort studies and nested case-control studies were identified by systematically searching Ovid Medline, EMBASE, and ISI Web of Knowledge databases and by cross-referencing. The following data were extracted in a standardized manner from eligible studies: first author, publication year, country, study design, characteristics of the study population, duration of follow-up, OC incidence according to circulating vitamin D status and the respective relative risks, and covariates adjusted for in the analysis. Due to the heterogeneity of studies in categorizing circulating vitamin D levels, all results were recalculated for an increase of circulating 25(OH)D by 20ng/ml. Summary relative risks (RRs) were calculated using meta-analysis methods. Overall, ten individual-level studies were included that reported on the association between circulating vitamin D levels and OC incidence. Meta-analysis of studies on OC incidence resulted in a summary RR (95% confidence interval, CI) of 0.83 (0.63-1.08) for an increase of 25(OH)D by 20ng/ml (P=0.160). No indication for heterogeneity and publication bias was found. A tentative inverse association of circulating 25(OH)D with OC incidence was found, which did not reach statistical significance but which requires clarification by additional studies due to potentially high clinical and public health impact. Copyright © 2011 Elsevier Inc. All rights reserved.

  4. Dietary Fat Intake and Lung Cancer Risk: A Pooled Analysis.

    PubMed

    Yang, Jae Jeong; Yu, Danxia; Takata, Yumie; Smith-Warner, Stephanie A; Blot, William; White, Emily; Robien, Kim; Park, Yikyung; Xiang, Yong-Bing; Sinha, Rashmi; Lazovich, DeAnn; Stampfer, Meir; Tumino, Rosario; Aune, Dagfinn; Overvad, Kim; Liao, Linda; Zhang, Xuehong; Gao, Yu-Tang; Johansson, Mattias; Willett, Walter; Zheng, Wei; Shu, Xiao-Ou

    2017-09-10

    Purpose Dietary fat may play a role in lung carcinogenesis. Findings from epidemiologic studies, however, remain inconsistent. In this pooled analysis of 10 prospective cohort studies from the United States, Europe, and Asia, we evaluated the associations of total and specific types of dietary fat with lung cancer risk. Methods Cox regression was used to estimate hazard ratios (HRs) and 95% CIs in each cohort. Study-specific risk estimates were pooled by random- or fixed-effects meta-analysis. The first 2 years of follow-up were excluded to address potential influence of preclinical dietary changes. Results Among 1,445,850 participants, 18,822 incident cases were identified (mean follow-up, 9.4 years). High intakes of total and saturated fat were associated with an increased risk of lung cancer (for highest v lowest quintile: HR, 1.07 and 1.14, respectively; 95% CI, 1.00 to 1.15 and 1.07 to 1.22, respectively; P for trend for both < .001). The positive association of saturated fat was more evident among current smokers (HR, 1.23; 95% CI, 1.13 to 1.35; P for trend < .001) than former/never smokers ( P for interaction = .004), and for squamous cell and small cell carcinoma (HR, 1.61 and 1.40, respectively; 95% CI, 1.38 to 1.88 and 1.17 to 1.67, respectively; P for trend for both < .001) than other histologic types ( P for heterogeneity < .001). In contrast, a high intake of polyunsaturated fat was associated with a decreased risk of lung cancer (HR, 0.92; 95% CI, 0.87 to 0.98 for highest v lowest quintile; P for trend = .02). A 5% energy substitution of saturated fat with polyunsaturated fat was associated with a 16% to 17% lower risk of small cell and squamous cell carcinoma. No associations were found for monounsaturated fat. Conclusion Findings from this large, international cohort consortium suggest that modifying dietary fat intake (ie, replacing saturated fat with polyunsaturated fat) may reduce lung cancer risk, particularly among smokers and for squamous cell

  5. Probabilistic risk assessment of N Reactor using NUREG-1150 methods

    SciTech Connect

    Wang, O.S.; Baxter, J.T.; Coles, G.A.; Powers, T.B.; Zentner, M.D.

    1989-11-01

    A Level III probabilistic risk assessment (PRA) has been performed for N Reactor, a US Department of Energy (DOE) Category A production reactor. The main contractor is Westinghouse Hanford Company (Westinghouse Hanford). The PRA methodology developed for the US Nuclear Regulatory Commission (NRC) by Sandia National Laboratories (SNL) in support of the NUREG-1150 (Reference 1) effort were used for this analysis. N Reactor is a graphite-moderated pressurized water reactor designed by General Electric. The dual-purpose 4000 MWt nuclear plant is located within the Hanford Site in the south-central part of the State of Washington. In addition to producing special materials for the DOE, N Reactor generates 860 MWe for the Washington Public Power Supply System. The reactor has been operated successfully and safely since 1963, and was put into standby status in 1988 due to the changing need in special nuclear material. 3 refs., 4 tabs.

  6. Metabolic Disease Risk in Children by Salivary Biomarker Analysis

    PubMed Central

    Goodson, J. Max; Kantarci, Alpdogan; Hartman, Mor-Li; Denis, Gerald V.; Stephens, Danielle; Hasturk, Hatice; Yaskell, Tina; Vargas, Jorel; Wang, Xiaoshan; Cugini, Maryann; Barake, Roula; Alsmadi, Osama; Al-Mutawa, Sabiha; Ariga, Jitendra; Soparkar, Pramod; Behbehani, Jawad; Behbehani, Kazem; Welty, Francine

    2014-01-01

    Objective The study of obesity-related metabolic syndrome or Type 2 diabetes (T2D) in children is particularly difficult because of fear of needles. We tested a non-invasive approach to study inflammatory parameters in an at-risk population of children to provide proof-of-principle for future investigations of vulnerable subjects. Design and Methods We evaluated metabolic differences in 744, 11-year old children selected from underweight, normal healthy weight, overweight and obese categories by analyzing fasting saliva samples for 20 biomarkers. Saliva supernatants were obtained following centrifugation and used for analyses. Results Salivary C-reactive protein (CRP) was 6 times higher, salivary insulin and leptin were 3 times higher, and adiponectin was 30% lower in obese children compared to healthy normal weight children (all P<0.0001). Categorical analysis suggested that there might be three types of obesity in children. Distinctly inflammatory characteristics appeared in 76% of obese children while in 13%, salivary insulin was high but not associated with inflammatory mediators. The remaining 11% of obese children had high insulin and reduced adiponectin. Forty percent of the non-obese children were found in groups which, based on biomarker characteristics, may be at risk for becoming obese. Conclusions Significantly altered levels of salivary biomarkers in obese children from a high-risk population, suggest the potential for developing non-invasive screening procedures to identify T2D-vulnerable individuals and a means to test preventative strategies. PMID:24915044

  7. Risk assessment and cost-effectiveness/utility analysis.

    PubMed

    Busch, Michael; Walderhaug, Mark; Custer, Brian; Allain, Jean-Pierre; Reddy, Ravi; McDonough, Brian

    2009-04-01

    Decision-makers at all levels of public health and transfusion medicine have always assessed the risks and benefits of their decisions. Decisions are usually guided by immediately available information and a significant amount of experience and judgment. For decisions concerning familiar situations and common problems, judgment and experience may work quite well, but this type of decision process can lack clarity and accountability. Public health challenges are changing as emerging diseases and expensive technologies complicate the decision-makers' task, confronting the decision-maker with new problems that include multiple potential solutions. Decisions regarding policies and adoption of technologies are particularly complex in transfusion medicine due to the scope of the field, implications for public health, and legal, regulatory and public expectations regarding blood safety. To assist decision-makers, quantitative risk assessment and cost-effectiveness analysis are now being more widely applied. This set of articles will introduce risk assessment and cost-effectiveness methodologies and discuss recent applications of these methods in transfusion medicine.

  8. Causal Moderation Analysis Using Propensity Score Methods

    ERIC Educational Resources Information Center

    Dong, Nianbo

    2012-01-01

    This paper is based on previous studies in applying propensity score methods to study multiple treatment variables to examine the causal moderator effect. The propensity score methods will be demonstrated in a case study to examine the causal moderator effect, where the moderators are categorical and continuous variables. Moderation analysis is an…

  9. Enhancing local action planning through quantitative flood risk analysis: a case study in Spain

    NASA Astrophysics Data System (ADS)

    Castillo-Rodríguez, Jesica Tamara; Escuder-Bueno, Ignacio; Perales-Momparler, Sara; Ramón Porta-Sancho, Juan

    2016-07-01

    This article presents a method to incorporate and promote quantitative risk analysis to support local action planning against flooding. The proposed approach aims to provide a framework for local flood risk analysis, combining hazard mapping with vulnerability data to quantify risk in terms of expected annual affected population, potential injuries, number of fatalities, and economic damages. Flood risk is estimated combining GIS data of loads, system response, and consequences and using event tree modelling for risk calculation. The study area is the city of Oliva, located on the eastern coast of Spain. Results from risk modelling have been used to inform local action planning and to assess the benefits of structural and non-structural risk reduction measures. Results show the potential impact on risk reduction of flood defences and improved warning communication schemes through local action planning: societal flood risk (in terms of annual expected affected population) would be reduced up to 51 % by combining both structural and non-structural measures. In addition, the effect of seasonal population variability is analysed (annual expected affected population ranges from 82 to 107 %, compared with the current situation, depending on occupancy rates in hotels and campsites). Results highlight the need for robust and standardized methods for urban flood risk analysis replicability at regional and national scale.

  10. Risk analysis of an RTG on the Space Shuttle

    NASA Technical Reports Server (NTRS)

    Frank, Michael V.

    1991-01-01

    As part of the effort to review the Ulysses Final Safety Analysis Report and to understand the risk of plutonium release from the Ulysses spacecraft General Purpose Heat Source-Radioisotope Thermal Generator (GPHS-RTG), the Interagency Nuclear Safety Review Panel (INSRP) and the author performed an integrated, quantitative analysis of the uncertainties of the calculated risk of plutonium release from Ulysses. Using state-of-the-art probabilistic risk assessment technology, the uncertainty analysis accounted for both variability and uncertainty of the key parameters of the risk analysis. The results show tht INSRP had high confidence that risk of fatal cancers from potential plutonium release associated with calculated launch and deployment accident scenarios is low.

  11. Creating a spatially-explicit index: a method for assessing the global wildfire-water risk

    NASA Astrophysics Data System (ADS)

    Robinne, François-Nicolas; Parisien, Marc-André; Flannigan, Mike; Miller, Carol; Bladon, Kevin D.

    2017-04-01

    The wildfire-water risk (WWR) has been defined as the potential for wildfires to adversely affect water resources that are important for downstream ecosystems and human water needs for adequate water quantity and quality, therefore compromising the security of their water supply. While tools and methods are numerous for watershed-scale risk analysis, the development of a toolbox for the large-scale evaluation of the wildfire risk to water security has only started recently. In order to provide managers and policy-makers with an adequate tool, we implemented a method for the spatial analysis of the global WWR based on the Driving forces-Pressures-States-Impacts-Responses (DPSIR) framework. This framework relies on the cause-and-effect relationships existing between the five categories of the DPSIR chain. As this approach heavily relies on data, we gathered an extensive set of spatial indicators relevant to fire-induced hydrological hazards and water consumption patterns by human and natural communities. When appropriate, we applied a hydrological routing function to our indicators in order to simulate downstream accumulation of potentially harmful material. Each indicator was then assigned a DPSIR category. We collapsed the information in each category using a principal component analysis in order to extract the most relevant pixel-based information provided by each spatial indicator. Finally, we compiled our five categories using an additive indexation process to produce a spatially-explicit index of the WWR. A thorough sensitivity analysis has been performed in order to understand the relationship between the final risk values and the spatial pattern of each category used during the indexation. For comparison purposes, we aggregated index scores by global hydrological regions, or hydrobelts, to get a sense of regional DPSIR specificities. This rather simple method does not necessitate the use of complex physical models and provides a scalable and efficient tool

  12. Space Weather Influence on Power Systems: Prediction, Risk Analysis, and Modeling

    NASA Astrophysics Data System (ADS)

    Yatsenko, Vitaliy

    2016-04-01

    This report concentrates on dynamic probabilistic risk analysis of optical elements for complex characterization of damages using physical model of solid state lasers and predictable level of ionizing radiation and space weather. The following main subjects will be covered by our report: (a) solid-state laser model; (b) mathematical models for dynamic probabilistic risk assessment; and (c) software for modeling and prediction of ionizing radiation. A probabilistic risk assessment method for solid-state lasers is presented with consideration of some deterministic and stochastic factors. Probabilistic risk assessment is a comprehensive, structured, and logical analysis method aimed at identifying and assessing risks in solid-state lasers for the purpose of cost-e®ectively improving their safety and performance. This method based on the Conditional Value-at-Risk measure (CVaR) and the expected loss exceeding Value-at-Risk (VaR). We propose to use a new dynamical-information approach for radiation damage risk assessment of laser elements by cosmic radiation. Our approach includes the following steps: laser modeling, modeling of ionizing radiation in°uences on laser elements, probabilistic risk assessment methods, and risk minimization. For computer simulation of damage processes at microscopic and macroscopic levels the following methods are used: () statistical; (b) dynamical; (c) optimization; (d) acceleration modeling, and (e) mathematical modeling of laser functioning. Mathematical models of space ionizing radiation in°uence on laser elements were developed for risk assessment in laser safety analysis. This is a so-called `black box' or `input-output' models, which seeks only to reproduce the behaviour of the system's output in response to changes in its inputs. The model inputs are radiation in°uences on laser systems and output parameters are dynamical characteristics of the solid laser. Algorithms and software for optimal structure and parameters of

  13. Comparing multiple competing interventions in the absence of randomized trials using clinical risk-benefit analysis

    PubMed Central

    2012-01-01

    Background To demonstrate the use of risk-benefit analysis for comparing multiple competing interventions in the absence of randomized trials, we applied this approach to the evaluation of five anticoagulants to prevent thrombosis in patients undergoing orthopedic surgery. Methods Using a cost-effectiveness approach from a clinical perspective (i.e. risk benefit analysis) we compared thromboprophylaxis with warfarin, low molecular weight heparin, unfractionated heparin, fondaparinux or ximelagatran in patients undergoing major orthopedic surgery, with sub-analyses according to surgery type. Proportions and variances of events defining risk (major bleeding) and benefit (thrombosis averted) were obtained through a meta-analysis and used to define beta distributions. Monte Carlo simulations were conducted and used to calculate incremental risks, benefits, and risk-benefit ratios. Finally, net clinical benefit was calculated for all replications across a range of risk-benefit acceptability thresholds, with a reference range obtained by estimating the case fatality rate - ratio of thrombosis to bleeding. Results The analysis showed that compared to placebo ximelagatran was superior to other options but final results were influenced by type of surgery, since ximelagatran was superior in total knee replacement but not in total hip replacement. Conclusions Using simulation and economic techniques we demonstrate a method that allows comparing multiple competing interventions in the absence of randomized trials with multiple arms by determining the option with the best risk-benefit profile. It can be helpful in clinical decision making since it incorporates risk, benefit, and personal risk acceptance. PMID:22233221

  14. Use-related risk analysis for medical devices based on improved FMEA.

    PubMed

    Liu, Long; Shuai, Ma; Wang, Zhu; Li, Ping

    2012-01-01

    In order to effectively analyze and control use-related risk of medical devices, quantitative methodologies must be applied. Failure Mode and Effects Analysis (FMEA) is a proactive technique for error detection and risk reduction. In this article, an improved FMEA based on Fuzzy Mathematics and Grey Relational Theory is developed to better carry out user-related risk analysis for medical devices. As an example, the analysis process using this improved FMEA method for a certain medical device (C-arm X-ray machine) is described.

  15. Stratospheric Aerosol and Gas Experiment, SAGE III on ISS, An Earth Science Mission on the International Space Station, Schedule Risk Analysis, A Project Perspective

    NASA Technical Reports Server (NTRS)

    Bonine, Lauren

    2015-01-01

    The presentation provides insight into the schedule risk analysis process used by the Stratospheric Aerosol and Gas Experiment III on the International Space Station Project. The presentation focuses on the schedule risk analysis process highlighting the methods for identification of risk inputs, the inclusion of generic risks identified outside the traditional continuous risk management process, and the development of tailored analysis products used to improve risk informed decision making.

  16. Maternal migration and autism risk: systematic analysis.

    PubMed

    Crafa, Daina; Warfa, Nasir

    2015-02-01

    Autism (AUT) is one of the most prevalent developmental disorders emerging during childhood, and can be amongst the most incapacitating mental disorders. Some individuals with AUT require a lifetime of supervised care. Autism Speaks reported estimated costs for 2012 at £34 billion in the UK; and $3.2 million-$126 billion in the US, Australia and Canada. Ethnicity and migration experiences appear to increase risks of AUT and relate to underlying biological risk factors. Sociobiological stress factors can affect the uterine environment, or relate to stress-induced epigenetic changes during pregnancy and delivery. Epigenetic risk factors associated with AUT also include poor pregnancy conditions, low birth weight, and congenital malformation. Recent studies report that children from migrant communities are at higher risk of AUT than children born to non-migrant mothers, with the exception of Hispanic children. This paper provides the first systematic review into prevalence and predictors of AUT with a particular focus on maternal migration stressors and epigenetic risk factors. AUT rates appear higher in certain migrant communities, potentially relating to epigenetic changes after stressful experiences. Although AUT remains a rare disorder, failures to recognize its public health urgency and local community needs continue to leave certain cultural groups at a disadvantage.

  17. Risk factors for retained surgical items: a meta-analysis and proposed risk stratification system.

    PubMed

    Moffatt-Bruce, Susan D; Cook, Charles H; Steinberg, Steven M; Stawicki, Stanislaw P

    2014-08-01

    Retained surgical items (RSI) are designated as completely preventable "never events". Despite numerous case reports, clinical series, and expert opinions few studies provide quantitative insight into RSI risk factors and their relative contributions to the overall RSI risk profile. Existing case-control studies lack the ability to reliably detect clinically important differences within the long list of proposed risks. This meta-analysis examines the best available data for RSI risk factors, seeking to provide a clinically relevant risk stratification system. Nineteen candidate studies were considered for this meta-analysis. Three retrospective, case-control studies of RSI-related risk factors contained suitable group comparisons between patients with and without RSI, thus qualifying for further analysis. Comprehensive Meta-Analysis 2.0 (BioStat, Inc, Englewood, NJ) software was used to analyze the following "common factor" variables compiled from the above studies: body-mass index, emergency procedure, estimated operative blood loss >500 mL, incorrect surgical count, lack of surgical count, >1 subprocedure, >1 surgical team, nursing staff shift change, operation "afterhours" (i.e., between 5 PM and 7 AM), operative time, trainee presence, and unexpected intraoperative factors. We further stratified resulting RSI risk factors into low, intermediate, and high risk. Despite the fact that only between three and six risk factors were associated with increased RSI risk across the three studies, our analysis of pooled data demonstrates that seven risk factors are significantly associated with increased RSI risk. Variables found to elevate the RSI risk include intraoperative blood loss >500 mL (odds ratio [OR] 1.6); duration of operation (OR 1.7); >1 subprocedure (OR 2.1); lack of surgical counts (OR 2.5); >1 surgical team (OR 3.0); unexpected intraoperative factors (OR 3.4); and incorrect surgical count (OR 6.1). Changes in nursing staff, emergency surgery, body

  18. Hybrid least squares multivariate spectral analysis methods

    DOEpatents

    Haaland, David M.

    2002-01-01

    A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following estimation or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The "hybrid" method herein means a combination of an initial classical least squares analysis calibration step with subsequent analysis by an inverse multivariate analysis method. A "spectral shape" herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The "shape" can be continuous, discontinuous, or even discrete points illustrative of the particular effect.

  19. Why Map Issues? On Controversy Analysis as a Digital Method

    PubMed Central

    2015-01-01

    This article takes stock of recent efforts to implement controversy analysis as a digital method in the study of science, technology, and society (STS) and beyond and outlines a distinctive approach to address the problem of digital bias. Digital media technologies exert significant influence on the enactment of controversy in online settings, and this risks undermining the substantive focus of controversy analysis conducted by digital means. To address this problem, I propose a shift in thematic focus from controversy analysis to issue mapping. The article begins by distinguishing between three broad frameworks that currently guide the development of controversy analysis as a digital method, namely, demarcationist, discursive, and empiricist. Each has been adopted in STS, but only the last one offers a digital “move beyond impartiality.” I demonstrate this approach by analyzing issues of Internet governance with the aid of the social media platform Twitter. PMID:26336325

  20. Why Map Issues? On Controversy Analysis as a Digital Method.

    PubMed

    Marres, Noortje

    2015-09-01

    This article takes stock of recent efforts to implement controversy analysis as a digital method in the study of science, technology, and society (STS) and beyond and outlines a distinctive approach to address the problem of digital bias. Digital media technologies exert significant influence on the enactment of controversy in online settings, and this risks undermining the substantive focus of controversy analysis conducted by digital means. To address this problem, I propose a shift in thematic focus from controversy analysis to issue mapping. The article begins by distinguishing between three broad frameworks that currently guide the development of controversy analysis as a digital method, namely, demarcationist, discursive, and empiricist. Each has been adopted in STS, but only the last one offers a digital "move beyond impartiality." I demonstrate this approach by analyzing issues of Internet governance with the aid of the social media platform Twitter.

  1. Risk analysis: divergent models and convergent interpretations

    NASA Technical Reports Server (NTRS)

    Carnes, B. A.; Gavrilova, N.

    2001-01-01

    Material presented at a NASA-sponsored workshop on risk models for exposure conditions relevant to prolonged space flight are described in this paper. Analyses used mortality data from experiments conducted at Argonne National Laboratory on the long-term effects of external whole-body irradiation on B6CF1 mice by 60Co gamma rays and fission neutrons delivered as a single exposure or protracted over either 24 or 60 once-weekly exposures. The maximum dose considered was restricted to 1 Gy for neutrons and 10 Gy for gamma rays. Proportional hazard models were used to investigate the shape of the dose response at these lower doses for deaths caused by solid-tissue tumors and tumors of either connective or epithelial tissue origin. For protracted exposures, a significant mortality effect was detected at a neutron dose of 14 cGy and a gamma-ray dose of 3 Gy. For single exposures, radiation-induced mortality for neutrons also occurred within the range of 10-20 cGy, but dropped to 86 cGy for gamma rays. Plots of risk relative to control estimated for each observed dose gave a visual impression of nonlinearity for both neutrons and gamma rays. At least for solid-tissue tumors, male and female mortality was nearly identical for gamma-ray exposures, but mortality risks for females were higher than for males for neutron exposures. As expected, protracting the gamma-ray dose reduced mortality risks. Although curvature consistent with that observed visually could be detected by a model parameterized to detect curvature, a relative risk term containing only a simple term for total dose was usually sufficient to describe the dose response. Although detectable mortality for the three pathology end points considered typically occurred at the same level of dose, the highest risks were almost always associated with deaths caused by tumors of epithelial tissue origin.

  2. Risk analysis: divergent models and convergent interpretations

    NASA Technical Reports Server (NTRS)

    Carnes, B. A.; Gavrilova, N.

    2001-01-01

    Material presented at a NASA-sponsored workshop on risk models for exposure conditions relevant to prolonged space flight are described in this paper. Analyses used mortality data from experiments conducted at Argonne National Laboratory on the long-term effects of external whole-body irradiation on B6CF1 mice by 60Co gamma rays and fission neutrons delivered as a single exposure or protracted over either 24 or 60 once-weekly exposures. The maximum dose considered was restricted to 1 Gy for neutrons and 10 Gy for gamma rays. Proportional hazard models were used to investigate the shape of the dose response at these lower doses for deaths caused by solid-tissue tumors and tumors of either connective or epithelial tissue origin. For protracted exposures, a significant mortality effect was detected at a neutron dose of 14 cGy and a gamma-ray dose of 3 Gy. For single exposures, radiation-induced mortality for neutrons also occurred within the range of 10-20 cGy, but dropped to 86 cGy for gamma rays. Plots of risk relative to control estimated for each observed dose gave a visual impression of nonlinearity for both neutrons and gamma rays. At least for solid-tissue tumors, male and female mortality was nearly identical for gamma-ray exposures, but mortality risks for females were higher than for males for neutron exposures. As expected, protracting the gamma-ray dose reduced mortality risks. Although curvature consistent with that observed visually could be detected by a model parameterized to detect curvature, a relative risk term containing only a simple term for total dose was usually sufficient to describe the dose response. Although detectable mortality for the three pathology end points considered typically occurred at the same level of dose, the highest risks were almost always associated with deaths caused by tumors of epithelial tissue origin.

  3. Obesity and Risk of Thyroid Cancer: Evidence from a Meta-Analysis of 21 Observational Studies

    PubMed Central

    Ma, Jie; Huang, Min; Wang, Li; Ye, Wei; Tong, Yan; Wang, Hanmin

    2015-01-01

    Background Several studies have evaluated the association between obesity and thyroid cancer risk. However, the results remain uncertain. In this study, we conducted a meta-analysis to assess the association between obesity and thyroid cancer risk. Material/Methods Published literature from PubMed, EMBASE, Springer Link, Ovid, Chinese Wanfang Data Knowledge Service Platform, Chinese National Knowledge Infrastructure (CNKI), and Chinese Biology Medicine (CBM) were retrieved before 10 August 2014. We included all studies that reported adjusted risk ratios (RRs), hazard ratios (HRs) or odds ratios (ORs), and 95% confidence intervals (CIs) of thyroid cancer risk. Results Thirty-two studies (n=12 620 676) were included in this meta-analysis. Obesity was associated with a significantly increased risk of thyroid cancer (adjusted RR=1.33; 95% CI, 1.24–1.42; I2=25%). In the subgroup analysis by study type, increased risk of thyroid cancer was found in cohort studies and case-control studies. In subgroup analysis by sex, both obese men and women were at significantly greater risk of thyroid cancer than non-obese subjects. When stratified by ethnicity, significantly elevated risk was observed in Caucasians and in Asians. In the age subgroup analysis, both young and old populations showed increased thyroid cancer risk. Subgroup analysis on smoking status showed that increased thyroid cancer risks were found in smokers and in non-smokers. In the histology subgroup analyses, increased risks of papillary thyroid cancer, follicular thyroid cancer, and anaplastic thyroid cancer were observed. However, obesity was associated with decreased risk of medullary thyroid cancer. Conclusions Our results indicate that obesity is associated with an increased thyroid cancer risk, except medullary thyroid cancer. PMID:25612155

  4. Integrated Hybrid System Architecture for Risk Analysis

    NASA Technical Reports Server (NTRS)

    Moynihan, Gary P.; Fonseca, Daniel J.; Ray, Paul S.

    2010-01-01

    A conceptual design has been announced of an expert-system computer program, and the development of a prototype of the program, intended for use as a project-management tool. The program integrates schedule and risk data for the purpose of determining the schedule applications of safety risks and, somewhat conversely, the effects of changes in schedules on changes on safety. It is noted that the design has been delivered to a NASA client and that it is planned to disclose the design in a conference presentation.

  5. Handbook of methods for risk-based analyses of technical specifications

    SciTech Connect

    Samanta, P.K.; Kim, I.S.; Mankamo, T.; Vesely, W.E.

    1994-12-01

    Technical Specifications (TS) requirements for nuclear power plants define the Limiting Conditions for Operation (LCOs) and Surveillance Requirements (SRs) to assure safety during operation. In general, these requirements are based on deterministic analysis and engineering judgments. Experiences with plant operation indicate that some elements of the requirements are unnecessarily restrictive, while a few may not be conducive to safety. The US Nuclear Regulatory Commission (USNRC) Office of Research has sponsored research to develop systematic risk-based methods to improve various aspects of TS requirements. This handbook summarizes these risk-based methods. The scope of the handbook includes reliability and risk-based methods for evaluating allowed outage times (AOTs), scheduled or preventive maintenance, action statements requiring shutdown where shutdown risk may be substantial, surveillance test intervals (STIs), and management of plant configurations resulting from outages of systems, or components. For each topic, the handbook summarizes analytic methods with data needs, outlines the insights to be gained, lists additional references, and gives examples of evaluations.

  6. Comparison of 3 methods for identifying dietary patterns associated with risk of disease.

    PubMed

    DiBello, Julia R; Kraft, Peter; McGarvey, Stephen T; Goldberg, Robert; Campos, Hannia; Baylin, Ana

    2008-12-15

    Reduced rank regression and partial least-squares regression (PLS) are proposed alternatives to principal component analysis (PCA). Using all 3 methods, the authors derived dietary patterns in Costa Rican data collected on 3,574 cases and controls in 1994-2004 and related the resulting patterns to risk of first incident myocardial infarction. Four dietary patterns associated with myocardial infarction were identified. Factor 1, characterized by high intakes of lean chicken, vegetables, fruit, and polyunsaturated oil, was generated by all 3 dietary pattern methods and was associated with a significantly decreased adjusted risk of myocardial infarction (28%-46%, depending on the method used). PCA and PLS also each yielded a pattern associated with a significantly decreased risk of myocardial infarction (31% and 23%, respectively); this pattern was characterized by moderate intake of alcohol and polyunsaturated oil and low intake of high-fat dairy products. The fourth factor derived from PCA was significantly associated with a 38% increased risk of myocardial infarction and was characterized by high intakes of coffee and palm oil. Contrary to previous studies, the authors found PCA and PLS to produce more patterns associated with cardiovascular disease than reduced rank regression. The most effective method for deriving dietary patterns related to disease may vary depending on the study goals.

  7. Flood Risk Assessment Based On Security Deficit Analysis

    NASA Astrophysics Data System (ADS)

    Beck, J.; Metzger, R.; Hingray, B.; Musy, A.

    Risk is a human perception: a given risk may be considered as acceptable or unac- ceptable depending on the group that has to face that risk. Flood risk analysis of- ten estimates economic losses from damages, but neglects the question of accept- able/unacceptable risk. With input from land use managers, politicians and other stakeholders, risk assessment based on security deficit analysis determines objects with unacceptable risk and their degree of security deficit. Such a risk assessment methodology, initially developed by the Swiss federal authorities, is illustrated by its application on a reach of the Alzette River (Luxembourg) in the framework of the IRMA-SPONGE FRHYMAP project. Flood risk assessment always involves a flood hazard analysis, an exposed object vulnerability analysis, and an analysis combing the results of these two previous analyses. The flood hazard analysis was done with the quasi-2D hydraulic model FldPln to produce flood intensity maps. Flood intensity was determined by the water height and velocity. Object data for the vulnerability analysis, provided by the Luxembourg government, were classified according to their potential damage. Potential damage is expressed in terms of direct, human life and secondary losses. A thematic map was produced to show the object classification. Protection goals were then attributed to the object classes. Protection goals are assigned in terms of an acceptable flood intensity for a certain flood frequency. This is where input from land use managers and politicians comes into play. The perception of risk in the re- gion or country influences the protection goal assignment. Protection goals as used in Switzerland were used in this project. Thematic maps showing the protection goals of each object in the case study area for a given flood frequency were produced. Com- parison between an object's protection goal and the intensity of the flood that touched the object determine the acceptability of the risk and the

  8. Cost-effectiveness analysis of risk-reduction measures to reach water safety targets.

    PubMed

    Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof; Pettersson, Thomas J R

    2011-01-01

    Identifying the most suitable risk-reduction measures in drinking water systems requires a thorough analysis of possible alternatives. In addition to the effects on the risk level, also the economic aspects of the risk-reduction alternatives are commonly considered important. Drinking water supplies are complex systems and to avoid sub-optimisation of risk-reduction measures, the entire system from source to tap needs to be considered. There is a lack of methods for quantification of water supply risk reduction in an economic context for entire drinking water systems. The aim of this paper is to present a novel approach for risk assessment in combination with economic analysis to evaluate risk-reduction measures based on a source-to-tap approach. The approach combines a probabilistic and dynamic fault tree method with cost-effectiveness analysis (CEA). The developed approach comprises the following main parts: (1) quantification of risk reduction of alternatives using a probabilistic fault tree model of the entire system; (2) combination of the modelling results with CEA; and (3) evaluation of the alternatives with respect to the risk reduction, the probability of not reaching water safety targets and the cost-effectiveness. The fault tree method and CEA enable comparison of risk-reduction measures in the same quantitative unit and consider costs and uncertainties. The approach provides a structured and thorough analysis of risk-reduction measures that facilitates transparency and long-term planning of drinking water systems in order to avoid sub-optimisation of available resources for risk reduction. Copyright © 2010 Elsevier Ltd. All rights reserved.

  9. Fire risk analysis for a chemical agent disposal facility

    SciTech Connect

    Chang, J. ); Ho, V. ); Douthat, D. )

    1992-01-01

    The US Department of Defense (DOD) was directed by the Congress in the DOD Authorization Act of 1986 to destroy the nation's stockpile of lethal unitary chemical warfare agents and munitions. The stockpile consists of nerve agents and a blister agent in bulk storage containers, bombs, rockets, mines, projectiles, and mortar rounds stored at eight locations in the continental US and at Johnston Atoll in the Pacific Ocean. The chemical agent disposal facility is designed to destroy the agents safely. Serious fires in the facility can cause munition explosions, major equipment damages, and the damage of safety control systems whose functions are crucial in preventing agent release. A fire risk assessment is conducted to investigate frequencies, consequences, and mitigation methods of fires to enhance the design safety features of the agent disposal facility. This paper describes the fire risk analysis (FRA) performed in the system hazard analysis task for the facility and also presents highlights of the FRA results. Application can be made to the nuclear industry.

  10. Performance of risk prediction for inflammatory bowel disease based on genotyping platform and genomic risk score method.

    PubMed

    Chen, Guo-Bo; Lee, Sang Hong; Montgomery, Grant W; Wray, Naomi R; Visscher, Peter M; Gearry, Richard B; Lawrance, Ian C; Andrews, Jane M; Bampton, Peter; Mahy, Gillian; Bell, Sally; Walsh, Alissa; Connor, Susan; Sparrow, Miles; Bowdler, Lisa M; Simms, Lisa A; Krishnaprasad, Krupa; Radford-Smith, Graham L; Moser, Gerhard

    2017-08-29

    Predicting risk of disease from genotypes is being increasingly proposed for a variety of diagnostic and prognostic purposes. Genome-wide association studies (GWAS) have identified a large number of genome-wide significant susceptibility loci for Crohn's disease (CD) and ulcerative colitis (UC), two subtypes of inflammatory bowel disease (IBD). Recent studies have demonstrated that including only loci that are significantly associated with disease in the prediction model has low predictive power and that power can substantially be improved using a polygenic approach. We performed a comprehensive analysis of risk prediction models using large case-control cohorts genotyped for 909,763 GWAS SNPs or 123,437 SNPs on the custom designed Immunochip using four prediction methods (polygenic score, best linear genomic prediction, elastic-net regularization and a Bayesian mixture model). We used the area under the curve (AUC) to assess prediction performance for discovery populations with different sample sizes and number of SNPs within cross-validation. On average, the Bayesian mixture approach had the best prediction performance. Using cross-validation we found little differences in prediction performance between GWAS and Immunochip, despite the GWAS array providing a 10 times larger effective genome-wide coverage. The prediction performance using Immunochip is largely due to the power of the initial GWAS for its marker selection and its low cost that enabled larger sample sizes. The predictive ability of the genomic risk score based on Immunochip was replicated in external data, with AUC of 0.75 for CD and 0.70 for UC. CD patients with higher risk scores demonstrated clinical characteristics typically associated with a more severe disease course including ileal location and earlier age at diagnosis. Our analyses demonstrate that the power of genomic risk prediction for IBD is mainly due to strongly associated SNPs with considerable effect sizes. Additional SNPs that are

  11. Risk and value analysis of SETI

    NASA Technical Reports Server (NTRS)

    Billingham, J.

    1986-01-01

    The risks, values, and costs of the SETI project are evaluated and compared with those of the Viking project. Examination of the scientific values, side benefits, and costs of the two projects reveal that both projects provide equal benefits at equal costs. The probability of scientific and technical success is analyzed.

  12. Method and apparatus for ceramic analysis

    DOEpatents

    Jankowiak, Ryszard J.; Schilling, Chris; Small, Gerald J.; Tomasik, Piotr

    2003-04-01

    The present invention relates to a method and apparatus for ceramic analysis, in particular, a method for analyzing density, density gradients and/or microcracks, including an apparatus with optical instrumentation for analysis of density, density gradients and/or microcracks in ceramics. The method provides analyzing density of a ceramic comprising exciting a component on a surface/subsurface of the ceramic by exposing the material to excitation energy. The method may further include the step of obtaining a measurement of an emitted energy from the component. The method may additionally include comparing the measurement of the emitted energy from the component with a predetermined reference measurement so as to obtain a density for said ceramic.

  13. Bioanalytical methods for food contaminant analysis.

    PubMed

    Van Emon, Jeanette M

    2010-01-01

    Foods are complex mixtures of lipids, carbohydrates, proteins, vitamins, organic compounds, and other naturally occurring substances. Sometimes added to this mixture are residues of pesticides, veterinary and human drugs, microbial toxins, preservatives, contaminants from food processing and packaging, and other residues. This milieu of compounds can pose difficulties in the analysis of food contaminants. There is an expanding need for rapid and cost-effective residue methods for difficult food matrixes to safeguard our food supply. Bioanalytical methods are established for many food contaminants such as mycotoxins and are the method of choice for many food allergens. Bioanalytical methods are often more cost-effective and sensitive than instrumental procedures. Recent developments in bioanalytical methods may provide more applications for their use in food analysis.

  14. A quantitative risk analysis approach to port hydrocarbon logistics.

    PubMed

    Ronza, A; Carol, S; Espejo, V; Vílchez, J A; Arnaldos, J

    2006-01-16

    A method is presented that allows quantitative risk analysis to be performed on marine hydrocarbon terminals sited in ports. A significant gap was identified in the technical literature on QRA for the handling of hazardous materials in harbours published prior to this work. The analysis is extended to tanker navigation through port waters and loading and unloading facilities. The steps of the method are discussed, beginning with data collecting. As to accident scenario identification, an approach is proposed that takes into account minor and massive spills due to loading arm failures and tank rupture. Frequency estimation is thoroughly reviewed and a shortcut approach is proposed for frequency calculation. This allows for the two-fold possibility of a tanker colliding/grounding at/near the berth or while navigating to/from the berth. A number of probability data defining the possibility of a cargo spill after an external impact on a tanker are discussed. As to consequence and vulnerability estimates, a scheme is proposed for the use of ratios between the numbers of fatal victims, injured and evacuated people. Finally, an example application is given, based on a pilot study conducted in the Port of Barcelona, where the method was tested.

  15. Analysis of Affordance, Time, and Adaptation in the Assessment of Industrial Control System Cybersecurity Risk.

    PubMed

    Busby, J S; Green, B; Hutchison, D

    2017-01-17

    Industrial control systems increasingly use standard communication protocols and are increasingly connected to public networks-creating substantial cybersecurity risks, especially when used in critical infrastructures such as electricity and water distribution systems. Methods of assessing risk in such systems have recognized for some time the way in which the strategies of potential adversaries and risk managers interact in defining the risk to which such systems are exposed. But it is also important to consider the adaptations of the systems' operators and other legitimate users to risk controls, adaptations that often appear to undermine these controls, or shift the risk from one part of a system to another. Unlike the case with adversarial risk analysis, the adaptations of system users are typically orthogonal to the objective of minimizing or maximizing risk in the system. We argue that this need to analyze potential adaptations to risk controls is true for risk problems more generally, and we develop a framework for incorporating such adaptations into an assessment process. The method is based on the principle of affordances, and we show how this can be incorporated in an iterative procedure based on raising the minimum period of risk materialization above some threshold. We apply the method in a case study of a small European utility provider and discuss the observations arising from this.

  16. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    SciTech Connect

    Nachtigal, Noel M.; Fruetel, Julia A.; Gleason, Nathaniel J.; Helms, Jovana; Imbro, Dennis Raymond; Sumner, Matthew C.

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  17. Analysis to support allergen risk management: Which way to go?

    PubMed

    Cucu, Tatiana; Jacxsens, Liesbeth; De Meulenaer, Bruno

    2013-06-19

    Food allergy represents an important food safety issue because of the potential lethal effects; the only effective treatment is the complete removal of the allergen involved from the diet. However, due to the growing complexity of food formulations and food processing, foods may be unintentionally contaminated via allergen-containing ingredients or cross-contamination. This affects not only consumers' well-being but also food producers and competent authorities involved in inspecting and auditing food companies. To address these issues, the food industry and control agencies rely on available analytical methods to quantify the amount of a particular allergic commodity in a food and thus to decide upon its safety. However, no "gold standard methods" exist for the quantitative detection of food allergens. Nowadays mostly receptor-based methods and in particular commercial kits are used in routine analysis. However, upon evaluation of their performances, commercial assays proved often to be unreliable in processed foods, attributed to the chemical changes in proteins that affect the molecular recognition with the receptor used. Unfortunately, the analytical outcome of other methods, among which are chromatographic combined with mass spectrometric techniques as well as DNA-based methods, seem to be affected in a comparable way by food processing. Several strategies can be employed to improve the quantitative analysis of allergens in foods. Nevertheless, issues related to extractability and matrix effects remain a permanent challenge. In view of the presented results, it is clear that the food industry needs to continue to make extra efforts to provide accurate labeling and to reduce the contamination with allergens to an acceptable level through the use of allergen risk management on a company level, which needs to be supported inevitably by a tailor-validated extraction and detection method.

  18. Network analysis of wildfire transmission and implications for risk governance

    PubMed Central

    Ager, Alan A.; Evers, Cody R.; Day, Michelle A.; Preisler, Haiganoush K.; Barros, Ana M. G.; Nielsen-Pincus, Max

    2017-01-01

    We characterized wildfire transmission and exposure within a matrix of large land tenures (federal, state, and private) surrounding 56 communities within a 3.3 million ha fire prone region of central Oregon US. Wildfire simulation and network analysis were used to quantify the exchange of fire among land tenures and communities and analyze the relative contributions of human versus natural ignitions to wildfire exposure. Among the land tenures examined, the area burned by incoming fires averaged 57% of the total burned area. Community exposure from incoming fires ignited on surrounding land tenures accounted for 67% of the total area burned. The number of land tenures contributing wildfire to individual communities and surrounding wildland urban interface (WUI) varied from 3 to 20. Community firesheds, i.e. the area where ignitions can spawn fires that can burn into the WUI, covered 4