Sample records for risk analysis method

  1. The development of a 3D risk analysis method.

    PubMed

    I, Yet-Pole; Cheng, Te-Lung

    2008-05-01

    Much attention has been paid to the quantitative risk analysis (QRA) research in recent years due to more and more severe disasters that have happened in the process industries. Owing to its calculation complexity, very few software, such as SAFETI, can really make the risk presentation meet the practice requirements. However, the traditional risk presentation method, like the individual risk contour in SAFETI, is mainly based on the consequence analysis results of dispersion modeling, which usually assumes that the vapor cloud disperses over a constant ground roughness on a flat terrain with no obstructions and concentration fluctuations, which is quite different from the real situations of a chemical process plant. All these models usually over-predict the hazardous regions in order to maintain their conservativeness, which also increases the uncertainty of the simulation results. On the other hand, a more rigorous model such as the computational fluid dynamics (CFD) model can resolve the previous limitations; however, it cannot resolve the complexity of risk calculations. In this research, a conceptual three-dimensional (3D) risk calculation method was proposed via the combination of results of a series of CFD simulations with some post-processing procedures to obtain the 3D individual risk iso-surfaces. It is believed that such technique will not only be limited to risk analysis at ground level, but also be extended into aerial, submarine, or space risk analyses in the near future.

  2. Risk Analysis Methods for Deepwater Port Oil Transfer Systems

    DOT National Transportation Integrated Search

    1976-06-01

    This report deals with the risk analysis methodology for oil spills from the oil transfer systems in deepwater ports. Failure mode and effect analysis in combination with fault tree analysis are identified as the methods best suited for the assessmen...

  3. A comparative critical study between FMEA and FTA risk analysis methods

    NASA Astrophysics Data System (ADS)

    Cristea, G.; Constantinescu, DM

    2017-10-01

    Today there is used an overwhelming number of different risk analyses techniques with acronyms such as: FMEA (Failure Modes and Effects Analysis) and its extension FMECA (Failure Mode, Effects, and Criticality Analysis), DRBFM (Design Review by Failure Mode), FTA (Fault Tree Analysis) and and its extension ETA (Event Tree Analysis), HAZOP (Hazard & Operability Studies), HACCP (Hazard Analysis and Critical Control Points) and What-if/Checklist. However, the most used analysis techniques in the mechanical and electrical industry are FMEA and FTA. In FMEA, which is an inductive method, information about the consequences and effects of the failures is usually collected through interviews with experienced people, and with different knowledge i.e., cross-functional groups. The FMEA is used to capture potential failures/risks & impacts and prioritize them on a numeric scale called Risk Priority Number (RPN) which ranges from 1 to 1000. FTA is a deductive method i.e., a general system state is decomposed into chains of more basic events of components. The logical interrelationship of how such basic events depend on and affect each other is often described analytically in a reliability structure which can be visualized as a tree. Both methods are very time-consuming to be applied thoroughly, and this is why it is oftenly not done so. As a consequence possible failure modes may not be identified. To address these shortcomings, it is proposed to use a combination of FTA and FMEA.

  4. Risk Analysis of Earth-Rock Dam Failures Based on Fuzzy Event Tree Method

    PubMed Central

    Fu, Xiao; Gu, Chong-Shi; Su, Huai-Zhi; Qin, Xiang-Nan

    2018-01-01

    Earth-rock dams make up a large proportion of the dams in China, and their failures can induce great risks. In this paper, the risks associated with earth-rock dam failure are analyzed from two aspects: the probability of a dam failure and the resulting life loss. An event tree analysis method based on fuzzy set theory is proposed to calculate the dam failure probability. The life loss associated with dam failure is summarized and refined to be suitable for Chinese dams from previous studies. The proposed method and model are applied to one reservoir dam in Jiangxi province. Both engineering and non-engineering measures are proposed to reduce the risk. The risk analysis of the dam failure has essential significance for reducing dam failure probability and improving dam risk management level. PMID:29710824

  5. Application of the Maximum Entropy Method to Risk Analysis of Mergers and Acquisitions

    NASA Astrophysics Data System (ADS)

    Xie, Jigang; Song, Wenyun

    The maximum entropy (ME) method can be used to analyze the risk of mergers and acquisitions when only pre-acquisition information is available. A practical example of the risk analysis of China listed firms’ mergers and acquisitions is provided to testify the feasibility and practicality of the method.

  6. Failure mode and effects analysis: a comparison of two common risk prioritisation methods.

    PubMed

    McElroy, Lisa M; Khorzad, Rebeca; Nannicelli, Anna P; Brown, Alexandra R; Ladner, Daniela P; Holl, Jane L

    2016-05-01

    Failure mode and effects analysis (FMEA) is a method of risk assessment increasingly used in healthcare over the past decade. The traditional method, however, can require substantial time and training resources. The goal of this study is to compare a simplified scoring method with the traditional scoring method to determine the degree of congruence in identifying high-risk failures. An FMEA of the operating room (OR) to intensive care unit (ICU) handoff was conducted. Failures were scored and ranked using both the traditional risk priority number (RPN) and criticality-based method, and a simplified method, which designates failures as 'high', 'medium' or 'low' risk. The degree of congruence was determined by first identifying those failures determined to be critical by the traditional method (RPN≥300), and then calculating the per cent congruence with those failures designated critical by the simplified methods (high risk). In total, 79 process failures among 37 individual steps in the OR to ICU handoff process were identified. The traditional method yielded Criticality Indices (CIs) ranging from 18 to 72 and RPNs ranging from 80 to 504. The simplified method ranked 11 failures as 'low risk', 30 as medium risk and 22 as high risk. The traditional method yielded 24 failures with an RPN ≥300, of which 22 were identified as high risk by the simplified method (92% agreement). The top 20% of CI (≥60) included 12 failures, of which six were designated as high risk by the simplified method (50% agreement). These results suggest that the simplified method of scoring and ranking failures identified by an FMEA can be a useful tool for healthcare organisations with limited access to FMEA expertise. However, the simplified method does not result in the same degree of discrimination in the ranking of failures offered by the traditional method. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  7. The integration methods of fuzzy fault mode and effect analysis and fault tree analysis for risk analysis of yogurt production

    NASA Astrophysics Data System (ADS)

    Aprilia, Ayu Rizky; Santoso, Imam; Ekasari, Dhita Murita

    2017-05-01

    Yogurt is a product based on milk, which has beneficial effects for health. The process for the production of yogurt is very susceptible to failure because it involves bacteria and fermentation. For an industry, the risks may cause harm and have a negative impact. In order for a product to be successful and profitable, it requires the analysis of risks that may occur during the production process. Risk analysis can identify the risks in detail and prevent as well as determine its handling, so that the risks can be minimized. Therefore, this study will analyze the risks of the production process with a case study in CV.XYZ. The method used in this research is the Fuzzy Failure Mode and Effect Analysis (fuzzy FMEA) and Fault Tree Analysis (FTA). The results showed that there are 6 risks from equipment variables, raw material variables, and process variables. Those risks include the critical risk, which is the risk of a lack of an aseptic process, more specifically if starter yogurt is damaged due to contamination by fungus or other bacteria and a lack of sanitation equipment. The results of quantitative analysis of FTA showed that the highest probability is the probability of the lack of an aseptic process, with a risk of 3.902%. The recommendations for improvement include establishing SOPs (Standard Operating Procedures), which include the process, workers, and environment, controlling the starter of yogurt and improving the production planning and sanitation equipment using hot water immersion.

  8. Multiple Interacting Risk Factors: On Methods for Allocating Risk Factor Interactions.

    PubMed

    Price, Bertram; MacNicoll, Michael

    2015-05-01

    A persistent problem in health risk analysis where it is known that a disease may occur as a consequence of multiple risk factors with interactions is allocating the total risk of the disease among the individual risk factors. This problem, referred to here as risk apportionment, arises in various venues, including: (i) public health management, (ii) government programs for compensating injured individuals, and (iii) litigation. Two methods have been described in the risk analysis and epidemiology literature for allocating total risk among individual risk factors. One method uses weights to allocate interactions among the individual risk factors. The other method is based on risk accounting axioms and finding an optimal and unique allocation that satisfies the axioms using a procedure borrowed from game theory. Where relative risk or attributable risk is the risk measure, we find that the game-theory-determined allocation is the same as the allocation where risk factor interactions are apportioned to individual risk factors using equal weights. Therefore, the apportionment problem becomes one of selecting a meaningful set of weights for allocating interactions among the individual risk factors. Equal weights and weights proportional to the risks of the individual risk factors are discussed. © 2015 Society for Risk Analysis.

  9. A comparison of two prospective risk analysis methods: Traditional FMEA and a modified healthcare FMEA.

    PubMed

    Rah, Jeong-Eun; Manger, Ryan P; Yock, Adam D; Kim, Gwe-Ya

    2016-12-01

    To examine the abilities of a traditional failure mode and effects analysis (FMEA) and modified healthcare FMEA (m-HFMEA) scoring methods by comparing the degree of congruence in identifying high risk failures. The authors applied two prospective methods of the quality management to surface image guided, linac-based radiosurgery (SIG-RS). For the traditional FMEA, decisions on how to improve an operation were based on the risk priority number (RPN). The RPN is a product of three indices: occurrence, severity, and detectability. The m-HFMEA approach utilized two indices, severity and frequency. A risk inventory matrix was divided into four categories: very low, low, high, and very high. For high risk events, an additional evaluation was performed. Based upon the criticality of the process, it was decided if additional safety measures were needed and what they comprise. The two methods were independently compared to determine if the results and rated risks matched. The authors' results showed an agreement of 85% between FMEA and m-HFMEA approaches for top 20 risks of SIG-RS-specific failure modes. The main differences between the two approaches were the distribution of the values and the observation that failure modes (52, 54, 154) with high m-HFMEA scores do not necessarily have high FMEA-RPN scores. In the m-HFMEA analysis, when the risk score is determined, the basis of the established HFMEA Decision Tree™ or the failure mode should be more thoroughly investigated. m-HFMEA is inductive because it requires the identification of the consequences from causes, and semi-quantitative since it allows the prioritization of high risks and mitigation measures. It is therefore a useful tool for the prospective risk analysis method to radiotherapy.

  10. Probabilistic Scenario-based Seismic Risk Analysis for Critical Infrastructures Method and Application for a Nuclear Power Plant

    NASA Astrophysics Data System (ADS)

    Klügel, J.

    2006-12-01

    Deterministic scenario-based seismic hazard analysis has a long tradition in earthquake engineering for developing the design basis of critical infrastructures like dams, transport infrastructures, chemical plants and nuclear power plants. For many applications besides of the design of infrastructures it is of interest to assess the efficiency of the design measures taken. These applications require a method allowing to perform a meaningful quantitative risk analysis. A new method for a probabilistic scenario-based seismic risk analysis has been developed based on a probabilistic extension of proven deterministic methods like the MCE- methodology. The input data required for the method are entirely based on the information which is necessary to perform any meaningful seismic hazard analysis. The method is based on the probabilistic risk analysis approach common for applications in nuclear technology developed originally by Kaplan & Garrick (1981). It is based (1) on a classification of earthquake events into different size classes (by magnitude), (2) the evaluation of the frequency of occurrence of events, assigned to the different classes (frequency of initiating events, (3) the development of bounding critical scenarios assigned to each class based on the solution of an optimization problem and (4) in the evaluation of the conditional probability of exceedance of critical design parameters (vulnerability analysis). The advantage of the method in comparison with traditional PSHA consists in (1) its flexibility, allowing to use different probabilistic models for earthquake occurrence as well as to incorporate advanced physical models into the analysis, (2) in the mathematically consistent treatment of uncertainties, and (3) in the explicit consideration of the lifetime of the critical structure as a criterion to formulate different risk goals. The method was applied for the evaluation of the risk of production interruption losses of a nuclear power plant during its

  11. RAMPART (TM): Risk Assessment Method-Property Analysis and Ranking Tool v.4.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carson, Susan D.; Hunter, Regina L.; Link, Madison D.

    RAMPART{trademark}, Risk Assessment Method-property Analysis and Ranking Tool, is a new type of computer software package for the assessment of risk to buildings. RAMPART{trademark} has been developed by Sandia National Laboratories (SNL) for the U.S. General Services Administration (GSA). RAMPART {trademark} has been designed and developed to be a risk-based decision support tool that requires no risk analysis expertise on the part of the user. The RAMPART{trademark} user interface elicits information from the user about the building. The RAMPART{trademark} expert system is a set of rules that embodies GSA corporate knowledge and SNL's risk assessment experience. The RAMPART{trademark} database containsmore » both data entered by the user during a building analysis session and large sets of natural hazard and crime data. RAMPART{trademark} algorithms use these data to assess the risk associated with a given building in the face of certain hazards. Risks arising from five natural hazards (earthquake, hurricane, winter storm, tornado and flood); crime (inside and outside the building); fire and terrorism are calculated. These hazards may cause losses of various kinds. RAMPART{trademark} considers death, injury, loss of mission, loss of property, loss of contents, loss of building use, and first-responder loss. The results of each analysis are presented graphically on the screen and in a written report.« less

  12. Probabilistic risk analysis and terrorism risk.

    PubMed

    Ezell, Barry Charles; Bennett, Steven P; von Winterfeldt, Detlof; Sokolowski, John; Collins, Andrew J

    2010-04-01

    Since the terrorist attacks of September 11, 2001, and the subsequent establishment of the U.S. Department of Homeland Security (DHS), considerable efforts have been made to estimate the risks of terrorism and the cost effectiveness of security policies to reduce these risks. DHS, industry, and the academic risk analysis communities have all invested heavily in the development of tools and approaches that can assist decisionmakers in effectively allocating limited resources across the vast array of potential investments that could mitigate risks from terrorism and other threats to the homeland. Decisionmakers demand models, analyses, and decision support that are useful for this task and based on the state of the art. Since terrorism risk analysis is new, no single method is likely to meet this challenge. In this article we explore a number of existing and potential approaches for terrorism risk analysis, focusing particularly on recent discussions regarding the applicability of probabilistic and decision analytic approaches to bioterrorism risks and the Bioterrorism Risk Assessment methodology used by the DHS and criticized by the National Academies and others.

  13. Method of assessing a lipid-related health risk based on ion mobility analysis of lipoproteins

    DOEpatents

    Benner, W. Henry; Krauss, Ronald M.; Blanche, Patricia J.

    2010-12-14

    A medical diagnostic method and instrumentation system for analyzing noncovalently bonded agglomerated biological particles is described. The method and system comprises: a method of preparation for the biological particles; an electrospray generator; an alpha particle radiation source; a differential mobility analyzer; a particle counter; and data acquisition and analysis means. The medical device is useful for the assessment of human diseases, such as cardiac disease risk and hyperlipidemia, by rapid quantitative analysis of lipoprotein fraction densities. Initially, purification procedures are described to reduce an initial blood sample to an analytical input to the instrument. The measured sizes from the analytical sample are correlated with densities, resulting in a spectrum of lipoprotein densities. The lipoprotein density distribution can then be used to characterize cardiac and other lipid-related health risks.

  14. Stochastic Drought Risk Analysis and Projection Methods For Thermoelectric Power Systems

    NASA Astrophysics Data System (ADS)

    Bekera, Behailu Belamo

    Combined effects of socio-economic, environmental, technological and political factors impact fresh cooling water availability, which is among the most important elements of thermoelectric power plant site selection and evaluation criteria. With increased variability and changes in hydrologic statistical stationarity, one concern is the increased occurrence of extreme drought events that may be attributable to climatic changes. As hydrological systems are altered, operators of thermoelectric power plants need to ensure a reliable supply of water for cooling and generation requirements. The effects of climate change are expected to influence hydrological systems at multiple scales, possibly leading to reduced efficiency of thermoelectric power plants. This study models and analyzes drought characteristics from a thermoelectric systems operational and regulation perspective. A systematic approach to characterize a stream environment in relation to extreme drought occurrence, duration and deficit-volume is proposed and demonstrated. More specifically, the objective of this research is to propose a stochastic water supply risk analysis and projection methods from thermoelectric power systems operation and management perspectives. The study defines thermoelectric drought as a shortage of cooling water due to stressed supply or beyond operable water temperature limits for an extended period of time requiring power plants to reduce production or completely shut down. It presents a thermoelectric drought risk characterization framework that considers heat content and water quantity facets of adequate water availability for uninterrupted operation of such plants and safety of its surroundings. In addition, it outlines mechanisms to identify rate of occurrences of the said droughts and stochastically quantify subsequent potential losses to the sector. This mechanism is enabled through a model based on compound Nonhomogeneous Poisson Process. This study also demonstrates how

  15. Is risk analysis scientific?

    PubMed

    Hansson, Sven Ove; Aven, Terje

    2014-07-01

    This article discusses to what extent risk analysis is scientific in view of a set of commonly used definitions and criteria. We consider scientific knowledge to be characterized by its subject matter, its success in developing the best available knowledge in its fields of study, and the epistemic norms and values that guide scientific investigations. We proceed to assess the field of risk analysis according to these criteria. For this purpose, we use a model for risk analysis in which science is used as a base for decision making on risks, which covers the five elements evidence, knowledge base, broad risk evaluation, managerial review and judgment, and the decision; and that relates these elements to the domains experts and decisionmakers, and to the domains fact-based or value-based. We conclude that risk analysis is a scientific field of study, when understood as consisting primarily of (i) knowledge about risk-related phenomena, processes, events, etc., and (ii) concepts, theories, frameworks, approaches, principles, methods and models to understand, assess, characterize, communicate, and manage risk, in general and for specific applications (the instrumental part). © 2014 Society for Risk Analysis.

  16. An improved method for risk evaluation in failure modes and effects analysis of CNC lathe

    NASA Astrophysics Data System (ADS)

    Rachieru, N.; Belu, N.; Anghel, D. C.

    2015-11-01

    Failure mode and effects analysis (FMEA) is one of the most popular reliability analysis tools for identifying, assessing and eliminating potential failure modes in a wide range of industries. In general, failure modes in FMEA are evaluated and ranked through the risk priority number (RPN), which is obtained by the multiplication of crisp values of the risk factors, such as the occurrence (O), severity (S), and detection (D) of each failure mode. However, the crisp RPN method has been criticized to have several deficiencies. In this paper, linguistic variables, expressed in Gaussian, trapezoidal or triangular fuzzy numbers, are used to assess the ratings and weights for the risk factors S, O and D. A new risk assessment system based on the fuzzy set theory and fuzzy rule base theory is to be applied to assess and rank risks associated to failure modes that could appear in the functioning of Turn 55 Lathe CNC. Two case studies have been shown to demonstrate the methodology thus developed. It is illustrated a parallel between the results obtained by the traditional method and fuzzy logic for determining the RPNs. The results show that the proposed approach can reduce duplicated RPN numbers and get a more accurate, reasonable risk assessment. As a result, the stability of product and process can be assured.

  17. Augmenting the Deliberative Method for Ranking Risks.

    PubMed

    Susel, Irving; Lasley, Trace; Montezemolo, Mark; Piper, Joel

    2016-01-01

    The Department of Homeland Security (DHS) characterized and prioritized the physical cross-border threats and hazards to the nation stemming from terrorism, market-driven illicit flows of people and goods (illegal immigration, narcotics, funds, counterfeits, and weaponry), and other nonmarket concerns (movement of diseases, pests, and invasive species). These threats and hazards pose a wide diversity of consequences with very different combinations of magnitudes and likelihoods, making it very challenging to prioritize them. This article presents the approach that was used at DHS to arrive at a consensus regarding the threats and hazards that stand out from the rest based on the overall risk they pose. Due to time constraints for the decision analysis, it was not feasible to apply multiattribute methodologies like multiattribute utility theory or the analytic hierarchy process. Using a holistic approach was considered, such as the deliberative method for ranking risks first published in this journal. However, an ordinal ranking alone does not indicate relative or absolute magnitude differences among the risks. Therefore, the use of the deliberative method for ranking risks is not sufficient for deciding whether there is a material difference between the top-ranked and bottom-ranked risks, let alone deciding what the stand-out risks are. To address this limitation of ordinal rankings, the deliberative method for ranking risks was augmented by adding an additional step to transform the ordinal ranking into a ratio scale ranking. This additional step enabled the selection of stand-out risks to help prioritize further analysis. © 2015 Society for Risk Analysis.

  18. Safety analysis, risk assessment, and risk acceptance criteria

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jamali, K.; Stack, D.W.; Sullivan, L.H.

    1997-08-01

    This paper discusses a number of topics that relate safety analysis as documented in the Department of Energy (DOE) safety analysis reports (SARs), probabilistic risk assessments (PRA) as characterized primarily in the context of the techniques that have assumed some level of formality in commercial nuclear power plant applications, and risk acceptance criteria as an outgrowth of PRA applications. DOE SARs of interest are those that are prepared for DOE facilities under DOE Order 5480.23 and the implementing guidance in DOE STD-3009-94. It must be noted that the primary area of application for DOE STD-3009 is existing DOE facilities andmore » that certain modifications of the STD-3009 approach are necessary in SARs for new facilities. Moreover, it is the hazard analysis (HA) and accident analysis (AA) portions of these SARs that are relevant to the present discussions. Although PRAs can be qualitative in nature, PRA as used in this paper refers more generally to all quantitative risk assessments and their underlying methods. HA as used in this paper refers more generally to all qualitative risk assessments and their underlying methods that have been in use in hazardous facilities other than nuclear power plants. This discussion includes both quantitative and qualitative risk assessment methods. PRA has been used, improved, developed, and refined since the Reactor Safety Study (WASH-1400) was published in 1975 by the Nuclear Regulatory Commission (NRC). Much debate has ensued since WASH-1400 on exactly what the role of PRA should be in plant design, reactor licensing, `ensuring` plant and process safety, and a large number of other decisions that must be made for potentially hazardous activities. Of particular interest in this area is whether the risks quantified using PRA should be compared with numerical risk acceptance criteria (RACs) to determine whether a facility is `safe.` Use of RACs requires quantitative estimates of consequence frequency and magnitude.« less

  19. A Comparison of Rule-based Analysis with Regression Methods in Understanding the Risk Factors for Study Withdrawal in a Pediatric Study.

    PubMed

    Haghighi, Mona; Johnson, Suzanne Bennett; Qian, Xiaoning; Lynch, Kristian F; Vehik, Kendra; Huang, Shuai

    2016-08-26

    Regression models are extensively used in many epidemiological studies to understand the linkage between specific outcomes of interest and their risk factors. However, regression models in general examine the average effects of the risk factors and ignore subgroups with different risk profiles. As a result, interventions are often geared towards the average member of the population, without consideration of the special health needs of different subgroups within the population. This paper demonstrates the value of using rule-based analysis methods that can identify subgroups with heterogeneous risk profiles in a population without imposing assumptions on the subgroups or method. The rules define the risk pattern of subsets of individuals by not only considering the interactions between the risk factors but also their ranges. We compared the rule-based analysis results with the results from a logistic regression model in The Environmental Determinants of Diabetes in the Young (TEDDY) study. Both methods detected a similar suite of risk factors, but the rule-based analysis was superior at detecting multiple interactions between the risk factors that characterize the subgroups. A further investigation of the particular characteristics of each subgroup may detect the special health needs of the subgroup and lead to tailored interventions.

  20. Fault tree analysis for integrated and probabilistic risk analysis of drinking water systems.

    PubMed

    Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof

    2009-04-01

    Drinking water systems are vulnerable and subject to a wide range of risks. To avoid sub-optimisation of risk-reduction options, risk analyses need to include the entire drinking water system, from source to tap. Such an integrated approach demands tools that are able to model interactions between different events. Fault tree analysis is a risk estimation tool with the ability to model interactions between events. Using fault tree analysis on an integrated level, a probabilistic risk analysis of a large drinking water system in Sweden was carried out. The primary aims of the study were: (1) to develop a method for integrated and probabilistic risk analysis of entire drinking water systems; and (2) to evaluate the applicability of Customer Minutes Lost (CML) as a measure of risk. The analysis included situations where no water is delivered to the consumer (quantity failure) and situations where water is delivered but does not comply with water quality standards (quality failure). Hard data as well as expert judgements were used to estimate probabilities of events and uncertainties in the estimates. The calculations were performed using Monte Carlo simulations. CML is shown to be a useful measure of risks associated with drinking water systems. The method presented provides information on risk levels, probabilities of failure, failure rates and downtimes of the system. This information is available for the entire system as well as its different sub-systems. Furthermore, the method enables comparison of the results with performance targets and acceptable levels of risk. The method thus facilitates integrated risk analysis and consequently helps decision-makers to minimise sub-optimisation of risk-reduction options.

  1. An Emerging New Risk Analysis Science: Foundations and Implications.

    PubMed

    Aven, Terje

    2018-05-01

    To solve real-life problems-such as those related to technology, health, security, or climate change-and make suitable decisions, risk is nearly always a main issue. Different types of sciences are often supporting the work, for example, statistics, natural sciences, and social sciences. Risk analysis approaches and methods are also commonly used, but risk analysis is not broadly accepted as a science in itself. A key problem is the lack of explanatory power and large uncertainties when assessing risk. This article presents an emerging new risk analysis science based on novel ideas and theories on risk analysis developed in recent years by the risk analysis community. It builds on a fundamental change in thinking, from the search for accurate predictions and risk estimates, to knowledge generation related to concepts, theories, frameworks, approaches, principles, methods, and models to understand, assess, characterize, communicate, and (in a broad sense) manage risk. Examples are used to illustrate the importance of this distinct/separate risk analysis science for solving risk problems, supporting science in general and other disciplines in particular. © 2017 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.

  2. Evaluating the risks of clinical research: direct comparative analysis.

    PubMed

    Rid, Annette; Abdoler, Emily; Roberson-Nay, Roxann; Pine, Daniel S; Wendler, David

    2014-09-01

    Many guidelines and regulations allow children and adolescents to be enrolled in research without the prospect of clinical benefit when it poses minimal risk. However, few systematic methods exist to determine when research risks are minimal. This situation has led to significant variation in minimal risk judgments, raising concern that some children are not being adequately protected. To address this concern, we describe a new method for implementing the widely endorsed "risks of daily life" standard for minimal risk. This standard defines research risks as minimal when they do not exceed the risks posed by daily life activities or routine examinations. This study employed a conceptual and normative analysis, and use of an illustrative example. Different risks are composed of the same basic elements: Type, likelihood, and magnitude of harm. Hence, one can compare the risks of research and the risks of daily life by comparing the respective basic elements with each other. We use this insight to develop a systematic method, direct comparative analysis, for implementing the "risks of daily life" standard for minimal risk. The method offers a way of evaluating research procedures that pose the same types of risk as daily life activities, such as the risk of experiencing anxiety, stress, or other psychological harm. We thus illustrate how direct comparative analysis can be applied in practice by using it to evaluate whether the anxiety induced by a respiratory CO2 challenge poses minimal or greater than minimal risks in children and adolescents. Direct comparative analysis is a systematic method for applying the "risks of daily life" standard for minimal risk to research procedures that pose the same types of risk as daily life activities. It thereby offers a method to protect children and adolescents in research, while ensuring that important studies are not blocked because of unwarranted concerns about research risks.

  3. Risk-Stratified Imputation in Survival Analysis

    PubMed Central

    Kennedy, Richard E.; Adragni, Kofi P.; Tiwari, Hemant K.; Voeks, Jenifer H.; Brott, Thomas G.; Howard, George

    2013-01-01

    Background Censoring that is dependent on covariates associated with survival can arise in randomized trials due to changes in recruitment and eligibility criteria to minimize withdrawals, potentially leading to biased treatment effect estimates. Imputation approaches have been proposed to address censoring in survival analysis; and while these approaches may provide unbiased estimates of treatment effects, imputation of a large number of outcomes may over- or underestimate the associated variance based on the imputation pool selected. Purpose We propose an improved method, risk-stratified imputation, as an alternative to address withdrawal related to the risk of events in the context of time-to-event analyses. Methods Our algorithm performs imputation from a pool of replacement subjects with similar values of both treatment and covariate(s) of interest, that is, from a risk-stratified sample. This stratification prior to imputation addresses the requirement of time-to-event analysis that censored observations are representative of all other observations in the risk group with similar exposure variables. We compared our risk-stratified imputation to case deletion and bootstrap imputation in a simulated dataset in which the covariate of interest (study withdrawal) was related to treatment. A motivating example from a recent clinical trial is also presented to demonstrate the utility of our method. Results In our simulations, risk-stratified imputation gives estimates of treatment effect comparable to bootstrap and auxiliary variable imputation while avoiding inaccuracies of the latter two in estimating the associated variance. Similar results were obtained in analysis of clinical trial data. Limitations Risk-stratified imputation has little advantage over other imputation methods when covariates of interest are not related to treatment, although its performance is superior when covariates are related to treatment. Risk-stratified imputation is intended for

  4. Evaluating the Risks of Clinical Research: Direct Comparative Analysis

    PubMed Central

    Abdoler, Emily; Roberson-Nay, Roxann; Pine, Daniel S.; Wendler, David

    2014-01-01

    Abstract Objectives: Many guidelines and regulations allow children and adolescents to be enrolled in research without the prospect of clinical benefit when it poses minimal risk. However, few systematic methods exist to determine when research risks are minimal. This situation has led to significant variation in minimal risk judgments, raising concern that some children are not being adequately protected. To address this concern, we describe a new method for implementing the widely endorsed “risks of daily life” standard for minimal risk. This standard defines research risks as minimal when they do not exceed the risks posed by daily life activities or routine examinations. Methods: This study employed a conceptual and normative analysis, and use of an illustrative example. Results: Different risks are composed of the same basic elements: Type, likelihood, and magnitude of harm. Hence, one can compare the risks of research and the risks of daily life by comparing the respective basic elements with each other. We use this insight to develop a systematic method, direct comparative analysis, for implementing the “risks of daily life” standard for minimal risk. The method offers a way of evaluating research procedures that pose the same types of risk as daily life activities, such as the risk of experiencing anxiety, stress, or other psychological harm. We thus illustrate how direct comparative analysis can be applied in practice by using it to evaluate whether the anxiety induced by a respiratory CO2 challenge poses minimal or greater than minimal risks in children and adolescents. Conclusions: Direct comparative analysis is a systematic method for applying the “risks of daily life” standard for minimal risk to research procedures that pose the same types of risk as daily life activities. It thereby offers a method to protect children and adolescents in research, while ensuring that important studies are not blocked because of unwarranted concerns about

  5. Recruitment Methods and Show Rates to a Prostate Cancer Early Detection Program for High-Risk Men: A Comprehensive Analysis

    PubMed Central

    Giri, Veda N.; Coups, Elliot J.; Ruth, Karen; Goplerud, Julia; Raysor, Susan; Kim, Taylor Y.; Bagden, Loretta; Mastalski, Kathleen; Zakrzewski, Debra; Leimkuhler, Suzanne; Watkins-Bruner, Deborah

    2009-01-01

    Purpose Men with a family history (FH) of prostate cancer (PCA) and African American (AA) men are at higher risk for PCA. Recruitment and retention of these high-risk men into early detection programs has been challenging. We report a comprehensive analysis on recruitment methods, show rates, and participant factors from the Prostate Cancer Risk Assessment Program (PRAP), which is a prospective, longitudinal PCA screening study. Materials and Methods Men 35–69 years are eligible if they have a FH of PCA, are AA, or have a BRCA1/2 mutation. Recruitment methods were analyzed with respect to participant demographics and show to the first PRAP appointment using standard statistical methods Results Out of 707 men recruited, 64.9% showed to the initial PRAP appointment. More individuals were recruited via radio than from referral or other methods (χ2 = 298.13, p < .0001). Men recruited via radio were more likely to be AA (p<0.001), less educated (p=0.003), not married or partnered (p=0.007), and have no FH of PCA (p<0.001). Men recruited via referrals had higher incomes (p=0.007). Men recruited via referral were more likely to attend their initial PRAP visit than those recruited by radio or other methods (χ2 = 27.08, p < .0001). Conclusions This comprehensive analysis finds that radio leads to higher recruitment of AA men with lower socioeconomic status. However, these are the high-risk men that have lower show rates for PCA screening. Targeted motivational measures need to be studied to improve show rates for PCA risk assessment for these high-risk men. PMID:19758657

  6. An integrated optimization method for river water quality management and risk analysis in a rural system.

    PubMed

    Liu, J; Li, Y P; Huang, G H; Zeng, X T; Nie, S

    2016-01-01

    In this study, an interval-stochastic-based risk analysis (RSRA) method is developed for supporting river water quality management in a rural system under uncertainty (i.e., uncertainties exist in a number of system components as well as their interrelationships). The RSRA method is effective in risk management and policy analysis, particularly when the inputs (such as allowable pollutant discharge and pollutant discharge rate) are expressed as probability distributions and interval values. Moreover, decision-makers' attitudes towards system risk can be reflected using a restricted resource measure by controlling the variability of the recourse cost. The RSRA method is then applied to a real case of water quality management in the Heshui River Basin (a rural area of China), where chemical oxygen demand (COD), total nitrogen (TN), total phosphorus (TP), and soil loss are selected as major indicators to identify the water pollution control strategies. Results reveal that uncertainties and risk attitudes have significant effects on both pollutant discharge and system benefit. A high risk measure level can lead to a reduced system benefit; however, this reduction also corresponds to raised system reliability. Results also disclose that (a) agriculture is the dominant contributor to soil loss, TN, and TP loads, and abatement actions should be mainly carried out for paddy and dry farms; (b) livestock husbandry is the main COD discharger, and abatement measures should be mainly conducted for poultry farm; (c) fishery accounts for a high percentage of TN, TP, and COD discharges but a has low percentage of overall net benefit, and it may be beneficial to cease fishery activities in the basin. The findings can facilitate the local authority in identifying desired pollution control strategies with the tradeoff between socioeconomic development and environmental sustainability.

  7. Animal Disease Import Risk Analysis--a Review of Current Methods and Practice.

    PubMed

    Peeler, E J; Reese, R A; Thrush, M A

    2015-10-01

    The application of risk analysis to the spread of disease with international trade in animals and their products, that is, import risk analysis (IRA), has been largely driven by the Sanitary and Phytosanitary (SPS) agreement of the World Trade Organization (WTO). The degree to which the IRA standard established by the World Organization for Animal Health (OIE), and associated guidance, meets the needs of the SPS agreement is discussed. The use of scenario trees is the core modelling approach used to represent the steps necessary for the hazard to occur. There is scope to elaborate scenario trees for commodity IRA so that the quantity of hazard at each step is assessed, which is crucial to the likelihood of establishment. The dependence between exposure and establishment suggests that they should fall within the same subcomponent. IRA undertaken for trade reasons must include an assessment of consequences to meet SPS criteria, but guidance is sparse. The integration of epidemiological and economic modelling may open a path for better methods. Matrices have been used in qualitative IRA to combine estimates of entry and exposure, and consequences with likelihood, but this approach has flaws and better methods are needed. OIE IRA standards and guidance indicate that the volume of trade should be taken into account, but offer no detail. Some published qualitative IRAs have assumed current levels and patterns of trade without specifying the volume of trade, which constrains the use of IRA to determine mitigation measures (to reduce risk to an acceptable level) and whether the principle of equivalence, fundamental to the SPS agreement, has been observed. It is questionable whether qualitative IRA can meet all the criteria set out in the SPS agreement. Nevertheless, scope exists to elaborate the current standards and guidance, so they better serve the principle of science-based decision-making. © 2013 Crown copyright. This article is published with the permission of the

  8. Risk analysis for veterinary biologicals released into the environment.

    PubMed

    Silva, S V; Samagh, B S; Morley, R S

    1995-12-01

    All veterinary biologicals licensed in Canada must be shown to be pure, potent, safe and effective. A risk-based approach is used to evaluate the safety of all biologicals, whether produced by conventional methods or by molecular biological techniques. Traditionally, qualitative risk assessment methods have been used for this purpose. More recently, quantitative risk assessment has become available for complex issues. The quantitative risk assessment method uses "scenario tree analysis' to predict the likelihood of various outcomes and their respective impacts. The authors describe the quantitative risk assessment approach which is used within the broader context of risk analysis (i.e. risk assessment, risk management and risk communication) to develop recommendations for the field release of veterinary biologicals. The general regulatory framework for the licensing of veterinary biologicals in Canada is also presented.

  9. Risk analysis of a biomass combustion process using MOSAR and FMEA methods.

    PubMed

    Thivel, P-X; Bultel, Y; Delpech, F

    2008-02-28

    Thermal and chemical conversion processes that convert in energy the sewage sludge, pasty waste and other pre-processed waste are increasingly common, for economic and ecological reasons. Fluidized bed combustion is currently one of the most promising methods of energy conversion, since it burns biomass very efficiently, and produces only very small quantities of sulphur and nitrogen oxides. The hazards associated with biomass combustion processes are fire, explosion and poisoning from the combustion gases (CO, etc.). The risk analysis presented in this paper uses the MADS-MOSAR methodology, applied to a semi-industrial pilot scheme comprising a fluidization column, a conventional cyclone, two natural gas burners and a continuous supply of biomass. The methodology uses a generic approach, with an initial macroscopic stage where hazard sources are identified, scenarios for undesired events are recognized and ranked using a grid of SeverityxProbability and safety barriers suggested. A microscopic stage then analyzes in detail the major risks identified during the first stage. This analysis may use various different tools, such as HAZOP, FMEA, etc.: our analysis is based on FMEA. Using MOSAR, we were able to identify five subsystems: the reactor (fluidized bed and centrifuge), the fuel and biomass supply lines, the operator and the environment. When we drew up scenarios based on these subsystems, we found that malfunction of the gas supply burners was a common trigger in many scenarios. Our subsequent microscopic analysis, therefore, focused on the burners, looking at the ways they failed, and at the effects and criticality of those failures (FMEA). We were, thus, able to identify a number of critical factors such as the incoming gas lines and the ignition electrode.

  10. A method for scenario-based risk assessment for robust aerospace systems

    NASA Astrophysics Data System (ADS)

    Thomas, Victoria Katherine

    In years past, aircraft conceptual design centered around creating a feasible aircraft that could be built and could fly the required missions. More recently, aircraft viability entered into conceptual design, allowing that the product's potential to be profitable should also be examined early in the design process. While examining an aerospace system's feasibility and viability early in the design process is extremely important, it is also important to examine system risk. In traditional aerospace systems risk analysis, risk is examined from the perspective of performance, schedule, and cost. Recently, safety and reliability analysis have been brought forward in the design process to also be examined during late conceptual and early preliminary design. While these analyses work as designed, existing risk analysis methods and techniques are not designed to examine an aerospace system's external operating environment and the risks present there. A new method has been developed here to examine, during the early part of concept design, the risk associated with not meeting assumptions about the system's external operating environment. The risks are examined in five categories: employment, culture, government and politics, economics, and technology. The risks are examined over a long time-period, up to the system's entire life cycle. The method consists of eight steps over three focus areas. The first focus area is Problem Setup. During problem setup, the problem is defined and understood to the best of the decision maker's ability. There are four steps in this area, in the following order: Establish the Need, Scenario Development, Identify Solution Alternatives, and Uncertainty and Risk Identification. There is significant iteration between steps two through four. Focus area two is Modeling and Simulation. In this area the solution alternatives and risks are modeled, and a numerical value for risk is calculated. A risk mitigation model is also created. The four steps

  11. Dynamic Blowout Risk Analysis Using Loss Functions.

    PubMed

    Abimbola, Majeed; Khan, Faisal

    2018-02-01

    Most risk analysis approaches are static; failing to capture evolving conditions. Blowout, the most feared accident during a drilling operation, is a complex and dynamic event. The traditional risk analysis methods are useful in the early design stage of drilling operation while falling short during evolving operational decision making. A new dynamic risk analysis approach is presented to capture evolving situations through dynamic probability and consequence models. The dynamic consequence models, the focus of this study, are developed in terms of loss functions. These models are subsequently integrated with the probability to estimate operational risk, providing a real-time risk analysis. The real-time evolving situation is considered dependent on the changing bottom-hole pressure as drilling progresses. The application of the methodology and models are demonstrated with a case study of an offshore drilling operation evolving to a blowout. © 2017 Society for Risk Analysis.

  12. Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William

    2009-01-01

    This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).

  13. Missing in space: an evaluation of imputation methods for missing data in spatial analysis of risk factors for type II diabetes.

    PubMed

    Baker, Jannah; White, Nicole; Mengersen, Kerrie

    2014-11-20

    Spatial analysis is increasingly important for identifying modifiable geographic risk factors for disease. However, spatial health data from surveys are often incomplete, ranging from missing data for only a few variables, to missing data for many variables. For spatial analyses of health outcomes, selection of an appropriate imputation method is critical in order to produce the most accurate inferences. We present a cross-validation approach to select between three imputation methods for health survey data with correlated lifestyle covariates, using as a case study, type II diabetes mellitus (DM II) risk across 71 Queensland Local Government Areas (LGAs). We compare the accuracy of mean imputation to imputation using multivariate normal and conditional autoregressive prior distributions. Choice of imputation method depends upon the application and is not necessarily the most complex method. Mean imputation was selected as the most accurate method in this application. Selecting an appropriate imputation method for health survey data, after accounting for spatial correlation and correlation between covariates, allows more complete analysis of geographic risk factors for disease with more confidence in the results to inform public policy decision-making.

  14. [Survival analysis with competing risks: estimating failure probability].

    PubMed

    Llorca, Javier; Delgado-Rodríguez, Miguel

    2004-01-01

    To show the impact of competing risks of death on survival analysis. We provide an example of survival time without chronic rejection after heart transplantation, where death before rejection acts as a competing risk. Using a computer simulation, we compare the Kaplan-Meier estimator and the multiple decrement model. The Kaplan-Meier method overestimated the probability of rejection. Next, we illustrate the use of the multiple decrement model to analyze secondary end points (in our example: death after rejection). Finally, we discuss Kaplan-Meier assumptions and why they fail in the presence of competing risks. Survival analysis should be adjusted for competing risks of death to avoid overestimation of the risk of rejection produced with the Kaplan-Meier method.

  15. A two-stage method of quantitative flood risk analysis for reservoir real-time operation using ensemble-based hydrologic forecasts

    NASA Astrophysics Data System (ADS)

    Liu, P.

    2013-12-01

    Quantitative analysis of the risk for reservoir real-time operation is a hard task owing to the difficulty of accurate description of inflow uncertainties. The ensemble-based hydrologic forecasts directly depict the inflows not only the marginal distributions but also their persistence via scenarios. This motivates us to analyze the reservoir real-time operating risk with ensemble-based hydrologic forecasts as inputs. A method is developed by using the forecast horizon point to divide the future time into two stages, the forecast lead-time and the unpredicted time. The risk within the forecast lead-time is computed based on counting the failure number of forecast scenarios, and the risk in the unpredicted time is estimated using reservoir routing with the design floods and the reservoir water levels of forecast horizon point. As a result, a two-stage risk analysis method is set up to quantify the entire flood risks by defining the ratio of the number of scenarios that excessive the critical value to the total number of scenarios. The China's Three Gorges Reservoir (TGR) is selected as a case study, where the parameter and precipitation uncertainties are implemented to produce ensemble-based hydrologic forecasts. The Bayesian inference, Markov Chain Monte Carlo, is used to account for the parameter uncertainty. Two reservoir operation schemes, the real operated and scenario optimization, are evaluated for the flood risks and hydropower profits analysis. With the 2010 flood, it is found that the improvement of the hydrologic forecast accuracy is unnecessary to decrease the reservoir real-time operation risk, and most risks are from the forecast lead-time. It is therefore valuable to decrease the avarice of ensemble-based hydrologic forecasts with less bias for a reservoir operational purpose.

  16. A concept analysis of forensic risk.

    PubMed

    Kettles, A M

    2004-08-01

    Forensic risk is a term used in relation to many forms of clinical practice, such as assessment, intervention and management. Rarely is the term defined in the literature and as a concept it is multifaceted. Concept analysis is a method for exploring and evaluating the meaning of words. It gives precise definitions, both theoretical and operational, for use in theory, clinical practice and research. A concept analysis provides a logical basis for defining terms through providing defining attributes, case examples (model, contrary, borderline, related), antecedents and consequences and the implications for nursing. Concept analysis helps us to refine and define a concept that derives from practice, research or theory. This paper will use the strategy of concept analysis to find a working definition for the concept of forensic risk. In conclusion, the historical background and literature are reviewed using concept analysis to bring the term into focus and to define it more clearly. Forensic risk is found to derive both from forensic practice and from risk theory. A proposed definition of forensic risk is given.

  17. Failure mode effect analysis and fault tree analysis as a combined methodology in risk management

    NASA Astrophysics Data System (ADS)

    Wessiani, N. A.; Yoshio, F.

    2018-04-01

    There have been many studies reported the implementation of Failure Mode Effect Analysis (FMEA) and Fault Tree Analysis (FTA) as a method in risk management. However, most of the studies usually only choose one of these two methods in their risk management methodology. On the other side, combining these two methods will reduce the drawbacks of each methods when implemented separately. This paper aims to combine the methodology of FMEA and FTA in assessing risk. A case study in the metal company will illustrate how this methodology can be implemented. In the case study, this combined methodology will assess the internal risks that occur in the production process. Further, those internal risks should be mitigated based on their level of risks.

  18. Advanced uncertainty modelling for container port risk analysis.

    PubMed

    Alyami, Hani; Yang, Zaili; Riahi, Ramin; Bonsall, Stephen; Wang, Jin

    2016-08-13

    Globalization has led to a rapid increase of container movements in seaports. Risks in seaports need to be appropriately addressed to ensure economic wealth, operational efficiency, and personnel safety. As a result, the safety performance of a Container Terminal Operational System (CTOS) plays a growing role in improving the efficiency of international trade. This paper proposes a novel method to facilitate the application of Failure Mode and Effects Analysis (FMEA) in assessing the safety performance of CTOS. The new approach is developed through incorporating a Fuzzy Rule-Based Bayesian Network (FRBN) with Evidential Reasoning (ER) in a complementary manner. The former provides a realistic and flexible method to describe input failure information for risk estimates of individual hazardous events (HEs) at the bottom level of a risk analysis hierarchy. The latter is used to aggregate HEs safety estimates collectively, allowing dynamic risk-based decision support in CTOS from a systematic perspective. The novel feature of the proposed method, compared to those in traditional port risk analysis lies in a dynamic model capable of dealing with continually changing operational conditions in ports. More importantly, a new sensitivity analysis method is developed and carried out to rank the HEs by taking into account their specific risk estimations (locally) and their Risk Influence (RI) to a port's safety system (globally). Due to its generality, the new approach can be tailored for a wide range of applications in different safety and reliability engineering and management systems, particularly when real time risk ranking is required to measure, predict, and improve the associated system safety performance. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Use of a systematic risk analysis method to improve safety in the production of paediatric parenteral nutrition solutions

    PubMed Central

    Bonnabry, P; Cingria, L; Sadeghipour, F; Ing, H; Fonzo-Christe, C; Pfister, R

    2005-01-01

    Background: Until recently, the preparation of paediatric parenteral nutrition formulations in our institution included re-transcription and manual compounding of the mixture. Although no significant clinical problems have occurred, re-engineering of this high risk activity was undertaken to improve its safety. Several changes have been implemented including new prescription software, direct recording on a server, automatic printing of the labels, and creation of a file used to pilot a BAXA MM 12 automatic compounder. The objectives of this study were to compare the risks associated with the old and new processes, to quantify the improved safety with the new process, and to identify the major residual risks. Methods: A failure modes, effects, and criticality analysis (FMECA) was performed by a multidisciplinary team. A cause-effect diagram was built, the failure modes were defined, and the criticality index (CI) was determined for each of them on the basis of the likelihood of occurrence, the severity of the potential effect, and the detection probability. The CIs for each failure mode were compared for the old and new processes and the risk reduction was quantified. Results: The sum of the CIs of all 18 identified failure modes was 3415 for the old process and 1397 for the new (reduction of 59%). The new process reduced the CIs of the different failure modes by a mean factor of 7. The CI was smaller with the new process for 15 failure modes, unchanged for two, and slightly increased for one. The greatest reduction (by a factor of 36) concerned re-transcription errors, followed by readability problems (by a factor of 30) and chemical cross contamination (by a factor of 10). The most critical steps in the new process were labelling mistakes (CI 315, maximum 810), failure to detect a dosage or product mistake (CI 288), failure to detect a typing error during the prescription (CI 175), and microbial contamination (CI 126). Conclusions: Modification of the process

  20. Study Of The Risks Arising From Natural Disasters And Hazards On Urban And Intercity Motorways By Using Failure Mode Effect Analysis (FMEA) Methods

    NASA Astrophysics Data System (ADS)

    DELİCE, Yavuz

    2015-04-01

    Highways, Located in the city and intercity locations are generally prone to many kind of natural disaster risks. Natural hazards and disasters that may occur firstly from highway project making to construction and operation stages and later during the implementation of highway maintenance and repair stages have to be taken into consideration. And assessment of risks that may occur against adverse situations is very important in terms of project design, construction, operation maintenance and repair costs. Making hazard and natural disaster risk analysis is largely depending on the definition of the likelihood of the probable hazards on the highways. However, assets at risk , and the impacts of the events must be examined and to be rated in their own. With the realization of these activities, intended improvements against natural hazards and disasters will be made with the utilization of Failure Mode Effects Analysis (FMEA) method and their effects will be analyzed with further works. FMEA, is a useful method to identify the failure mode and effects depending on the type of failure rate effects priorities and finding the most optimum economic and effective solution. Although relevant measures being taken for the identified risks by this analysis method , it may also provide some information for some public institutions about the nature of these risks when required. Thus, the necessary measures will have been taken in advance in the city and intercity highways. Many hazards and natural disasters are taken into account in risk assessments. The most important of these dangers can be listed as follows; • Natural disasters 1. Meteorological based natural disasters (floods, severe storms, tropical storms, winter storms, avalanches, etc.). 2. Geological based natural disasters (earthquakes, tsunamis, landslides, subsidence, sinkholes, etc) • Human originated disasters 1. Transport accidents (traffic accidents), originating from the road surface defects (icing

  1. Critical review of methods for risk ranking of food-related hazards, based on risks for human health.

    PubMed

    Van der Fels-Klerx, H J; Van Asselt, E D; Raley, M; Poulsen, M; Korsgaard, H; Bredsdorff, L; Nauta, M; D'agostino, M; Coles, D; Marvin, H J P; Frewer, L J

    2018-01-22

    This study aimed to critically review methods for ranking risks related to food safety and dietary hazards on the basis of their anticipated human health impacts. A literature review was performed to identify and characterize methods for risk ranking from the fields of food, environmental science and socio-economic sciences. The review used a predefined search protocol, and covered the bibliographic databases Scopus, CAB Abstracts, Web of Sciences, and PubMed over the period 1993-2013. All references deemed relevant, on the basis of predefined evaluation criteria, were included in the review, and the risk ranking method characterized. The methods were then clustered-based on their characteristics-into eleven method categories. These categories included: risk assessment, comparative risk assessment, risk ratio method, scoring method, cost of illness, health adjusted life years (HALY), multi-criteria decision analysis, risk matrix, flow charts/decision trees, stated preference techniques and expert synthesis. Method categories were described by their characteristics, weaknesses and strengths, data resources, and fields of applications. It was concluded there is no single best method for risk ranking. The method to be used should be selected on the basis of risk manager/assessor requirements, data availability, and the characteristics of the method. Recommendations for future use and application are provided.

  2. Risk analysis based on hazards interactions

    NASA Astrophysics Data System (ADS)

    Rossi, Lauro; Rudari, Roberto; Trasforini, Eva; De Angeli, Silvia; Becker, Joost

    2017-04-01

    Despite an increasing need for open, transparent, and credible multi-hazard risk assessment methods, models, and tools, the availability of comprehensive risk information needed to inform disaster risk reduction is limited, and the level of interaction across hazards is not systematically analysed. Risk assessment methodologies for different hazards often produce risk metrics that are not comparable. Hazard interactions (consecutive occurrence two or more different events) are generally neglected, resulting in strongly underestimated risk assessment in the most exposed areas. This study presents cases of interaction between different hazards, showing how subsidence can affect coastal and river flood risk (Jakarta and Bandung, Indonesia) or how flood risk is modified after a seismic event (Italy). The analysis of well documented real study cases, based on a combination between Earth Observation and in-situ data, would serve as basis the formalisation of a multi-hazard methodology, identifying gaps and research frontiers. Multi-hazard risk analysis is performed through the RASOR platform (Rapid Analysis and Spatialisation Of Risk). A scenario-driven query system allow users to simulate future scenarios based on existing and assumed conditions, to compare with historical scenarios, and to model multi-hazard risk both before and during an event (www.rasor.eu).

  3. Benefit-risk analysis : a brief review and proposed quantitative approaches.

    PubMed

    Holden, William L

    2003-01-01

    Given the current status of benefit-risk analysis as a largely qualitative method, two techniques for a quantitative synthesis of a drug's benefit and risk are proposed to allow a more objective approach. The recommended methods, relative-value adjusted number-needed-to-treat (RV-NNT) and its extension, minimum clinical efficacy (MCE) analysis, rely upon efficacy or effectiveness data, adverse event data and utility data from patients, describing their preferences for an outcome given potential risks. These methods, using hypothetical data for rheumatoid arthritis drugs, demonstrate that quantitative distinctions can be made between drugs which would better inform clinicians, drug regulators and patients about a drug's benefit-risk profile. If the number of patients needed to treat is less than the relative-value adjusted number-needed-to-harm in an RV-NNT analysis, patients are willing to undergo treatment with the experimental drug to derive a certain benefit knowing that they may be at risk for any of a series of potential adverse events. Similarly, the results of an MCE analysis allow for determining the worth of a new treatment relative to an older one, given not only the potential risks of adverse events and benefits that may be gained, but also by taking into account the risk of disease without any treatment. Quantitative methods of benefit-risk analysis have a place in the evaluative armamentarium of pharmacovigilance, especially those that incorporate patients' perspectives.

  4. FOOD RISK ANALYSIS

    USDA-ARS?s Scientific Manuscript database

    Food risk analysis is a holistic approach to food safety because it considers all aspects of the problem. Risk assessment modeling is the foundation of food risk analysis. Proper design and simulation of the risk assessment model is important to properly predict and control risk. Because of knowl...

  5. Fuzzy Risk Evaluation in Failure Mode and Effects Analysis Using a D Numbers Based Multi-Sensor Information Fusion Method.

    PubMed

    Deng, Xinyang; Jiang, Wen

    2017-09-12

    Failure mode and effect analysis (FMEA) is a useful tool to define, identify, and eliminate potential failures or errors so as to improve the reliability of systems, designs, and products. Risk evaluation is an important issue in FMEA to determine the risk priorities of failure modes. There are some shortcomings in the traditional risk priority number (RPN) approach for risk evaluation in FMEA, and fuzzy risk evaluation has become an important research direction that attracts increasing attention. In this paper, the fuzzy risk evaluation in FMEA is studied from a perspective of multi-sensor information fusion. By considering the non-exclusiveness between the evaluations of fuzzy linguistic variables to failure modes, a novel model called D numbers is used to model the non-exclusive fuzzy evaluations. A D numbers based multi-sensor information fusion method is proposed to establish a new model for fuzzy risk evaluation in FMEA. An illustrative example is provided and examined using the proposed model and other existing method to show the effectiveness of the proposed model.

  6. Fuzzy Risk Evaluation in Failure Mode and Effects Analysis Using a D Numbers Based Multi-Sensor Information Fusion Method

    PubMed Central

    Deng, Xinyang

    2017-01-01

    Failure mode and effect analysis (FMEA) is a useful tool to define, identify, and eliminate potential failures or errors so as to improve the reliability of systems, designs, and products. Risk evaluation is an important issue in FMEA to determine the risk priorities of failure modes. There are some shortcomings in the traditional risk priority number (RPN) approach for risk evaluation in FMEA, and fuzzy risk evaluation has become an important research direction that attracts increasing attention. In this paper, the fuzzy risk evaluation in FMEA is studied from a perspective of multi-sensor information fusion. By considering the non-exclusiveness between the evaluations of fuzzy linguistic variables to failure modes, a novel model called D numbers is used to model the non-exclusive fuzzy evaluations. A D numbers based multi-sensor information fusion method is proposed to establish a new model for fuzzy risk evaluation in FMEA. An illustrative example is provided and examined using the proposed model and other existing method to show the effectiveness of the proposed model. PMID:28895905

  7. Applying quantitative benefit-risk analysis to aid regulatory decision making in diagnostic imaging: methods, challenges, and opportunities.

    PubMed

    Agapova, Maria; Devine, Emily Beth; Bresnahan, Brian W; Higashi, Mitchell K; Garrison, Louis P

    2014-09-01

    Health agencies making regulatory marketing-authorization decisions use qualitative and quantitative approaches to assess expected benefits and expected risks associated with medical interventions. There is, however, no universal standard approach that regulatory agencies consistently use to conduct benefit-risk assessment (BRA) for pharmaceuticals or medical devices, including for imaging technologies. Economics, health services research, and health outcomes research use quantitative approaches to elicit preferences of stakeholders, identify priorities, and model health conditions and health intervention effects. Challenges to BRA in medical devices are outlined, highlighting additional barriers in radiology. Three quantitative methods--multi-criteria decision analysis, health outcomes modeling and stated-choice survey--are assessed using criteria that are important in balancing benefits and risks of medical devices and imaging technologies. To be useful in regulatory BRA, quantitative methods need to: aggregate multiple benefits and risks, incorporate qualitative considerations, account for uncertainty, and make clear whose preferences/priorities are being used. Each quantitative method performs differently across these criteria and little is known about how BRA estimates and conclusions vary by approach. While no specific quantitative method is likely to be the strongest in all of the important areas, quantitative methods may have a place in BRA of medical devices and radiology. Quantitative BRA approaches have been more widely applied in medicines, with fewer BRAs in devices. Despite substantial differences in characteristics of pharmaceuticals and devices, BRA methods may be as applicable to medical devices and imaging technologies as they are to pharmaceuticals. Further research to guide the development and selection of quantitative BRA methods for medical devices and imaging technologies is needed. Copyright © 2014 AUR. Published by Elsevier Inc. All rights

  8. The Use of Object-Oriented Analysis Methods in Surety Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craft, Richard L.; Funkhouser, Donald R.; Wyss, Gregory D.

    1999-05-01

    Object-oriented analysis methods have been used in the computer science arena for a number of years to model the behavior of computer-based systems. This report documents how such methods can be applied to surety analysis. By embodying the causality and behavior of a system in a common object-oriented analysis model, surety analysts can make the assumptions that underlie their models explicit and thus better communicate with system designers. Furthermore, given minor extensions to traditional object-oriented analysis methods, it is possible to automatically derive a wide variety of traditional risk and reliability analysis methods from a single common object model. Automaticmore » model extraction helps ensure consistency among analyses and enables the surety analyst to examine a system from a wider variety of viewpoints in a shorter period of time. Thus it provides a deeper understanding of a system's behaviors and surety requirements. This report documents the underlying philosophy behind the common object model representation, the methods by which such common object models can be constructed, and the rules required to interrogate the common object model for derivation of traditional risk and reliability analysis models. The methodology is demonstrated in an extensive example problem.« less

  9. New Methods for the Analysis of Heartbeat Behavior in Risk Stratification

    PubMed Central

    Glass, Leon; Lerma, Claudia; Shrier, Alvin

    2011-01-01

    Developing better methods for risk stratification for tachyarrhythmic sudden cardiac remains a major challenge for physicians and scientists. Since the transition from sinus rhythm to ventricular tachycardia/fibrillation happens by different mechanisms in different people, it is unrealistic to think that a single measure will be adequate to provide a good index for risk stratification. We analyze the dynamical properties of ventricular premature complexes over 24 h in an effort to understand the underlying mechanisms of ventricular arrhythmias and to better understand the arrhythmias that occur in individual patients. Two dimensional density plots, called heartprints, correlate characteristic features of the dynamics of premature ventricular complexes and the sinus rate. Heartprints show distinctive characteristics in individual patients. Based on a better understanding of the natures of transitions from sinus rhythm to sudden cardiac and the mechanisms of arrhythmia prior to cardiac arrest, it should be possible to develop better methods for risk stratification. PMID:22144963

  10. Fuzzy Comprehensive Evaluation Method Applied in the Real Estate Investment Risks Research

    NASA Astrophysics Data System (ADS)

    ML(Zhang Minli), Zhang; Wp(Yang Wenpo), Yang

    Real estate investment is a high-risk and high returned of economic activity, the key of real estate analysis is the identification of their types of investment risk and the risk of different types of effective prevention. But, as the financial crisis sweeping the world, the real estate industry also faces enormous risks, how effective and correct evaluation of real estate investment risks becomes the multitudinous scholar concern[1]. In this paper, real estate investment risks were summarized and analyzed, and comparative analysis method is discussed and finally presented fuzzy comprehensive evaluation method, not only in theory has the advantages of science, in the application also has the reliability, for real estate investment risk assessment provides an effective means for investors in real estate investing guidance on risk factors and forecasts.

  11. Instability risk analysis and risk assessment system establishment of underground storage caverns in bedded salt rock

    NASA Astrophysics Data System (ADS)

    Jing, Wenjun; Zhao, Yan

    2018-02-01

    Stability is an important part of geotechnical engineering research. The operating experiences of underground storage caverns in salt rock all around the world show that the stability of the caverns is the key problem of safe operation. Currently, the combination of theoretical analysis and numerical simulation are the mainly adopts method of reserve stability analysis. This paper introduces the concept of risk into the stability analysis of underground geotechnical structure, and studies the instability of underground storage cavern in salt rock from the perspective of risk analysis. Firstly, the definition and classification of cavern instability risk is proposed, and the damage mechanism is analyzed from the mechanical angle. Then the main stability evaluating indicators of cavern instability risk are proposed, and an evaluation method of cavern instability risk is put forward. Finally, the established cavern instability risk assessment system is applied to the analysis and prediction of cavern instability risk after 30 years of operation in a proposed storage cavern group in the Huai’an salt mine. This research can provide a useful theoretical base for the safe operation and management of underground storage caverns in salt rock.

  12. Kaplan-Meier Survival Analysis Overestimates the Risk of Revision Arthroplasty: A Meta-analysis.

    PubMed

    Lacny, Sarah; Wilson, Todd; Clement, Fiona; Roberts, Derek J; Faris, Peter D; Ghali, William A; Marshall, Deborah A

    2015-11-01

    Although Kaplan-Meier survival analysis is commonly used to estimate the cumulative incidence of revision after joint arthroplasty, it theoretically overestimates the risk of revision in the presence of competing risks (such as death). Because the magnitude of overestimation is not well documented, the potential associated impact on clinical and policy decision-making remains unknown. We performed a meta-analysis to answer the following questions: (1) To what extent does the Kaplan-Meier method overestimate the cumulative incidence of revision after joint replacement compared with alternative competing-risks methods? (2) Is the extent of overestimation influenced by followup time or rate of competing risks? We searched Ovid MEDLINE, EMBASE, BIOSIS Previews, and Web of Science (1946, 1980, 1980, and 1899, respectively, to October 26, 2013) and included article bibliographies for studies comparing estimated cumulative incidence of revision after hip or knee arthroplasty obtained using both Kaplan-Meier and competing-risks methods. We excluded conference abstracts, unpublished studies, or studies using simulated data sets. Two reviewers independently extracted data and evaluated the quality of reporting of the included studies. Among 1160 abstracts identified, six studies were included in our meta-analysis. The principal reason for the steep attrition (1160 to six) was that the initial search was for studies in any clinical area that compared the cumulative incidence estimated using the Kaplan-Meier versus competing-risks methods for any event (not just the cumulative incidence of hip or knee revision); we did this to minimize the likelihood of missing any relevant studies. We calculated risk ratios (RRs) comparing the cumulative incidence estimated using the Kaplan-Meier method with the competing-risks method for each study and used DerSimonian and Laird random effects models to pool these RRs. Heterogeneity was explored using stratified meta-analyses and

  13. How to Perform an Ethical Risk Analysis (eRA).

    PubMed

    Hansson, Sven Ove

    2018-02-26

    Ethical analysis is often needed in the preparation of policy decisions on risk. A three-step method is proposed for performing an ethical risk analysis (eRA). In the first step, the people concerned are identified and categorized in terms of the distinct but compatible roles of being risk-exposed, a beneficiary, or a decisionmaker. In the second step, a more detailed classification of roles and role combinations is performed, and ethically problematic role combinations are identified. In the third step, further ethical deliberation takes place, with an emphasis on individual risk-benefit weighing, distributional analysis, rights analysis, and power analysis. Ethical issues pertaining to subsidiary risk roles, such as those of experts and journalists, are also treated in this phase. An eRA should supplement, not replace, a traditional risk analysis that puts emphasis on the probabilities and severities of undesirable events but does not cover ethical issues such as agency, interpersonal relationships, and justice. © 2018 Society for Risk Analysis.

  14. Risk-benefit analysis and public policy: a bibliography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clark, E.M.; Van Horn, A.J.

    1976-11-01

    Risk-benefit analysis has been implicitly practiced whenever decision-makers are confronted with decisions involving risks to life, health, or to the environment. Various methodologies have been developed to evaluate relevant criteria and to aid in assessing the impacts of alternative projects. Among these have been cost-benefit analysis, which has been widely used for project evaluation. However, in many cases it has been difficult to assign dollar costs to those criteria involving risks and benefits which are not now assigned explicit monetary values in our economic system. Hence, risk-benefit analysis has evolved to become more than merely an extension of cost-benefit analysis,more » and many methods have been applied to examine the trade-offs between risks and benefits. In addition, new scientific and statistical techniques have been developed for assessing current and future risks. The 950 references included in this bibliography are meant to suggest the breadth of those methodologies which have been applied to decisions involving risk.« less

  15. Methods for external event screening quantification: Risk Methods Integration and Evaluation Program (RMIEP) methods development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ravindra, M.K.; Banon, H.

    1992-07-01

    In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as amore » three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC's PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described.« less

  16. Methods for external event screening quantification: Risk Methods Integration and Evaluation Program (RMIEP) methods development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ravindra, M.K.; Banon, H.

    1992-07-01

    In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as amore » three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC`s PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described.« less

  17. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    PubMed

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality

  18. TU-FG-201-11: Evaluating the Validity of Prospective Risk Analysis Methods: A Comparison of Traditional FMEA and Modified Healthcare FMEA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lah, J; Manger, R; Kim, G

    Purpose: To examine the ability of traditional Failure mode and effects analysis (FMEA) and a light version of Healthcare FMEA (HFMEA), called Scenario analysis of FMEA (SAFER) by comparing their outputs in terms of the risks identified and their severity rankings. Methods: We applied two prospective methods of the quality management to surface image guided, linac-based radiosurgery (SIG-RS). For the traditional FMEA, decisions on how to improve an operation are based on risk priority number (RPN). RPN is a product of three indices: occurrence, severity and detectability. The SAFER approach; utilized two indices-frequency and severity-which were defined by a multidisciplinarymore » team. A criticality matrix was divided into 4 categories; very low, low, high and very high. For high risk events, an additional evaluation was performed. Based upon the criticality of the process, it was decided if additional safety measures were needed and what they comprise. Results: Two methods were independently compared to determine if the results and rated risks were matching or not. Our results showed an agreement of 67% between FMEA and SAFER approaches for the 15 riskiest SIG-specific failure modes. The main differences between the two approaches were the distribution of the values and the failure modes (No.52, 54, 154) that have high SAFER scores do not necessarily have high FMEA RPN scores. In our results, there were additional risks identified by both methods with little correspondence. In the SAFER, when the risk score is determined, the basis of the established decision tree or the failure mode should be more investigated. Conclusion: The FMEA method takes into account the probability that an error passes without being detected. SAFER is inductive because it requires the identification of the consequences from causes, and semi-quantitative since it allow the prioritization of risks and mitigation measures, and thus is perfectly applicable to clinical parts of

  19. Probabilistic Exposure Analysis for Chemical Risk Characterization

    PubMed Central

    Bogen, Kenneth T.; Cullen, Alison C.; Frey, H. Christopher; Price, Paul S.

    2009-01-01

    This paper summarizes the state of the science of probabilistic exposure assessment (PEA) as applied to chemical risk characterization. Current probabilistic risk analysis methods applied to PEA are reviewed. PEA within the context of risk-based decision making is discussed, including probabilistic treatment of related uncertainty, interindividual heterogeneity, and other sources of variability. Key examples of recent experience gained in assessing human exposures to chemicals in the environment, and other applications to chemical risk characterization and assessment, are presented. It is concluded that, although improvements continue to be made, existing methods suffice for effective application of PEA to support quantitative analyses of the risk of chemically induced toxicity that play an increasing role in key decision-making objectives involving health protection, triage, civil justice, and criminal justice. Different types of information required to apply PEA to these different decision contexts are identified, and specific PEA methods are highlighted that are best suited to exposure assessment in these separate contexts. PMID:19223660

  20. Sensitivity Analysis of Launch Vehicle Debris Risk Model

    NASA Technical Reports Server (NTRS)

    Gee, Ken; Lawrence, Scott L.

    2010-01-01

    As part of an analysis of the loss of crew risk associated with an ascent abort system for a manned launch vehicle, a model was developed to predict the impact risk of the debris resulting from an explosion of the launch vehicle on the crew module. The model consisted of a debris catalog describing the number, size and imparted velocity of each piece of debris, a method to compute the trajectories of the debris and a method to calculate the impact risk given the abort trajectory of the crew module. The model provided a point estimate of the strike probability as a function of the debris catalog, the time of abort and the delay time between the abort and destruction of the launch vehicle. A study was conducted to determine the sensitivity of the strike probability to the various model input parameters and to develop a response surface model for use in the sensitivity analysis of the overall ascent abort risk model. The results of the sensitivity analysis and the response surface model are presented in this paper.

  1. WE-B-BRC-02: Risk Analysis and Incident Learning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fraass, B.

    Prospective quality management techniques, long used by engineering and industry, have become a growing aspect of efforts to improve quality management and safety in healthcare. These techniques are of particular interest to medical physics as scope and complexity of clinical practice continue to grow, thus making the prescriptive methods we have used harder to apply and potentially less effective for our interconnected and highly complex healthcare enterprise, especially in imaging and radiation oncology. An essential part of most prospective methods is the need to assess the various risks associated with problems, failures, errors, and design flaws in our systems. Wemore » therefore begin with an overview of risk assessment methodologies used in healthcare and industry and discuss their strengths and weaknesses. The rationale for use of process mapping, failure modes and effects analysis (FMEA) and fault tree analysis (FTA) by TG-100 will be described, as well as suggestions for the way forward. This is followed by discussion of radiation oncology specific risk assessment strategies and issues, including the TG-100 effort to evaluate IMRT and other ways to think about risk in the context of radiotherapy. Incident learning systems, local as well as the ASTRO/AAPM ROILS system, can also be useful in the risk assessment process. Finally, risk in the context of medical imaging will be discussed. Radiation (and other) safety considerations, as well as lack of quality and certainty all contribute to the potential risks associated with suboptimal imaging. The goal of this session is to summarize a wide variety of risk analysis methods and issues to give the medical physicist access to tools which can better define risks (and their importance) which we work to mitigate with both prescriptive and prospective risk-based quality management methods. Learning Objectives: Description of risk assessment methodologies used in healthcare and industry Discussion of radiation

  2. Use of a systematic risk analysis method to improve safety in the production of paediatric parenteral nutrition solutions.

    PubMed

    Bonnabry, P; Cingria, L; Sadeghipour, F; Ing, H; Fonzo-Christe, C; Pfister, R E

    2005-04-01

    shown by risk analysis. Residual failure opportunities were also quantified, allowing additional actions to be taken to reduce the risk of labelling mistakes. This study illustrates the usefulness of prospective risk analysis methods in healthcare processes. More systematic use of risk analysis is needed to guide continuous safety improvement of high risk activities.

  3. Applying the partitioned multiobjective risk method (PMRM) to portfolio selection.

    PubMed

    Reyes Santos, Joost; Haimes, Yacov Y

    2004-06-01

    The analysis of risk-return tradeoffs and their practical applications to portfolio analysis paved the way for Modern Portfolio Theory (MPT), which won Harry Markowitz a 1992 Nobel Prize in Economics. A typical approach in measuring a portfolio's expected return is based on the historical returns of the assets included in a portfolio. On the other hand, portfolio risk is usually measured using volatility, which is derived from the historical variance-covariance relationships among the portfolio assets. This article focuses on assessing portfolio risk, with emphasis on extreme risks. To date, volatility is a major measure of risk owing to its simplicity and validity for relatively small asset price fluctuations. Volatility is a justified measure for stable market performance, but it is weak in addressing portfolio risk under aberrant market fluctuations. Extreme market crashes such as that on October 19, 1987 ("Black Monday") and catastrophic events such as the terrorist attack of September 11, 2001 that led to a four-day suspension of trading on the New York Stock Exchange (NYSE) are a few examples where measuring risk via volatility can lead to inaccurate predictions. Thus, there is a need for a more robust metric of risk. By invoking the principles of the extreme-risk-analysis method through the partitioned multiobjective risk method (PMRM), this article contributes to the modeling of extreme risks in portfolio performance. A measure of an extreme portfolio risk, denoted by f(4), is defined as the conditional expectation for a lower-tail region of the distribution of the possible portfolio returns. This article presents a multiobjective problem formulation consisting of optimizing expected return and f(4), whose solution is determined using Evolver-a software that implements a genetic algorithm. Under business-as-usual market scenarios, the results of the proposed PMRM portfolio selection model are found to be compatible with those of the volatility-based model

  4. Multiattribute risk analysis in nuclear emergency management.

    PubMed

    Hämäläinen, R P; Lindstedt, M R; Sinkko, K

    2000-08-01

    Radiation protection authorities have seen a potential for applying multiattribute risk analysis in nuclear emergency management and planning to deal with conflicting objectives, different parties involved, and uncertainties. This type of approach is expected to help in the following areas: to ensure that all relevant attributes are considered in decision making; to enhance communication between the concerned parties, including the public; and to provide a method for explicitly including risk analysis in the process. A multiattribute utility theory analysis was used to select a strategy for protecting the population after a simulated nuclear accident. The value-focused approach and the use of a neutral facilitator were identified as being useful.

  5. Comparison of two occurrence risk assessment methods for collapse gully erosion ——A case study in Guangdong province

    NASA Astrophysics Data System (ADS)

    Sun, K.; Cheng, D. B.; He, J. J.; Zhao, Y. L.

    2018-02-01

    Collapse gully erosion is a specific type of soil erosion in the red soil region of southern China, and early warning and prevention of the occurrence of collapse gully erosion is very important. Based on the idea of risk assessment, this research, taking Guangdong province as an example, adopt the information acquisition analysis and the logistic regression analysis, to discuss the feasibility for collapse gully erosion risk assessment in regional scale, and compare the applicability of the different risk assessment methods. The results show that in the Guangdong province, the risk degree of collapse gully erosion occurrence is high in northeastern and western area, and relatively low in southwestern and central part. The comparing analysis of the different risk assessment methods on collapse gully also indicated that the risk distribution patterns from the different methods were basically consistent. However, the accuracy of risk map from the information acquisition analysis method was slightly better than that from the logistic regression analysis method.

  6. Risk-Based Prioritization Method for the Classification of Groundwater Pollution from Hazardous Waste Landfills.

    PubMed

    Yang, Yu; Jiang, Yong-Hai; Lian, Xin-Ying; Xi, Bei-Dou; Ma, Zhi-Fei; Xu, Xiang-Jian; An, Da

    2016-12-01

    Hazardous waste landfill sites are a significant source of groundwater pollution. To ensure that these landfills with a significantly high risk of groundwater contamination are properly managed, a risk-based ranking method related to groundwater contamination is needed. In this research, a risk-based prioritization method for the classification of groundwater pollution from hazardous waste landfills was established. The method encompasses five phases, including risk pre-screening, indicator selection, characterization, classification and, lastly, validation. In the risk ranking index system employed here, 14 indicators involving hazardous waste landfills and migration in the vadose zone as well as aquifer were selected. The boundary of each indicator was determined by K-means cluster analysis and the weight of each indicator was calculated by principal component analysis. These methods were applied to 37 hazardous waste landfills in China. The result showed that the risk for groundwater contamination from hazardous waste landfills could be ranked into three classes from low to high risk. In all, 62.2 % of the hazardous waste landfill sites were classified in the low and medium risk classes. The process simulation method and standardized anomalies were used to validate the result of risk ranking; the results were consistent with the simulated results related to the characteristics of contamination. The risk ranking method was feasible, valid and can provide reference data related to risk management for groundwater contamination at hazardous waste landfill sites.

  7. [Study on the risk assessment method of regional groundwater pollution].

    PubMed

    Yang, Yan; Yu, Yun-Jiang; Wang, Zong-Qing; Li, Ding-Long; Sun, Hong-Wei

    2013-02-01

    Based on the boundary elements of system risk assessment, the regional groundwater pollution risk assessment index system was preliminarily established, which included: regional groundwater specific vulnerability assessment, the regional pollution sources characteristics assessment and the health risk assessment of regional featured pollutants. The three sub-evaluation systems were coupled with the multi-index comprehensive method, the risk was characterized with the Spatial Analysis of ArcMap, and a new method to evaluate regional groundwater pollution risk that suitable for different parts of natural conditions, different types of pollution was established. Take Changzhou as an example, the risk of shallow groundwater pollution was studied with the new method, and found that the vulnerability index of groundwater in Changzhou is high and distributes unevenly; The distribution of pollution sources is concentrated and has a great impact on groundwater pollution risks; Influenced by the pollutants and pollution sources, the values of health risks are high in the urban area of Changzhou. The pollution risk of shallow groundwater is high and distributes unevenly, and distributes in the north of the line of Anjia-Xuejia-Zhenglu, the center of the city and the southeast, where the human activities are more intense and the pollution sources are intensive.

  8. Comprehensive risk assessment method of catastrophic accident based on complex network properties

    NASA Astrophysics Data System (ADS)

    Cui, Zhen; Pang, Jun; Shen, Xiaohong

    2017-09-01

    On the macro level, the structural properties of the network and the electrical characteristics of the micro components determine the risk of cascading failures. And the cascading failures, as a process with dynamic development, not only the direct risk but also potential risk should be considered. In this paper, comprehensively considered the direct risk and potential risk of failures based on uncertain risk analysis theory and connection number theory, quantified uncertain correlation by the node degree and node clustering coefficient, then established a comprehensive risk indicator of failure. The proposed method has been proved by simulation on the actual power grid. Modeling a network according to the actual power grid, and verified the rationality of the proposed method.

  9. Integrating Household Risk Mitigation Behavior in Flood Risk Analysis: An Agent-Based Model Approach.

    PubMed

    Haer, Toon; Botzen, W J Wouter; de Moel, Hans; Aerts, Jeroen C J H

    2017-10-01

    Recent studies showed that climate change and socioeconomic trends are expected to increase flood risks in many regions. However, in these studies, human behavior is commonly assumed to be constant, which neglects interaction and feedback loops between human and environmental systems. This neglect of human adaptation leads to a misrepresentation of flood risk. This article presents an agent-based model that incorporates human decision making in flood risk analysis. In particular, household investments in loss-reducing measures are examined under three economic decision models: (1) expected utility theory, which is the traditional economic model of rational agents; (2) prospect theory, which takes account of bounded rationality; and (3) a prospect theory model, which accounts for changing risk perceptions and social interactions through a process of Bayesian updating. We show that neglecting human behavior in flood risk assessment studies can result in a considerable misestimation of future flood risk, which is in our case study an overestimation of a factor two. Furthermore, we show how behavior models can support flood risk analysis under different behavioral assumptions, illustrating the need to include the dynamic adaptive human behavior of, for instance, households, insurers, and governments. The method presented here provides a solid basis for exploring human behavior and the resulting flood risk with respect to low-probability/high-impact risks. © 2016 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.

  10. Program risk analysis handbook

    NASA Technical Reports Server (NTRS)

    Batson, R. G.

    1987-01-01

    NASA regulations specify that formal risk analysis be performed on a program at each of several milestones. Program risk analysis is discussed as a systems analysis approach, an iterative process (identification, assessment, management), and a collection of techniques. These techniques, which range from extremely simple to complex network-based simulation, are described in this handbook in order to provide both analyst and manager with a guide for selection of the most appropriate technique. All program risk assessment techniques are shown to be based on elicitation and encoding of subjective probability estimates from the various area experts on a program. Techniques to encode the five most common distribution types are given. Then, a total of twelve distinct approaches to risk assessment are given. Steps involved, good and bad points, time involved, and degree of computer support needed are listed. Why risk analysis should be used by all NASA program managers is discussed. Tools available at NASA-MSFC are identified, along with commercially available software. Bibliography (150 entries) and a program risk analysis check-list are provided.

  11. Tsunamis: Global Exposure and Local Risk Analysis

    NASA Astrophysics Data System (ADS)

    Harbitz, C. B.; Løvholt, F.; Glimsdal, S.; Horspool, N.; Griffin, J.; Davies, G.; Frauenfelder, R.

    2014-12-01

    The 2004 Indian Ocean tsunami led to a better understanding of the likelihood of tsunami occurrence and potential tsunami inundation, and the Hyogo Framework for Action (HFA) was one direct result of this event. The United Nations International Strategy for Disaster Risk Reduction (UN-ISDR) adopted HFA in January 2005 in order to reduce disaster risk. As an instrument to compare the risk due to different natural hazards, an integrated worldwide study was implemented and published in several Global Assessment Reports (GAR) by UN-ISDR. The results of the global earthquake induced tsunami hazard and exposure analysis for a return period of 500 years are presented. Both deterministic and probabilistic methods (PTHA) are used. The resulting hazard levels for both methods are compared quantitatively for selected areas. The comparison demonstrates that the analysis is rather rough, which is expected for a study aiming at average trends on a country level across the globe. It is shown that populous Asian countries account for the largest absolute number of people living in tsunami prone areas, more than 50% of the total exposed people live in Japan. Smaller nations like Macao and the Maldives are among the most exposed by population count. Exposed nuclear power plants are limited to Japan, China, India, Taiwan, and USA. On the contrary, a local tsunami vulnerability and risk analysis applies information on population, building types, infrastructure, inundation, flow depth for a certain tsunami scenario with a corresponding return period combined with empirical data on tsunami damages and mortality. Results and validation of a GIS tsunami vulnerability and risk assessment model are presented. The GIS model is adapted for optimal use of data available for each study. Finally, the importance of including landslide sources in the tsunami analysis is also discussed.

  12. 78 FR 25440 - Request for Information and Citations on Methods for Cumulative Risk Assessment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-01

    ... Citations on Methods for Cumulative Risk Assessment AGENCY: Office of the Science Advisor, Environmental... requesting information and citations on approaches and methods for the planning, analysis, assessment, and... approaches to understanding risks to human health and the environment. For example, in Science & Decisions...

  13. Cost-effectiveness analysis of risk-reduction measures to reach water safety targets.

    PubMed

    Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof; Pettersson, Thomas J R

    2011-01-01

    Identifying the most suitable risk-reduction measures in drinking water systems requires a thorough analysis of possible alternatives. In addition to the effects on the risk level, also the economic aspects of the risk-reduction alternatives are commonly considered important. Drinking water supplies are complex systems and to avoid sub-optimisation of risk-reduction measures, the entire system from source to tap needs to be considered. There is a lack of methods for quantification of water supply risk reduction in an economic context for entire drinking water systems. The aim of this paper is to present a novel approach for risk assessment in combination with economic analysis to evaluate risk-reduction measures based on a source-to-tap approach. The approach combines a probabilistic and dynamic fault tree method with cost-effectiveness analysis (CEA). The developed approach comprises the following main parts: (1) quantification of risk reduction of alternatives using a probabilistic fault tree model of the entire system; (2) combination of the modelling results with CEA; and (3) evaluation of the alternatives with respect to the risk reduction, the probability of not reaching water safety targets and the cost-effectiveness. The fault tree method and CEA enable comparison of risk-reduction measures in the same quantitative unit and consider costs and uncertainties. The approach provides a structured and thorough analysis of risk-reduction measures that facilitates transparency and long-term planning of drinking water systems in order to avoid sub-optimisation of available resources for risk reduction. Copyright © 2010 Elsevier Ltd. All rights reserved.

  14. Problems With Risk Reclassification Methods for Evaluating Prediction Models

    PubMed Central

    Pepe, Margaret S.

    2011-01-01

    For comparing the performance of a baseline risk prediction model with one that includes an additional predictor, a risk reclassification analysis strategy has been proposed. The first step is to cross-classify risks calculated according to the 2 models for all study subjects. Summary measures including the percentage of reclassification and the percentage of correct reclassification are calculated, along with 2 reclassification calibration statistics. The author shows that interpretations of the proposed summary measures and P values are problematic. The author's recommendation is to display the reclassification table, because it shows interesting information, but to use alternative methods for summarizing and comparing model performance. The Net Reclassification Index has been suggested as one alternative method. The author argues for reporting components of the Net Reclassification Index because they are more clinically relevant than is the single numerical summary measure. PMID:21555714

  15. Research on the method of information system risk state estimation based on clustering particle filter

    NASA Astrophysics Data System (ADS)

    Cui, Jia; Hong, Bei; Jiang, Xuepeng; Chen, Qinghua

    2017-05-01

    With the purpose of reinforcing correlation analysis of risk assessment threat factors, a dynamic assessment method of safety risks based on particle filtering is proposed, which takes threat analysis as the core. Based on the risk assessment standards, the method selects threat indicates, applies a particle filtering algorithm to calculate influencing weight of threat indications, and confirms information system risk levels by combining with state estimation theory. In order to improve the calculating efficiency of the particle filtering algorithm, the k-means cluster algorithm is introduced to the particle filtering algorithm. By clustering all particles, the author regards centroid as the representative to operate, so as to reduce calculated amount. The empirical experience indicates that the method can embody the relation of mutual dependence and influence in risk elements reasonably. Under the circumstance of limited information, it provides the scientific basis on fabricating a risk management control strategy.

  16. Vulnerabilities, Influences and Interaction Paths: Failure Data for Integrated System Risk Analysis

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land

    2006-01-01

    We describe graph-based analysis methods for identifying and analyzing cross-subsystem interaction risks from subsystem connectivity information. By discovering external and remote influences that would be otherwise unexpected, these methods can support better communication among subsystem designers at points of potential conflict and to support design of more dependable and diagnosable systems. These methods identify hazard causes that can impact vulnerable functions or entities if propagated across interaction paths from the hazard source to the vulnerable target. The analysis can also assess combined impacts of And-Or trees of disabling influences. The analysis can use ratings of hazards and vulnerabilities to calculate cumulative measures of the severity and importance. Identification of cross-subsystem hazard-vulnerability pairs and propagation paths across subsystems will increase coverage of hazard and risk analysis and can indicate risk control and protection strategies.

  17. Impact of model-based risk analysis for liver surgery planning.

    PubMed

    Hansen, C; Zidowitz, S; Preim, B; Stavrou, G; Oldhafer, K J; Hahn, H K

    2014-05-01

    A model-based risk analysis for oncologic liver surgery was described in previous work (Preim et al. in Proceedings of international symposium on computer assisted radiology and surgery (CARS), Elsevier, Amsterdam, pp. 353–358, 2002; Hansen et al. Int I Comput Assist Radiol Surg 4(5):469–474, 2009). In this paper, we present an evaluation of this method. To prove whether and how the risk analysis facilitates the process of liver surgery planning, an explorative user study with 10 liver experts was conducted. The purpose was to compare and analyze their decision-making. The results of the study show that model-based risk analysis enhances the awareness of surgical risk in the planning stage. Participants preferred smaller resection volumes and agreed more on the safety margins’ width in case the risk analysis was available. In addition, time to complete the planning task and confidence of participants were not increased when using the risk analysis. This work shows that the applied model-based risk analysis may influence important planning decisions in liver surgery. It lays a basis for further clinical evaluations and points out important fields for future research.

  18. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    PubMed

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.

  19. Investment appraisal using quantitative risk analysis.

    PubMed

    Johansson, Henrik

    2002-07-01

    Investment appraisal concerned with investments in fire safety systems is discussed. Particular attention is directed at evaluating, in terms of the Bayesian decision theory, the risk reduction that investment in a fire safety system involves. It is shown how the monetary value of the change from a building design without any specific fire protection system to one including such a system can be estimated by use of quantitative risk analysis, the results of which are expressed in terms of a Risk-adjusted net present value. This represents the intrinsic monetary value of investing in the fire safety system. The method suggested is exemplified by a case study performed in an Avesta Sheffield factory.

  20. Earthquake Hazard Analysis Methods: A Review

    NASA Astrophysics Data System (ADS)

    Sari, A. M.; Fakhrurrozi, A.

    2018-02-01

    One of natural disasters that have significantly impacted on risks and damage is an earthquake. World countries such as China, Japan, and Indonesia are countries located on the active movement of continental plates with more frequent earthquake occurrence compared to other countries. Several methods of earthquake hazard analysis have been done, for example by analyzing seismic zone and earthquake hazard micro-zonation, by using Neo-Deterministic Seismic Hazard Analysis (N-DSHA) method, and by using Remote Sensing. In its application, it is necessary to review the effectiveness of each technique in advance. Considering the efficiency of time and the accuracy of data, remote sensing is used as a reference to the assess earthquake hazard accurately and quickly as it only takes a limited time required in the right decision-making shortly after the disaster. Exposed areas and possibly vulnerable areas due to earthquake hazards can be easily analyzed using remote sensing. Technological developments in remote sensing such as GeoEye-1 provide added value and excellence in the use of remote sensing as one of the methods in the assessment of earthquake risk and damage. Furthermore, the use of this technique is expected to be considered in designing policies for disaster management in particular and can reduce the risk of natural disasters such as earthquakes in Indonesia.

  1. Risk analysis within environmental impact assessment of proposed construction activity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zeleňáková, Martina; Zvijáková, Lenka

    Environmental impact assessment is an important process, prior to approval of the investment plan, providing a detailed examination of the likely and foreseeable impacts of proposed construction activity on the environment. The objective of this paper is to develop a specific methodology for the analysis and evaluation of environmental impacts of selected constructions – flood protection structures using risk analysis methods. The application of methodology designed for the process of environmental impact assessment will develop assumptions for further improvements or more effective implementation and performance of this process. The main objective of the paper is to improve the implementation ofmore » the environmental impact assessment process. Through the use of risk analysis methods in environmental impact assessment process, the set objective has been achieved. - Highlights: This paper is informed by an effort to develop research with the aim of: • Improving existing qualitative and quantitative methods for assessing the impacts • A better understanding of relations between probabilities and consequences • Methodology for the EIA of flood protection constructions based on risk analysis • Creative approaches in the search for environmentally friendly proposed activities.« less

  2. The JPL Cost Risk Analysis Approach that Incorporates Engineering Realism

    NASA Technical Reports Server (NTRS)

    Harmon, Corey C.; Warfield, Keith R.; Rosenberg, Leigh S.

    2006-01-01

    This paper discusses the JPL Cost Engineering Group (CEG) cost risk analysis approach that accounts for all three types of cost risk. It will also describe the evaluation of historical cost data upon which this method is based. This investigation is essential in developing a method that is rooted in engineering realism and produces credible, dependable results to aid decision makers.

  3. Evaluation of Contamination Inspection and Analysis Methods through Modeling System Performance

    NASA Technical Reports Server (NTRS)

    Seasly, Elaine; Dever, Jason; Stuban, Steven M. F.

    2016-01-01

    Contamination is usually identified as a risk on the risk register for sensitive space systems hardware. Despite detailed, time-consuming, and costly contamination control efforts during assembly, integration, and test of space systems, contaminants are still found during visual inspections of hardware. Improved methods are needed to gather information during systems integration to catch potential contamination issues earlier and manage contamination risks better. This research explores evaluation of contamination inspection and analysis methods to determine optical system sensitivity to minimum detectable molecular contamination levels based on IEST-STD-CC1246E non-volatile residue (NVR) cleanliness levels. Potential future degradation of the system is modeled given chosen modules representative of optical elements in an optical system, minimum detectable molecular contamination levels for a chosen inspection and analysis method, and determining the effect of contamination on the system. By modeling system performance based on when molecular contamination is detected during systems integration and at what cleanliness level, the decision maker can perform trades amongst different inspection and analysis methods and determine if a planned method is adequate to meet system requirements and manage contamination risk.

  4. Analysis and classification of the tools for assessing the risks associated with industrial machines.

    PubMed

    Paques, Joseph-Jean; Gauthier, François; Perez, Alejandro

    2007-01-01

    To assess and plan future risk-analysis research projects, 275 documents describing methods and tools for assessing the risks associated with industrial machines or with other sectors such as the military, and the nuclear and aeronautics industries, etc., were collected. These documents were in the format of published books or papers, standards, technical guides and company procedures collected throughout industry. From the collected documents, 112 documents were selected for analysis; 108 methods applied or potentially applicable for assessing the risks associated with industrial machines were analyzed and classified. This paper presents the main quantitative results of the analysis of the methods and tools.

  5. Comparison of risk assessment procedures used in OCRA and ULRA methods

    PubMed Central

    Roman-Liu, Danuta; Groborz, Anna; Tokarski, Tomasz

    2013-01-01

    The aim of this study was to analyse the convergence of two methods by comparing exposure and the assessed risk of developing musculoskeletal disorders at 18 repetitive task workstations. The already established occupational repetitive actions (OCRA) and the recently developed upper limb risk assessment (ULRA) produce correlated results (R = 0.84, p = 0.0001). A discussion of the factors that influence the values of the OCRA index and ULRA's repetitive task indicator shows that both similarities and differences in the results produced by the two methods can arise from the concepts that underlie them. The assessment procedure and mathematical calculations that the basic parameters are subjected to are crucial to the results of risk assessment. The way the basic parameters are defined influences the assessment of exposure and risk assessment to a lesser degree. The analysis also proved that not always do great differences in load indicator values result in differences in risk zones. Practitioner Summary: We focused on comparing methods that, even though based on different concepts, serve the same purpose. The results proved that different methods with different assumptions can produce similar assessment of upper limb load; sharp criteria in risk assessment are not the best solution. PMID:24041375

  6. Use-related risk analysis for medical devices based on improved FMEA.

    PubMed

    Liu, Long; Shuai, Ma; Wang, Zhu; Li, Ping

    2012-01-01

    In order to effectively analyze and control use-related risk of medical devices, quantitative methodologies must be applied. Failure Mode and Effects Analysis (FMEA) is a proactive technique for error detection and risk reduction. In this article, an improved FMEA based on Fuzzy Mathematics and Grey Relational Theory is developed to better carry out user-related risk analysis for medical devices. As an example, the analysis process using this improved FMEA method for a certain medical device (C-arm X-ray machine) is described.

  7. Risk assessment of water pollution sources based on an integrated k-means clustering and set pair analysis method in the region of Shiyan, China.

    PubMed

    Li, Chunhui; Sun, Lian; Jia, Junxiang; Cai, Yanpeng; Wang, Xuan

    2016-07-01

    Source water areas are facing many potential water pollution risks. Risk assessment is an effective method to evaluate such risks. In this paper an integrated model based on k-means clustering analysis and set pair analysis was established aiming at evaluating the risks associated with water pollution in source water areas, in which the weights of indicators were determined through the entropy weight method. Then the proposed model was applied to assess water pollution risks in the region of Shiyan in which China's key source water area Danjiangkou Reservoir for the water source of the middle route of South-to-North Water Diversion Project is located. The results showed that eleven sources with relative high risk value were identified. At the regional scale, Shiyan City and Danjiangkou City would have a high risk value in term of the industrial discharge. Comparatively, Danjiangkou City and Yunxian County would have a high risk value in terms of agricultural pollution. Overall, the risk values of north regions close to the main stream and reservoir of the region of Shiyan were higher than that in the south. The results of risk level indicated that five sources were in lower risk level (i.e., level II), two in moderate risk level (i.e., level III), one in higher risk level (i.e., level IV) and three in highest risk level (i.e., level V). Also risks of industrial discharge are higher than that of the agricultural sector. It is thus essential to manage the pillar industry of the region of Shiyan and certain agricultural companies in the vicinity of the reservoir to reduce water pollution risks of source water areas. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Latent Model Analysis of Substance Use and HIV Risk Behaviors among High-Risk Minority Adults

    ERIC Educational Resources Information Center

    Wang, Min Qi; Matthew, Resa F.; Chiu, Yu-Wen; Yan, Fang; Bellamy, Nikki D.

    2007-01-01

    Objectives: This study evaluated substance use and HIV risk profile using a latent model analysis based on ecological theory, inclusive of a risk and protective factor framework, in sexually active minority adults (N=1,056) who participated in a federally funded substance abuse and HIV prevention health initiative from 2002 to 2006. Methods: Data…

  9. The SOBANE risk management strategy and the Déparis method for the participatory screening of the risks.

    PubMed

    Malchaire, J B

    2004-08-01

    The first section of the document describes a risk-prevention strategy, called SOBANE, in four levels: screening, observation, analysis and expertise. The aim is to make risk prevention faster, more cost effective, and more effective in coordinating the contributions of the workers themselves, their management, the internal and external occupational health (OH) practitioners and the experts. These four levels are: screening, where the risk factors are detected by the workers and their management, and obvious solutions are implemented; observation, where the remaining problems are studied in more detail, one by one, and the reasons and the solutions are discussed in detail; analysis, where, when necessary, an OH practitioner is called upon to carry out appropriate measurements to develop specific solutions; expertise, where, in very sophisticated and rare cases, the assistance of an expert is called upon to solve a particular problem. The method for the participatory screening of the risks (in French: Dépistage Participatif des Risques), Déparis, is proposed for the first level screening of the SOBANE strategy. The work situation is systematically reviewed and all the aspects conditioning the easiness, the effectiveness and the satisfaction at work are discussed, in search of practical prevention measures. The points to be studied more in detail at level 2, observation, are identified. The method is carried out during a meeting of key workers and technical staff. The method proves to be simple, sparing in time and means and playing a significant role in the development of a dynamic plan of risk management and of a culture of dialogue in the company.

  10. Assessing the validity of prospective hazard analysis methods: a comparison of two techniques

    PubMed Central

    2014-01-01

    Background Prospective Hazard Analysis techniques such as Healthcare Failure Modes and Effects Analysis (HFMEA) and Structured What If Technique (SWIFT) have the potential to increase safety by identifying risks before an adverse event occurs. Published accounts of their application in healthcare have identified benefits, but the reliability of some methods has been found to be low. The aim of this study was to examine the validity of SWIFT and HFMEA by comparing their outputs in the process of risk assessment, and comparing the results with risks identified by retrospective methods. Methods The setting was a community-based anticoagulation clinic, in which risk assessment activities had been previously performed and were available. A SWIFT and an HFMEA workshop were conducted consecutively on the same day by experienced experts. Participants were a mixture of pharmacists, administrative staff and software developers. Both methods produced lists of risks scored according to the method’s procedure. Participants’ views about the value of the workshops were elicited with a questionnaire. Results SWIFT identified 61 risks and HFMEA identified 72 risks. For both methods less than half the hazards were identified by the other method. There was also little overlap between the results of the workshops and risks identified by prior root cause analysis, staff interviews or clinical governance board discussions. Participants’ feedback indicated that the workshops were viewed as useful. Conclusions Although there was limited overlap, both methods raised important hazards. Scoping the problem area had a considerable influence on the outputs. The opportunity for teams to discuss their work from a risk perspective is valuable, but these methods cannot be relied upon in isolation to provide a comprehensive description. Multiple methods for identifying hazards should be used and data from different sources should be integrated to give a comprehensive view of risk in a system

  11. Designing a Software for Flood Risk Assessment Based on Multi Criteria Desicion Analysis and Information Diffusion Methods

    NASA Astrophysics Data System (ADS)

    Musaoglu, N.; Saral, A.; Seker, D. Z.

    2012-12-01

    Flooding is one of the major natural disasters not only in Turkey but also in all over the world and it causes serious damage and harm. It is estimated that of the total economic loss caused by all kinds of disasters, 40% was due to floods. In July 1995, the Ayamama Creek in Istanbul was flooded, the insurance sector received around 1,200 claims notices during that period, insurance companies had to pay a total of $40 million for claims. In 2009, the same creek was flooded again and killed 31 people over two days and insurance firms paid for damages around cost €150 million for claims. To solve these kinds of problems modern tools such as GIS and Remote Sensing should be utilized. In this study, a software was designed for the flood risk analysis with Analytic Hierarchy Process (AHP) and Information Diffusion( InfoDif) methods.In the developed sofware, five evaluation criterias were taken into account, which were slope, aspect, elevation, geology and land use which were extracted from the satellite sensor data. The Digital Elevation Model (DEM) of the Ayamama River Basin was acquired from the SPOT 5 satellite image with 2.5 meter spatial resolution. Slope and aspect values of the study basin were extracted from this DEM. The land use of the Ayamama Creek was obtained by performing object-oriented nearest neighbor classification method by image segmentation on SPOT 5 image dated 2010. All produced data were used as an input for the part of Multi Criteria Desicion Analysis (MCDA) method of this software. Criterias and their each sub criteras were weighted and flood vulnerability was determined with MCDA-AHP. Also, daily flood data was collected from Florya Meteorological Station, between 1975 to 2009 years and the daily flood peak discharge was calculated with the method of Soil Conservation Service-Curve Number (SCS-CN) and were used as an input in the software for the part of InfoDif.Obtained results were verified using ground truth data and it has been clearly

  12. Classifying Nanomaterial Risks Using Multi-Criteria Decision Analysis

    NASA Astrophysics Data System (ADS)

    Linkov, I.; Steevens, J.; Chappell, M.; Tervonen, T.; Figueira, J. R.; Merad, M.

    There is rapidly growing interest by regulatory agencies and stakeholders in the potential toxicity and other risks associated with nanomaterials throughout the different stages of the product life cycle (e.g., development, production, use and disposal). Risk assessment methods and tools developed and applied to chemical and biological material may not be readily adaptable for nanomaterials because of the current uncertainty in identifying the relevant physico-chemical and biological properties that adequately describe the materials. Such uncertainty is further driven by the substantial variations in the properties of the original material because of the variable manufacturing processes employed in nanomaterial production. To guide scientists and engineers in nanomaterial research and application as well as promote the safe use/handling of these materials, we propose a decision support system for classifying nanomaterials into different risk categories. The classification system is based on a set of performance metrics that measure both the toxicity and physico-chemical characteristics of the original materials, as well as the expected environmental impacts through the product life cycle. The stochastic multicriteria acceptability analysis (SMAA-TRI), a formal decision analysis method, was used as the foundation for this task. This method allowed us to cluster various nanomaterials in different risk categories based on our current knowledge of nanomaterial's physico-chemical characteristics, variation in produced material, and best professional judgement. SMAA-TRI uses Monte Carlo simulations to explore all feasible values for weights, criteria measurements, and other model parameters to assess the robustness of nanomaterial grouping for risk management purposes.1,2

  13. Stingray Failure Mode, Effects and Criticality Analysis: WEC Risk Registers

    DOE Data Explorer

    Ken Rhinefrank

    2016-07-25

    Analysis method to systematically identify all potential failure modes and their effects on the Stingray WEC system. This analysis is incorporated early in the development cycle such that the mitigation of the identified failure modes can be achieved cost effectively and efficiently. The FMECA can begin once there is enough detail to functions and failure modes of a given system, and its interfaces with other systems. The FMECA occurs coincidently with the design process and is an iterative process which allows for design changes to overcome deficiencies in the analysis.Risk Registers for major subsystems completed according to the methodology described in "Failure Mode Effects and Criticality Analysis Risk Reduction Program Plan.pdf" document below, in compliance with the DOE Risk Management Framework developed by NREL.

  14. Coffee consumption and risk of fractures: a meta-analysis

    PubMed Central

    Liu, Huifang; Yao, Ke; Zhang, Wenjie; Zhou, Jun; Wu, Taixiang

    2012-01-01

    Introduction Recent studies have indicated higher risk of fractures among coffee drinkers. To quantitatively assess the association between coffee consumption and the risk of fractures, we conducted this meta-analysis. Material and methods We searched MEDLINE and EMBASE for prospective studies reporting the risk of fractures with coffee consumption. Quality of included studies was assessed with the Newcastle Ottawa scale. We conducted a meta-analysis and a cumulative meta-analysis of relative risk (RR) for an increment of one cup of coffee per day, and explored the potential dose-response relationship. Sensitivity analysis was performed where statistical heterogeneity existed. Results We included 10 prospective studies covering 214,059 participants and 9,597 cases. There was overall 3.5% higher fracture risk for an increment of one cup of coffee per day (RR = 1.035, 95% CI: 1.019-1.052). Pooled RRs were 1.049 (95% CI: 1.022-1.077) for women and 0.910 (95% CI: 0.873-0.949) for men. Among women, RR was 1.055 (95% CI: 0.999-1.114) for younger participants, and 1.047 (95% CI: 1.016-1.080) for older ones. Cumulative meta-analysis indicated that risk estimates reached a stabilization level (RR = 1.035, 95% CI: 1.019-1.052), and it revealed a positive dose-response relationship between coffee consumption and risk of fractures either for men and women combined or women specifically. Conclusions This meta-analysis suggests an overall harm of coffee intake in increasing the risk of fractures, especially for women. But current data are insufficient to reach a convincing conclusion and further research needs to be conducted. PMID:23185185

  15. A novel method of fuzzy fault tree analysis combined with VB program to identify and assess the risk of coal dust explosions

    PubMed Central

    Li, Jia; Wang, Deming; Huang, Zonghou

    2017-01-01

    Coal dust explosions (CDE) are one of the main threats to the occupational safety of coal miners. Aiming to identify and assess the risk of CDE, this paper proposes a novel method of fuzzy fault tree analysis combined with the Visual Basic (VB) program. In this methodology, various potential causes of the CDE are identified and a CDE fault tree is constructed. To overcome drawbacks from the lack of exact probability data for the basic events, fuzzy set theory is employed and the probability data of each basic event is treated as intuitionistic trapezoidal fuzzy numbers. In addition, a new approach for calculating the weighting of each expert is also introduced in this paper to reduce the error during the expert elicitation process. Specifically, an in-depth quantitative analysis of the fuzzy fault tree, such as the importance measure of the basic events and the cut sets, and the CDE occurrence probability is given to assess the explosion risk and acquire more details of the CDE. The VB program is applied to simplify the analysis process. A case study and analysis is provided to illustrate the effectiveness of this proposed method, and some suggestions are given to take preventive measures in advance and avoid CDE accidents. PMID:28793348

  16. A novel method of fuzzy fault tree analysis combined with VB program to identify and assess the risk of coal dust explosions.

    PubMed

    Wang, Hetang; Li, Jia; Wang, Deming; Huang, Zonghou

    2017-01-01

    Coal dust explosions (CDE) are one of the main threats to the occupational safety of coal miners. Aiming to identify and assess the risk of CDE, this paper proposes a novel method of fuzzy fault tree analysis combined with the Visual Basic (VB) program. In this methodology, various potential causes of the CDE are identified and a CDE fault tree is constructed. To overcome drawbacks from the lack of exact probability data for the basic events, fuzzy set theory is employed and the probability data of each basic event is treated as intuitionistic trapezoidal fuzzy numbers. In addition, a new approach for calculating the weighting of each expert is also introduced in this paper to reduce the error during the expert elicitation process. Specifically, an in-depth quantitative analysis of the fuzzy fault tree, such as the importance measure of the basic events and the cut sets, and the CDE occurrence probability is given to assess the explosion risk and acquire more details of the CDE. The VB program is applied to simplify the analysis process. A case study and analysis is provided to illustrate the effectiveness of this proposed method, and some suggestions are given to take preventive measures in advance and avoid CDE accidents.

  17. Comparison study on qualitative and quantitative risk assessment methods for urban natural gas pipeline network.

    PubMed

    Han, Z Y; Weng, W G

    2011-05-15

    In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.

  18. [Reliability theory based on quality risk network analysis for Chinese medicine injection].

    PubMed

    Li, Zheng; Kang, Li-Yuan; Fan, Xiao-Hui

    2014-08-01

    A new risk analysis method based upon reliability theory was introduced in this paper for the quality risk management of Chinese medicine injection manufacturing plants. The risk events including both cause and effect ones were derived in the framework as nodes with a Bayesian network analysis approach. It thus transforms the risk analysis results from failure mode and effect analysis (FMEA) into a Bayesian network platform. With its structure and parameters determined, the network can be used to evaluate the system reliability quantitatively with probabilistic analytical appraoches. Using network analysis tools such as GeNie and AgenaRisk, we are able to find the nodes that are most critical to influence the system reliability. The importance of each node to the system can be quantitatively evaluated by calculating the effect of the node on the overall risk, and minimization plan can be determined accordingly to reduce their influences and improve the system reliability. Using the Shengmai injection manufacturing plant of SZYY Ltd as a user case, we analyzed the quality risk with both static FMEA analysis and dynamic Bayesian Network analysis. The potential risk factors for the quality of Shengmai injection manufacturing were identified with the network analysis platform. Quality assurance actions were further defined to reduce the risk and improve the product quality.

  19. Status of risk-benefit analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Horn, A.J.; Wilson, R.

    1976-12-01

    The benefits and deficiencies of cost benefit analysis are reviewed. It is pointed out that, if decision making involving risks and benefits is to improve, more attention must be paid to the clear presentation of the assumptions, values, and results. Reports need to present concise summaries which convey the uncertainties and limitations of the analysis in addition to the matrix of costs, risks, and benefits. As the field of risk-benefit analysis advances the estimation of risks and benefits will become more precise and implicit valuations will be made more explicit. Corresponding improvements must also be made to enhance communications betweenmore » the risk-benefit analyst and the accountable decision maker.« less

  20. Method Analysis of Microbial-Resistant Gypsum Products

    EPA Science Inventory

    Method Analysis of Microbial-Resistant Gypsum ProductsD.A. Betancourt1, T.R.Dean1, A. Evans2, and G.Byfield2 1. US Environmental Protection Agency, Office of Research and Development, National Risk Management Research Laboratory; RTP, NC 277112. RTI International, RTP, NCSeveral...

  1. Alternative evaluation metrics for risk adjustment methods.

    PubMed

    Park, Sungchul; Basu, Anirban

    2018-06-01

    Risk adjustment is instituted to counter risk selection by accurately equating payments with expected expenditures. Traditional risk-adjustment methods are designed to estimate accurate payments at the group level. However, this generates residual risks at the individual level, especially for high-expenditure individuals, thereby inducing health plans to avoid those with high residual risks. To identify an optimal risk-adjustment method, we perform a comprehensive comparison of prediction accuracies at the group level, at the tail distributions, and at the individual level across 19 estimators: 9 parametric regression, 7 machine learning, and 3 distributional estimators. Using the 2013-2014 MarketScan database, we find that no one estimator performs best in all prediction accuracies. Generally, machine learning and distribution-based estimators achieve higher group-level prediction accuracy than parametric regression estimators. However, parametric regression estimators show higher tail distribution prediction accuracy and individual-level prediction accuracy, especially at the tails of the distribution. This suggests that there is a trade-off in selecting an appropriate risk-adjustment method between estimating accurate payments at the group level and lower residual risks at the individual level. Our results indicate that an optimal method cannot be determined solely on the basis of statistical metrics but rather needs to account for simulating plans' risk selective behaviors. Copyright © 2018 John Wiley & Sons, Ltd.

  2. Integrating Allergen Analysis Within a Risk Assessment Framework: Approaches to Development of Targeted Mass Spectrometry Methods for Allergen Detection and Quantification in the iFAAM Project.

    PubMed

    Nitride, Chiara; Lee, Victoria; Baricevic-Jones, Ivona; Adel-Patient, Karine; Baumgartner, Sabine; Mills, E N Clare

    2018-01-01

    Allergen analysis is central to implementing and monitoring food allergen risk assessment and management processes by the food industry, but current methods for the determination of allergens in foods give highly variable results. The European Union-funded "Integrated Approaches to Food Allergen and Allergy Risk Management" (iFAAM) project has been working to address gaps in knowledge regarding food allergen management and analysis, including the development of novel MS and immuno-based allergen determination methods. Common allergenic food ingredients (peanut, hazelnut, walnut, cow's milk [Bos domesticus], and hen's egg [Gallus domesticus]) and common food matrixes (chocolate dessert and cookie) have been used for both clinical studies and analytical method development to ensure that the new methods are clinically relevant. Allergen molecules have been used as analytical targets and allergenic ingredients incurred into matrixes at levels close to reference doses that may trigger the use of precautionary allergen labeling. An interlaboratory method comparison has been undertaken for the determination of peanut in chocolate dessert using MS and immuno-based methods. The iFAAM approach has highlighted the need for methods to report test results in allergenic protein. This will allow food business operators to use them in risk assessments that are founded on clinical study data in which protein has been used as a measure of allergenic potency.

  3. Research on the Risk Early Warning Method of Material Supplier Performance in Power Industry

    NASA Astrophysics Data System (ADS)

    Chen, Peng; Zhang, Xi

    2018-01-01

    The early warning of supplier performance risk is still in the initial stage interiorly, and research on the early warning mechanism to identify, analyze and prevent the performance risk is few. In this paper, a new method aiming at marerial supplier performance risk in power industry is proposed, firstly, establishing a set of risk early warning indexes, Then use the ECM method to classify the indexes to form different risk grades. Then, improving Crock Ford risk quantization model by considering three indicators, including the stability of power system, economic losses and successful bid ratio to form the predictive risk grade, and ultimately using short board effect principle to form the ultimate risk grade to truly reflect the supplier performance risk. Finally, making empirical analysis on supplier performance and putting forward the counter measures and prevention strategies for different risks.

  4. Risk analysis of gravity dam instability using credibility theory Monte Carlo simulation model.

    PubMed

    Xin, Cao; Chongshi, Gu

    2016-01-01

    Risk analysis of gravity dam stability involves complicated uncertainty in many design parameters and measured data. Stability failure risk ratio described jointly by probability and possibility has deficiency in characterization of influence of fuzzy factors and representation of the likelihood of risk occurrence in practical engineering. In this article, credibility theory is applied into stability failure risk analysis of gravity dam. Stability of gravity dam is viewed as a hybrid event considering both fuzziness and randomness of failure criterion, design parameters and measured data. Credibility distribution function is conducted as a novel way to represent uncertainty of influence factors of gravity dam stability. And combining with Monte Carlo simulation, corresponding calculation method and procedure are proposed. Based on a dam section, a detailed application of the modeling approach on risk calculation of both dam foundation and double sliding surfaces is provided. The results show that, the present method is feasible to be applied on analysis of stability failure risk for gravity dams. The risk assessment obtained can reflect influence of both sorts of uncertainty, and is suitable as an index value.

  5. Analysis of interactions among barriers in project risk management

    NASA Astrophysics Data System (ADS)

    Dandage, Rahul V.; Mantha, Shankar S.; Rane, Santosh B.; Bhoola, Vanita

    2018-03-01

    In the context of the scope, time, cost, and quality constraints, failure is not uncommon in project management. While small projects have 70% chances of success, large projects virtually have no chance of meeting the quadruple constraints. While there is no dearth of research on project risk management, the manifestation of barriers to project risk management is a less dwelt topic. The success of project management is oftentimes based on the understanding of barriers to effective risk management, application of appropriate risk management methodology, proactive leadership to avoid barriers, workers' attitude, adequate resources, organizational culture, and involvement of top management. This paper represents various risk categories and barriers to risk management in domestic and international projects through literature survey and feedback from project professionals. After analysing the various modelling methods used in project risk management literature, interpretive structural modelling (ISM) and MICMAC analysis have been used to analyse interactions among the barriers and prioritize them. The analysis indicates that lack of top management support, lack of formal training, and lack of addressing cultural differences are the high priority barriers, among many others.

  6. A semi-quantitative approach to GMO risk-benefit analysis.

    PubMed

    Morris, E Jane

    2011-10-01

    In many countries there are increasing calls for the benefits of genetically modified organisms (GMOs) to be considered as well as the risks, and for a risk-benefit analysis to form an integral part of GMO regulatory frameworks. This trend represents a shift away from the strict emphasis on risks, which is encapsulated in the Precautionary Principle that forms the basis for the Cartagena Protocol on Biosafety, and which is reflected in the national legislation of many countries. The introduction of risk-benefit analysis of GMOs would be facilitated if clear methodologies were available to support the analysis. Up to now, methodologies for risk-benefit analysis that would be applicable to the introduction of GMOs have not been well defined. This paper describes a relatively simple semi-quantitative methodology that could be easily applied as a decision support tool, giving particular consideration to the needs of regulators in developing countries where there are limited resources and experience. The application of the methodology is demonstrated using the release of an insect resistant maize variety in South Africa as a case study. The applicability of the method in the South African regulatory system is also discussed, as an example of what might be involved in introducing changes into an existing regulatory process.

  7. Operational Implementation of a Pc Uncertainty Construct for Conjunction Assessment Risk Analysis

    NASA Technical Reports Server (NTRS)

    Newman, Lauri K.; Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    Earlier this year the NASA Conjunction Assessment and Risk Analysis (CARA) project presented the theoretical and algorithmic aspects of a method to include the uncertainties in the calculation inputs when computing the probability of collision (Pc) between two space objects, principally uncertainties in the covariances and the hard-body radius. The output of this calculation approach is to produce rather than a single Pc value an entire probability density function that will represent the range of possible Pc values given the uncertainties in the inputs and bring CA risk analysis methodologies more in line with modern risk management theory. The present study provides results from the exercise of this method against an extended dataset of satellite conjunctions in order to determine the effect of its use on the evaluation of conjunction assessment (CA) event risk posture. The effects are found to be considerable: a good number of events are downgraded from or upgraded to a serious risk designation on the basis of consideration of the Pc uncertainty. The findings counsel the integration of the developed methods into NASA CA operations.

  8. A novel risk assessment method for landfill slope failure: Case study application for Bhalswa Dumpsite, India.

    PubMed

    Jahanfar, Ali; Amirmojahedi, Mohsen; Gharabaghi, Bahram; Dubey, Brajesh; McBean, Edward; Kumar, Dinesh

    2017-03-01

    Rapid population growth of major urban centres in many developing countries has created massive landfills with extraordinary heights and steep side-slopes, which are frequently surrounded by illegal low-income residential settlements developed too close to landfills. These extraordinary landfills are facing high risks of catastrophic failure with potentially large numbers of fatalities. This study presents a novel method for risk assessment of landfill slope failure, using probabilistic analysis of potential failure scenarios and associated fatalities. The conceptual framework of the method includes selecting appropriate statistical distributions for the municipal solid waste (MSW) material shear strength and rheological properties for potential failure scenario analysis. The MSW material properties for a given scenario is then used to analyse the probability of slope failure and the resulting run-out length to calculate the potential risk of fatalities. In comparison with existing methods, which are solely based on the probability of slope failure, this method provides a more accurate estimate of the risk of fatalities associated with a given landfill slope failure. The application of the new risk assessment method is demonstrated with a case study for a landfill located within a heavily populated area of New Delhi, India.

  9. Semi-Competing Risks Data Analysis: Accounting for Death as a Competing Risk When the Outcome of Interest Is Nonterminal.

    PubMed

    Haneuse, Sebastien; Lee, Kyu Ha

    2016-05-01

    Hospital readmission is a key marker of quality of health care. Notwithstanding its widespread use, however, it remains controversial in part because statistical methods used to analyze readmission, primarily logistic regression and related models, may not appropriately account for patients who die before experiencing a readmission event within the time frame of interest. Toward resolving this, we describe and illustrate the semi-competing risks framework, which refers to the general setting where scientific interest lies with some nonterminal event (eg, readmission), the occurrence of which is subject to a terminal event (eg, death). Although several statistical analysis methods have been proposed for semi-competing risks data, we describe in detail the use of illness-death models primarily because of their relation to well-known methods for survival analysis and the availability of software. We also describe and consider in detail several existing approaches that could, in principle, be used to analyze semi-competing risks data, including composite end point and competing risks analyses. Throughout we illustrate the ideas and methods using data on N=49 763 Medicare beneficiaries hospitalized between 2011 and 2013 with a principle discharge diagnosis of heart failure. © 2016 American Heart Association, Inc.

  10. Risk-based prioritization method for the classification of groundwater pesticide pollution from agricultural regions.

    PubMed

    Yang, Yu; Lian, Xin-Ying; Jiang, Yong-Hai; Xi, Bei-Dou; He, Xiao-Song

    2017-11-01

    Agricultural regions are a significant source of groundwater pesticide pollution. To ensure that agricultural regions with a significantly high risk of groundwater pesticide contamination are properly managed, a risk-based ranking method related to groundwater pesticide contamination is needed. In the present paper, a risk-based prioritization method for the classification of groundwater pesticide pollution from agricultural regions was established. The method encompasses 3 phases, including indicator selection, characterization, and classification. In the risk ranking index system employed here, 17 indicators involving the physicochemical properties, environmental behavior characteristics, pesticide application methods, and inherent vulnerability of groundwater in the agricultural region were selected. The boundary of each indicator was determined using K-means cluster analysis based on a survey of a typical agricultural region and the physical and chemical properties of 300 typical pesticides. The total risk characterization was calculated by multiplying the risk value of each indicator, which could effectively avoid the subjectivity of index weight calculation and identify the main factors associated with the risk. The results indicated that the risk for groundwater pesticide contamination from agriculture in a region could be ranked into 4 classes from low to high risk. This method was applied to an agricultural region in Jiangsu Province, China, and it showed that this region had a relatively high risk for groundwater contamination from pesticides, and that the pesticide application method was the primary factor contributing to the relatively high risk. The risk ranking method was determined to be feasible, valid, and able to provide reference data related to the risk management of groundwater pesticide pollution from agricultural regions. Integr Environ Assess Manag 2017;13:1052-1059. © 2017 SETAC. © 2017 SETAC.

  11. Comparison of concepts in easy-to-use methods for MSD risk assessment.

    PubMed

    Roman-Liu, Danuta

    2014-05-01

    This article presents a comparative analysis of easy-to-use methods for assessing musculoskeletal load and the risk for developing musculoskeletal disorders. In all such methods, assessment of load consists in defining input data, the procedure and the system of assessment. This article shows what assessment steps the methods have in common; it also shows how those methods differ in each step. In addition, the methods are grouped according to their characteristic features. The conclusion is that the concepts of assessing risk in different methods can be used to develop solutions leading to a comprehensive method appropriate for all work tasks and all parts of the body. However, studies are necessary to verify the accepted premises and to introduce some standardization that would make consolidation possible. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  12. Value-at-Risk analysis using ARMAX GARCHX approach for estimating risk of banking subsector stock return’s

    NASA Astrophysics Data System (ADS)

    Dewi Ratih, Iis; Sutijo Supri Ulama, Brodjol; Prastuti, Mike

    2018-03-01

    Value at Risk (VaR) is one of the statistical methods used to measure market risk by estimating the worst losses in a given time period and level of confidence. The accuracy of this measuring tool is very important in determining the amount of capital that must be provided by the company to cope with possible losses. Because there is a greater losses to be faced with a certain degree of probability by the greater risk. Based on this, VaR calculation analysis is of particular concern to researchers and practitioners of the stock market to be developed, thus getting more accurate measurement estimates. In this research, risk analysis of stocks in four banking sub-sector, Bank Rakyat Indonesia, Bank Mandiri, Bank Central Asia and Bank Negara Indonesia will be done. Stock returns are expected to be influenced by exogenous variables, namely ICI and exchange rate. Therefore, in this research, stock risk estimation are done by using VaR ARMAX-GARCHX method. Calculating the VaR value with the ARMAX-GARCHX approach using window 500 gives more accurate results. Overall, Bank Central Asia is the only bank had the estimated maximum loss in the 5% quantile.

  13. Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prayogo, Galang Sandy, E-mail: gasandylang@live.com; Haryadi, Gunawan Dwi; Ismail, Rifky

    Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presentedmore » a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.« less

  14. Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581

    NASA Astrophysics Data System (ADS)

    Prayogo, Galang Sandy; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin

    2016-04-01

    Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presented a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.

  15. Cable Overheating Risk Warning Method Based on Impedance Parameter Estimation in Distribution Network

    NASA Astrophysics Data System (ADS)

    Yu, Zhang; Xiaohui, Song; Jianfang, Li; Fei, Gao

    2017-05-01

    Cable overheating will lead to the cable insulation level reducing, speed up the cable insulation aging, even easy to cause short circuit faults. Cable overheating risk identification and warning is nessesary for distribution network operators. Cable overheating risk warning method based on impedance parameter estimation is proposed in the paper to improve the safty and reliability operation of distribution network. Firstly, cable impedance estimation model is established by using least square method based on the data from distribiton SCADA system to improve the impedance parameter estimation accuracy. Secondly, calculate the threshold value of cable impedance based on the historical data and the forecast value of cable impedance based on the forecasting data in future from distribiton SCADA system. Thirdly, establish risks warning rules library of cable overheating, calculate the cable impedance forecast value and analysis the change rate of impedance, and then warn the overheating risk of cable line based on the overheating risk warning rules library according to the variation relationship between impedance and line temperature rise. Overheating risk warning method is simulated in the paper. The simulation results shows that the method can identify the imedance and forecast the temperature rise of cable line in distribution network accurately. The result of overheating risk warning can provide decision basis for operation maintenance and repair.

  16. A ubiquitous method for street scale spatial data collection and analysis in challenging urban environments: mapping health risks using spatial video in Haiti

    PubMed Central

    2013-01-01

    Background Fine-scale and longitudinal geospatial analysis of health risks in challenging urban areas is often limited by the lack of other spatial layers even if case data are available. Underlying population counts, residential context, and associated causative factors such as standing water or trash locations are often missing unless collected through logistically difficult, and often expensive, surveys. The lack of spatial context also hinders the interpretation of results and designing intervention strategies structured around analytical insights. This paper offers a ubiquitous spatial data collection approach using a spatial video that can be used to improve analysis and involve participatory collaborations. A case study will be used to illustrate this approach with three health risks mapped at the street scale for a coastal community in Haiti. Methods Spatial video was used to collect street and building scale information, including standing water, trash accumulation, presence of dogs, cohort specific population characteristics, and other cultural phenomena. These data were digitized into Google Earth and then coded and analyzed in a GIS using kernel density and spatial filtering approaches. The concentrations of these risks around area schools which are sometimes sources of diarrheal disease infection because of the high concentration of children and variable sanitary practices will show the utility of the method. In addition schools offer potential locations for cholera education interventions. Results Previously unavailable fine scale health risk data vary in concentration across the town, with some schools being proximate to greater concentrations of the mapped risks. The spatial video is also used to validate coded data and location specific risks within these “hotspots”. Conclusions Spatial video is a tool that can be used in any environment to improve local area health analysis and intervention. The process is rapid and can be repeated in study

  17. Indoor Soiling Method and Outdoor Statistical Risk Analysis of Photovoltaic Power Plants

    NASA Astrophysics Data System (ADS)

    Rajasekar, Vidyashree

    This is a two-part thesis. Part 1 presents an approach for working towards the development of a standardized artificial soiling method for laminated photovoltaic (PV) cells or mini-modules. Construction of an artificial chamber to maintain controlled environmental conditions and components/chemicals used in artificial soil formulation is briefly explained. Both poly-Si mini-modules and a single cell mono-Si coupons were soiled and characterization tests such as I-V, reflectance and quantum efficiency (QE) were carried out on both soiled, and cleaned coupons. From the results obtained, poly-Si mini-modules proved to be a good measure of soil uniformity, as any non-uniformity present would not result in a smooth curve during I-V measurements. The challenges faced while executing reflectance and QE characterization tests on poly-Si due to smaller size cells was eliminated on the mono-Si coupons with large cells to obtain highly repeatable measurements. This study indicates that the reflectance measurements between 600-700 nm wavelengths can be used as a direct measure of soil density on the modules. Part 2 determines the most dominant failure modes of field aged PV modules using experimental data obtained in the field and statistical analysis, FMECA (Failure Mode, Effect, and Criticality Analysis). The failure and degradation modes of about 744 poly-Si glass/polymer frameless modules fielded for 18 years under the cold-dry climate of New York was evaluated. Defect chart, degradation rates (both string and module levels) and safety map were generated using the field measured data. A statistical reliability tool, FMECA that uses Risk Priority Number (RPN) is used to determine the dominant failure or degradation modes in the strings and modules by means of ranking and prioritizing the modes. This study on PV power plants considers all the failure and degradation modes from both safety and performance perspectives. The indoor and outdoor soiling studies were jointly

  18. Analysis of risk factors for persistent infection of asymptomatic women with high-risk human papilloma virus.

    PubMed

    Shi, Nianmin; Lu, Qiang; Zhang, Jiao; Li, Li; Zhang, Junnan; Zhang, Fanglei; Dong, Yanhong; Zhang, Xinyue; Zhang, Zheng; Gao, Wenhui

    2017-06-03

    This study aims to prevent persistentinfection, reduce the incidence of cervical cancer, and improve women's health by understanding the theoretical basis of the risk factors for continuous infection of asymptomatic women with high-risk human papilloma virus (HPV) strains via information collected, which includes the persistent infection rate and the most prevalent HPV strain types of high risk to asymptomatic women in the high-risk area of cervical cancer in Linfen, Shanxi Province. Based on the method of cluster sampling, locations were chosen from the industrial county and agricultural county of Linfen, Shanxi Province, namely the Xiangfen and Quwo counties. Use of the convenience sampling (CS) method enables the identification of women who have sex but without symptoms of abnormal cervix for analyzing risk factors of HPV-DNA detection and performing a retrospective questionnaire survey in these 2 counties. Firstly, cervical exfoliated cell samples were collected for thin-layer liquid-based cytology test (TCT), and simultaneously testing high-risk type HPV DNA, then samples with positive testing results were retested to identify the infected HPV types. The 6-month period of testing was done to derive the 6-month persistent infection rate. The retrospective survey included concepts addressed in the questionnaire: basic situation of the research objects, menstrual history, marital status, pregnancy history, sexual habits and other aspects. The questionnaire was divided into a case group and a comparison group, which are based on the high-risk HPV-DNA testing result to ascertain whether or not there is persistent infection. Statistical analysis employed Epidate3.1 software for date entry, SPSS17.0 for date statistical analysis. Select statistic charts, Chi-Square Analysis, single-factor analysis and multivariate Logistic regression analysis to analyze the protective factors and risk factors of high-risk HPV infection. Risk factors are predicted by using the

  19. Risk analysis with a fuzzy-logic approach of a complex installation

    NASA Astrophysics Data System (ADS)

    Peikert, Tim; Garbe, Heyno; Potthast, Stefan

    2016-09-01

    This paper introduces a procedural method based on fuzzy logic to analyze systematic the risk of an electronic system in an intentional electromagnetic environment (IEME). The method analyzes the susceptibility of a complex electronic installation with respect to intentional electromagnetic interference (IEMI). It combines the advantages of well-known techniques as fault tree analysis (FTA), electromagnetic topology (EMT) and Bayesian networks (BN) and extends the techniques with an approach to handle uncertainty. This approach uses fuzzy sets, membership functions and fuzzy logic to handle the uncertainty with probability functions and linguistic terms. The linguistic terms add to the risk analysis the knowledge from experts of the investigated system or environment.

  20. Risk analysis by FMEA as an element of analytical validation.

    PubMed

    van Leeuwen, J F; Nauta, M J; de Kaste, D; Odekerken-Rombouts, Y M C F; Oldenhof, M T; Vredenbregt, M J; Barends, D M

    2009-12-05

    We subjected a Near-Infrared (NIR) analytical procedure used for screening drugs on authenticity to a Failure Mode and Effects Analysis (FMEA), including technical risks as well as risks related to human failure. An FMEA team broke down the NIR analytical method into process steps and identified possible failure modes for each step. Each failure mode was ranked on estimated frequency of occurrence (O), probability that the failure would remain undetected later in the process (D) and severity (S), each on a scale of 1-10. Human errors turned out to be the most common cause of failure modes. Failure risks were calculated by Risk Priority Numbers (RPNs)=O x D x S. Failure modes with the highest RPN scores were subjected to corrective actions and the FMEA was repeated, showing reductions in RPN scores and resulting in improvement indices up to 5.0. We recommend risk analysis as an addition to the usual analytical validation, as the FMEA enabled us to detect previously unidentified risks.

  1. Analysis of Risk Factors for Postoperative Morbidity in Perforated Peptic Ulcer

    PubMed Central

    Kim, Jae-Myung; Jeong, Sang-Ho; Park, Soon-Tae; Choi, Sang-Kyung; Hong, Soon-Chan; Jung, Eun-Jung; Ju, Young-Tae; Jeong, Chi-Young; Ha, Woo-Song

    2012-01-01

    Purpose Emergency operations for perforated peptic ulcer are associated with a high incidence of postoperative complications. While several studies have investigated the impact of perioperative risk factors and underlying diseases on the postoperative morbidity after abdominal surgery, only a few have analyzed their role in perforated peptic ulcer disease. The purpose of this study was to determine any possible associations between postoperative morbidity and comorbid disease or perioperative risk factors in perforated peptic ulcer. Materials and Methods In total, 142 consecutive patients, who underwent surgery for perforated peptic ulcer, at a single institution, between January 2005 and October 2010 were included in this study. The clinical data concerning the patient characteristics, operative methods, and complications were collected retrospectively. Results The postoperative morbidity rate associated with perforated peptic ulcer operations was 36.6% (52/142). Univariate analysis revealed that a long operating time, the open surgical method, age (≥60), sex (female), high American Society of Anesthesiologists (ASA) score and presence of preoperative shock were significant perioperative risk factors for postoperative morbidity. Significant comorbid risk factors included hypertension, diabetes mellitus and pulmonary disease. Multivariate analysis revealed a long operating time, the open surgical method, high ASA score and the presence of preoperative shock were all independent risk factors for the postoperative morbidity in perforated peptic ulcer. Conclusions A high ASA score, preoperative shock, open surgery and long operating time of more than 150 minutes are high risk factors for morbidity. However, there is no association between postoperative morbidity and comorbid disease in patients with a perforated peptic ulcer. PMID:22500261

  2. A Method for Dynamic Risk Assessment and Management of Rockbursts in Drill and Blast Tunnels

    NASA Astrophysics Data System (ADS)

    Liu, Guo-Feng; Feng, Xia-Ting; Feng, Guang-Liang; Chen, Bing-Rui; Chen, Dong-Fang; Duan, Shu-Qian

    2016-08-01

    Focusing on the problems caused by rockburst hazards in deep tunnels, such as casualties, damage to construction equipment and facilities, construction schedule delays, and project cost increase, this research attempts to present a methodology for dynamic risk assessment and management of rockbursts in D&B tunnels. The basic idea of dynamic risk assessment and management of rockbursts is determined, and methods associated with each step in the rockburst risk assessment and management process are given, respectively. Among them, the main parts include a microseismic method for early warning the occurrence probability of rockburst risk, an estimation method that aims to assess potential consequences of rockburst risk, an evaluation method that utilizes a new quantitative index considering both occurrence probability and consequences for determining the level of rockburst risk, and the dynamic updating. Specifically, this research briefly describes the referenced microseismic method of warning rockburst, but focuses on the analysis of consequences and associated risk assessment and management of rockburst. Using the proposed method of risk assessment and management of rockburst, the occurrence probability, potential consequences, and the level of rockburst risk can be obtained in real-time during tunnel excavation, which contributes to the dynamic optimisation of risk mitigation measures and their application. The applicability of the proposed method has been verified by those cases from the Jinping II deep headrace and water drainage tunnels at depths of 1900-2525 m (with a length of 11.6 km in total for D&B tunnels).

  3. Semi-Competing Risks Data Analysis: Accounting for Death as a Competing Risk When the Outcome of Interest is Non-Terminal

    PubMed Central

    Haneuse, Sebastien; Lee, Kyu Ha

    2016-01-01

    Hospital readmission is a key marker of quality of health care. Notwithstanding its widespread use, however, it remains controversial in part because statistical methods used to analyze readmission, primarily logistic regression and related models, may not appropriately account for patients who die prior to experiencing a readmission event within the timeframe of interest. Towards resolving this, we describe and illustrate the semi-competing risks framework, which refers to the general setting where scientific interest lies with some non-terminal event (e.g. readmission), the occurrence of which is subject to a terminal event (e.g. death). Although a number of statistical analysis methods have been proposed for semi-competing risks data, we describe in detail the use of illness-death models primarily because of their relation to well-known methods for survival analysis and the availability of software. We also describe and consider in detail a number of existing approaches that could, in principle, be used to analyze semi-competing risks data including composite endpoint and competing risks analyses. Throughout we illustrate the ideas and methods using data on N=49,763 Medicare beneficiaries hospitalized between 2011–2013 with a principle discharge diagnosis of heart failure. PMID:27072677

  4. The Functional Resonance Analysis Method for a systemic risk based environmental auditing in a sinter plant: A semi-quantitative approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patriarca, Riccardo, E-mail: riccardo.patriarca@uniroma1.it; Di Gravio, Giulio; Costantino, Francesco

    Environmental auditing is a main issue for any production plant and assessing environmental performance is crucial to identify risks factors. The complexity of current plants arises from interactions among technological, human and organizational system components, which are often transient and not easily detectable. The auditing thus requires a systemic perspective, rather than focusing on individual behaviors, as emerged in recent research in the safety domain for socio-technical systems. We explore the significance of modeling the interactions of system components in everyday work, by the application of a recent systemic method, i.e. the Functional Resonance Analysis Method (FRAM), in order tomore » define dynamically the system structure. We present also an innovative evolution of traditional FRAM following a semi-quantitative approach based on Monte Carlo simulation. This paper represents the first contribution related to the application of FRAM in the environmental context, moreover considering a consistent evolution based on Monte Carlo simulation. The case study of an environmental risk auditing in a sinter plant validates the research, showing the benefits in terms of identifying potential critical activities, related mitigating actions and comprehensive environmental monitoring indicators. - Highlights: • We discuss the relevance of a systemic risk based environmental audit. • We present FRAM to represent functional interactions of the system. • We develop a semi-quantitative FRAM framework to assess environmental risks. • We apply the semi-quantitative FRAM framework to build a model for a sinter plant.« less

  5. Analysis of labour risks in the Spanish industrial aerospace sector.

    PubMed

    Laguardia, Juan; Rubio, Emilio; Garcia, Ana; Garcia-Foncillas, Rafael

    2016-01-01

    Labour risk prevention is an activity integrated within Safety and Hygiene at Work in Spain. In 2003, the Electronic Declaration for Accidents at Work, Delt@ (DELTA) was introduced. The industrial aerospace sector is subject to various risks. Our objective is to analyse the Spanish Industrial Aerospace Sector (SIAS) using the ACSOM methodology to assess its labour risks and to prioritise preventive actions. The SIAS and the Services Subsector (SS) were created and the relevant accident rate data were obtained. The ACSOM method was applied through double contrast (deviation and translocation) of the SIAS or SS risk polygon with the considered pattern, accidents from all sectors (ACSOM G) or the SIAS. A list of risks was obtained, ordered by action phases. In the SIAS vs. ACSOM G analysis, radiation risks were the worst, followed by overstrains. Accidents caused by living beings were also significant in the SS vs. SIAE, which will be able to be used to improve Risk Prevention. Radiation is the most significant risk in the SIAS and the SS. Preventive actions will be primary and secondary. ACSOM has shown itself to be a valid tool for the analysis of labour risks.

  6. Development of innovative methods for risk assessment in high-rise construction based on clustering of risk factors

    NASA Astrophysics Data System (ADS)

    Okolelova, Ella; Shibaeva, Marina; Shalnev, Oleg

    2018-03-01

    The article analyses risks in high-rise construction in terms of investment value with account of the maximum probable loss in case of risk event. The authors scrutinized the risks of high-rise construction in regions with various geographic, climatic and socio-economic conditions that may influence the project environment. Risk classification is presented in general terms, that includes aggregated characteristics of risks being common for many regions. Cluster analysis tools, that allow considering generalized groups of risk depending on their qualitative and quantitative features, were used in order to model the influence of the risk factors on the implementation of investment project. For convenience of further calculations, each type of risk is assigned a separate code with the number of the cluster and the subtype of risk. This approach and the coding of risk factors makes it possible to build a risk matrix, which greatly facilitates the task of determining the degree of impact of risks. The authors clarified and expanded the concept of the price risk, which is defined as the expected value of the event, 105 which extends the capabilities of the model, allows estimating an interval of the probability of occurrence and also using other probabilistic methods of calculation.

  7. Nuclear risk analysis of the Ulysses mission

    NASA Astrophysics Data System (ADS)

    Bartram, Bart W.; Vaughan, Frank R.; Englehart, Richard W., Dr.

    1991-01-01

    The use of a radioisotope thermoelectric generator fueled with plutonium-238 dioxide on the Space Shuttle-launched Ulysses mission implies some level of risk due to potential accidents. This paper describes the method used to quantify risks in the Ulysses mission Final Safety Analysis Report prepared for the U.S. Department of Energy. The starting point for the analysis described herein is following input of source term probability distributions from the General Electric Company. A Monte Carlo technique is used to develop probability distributions of radiological consequences for a range of accident scenarios thoughout the mission. Factors affecting radiological consequences are identified, the probability distribution of the effect of each factor determined, and the functional relationship among all the factors established. The probability distributions of all the factor effects are then combined using a Monte Carlo technique. The results of the analysis are presented in terms of complementary cumulative distribution functions (CCDF) by mission sub-phase, phase, and the overall mission. The CCDFs show the total probability that consequences (calculated health effects) would be equal to or greater than a given value.

  8. Predicting high risk births with contraceptive prevalence and contraceptive method-mix in an ecologic analysis.

    PubMed

    Perin, Jamie; Amouzou, Agbessi; Walker, Neff

    2017-11-07

    Increased contraceptive use has been associated with a decrease in high parity births, births that occur close together in time, and births to very young or to older women. These types of births are also associated with high risk of under-five mortality. Previous studies have looked at the change in the level of contraception use and the average change in these types of high-risk births. We aim to predict the distribution of births in a specific country when there is a change in the level and method of modern contraception. We used data from full birth histories and modern contraceptive use from 207 nationally representative Demographic and Health Surveys covering 71 countries to describe the distribution of births in each survey based on birth order, preceding birth space, and mother's age at birth. We estimated the ecologic associations between the prevalence and method-mix of modern contraceptives and the proportion of births in each category. Hierarchical modelling was applied to these aggregated cross sectional proportions, so that random effects were estimated for countries with multiple surveys. We use these results to predict the change in type of births associated with scaling up modern contraception in three different scenarios. We observed marked differences between regions, in the absolute rates of contraception, the types of contraceptives in use, and in the distribution of type of birth. Contraceptive method-mix was a significant determinant of proportion of high-risk births, especially for birth spacing, but also for mother's age and parity. Increased use of modern contraceptives is especially predictive of reduced parity and more births with longer preceding space. However, increased contraception alone is not associated with fewer births to women younger than 18 years or a decrease in short-spaced births. Both the level and the type of contraception are important factors in determining the effects of family planning on changes in distribution of

  9. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant...) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive... possible after the data breach, a non-VA entity with relevant expertise in data breach assessment and risk...

  10. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant...) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive... possible after the data breach, a non-VA entity with relevant expertise in data breach assessment and risk...

  11. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant...) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive... possible after the data breach, a non-VA entity with relevant expertise in data breach assessment and risk...

  12. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant...) INFORMATION SECURITY MATTERS Data Breaches § 75.115 Risk analysis. If a data breach involving sensitive... possible after the data breach, a non-VA entity with relevant expertise in data breach assessment and risk...

  13. Creating a spatially-explicit index: a method for assessing the global wildfire-water risk

    NASA Astrophysics Data System (ADS)

    Robinne, François-Nicolas; Parisien, Marc-André; Flannigan, Mike; Miller, Carol; Bladon, Kevin D.

    2017-04-01

    The wildfire-water risk (WWR) has been defined as the potential for wildfires to adversely affect water resources that are important for downstream ecosystems and human water needs for adequate water quantity and quality, therefore compromising the security of their water supply. While tools and methods are numerous for watershed-scale risk analysis, the development of a toolbox for the large-scale evaluation of the wildfire risk to water security has only started recently. In order to provide managers and policy-makers with an adequate tool, we implemented a method for the spatial analysis of the global WWR based on the Driving forces-Pressures-States-Impacts-Responses (DPSIR) framework. This framework relies on the cause-and-effect relationships existing between the five categories of the DPSIR chain. As this approach heavily relies on data, we gathered an extensive set of spatial indicators relevant to fire-induced hydrological hazards and water consumption patterns by human and natural communities. When appropriate, we applied a hydrological routing function to our indicators in order to simulate downstream accumulation of potentially harmful material. Each indicator was then assigned a DPSIR category. We collapsed the information in each category using a principal component analysis in order to extract the most relevant pixel-based information provided by each spatial indicator. Finally, we compiled our five categories using an additive indexation process to produce a spatially-explicit index of the WWR. A thorough sensitivity analysis has been performed in order to understand the relationship between the final risk values and the spatial pattern of each category used during the indexation. For comparison purposes, we aggregated index scores by global hydrological regions, or hydrobelts, to get a sense of regional DPSIR specificities. This rather simple method does not necessitate the use of complex physical models and provides a scalable and efficient tool

  14. Contribution of European research to risk analysis.

    PubMed

    Boenke, A

    2001-12-01

    The European Commission's, Quality of Life Research Programme, Key Action 1-Health, Food & Nutrition is mission-oriented and aims, amongst other things, at providing a healthy, safe and high-quality food supply leading to reinforced consumer confidence in the safety, of European food. Its objectives also include the enhancing of the competitiveness of the European food supply. Key Action 1 is currently supporting a number of different types of European collaborative projects in the area of risk analysis. The objectives of these projects range from the development and validation of prevention strategies including the reduction of consumers risks; development and validation of new modelling approaches, harmonization of risk assessment principles methodologies and terminology; standardization of methods and systems used for the safety evaluation of transgenic food; providing of tools for the evaluation of human viral contamination of shellfish and quality control; new methodologies for assessing the potential of unintended effects of genetically modified (genetically modified) foods; development of a risk assessment model for Cryptosporidium parvum related to the food and water industries, to the development of a communication platform for genetically modified organism, producers, retailers, regulatory authorities and consumer groups to improve safety assessment procedures, risk management strategies and risk communication; development and validation of new methods for safety testing of transgenic food; evaluation of the safety and efficacy of iron supplementation in pregnant women, evaluation of the potential cancer-preventing activity of pro- and pre-biotic ('synbiotic') combinations in human volunteers. An overview of these projects is presented here.

  15. Analysis of dengue fever risk using geostatistics model in bone regency

    NASA Astrophysics Data System (ADS)

    Amran, Stang, Mallongi, Anwar

    2017-03-01

    This research aim is to analysis of dengue fever risk based on Geostatistics model in Bone Regency. Risk levels of dengue fever are denoted by parameter of Binomial distribution. Effect of temperature, rainfalls, elevation, and larvae abundance are investigated through Geostatistics model. Bayesian hierarchical method is used in estimation process. Using dengue fever data in eleven locations this research shows that temperature and rainfall have significant effect of dengue fever risk in Bone regency.

  16. A Model-Free Machine Learning Method for Risk Classification and Survival Probability Prediction.

    PubMed

    Geng, Yuan; Lu, Wenbin; Zhang, Hao Helen

    2014-01-01

    Risk classification and survival probability prediction are two major goals in survival data analysis since they play an important role in patients' risk stratification, long-term diagnosis, and treatment selection. In this article, we propose a new model-free machine learning framework for risk classification and survival probability prediction based on weighted support vector machines. The new procedure does not require any specific parametric or semiparametric model assumption on data, and is therefore capable of capturing nonlinear covariate effects. We use numerous simulation examples to demonstrate finite sample performance of the proposed method under various settings. Applications to a glioma tumor data and a breast cancer gene expression survival data are shown to illustrate the new methodology in real data analysis.

  17. Fuzzy risk analysis of a modern γ-ray industrial irradiator.

    PubMed

    Castiglia, F; Giardina, M

    2011-06-01

    Fuzzy fault tree analyses were used to investigate accident scenarios that involve radiological exposure to operators working in industrial γ-ray irradiation facilities. The HEART method, a first generation human reliability analysis method, was used to evaluate the probability of adverse human error in these analyses. This technique was modified on the basis of fuzzy set theory to more directly take into account the uncertainties in the error-promoting factors on which the methodology is based. Moreover, with regard to some identified accident scenarios, fuzzy radiological exposure risk, expressed in terms of potential annual death, was evaluated. The calculated fuzzy risks for the examined plant were determined to be well below the reference risk suggested by International Commission on Radiological Protection.

  18. Revealing the underlying drivers of disaster risk: a global analysis

    NASA Astrophysics Data System (ADS)

    Peduzzi, Pascal

    2017-04-01

    Disasters events are perfect examples of compound events. Disaster risk lies at the intersection of several independent components such as hazard, exposure and vulnerability. Understanding the weight of each component requires extensive standardisation. Here, I show how footprints of past disastrous events were generated using GIS modelling techniques and used for extracting population and economic exposures based on distribution models. Using past event losses, it was possible to identify and quantify a wide range of socio-politico-economic drivers associated with human vulnerability. The analysis was applied to about nine thousand individual past disastrous events covering earthquakes, floods and tropical cyclones. Using a multiple regression analysis on these individual events it was possible to quantify each risk component and assess how vulnerability is influenced by various hazard intensities. The results show that hazard intensity, exposure, poverty, governance as well as other underlying factors (e.g. remoteness) can explain the magnitude of past disasters. Analysis was also performed to highlight the role of future trends in population and climate change and how this may impacts exposure to tropical cyclones in the future. GIS models combined with statistical multiple regression analysis provided a powerful methodology to identify, quantify and model disaster risk taking into account its various components. The same methodology can be applied to various types of risk at local to global scale. This method was applied and developed for the Global Risk Analysis of the Global Assessment Report on Disaster Risk Reduction (GAR). It was first applied on mortality risk in GAR 2009 and GAR 2011. New models ranging from global assets exposure and global flood hazard models were also recently developed to improve the resolution of the risk analysis and applied through CAPRA software to provide probabilistic economic risk assessments such as Average Annual Losses (AAL

  19. A probabilistic method for computing quantitative risk indexes from medical injuries compensation claims.

    PubMed

    Dalle Carbonare, S; Folli, F; Patrini, E; Giudici, P; Bellazzi, R

    2013-01-01

    The increasing demand of health care services and the complexity of health care delivery require Health Care Organizations (HCOs) to approach clinical risk management through proper methods and tools. An important aspect of risk management is to exploit the analysis of medical injuries compensation claims in order to reduce adverse events and, at the same time, to optimize the costs of health insurance policies. This work provides a probabilistic method to estimate the risk level of a HCO by computing quantitative risk indexes from medical injury compensation claims. Our method is based on the estimate of a loss probability distribution from compensation claims data through parametric and non-parametric modeling and Monte Carlo simulations. The loss distribution can be estimated both on the whole dataset and, thanks to the application of a Bayesian hierarchical model, on stratified data. The approach allows to quantitatively assessing the risk structure of the HCO by analyzing the loss distribution and deriving its expected value and percentiles. We applied the proposed method to 206 cases of injuries with compensation requests collected from 1999 to the first semester of 2007 by the HCO of Lodi, in the Northern part of Italy. We computed the risk indexes taking into account the different clinical departments and the different hospitals involved. The approach proved to be useful to understand the HCO risk structure in terms of frequency, severity, expected and unexpected loss related to adverse events.

  20. Analysis and Assessment of Operation Risk for Hybrid AC/DC Power System based on the Monte Carlo Method

    NASA Astrophysics Data System (ADS)

    Hu, Xiaojing; Li, Qiang; Zhang, Hao; Guo, Ziming; Zhao, Kun; Li, Xinpeng

    2018-06-01

    Based on the Monte Carlo method, an improved risk assessment method for hybrid AC/DC power system with VSC station considering the operation status of generators, converter stations, AC lines and DC lines is proposed. According to the sequential AC/DC power flow algorithm, node voltage and line active power are solved, and then the operation risk indices of node voltage over-limit and line active power over-limit are calculated. Finally, an improved two-area IEEE RTS-96 system is taken as a case to analyze and assessment its operation risk. The results show that the proposed model and method can intuitively and directly reflect the weak nodes and weak lines of the system, which can provide some reference for the dispatching department.

  1. Different type 2 diabetes risk assessments predict dissimilar numbers at ‘high risk’: a retrospective analysis of diabetes risk-assessment tools

    PubMed Central

    Gray, Benjamin J; Bracken, Richard M; Turner, Daniel; Morgan, Kerry; Thomas, Michael; Williams, Sally P; Williams, Meurig; Rice, Sam; Stephens, Jeffrey W

    2015-01-01

    Background Use of a validated risk-assessment tool to identify individuals at high risk of developing type 2 diabetes is currently recommended. It is under-reported, however, whether a different risk tool alters the predicted risk of an individual. Aim This study explored any differences between commonly used validated risk-assessment tools for type 2 diabetes. Design and setting Cross-sectional analysis of individuals who participated in a workplace-based risk assessment in Carmarthenshire, South Wales. Method Retrospective analysis of 676 individuals (389 females and 287 males) who participated in a workplace-based diabetes risk-assessment initiative. Ten-year risk of type 2 diabetes was predicted using the validated QDiabetes®, Leicester Risk Assessment (LRA), FINDRISC, and Cambridge Risk Score (CRS) algorithms. Results Differences between the risk-assessment tools were apparent following retrospective analysis of individuals. CRS categorised the highest proportion (13.6%) of individuals at ‘high risk’ followed by FINDRISC (6.6%), QDiabetes (6.1%), and, finally, the LRA was the most conservative risk tool (3.1%). Following further analysis by sex, over one-quarter of males were categorised at high risk using CRS (25.4%), whereas a greater percentage of females were categorised as high risk using FINDRISC (7.8%). Conclusion The adoption of a different valid risk-assessment tool can alter the predicted risk of an individual and caution should be used to identify those individuals who really are at high risk of type 2 diabetes. PMID:26541180

  2. Train integrity detection risk analysis based on PRISM

    NASA Astrophysics Data System (ADS)

    Wen, Yuan

    2018-04-01

    GNSS based Train Integrity Monitoring System (TIMS) is an effective and low-cost detection scheme for train integrity detection. However, as an external auxiliary system of CTCS, GNSS may be influenced by external environments, such as uncertainty of wireless communication channels, which may lead to the failure of communication and positioning. In order to guarantee the reliability and safety of train operation, a risk analysis method of train integrity detection based on PRISM is proposed in this article. First, we analyze the risk factors (in GNSS communication process and the on-board communication process) and model them. Then, we evaluate the performance of the model in PRISM based on the field data. Finally, we discuss how these risk factors influence the train integrity detection process.

  3. Analysis of risk factors for postoperative morbidity in perforated peptic ulcer.

    PubMed

    Kim, Jae-Myung; Jeong, Sang-Ho; Lee, Young-Joon; Park, Soon-Tae; Choi, Sang-Kyung; Hong, Soon-Chan; Jung, Eun-Jung; Ju, Young-Tae; Jeong, Chi-Young; Ha, Woo-Song

    2012-03-01

    Emergency operations for perforated peptic ulcer are associated with a high incidence of postoperative complications. While several studies have investigated the impact of perioperative risk factors and underlying diseases on the postoperative morbidity after abdominal surgery, only a few have analyzed their role in perforated peptic ulcer disease. The purpose of this study was to determine any possible associations between postoperative morbidity and comorbid disease or perioperative risk factors in perforated peptic ulcer. In total, 142 consecutive patients, who underwent surgery for perforated peptic ulcer, at a single institution, between January 2005 and October 2010 were included in this study. The clinical data concerning the patient characteristics, operative methods, and complications were collected retrospectively. The postoperative morbidity rate associated with perforated peptic ulcer operations was 36.6% (52/142). Univariate analysis revealed that a long operating time, the open surgical method, age (≥60), sex (female), high American Society of Anesthesiologists (ASA) score and presence of preoperative shock were significant perioperative risk factors for postoperative morbidity. Significant comorbid risk factors included hypertension, diabetes mellitus and pulmonary disease. Multivariate analysis revealed a long operating time, the open surgical method, high ASA score and the presence of preoperative shock were all independent risk factors for the postoperative morbidity in perforated peptic ulcer. A high ASA score, preoperative shock, open surgery and long operating time of more than 150 minutes are high risk factors for morbidity. However, there is no association between postoperative morbidity and comorbid disease in patients with a perforated peptic ulcer.

  4. Using Enterprise Architecture for Analysis of a Complex Adaptive Organization's Risk Inducing Characteristics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salguero, Laura Marie; Huff, Johnathon; Matta, Anthony R.

    Sandia National Laboratories is an organization with a wide range of research and development activities that include nuclear, explosives, and chemical hazards. In addition, Sandia has over 2000 labs and over 40 major test facilities, such as the Thermal Test Complex, the Lightning Test Facility, and the Rocket Sled Track. In order to support safe operations, Sandia has a diverse Environment, Safety, and Health (ES&H) organization that provides expertise to support engineers and scientists in performing work safely. With such a diverse organization to support, the ES&H program continuously seeks opportunities to improve the services provided for Sandia by usingmore » various methods as part of their risk management strategy. One of the methods being investigated is using enterprise architecture analysis to mitigate risk inducing characteristics such as normalization of deviance, organizational drift, and problems in information flow. This paper is a case study for how a Department of Defense Architecture Framework (DoDAF) model of the ES&H enterprise, including information technology applications, can be analyzed to understand the level of risk associated with the risk inducing characteristics discussed above. While the analysis is not complete, we provide proposed analysis methods that will be used for future research as the project progresses.« less

  5. Bridging the two cultures of risk analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jasanoff, S.

    1993-04-01

    During the past 15 years, risk analysis has come of age as an interdisciplinary field of remarkable breadth, nurturing connections among fields as diverse as mathematics, biostatistics, toxicology, and engineering on one hand, and law, psychology, sociology, and economics on the other hand. In this editorial, the author addresses the question: What has the presence of social scientists in the network meant to the substantive development of the field of risk analysis The answers offered here discuss the substantial progress in bridging the two cultures of risk analysis. Emphasis is made of the continual need for monitoring risk analysis. Topicsmore » include: the micro-worlds of risk assessment; constraining assumptions; and exchange programs. 14 refs.« less

  6. Study on Collision of Ship Side Structure by Simplified Plastic Analysis Method

    NASA Astrophysics Data System (ADS)

    Sun, C. J.; Zhou, J. H.; Wu, W.

    2017-10-01

    During its lifetime, a ship may encounter collision or grounding and sustain permanent damage after these types of accidents. Crashworthiness has been based on two kinds of main methods: simplified plastic analysis and numerical simulation. A simplified plastic analysis method is presented in this paper. Numerical methods using the non-linear finite-element software LS-DYNA are conducted to validate the method. The results show that, as for the accuracy of calculation results, the simplified plasticity analysis are in good agreement with the finite element simulation, which reveals that the simplified plasticity analysis method can quickly and accurately estimate the crashworthiness of the side structure during the collision process and can be used as a reliable risk assessment method.

  7. Use of a systematic risk analysis method (FMECA) to improve quality in a clinical laboratory procedure.

    PubMed

    Serafini, A; Troiano, G; Franceschini, E; Calzoni, P; Nante, N; Scapellato, C

    2016-01-01

    Risk management is a set of actions to recognize or identify risks, errors and their consequences and to take the steps to counter it. The aim of our study was to apply FMECA (Failure Mode, Effects and Criticality Analysis) to the Activated Protein C resistance (APCR) test in order to detect and avoid mistakes in this process. We created a team and the process was divided in phases and sub phases. For each phase we calculated the probability of occurrence (O) of an error, the detectability score (D) and the severity (S). The product of these three indexes yields the RPN (Risk Priority Number). Phases with a higher RPN need corrective actions with a higher priority. The calculation of RPN showed that more than 20 activities have a score higher than 150 and need important preventive actions; 8 have a score between 100 and 150. Only 23 actions obtained an acceptable score lower than 100. This was one of the first experience of application of FMECA analysis to a laboratory process, and the first one which applies this technique to the identification of the factor V Leiden, and our results confirm that FMECA could be a simple, powerful and useful tool in risk management and helps to identify quickly the criticality in a laboratory process.

  8. Risk Analysis for Resource Planning Optimization

    NASA Technical Reports Server (NTRS)

    Chueng, Kar-Ming

    2008-01-01

    The main purpose of this paper is to introduce a risk management approach that allows planners to quantify the risk and efficiency tradeoff in the presence of uncertainties, and to make forward-looking choices in the development and execution of the plan. Demonstrate a planning and risk analysis framework that tightly integrates mathematical optimization, empirical simulation, and theoretical analysis techniques to solve complex problems.

  9. Arenal-type pyroclastic flows: A probabilistic event tree risk analysis

    NASA Astrophysics Data System (ADS)

    Meloy, Anthony F.

    2006-09-01

    A quantitative hazard-specific scenario-modelling risk analysis is performed at Arenal volcano, Costa Rica for the newly recognised Arenal-type pyroclastic flow (ATPF) phenomenon using an event tree framework. These flows are generated by the sudden depressurisation and fragmentation of an active basaltic andesite lava pool as a result of a partial collapse of the crater wall. The deposits of this type of flow include angular blocks and juvenile clasts, which are rarely found in other types of pyroclastic flow. An event tree analysis (ETA) is a useful tool and framework in which to analyse and graphically present the probabilities of the occurrence of many possible events in a complex system. Four event trees are created in the analysis, three of which are extended to investigate the varying individual risk faced by three generic representatives of the surrounding community: a resident, a worker, and a tourist. The raw numerical risk estimates determined by the ETA are converted into a set of linguistic expressions (i.e. VERY HIGH, HIGH, MODERATE etc.) using an established risk classification scale. Three individually tailored semi-quantitative risk maps are then created from a set of risk conversion tables to show how the risk varies for each individual in different areas around the volcano. In some cases, by relocating from the north to the south, the level of risk can be reduced by up to three classes. While the individual risk maps may be broadly applicable, and therefore of interest to the general community, the risk maps and associated probability values generated in the ETA are intended to be used by trained professionals and government agencies to evaluate the risk and effectively manage the long-term development of infrastructure and habitation. With the addition of fresh monitoring data, the combination of both long- and short-term event trees would provide a comprehensive and consistent method of risk analysis (both during and pre-crisis), and as such

  10. Advances in Risk Analysis with Big Data.

    PubMed

    Choi, Tsan-Ming; Lambert, James H

    2017-08-01

    With cloud computing, Internet-of-things, wireless sensors, social media, fast storage and retrieval, etc., organizations and enterprises have access to unprecedented amounts and varieties of data. Current risk analysis methodology and applications are experiencing related advances and breakthroughs. For example, highway operations data are readily available, and making use of them reduces risks of traffic crashes and travel delays. Massive data of financial and enterprise systems support decision making under risk by individuals, industries, regulators, etc. In this introductory article, we first discuss the meaning of big data for risk analysis. We then examine recent advances in risk analysis with big data in several topic areas. For each area, we identify and introduce the relevant articles that are featured in the special issue. We conclude with a discussion on future research opportunities. © 2017 Society for Risk Analysis.

  11. 49 CFR Appendix D to Part 172 - Rail Risk Analysis Factors

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... nature of the rail system, each carrier must select and document the analysis method/model used and identify the routes to be analyzed. D. The safety and security risk analysis must consider current data and... curvature; 7. Presence or absence of signals and train control systems along the route (“dark” versus...

  12. A utility/cost analysis of breast cancer risk prediction algorithms

    NASA Astrophysics Data System (ADS)

    Abbey, Craig K.; Wu, Yirong; Burnside, Elizabeth S.; Wunderlich, Adam; Samuelson, Frank W.; Boone, John M.

    2016-03-01

    Breast cancer risk prediction algorithms are used to identify subpopulations that are at increased risk for developing breast cancer. They can be based on many different sources of data such as demographics, relatives with cancer, gene expression, and various phenotypic features such as breast density. Women who are identified as high risk may undergo a more extensive (and expensive) screening process that includes MRI or ultrasound imaging in addition to the standard full-field digital mammography (FFDM) exam. Given that there are many ways that risk prediction may be accomplished, it is of interest to evaluate them in terms of expected cost, which includes the costs of diagnostic outcomes. In this work we perform an expected-cost analysis of risk prediction algorithms that is based on a published model that includes the costs associated with diagnostic outcomes (true-positive, false-positive, etc.). We assume the existence of a standard screening method and an enhanced screening method with higher scan cost, higher sensitivity, and lower specificity. We then assess expected cost of using a risk prediction algorithm to determine who gets the enhanced screening method under the strong assumption that risk and diagnostic performance are independent. We find that if risk prediction leads to a high enough positive predictive value, it will be cost-effective regardless of the size of the subpopulation. Furthermore, in terms of the hit-rate and false-alarm rate of the of the risk prediction algorithm, iso-cost contours are lines with slope determined by properties of the available diagnostic systems for screening.

  13. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2007-01-01

    A formal method is described to quantify structural reliability and risk in the presence of a multitude of uncertainties. The method is based on the materials behavior level where primitive variables with their respective scatters are used to describe that behavior. Computational simulation is then used to propagate those uncertainties to the structural scale where reliability and risk are usually specified. A sample case is described to illustrate the effectiveness, versatility, and maturity of the method. Typical results from this method demonstrate that the method is mature and that it can be used for future strategic projections and planning to assure better, cheaper, faster products for competitive advantages in world markets. The results also indicate that the methods are suitable for predicting remaining life in aging or deteriorating structures.

  14. Integration of PKPD relationships into benefit–risk analysis

    PubMed Central

    Bellanti, Francesco; van Wijk, Rob C; Danhof, Meindert; Della Pasqua, Oscar

    2015-01-01

    Aim Despite the continuous endeavour to achieve high standards in medical care through effectiveness measures, a quantitative framework for the assessment of the benefit–risk balance of new medicines is lacking prior to regulatory approval. The aim of this short review is to summarise the approaches currently available for benefit–risk assessment. In addition, we propose the use of pharmacokinetic–pharmacodynamic (PKPD) modelling as the pharmacological basis for evidence synthesis and evaluation of novel therapeutic agents. Methods A comprehensive literature search has been performed using MESH terms in PubMed, in which articles describing benefit–risk assessment and modelling and simulation were identified. In parallel, a critical review of multi-criteria decision analysis (MCDA) is presented as a tool for characterising a drug's safety and efficacy profile. Results A definition of benefits and risks has been proposed by the European Medicines Agency (EMA), in which qualitative and quantitative elements are included. However, in spite of the value of MCDA as a quantitative method, decisions about benefit–risk balance continue to rely on subjective expert opinion. By contrast, a model-informed approach offers the opportunity for a more comprehensive evaluation of benefit–risk balance before extensive evidence is generated in clinical practice. Conclusions Benefit–risk balance should be an integral part of the risk management plan and as such considered before marketing authorisation. Modelling and simulation can be incorporated into MCDA to support the evidence synthesis as well evidence generation taking into account the underlying correlations between favourable and unfavourable effects. In addition, it represents a valuable tool for the optimization of protocol design in effectiveness trials. PMID:25940398

  15. Risk analysis of computer system designs

    NASA Technical Reports Server (NTRS)

    Vallone, A.

    1981-01-01

    Adverse events during implementation can affect final capabilities, schedule and cost of a computer system even though the system was accurately designed and evaluated. Risk analysis enables the manager to forecast the impact of those events and to timely ask for design revisions or contingency plans before making any decision. This paper presents a structured procedure for an effective risk analysis. The procedure identifies the required activities, separates subjective assessments from objective evaluations, and defines a risk measure to determine the analysis results. The procedure is consistent with the system design evaluation and enables a meaningful comparison among alternative designs.

  16. Application of a risk analysis method to different technologies for producing a monoclonal antibody employed in hepatitis B vaccine manufacturing.

    PubMed

    Milá, Lorely; Valdés, Rodolfo; Tamayo, Andrés; Padilla, Sigifredo; Ferro, Williams

    2012-03-01

    CB.Hep-1 monoclonal antibody (mAb) is used for a recombinant Hepatitis B vaccine manufacturing, which is included in a worldwide vaccination program against Hepatitis B disease. The use of this mAb as immunoligand has been addressed into one of the most efficient steps of active pharmaceutical ingredient purification process. Regarding this, Quality Risk Management (QRM) provides an excellent framework for the risk management use in pharmaceutical manufacturing and quality decision-making applications. Consequently, this study sought applying a prospective risk analysis methodology Failure Mode Effects Analysis (FMEA) as QRM tool for analyzing different CB.Hep-1 mAb manufacturing technologies. As main conclusions FMEA was successfully used to assess risks associated with potential problems in CB.Hep-1 mAb manufacturing processes. The severity and occurrence of risks analysis evidenced that the percentage of very high severe risks ranged 31.0-38.7% of all risks and the huge majority of risks have a very low occurrence level (61.9-83.3%) in all assessed technologies. Finally, additive Risk Priority Number, was descending ordered as follow: transgenic plants (2636), ascites (2577), transgenic animals (2046) and hollow fiber bioreactors (1654), which also corroborated that in vitro technology, should be the technology of choice for CB.Hep-1 mAb manufacturing in terms of risks and mAb molecule quality. Copyright © 2011 The International Alliance for Biological Standardization. Published by Elsevier Ltd. All rights reserved.

  17. Translational benchmark risk analysis

    PubMed Central

    Piegorsch, Walter W.

    2010-01-01

    Translational development – in the sense of translating a mature methodology from one area of application to another, evolving area – is discussed for the use of benchmark doses in quantitative risk assessment. Illustrations are presented with traditional applications of the benchmark paradigm in biology and toxicology, and also with risk endpoints that differ from traditional toxicological archetypes. It is seen that the benchmark approach can apply to a diverse spectrum of risk management settings. This suggests a promising future for this important risk-analytic tool. Extensions of the method to a wider variety of applications represent a significant opportunity for enhancing environmental, biomedical, industrial, and socio-economic risk assessments. PMID:20953283

  18. 31 CFR 223.11 - Limitation of risk: Protective methods.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 31 Money and Finance: Treasury 2 2014-07-01 2014-07-01 false Limitation of risk: Protective methods. 223.11 Section 223.11 Money and Finance: Treasury Regulations Relating to Money and Finance... BUSINESS WITH THE UNITED STATES § 223.11 Limitation of risk: Protective methods. The limitation of risk...

  19. SU-F-T-243: Major Risks in Radiotherapy. A Review Based On Risk Analysis Literature

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    López-Tarjuelo, J; Guasp-Tortajada, M; Iglesias-Montenegro, N

    Purpose: We present a literature review of risk analyses in radiotherapy to highlight the most reported risks and facilitate the spread of this valuable information so that professionals can be aware of these major threats before performing their own studies. Methods: We considered studies with at least an estimation of the probability of occurrence of an adverse event (O) and its associated severity (S). They cover external beam radiotherapy, brachytherapy, intraoperative radiotherapy, and stereotactic techniques. We selected only the works containing a detailed ranked series of elements or failure modes and focused on the first fully reported quartile as much.more » Afterward, we sorted the risk elements according to a regular radiotherapy procedure so that the resulting groups were cited in several works and be ranked in this way. Results: 29 references published between 2007 and February 2016 were studied. Publication trend has been generally rising. The most employed analysis has been the Failure mode and effect analysis (FMEA). Among references, we selected 20 works listing 258 ranked risk elements. They were sorted into 31 groups appearing at least in two different works. 11 groups appeared in at least 5 references and 5 groups did it in 7 or more papers. These last sets of risks where choosing another set of images or plan for planning or treating, errors related with contours, errors in patient positioning for treatment, human mistakes when programming treatments, and planning errors. Conclusion: There is a sufficient amount and variety of references for identifying which failure modes or elements should be addressed in a radiotherapy department before attempting a specific analysis. FMEA prevailed, but other studies such as “risk matrix” or “occurrence × severity” analyses can also lead professionals’ efforts. Risk associated with human actions ranks very high; therefore, they should be automated or at least peer-reviewed.« less

  20. Method of Evaluating the Life Cycle Cost of Small Earth Dams Considering the Risk of Heavy Rainfall and Selection Method of the Optimum Countermeasure

    NASA Astrophysics Data System (ADS)

    Hori, Toshikazu; Mohri, Yoshiyuki; Matsushima, Kenichi; Ariyoshi, Mitsuru

    In recent years the increase in the number of heavy rainfall occurrences such as through unpredictable cloudbursts have resulted in the safety of the embankments of small earth dams needing to be improved. However, the severe financial condition of the government and local autonomous bodies necessitate the cost of improving them to be reduced. This study concerns the development of a method of evaluating the life cycle cost of small earth dams considered to pose a risk and in order to improve the safety of the downstream areas of small earth dams at minimal cost. Use of a safety evaluation method that is based on a combination of runoff analysis, saturated and unsaturated seepage analysis, and slope stability analysis enables the probability of a dam breach and its life cycle cost with the risk of heavy rainfall taken into account to be calculated. Moreover, use of the life cycle cost evaluation method will lead to the development of a technique for selecting the method of the optimal improvement or countermeasures against heavy rainfall.

  1. White Paper: A Defect Prioritization Method Based on the Risk Priority Number

    DTIC Science & Technology

    2013-11-01

    adapted The Failure Modes and Effects Analysis ( FMEA ) method employs a measurement technique called Risk Priority Number (RPN) to quantify the...Up to an hour 16-60 1.5 Brief Interrupt 0-15 1 Table 1 – Time Scaling Factors In the FMEA formulation, RPN is a product of the three categories

  2. Comparison of 3 Methods for Identifying Dietary Patterns Associated With Risk of Disease

    PubMed Central

    DiBello, Julia R.; Kraft, Peter; McGarvey, Stephen T.; Goldberg, Robert; Campos, Hannia

    2008-01-01

    Reduced rank regression and partial least-squares regression (PLS) are proposed alternatives to principal component analysis (PCA). Using all 3 methods, the authors derived dietary patterns in Costa Rican data collected on 3,574 cases and controls in 1994–2004 and related the resulting patterns to risk of first incident myocardial infarction. Four dietary patterns associated with myocardial infarction were identified. Factor 1, characterized by high intakes of lean chicken, vegetables, fruit, and polyunsaturated oil, was generated by all 3 dietary pattern methods and was associated with a significantly decreased adjusted risk of myocardial infarction (28%–46%, depending on the method used). PCA and PLS also each yielded a pattern associated with a significantly decreased risk of myocardial infarction (31% and 23%, respectively); this pattern was characterized by moderate intake of alcohol and polyunsaturated oil and low intake of high-fat dairy products. The fourth factor derived from PCA was significantly associated with a 38% increased risk of myocardial infarction and was characterized by high intakes of coffee and palm oil. Contrary to previous studies, the authors found PCA and PLS to produce more patterns associated with cardiovascular disease than reduced rank regression. The most effective method for deriving dietary patterns related to disease may vary depending on the study goals. PMID:18945692

  3. Sexual Pleasure and Sexual Risk among Women who Use Methamphetamine: A Mixed Methods Study

    PubMed Central

    Lorvick, Jennifer; Bourgois, Philippe; Wenger, Lynn D.; Arreola, Sonya G.; Lutnick, Alexandra; Wechsberg, Wendee M.; Kral, Alex H.

    2012-01-01

    Background The intersection of drug use, sexual pleasure and sexual risk behavior is rarely explored when it comes to poor women who use drugs. This paper explores the relationship between sexual behavior and methamphetamine use in a community-based sample of women, exploring not only risk, but also desire, pleasure and the challenges of overcoming trauma. Methods Quantitative data were collected using standard epidemiological methods (N=322) for community-based studies. In addition, using purposive sampling, qualitative data were collected among a subset of participants (n=34). Data were integrated for mixed methods analysis. Results While many participants reported sexual risk behavior (unprotected vaginal or anal intercourse) in the quantitative survey, sexual risk was not the central narrative pertaining to sexual behavior and methamphetamine use in qualitative findings. Rather, desire, pleasure and disinhibition arose as central themes. Women described feelings of power and agency related to sexual behavior while high on methamphetamine. Findings were mixed on whether methamphetamine use increased sexual risk behavior. Conclusion The use of mixed methods afforded important insights into the sexual behavior and priorities of methamphetamine-using women. Efforts to reduce sexual risk should recognize and valorize the positive aspects of methamphetamine use for some women, building on positive feelings of power and agency as an approach to harm minimization. PMID:22954501

  4. Comparing multiple competing interventions in the absence of randomized trials using clinical risk-benefit analysis

    PubMed Central

    2012-01-01

    Background To demonstrate the use of risk-benefit analysis for comparing multiple competing interventions in the absence of randomized trials, we applied this approach to the evaluation of five anticoagulants to prevent thrombosis in patients undergoing orthopedic surgery. Methods Using a cost-effectiveness approach from a clinical perspective (i.e. risk benefit analysis) we compared thromboprophylaxis with warfarin, low molecular weight heparin, unfractionated heparin, fondaparinux or ximelagatran in patients undergoing major orthopedic surgery, with sub-analyses according to surgery type. Proportions and variances of events defining risk (major bleeding) and benefit (thrombosis averted) were obtained through a meta-analysis and used to define beta distributions. Monte Carlo simulations were conducted and used to calculate incremental risks, benefits, and risk-benefit ratios. Finally, net clinical benefit was calculated for all replications across a range of risk-benefit acceptability thresholds, with a reference range obtained by estimating the case fatality rate - ratio of thrombosis to bleeding. Results The analysis showed that compared to placebo ximelagatran was superior to other options but final results were influenced by type of surgery, since ximelagatran was superior in total knee replacement but not in total hip replacement. Conclusions Using simulation and economic techniques we demonstrate a method that allows comparing multiple competing interventions in the absence of randomized trials with multiple arms by determining the option with the best risk-benefit profile. It can be helpful in clinical decision making since it incorporates risk, benefit, and personal risk acceptance. PMID:22233221

  5. ITS risk analysis.

    DOT National Transportation Integrated Search

    1996-06-01

    Risk analysis plays a key role in the implementation of an architecture. Early definition of the situations, processes, or events that have the potential for impeding the implementation of key elements of the ITS National Architecture is a critical e...

  6. A classification scheme for risk assessment methods.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stamp, Jason Edwin; Campbell, Philip LaRoche

    2004-08-01

    This report presents a classification scheme for risk assessment methods. This scheme, like all classification schemes, provides meaning by imposing a structure that identifies relationships. Our scheme is based on two orthogonal aspects--level of detail, and approach. The resulting structure is shown in Table 1 and is explained in the body of the report. Each cell in the Table represent a different arrangement of strengths and weaknesses. Those arrangements shift gradually as one moves through the table, each cell optimal for a particular situation. The intention of this report is to enable informed use of the methods so that amore » method chosen is optimal for a situation given. This report imposes structure on the set of risk assessment methods in order to reveal their relationships and thus optimize their usage.We present a two-dimensional structure in the form of a matrix, using three abstraction levels for the rows and three approaches for the columns. For each of the nine cells in the matrix we identify the method type by name and example. The matrix helps the user understand: (1) what to expect from a given method, (2) how it relates to other methods, and (3) how best to use it. Each cell in the matrix represent a different arrangement of strengths and weaknesses. Those arrangements shift gradually as one moves through the table, each cell optimal for a particular situation. The intention of this report is to enable informed use of the methods so that a method chosen is optimal for a situation given. The matrix, with type names in the cells, is introduced in Table 2 on page 13 below. Unless otherwise stated we use the word 'method' in this report to refer to a 'risk assessment method', though often times we use the full phrase. The use of the terms 'risk assessment' and 'risk management' are close enough that we do not attempt to distinguish them in this report. The remainder of this report is organized as follows. In Section 2 we provide context for

  7. Multidimensional Risk Analysis: MRISK

    NASA Technical Reports Server (NTRS)

    McCollum, Raymond; Brown, Douglas; O'Shea, Sarah Beth; Reith, William; Rabulan, Jennifer; Melrose, Graeme

    2015-01-01

    Multidimensional Risk (MRISK) calculates the combined multidimensional score using Mahalanobis distance. MRISK accounts for covariance between consequence dimensions, which de-conflicts the interdependencies of consequence dimensions, providing a clearer depiction of risks. Additionally, in the event the dimensions are not correlated, Mahalanobis distance reduces to Euclidean distance normalized by the variance and, therefore, represents the most flexible and optimal method to combine dimensions. MRISK is currently being used in NASA's Environmentally Responsible Aviation (ERA) project o assess risk and prioritize scarce resources.

  8. A Proposal of Operational Risk Management Method Using FMEA for Drug Manufacturing Computerized System

    NASA Astrophysics Data System (ADS)

    Takahashi, Masakazu; Nanba, Reiji; Fukue, Yoshinori

    This paper proposes operational Risk Management (RM) method using Failure Mode and Effects Analysis (FMEA) for drug manufacturing computerlized system (DMCS). The quality of drug must not be influenced by failures and operational mistakes of DMCS. To avoid such situation, DMCS has to be conducted enough risk assessment and taken precautions. We propose operational RM method using FMEA for DMCS. To propose the method, we gathered and compared the FMEA results of DMCS, and develop a list that contains failure modes, failures and countermeasures. To apply this list, we can conduct RM in design phase, find failures, and conduct countermeasures efficiently. Additionally, we can find some failures that have not been found yet.

  9. Computational methods in the pricing and risk management of modern financial derivatives

    NASA Astrophysics Data System (ADS)

    Deutsch, Hans-Peter

    1999-09-01

    In the last 20 years modern finance has developed into a complex mathematically challenging field. Very complicated risks exist in financial markets which need very advanced methods to measure and/or model them. The financial instruments invented by the market participants to trade these risk, the so called derivatives are usually even more complicated than the risks themselves and also sometimes generate new riks. Topics like random walks, stochastic differential equations, martingale measures, time series analysis, implied correlations, etc. are of common use in the field. This is why more and more people with a science background, such as physicists, mathematicians, or computer scientists, are entering the field of finance. The measurement and management of all theses risks is the key to the continuing success of banks. This talk gives insight into today's common methods of modern market risk management such as variance-covariance, historical simulation, Monte Carlo, “Greek” ratios, etc., including the statistical concepts on which they are based. Derivatives are at the same time the main reason for and the most effective means of conducting risk management. As such, they stand at the beginning and end of risk management. The valuation of derivatives and structured financial instruments is therefore the prerequisite, the condition sine qua non, for all risk management. This talk introduces some of the important valuation methods used in modern derivatives pricing such as present value, Black-Scholes, binomial trees, Monte Carlo, etc. In summary this talk highlights an area outside physics where there is a lot of interesting work to do, especially for physicists. Or as one of our consultants said: The fascinating thing about this job is that Arthur Andersen hired me not ALTHOUGH I am a physicist but BECAUSE I am a physicist.

  10. Defining Human Failure Events for Petroleum Risk Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ronald L. Boring; Knut Øien

    2014-06-01

    In this paper, an identification and description of barriers and human failure events (HFEs) for human reliability analysis (HRA) is performed. The barriers, called target systems, are identified from risk significant accident scenarios represented as defined situations of hazard and accident (DSHAs). This report serves as the foundation for further work to develop petroleum HFEs compatible with the SPAR-H method and intended for reuse in future HRAs.

  11. 14 CFR 417.225 - Debris risk analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 4 2011-01-01 2011-01-01 false Debris risk analysis. 417.225 Section 417.225 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.225 Debris risk analysis. A...

  12. 14 CFR 417.225 - Debris risk analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Debris risk analysis. 417.225 Section 417.225 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.225 Debris risk analysis. A...

  13. 14 CFR 417.225 - Debris risk analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Debris risk analysis. 417.225 Section 417.225 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.225 Debris risk analysis. A...

  14. 14 CFR 417.225 - Debris risk analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Debris risk analysis. 417.225 Section 417.225 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.225 Debris risk analysis. A...

  15. 14 CFR 417.225 - Debris risk analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Debris risk analysis. 417.225 Section 417.225 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION LICENSING LAUNCH SAFETY Flight Safety Analysis § 417.225 Debris risk analysis. A...

  16. Abstract analysis method facilitates filtering low-methodological quality and high-bias risk systematic reviews on psoriasis interventions.

    PubMed

    Gómez-García, Francisco; Ruano, Juan; Aguilar-Luque, Macarena; Alcalde-Mellado, Patricia; Gay-Mimbrera, Jesús; Hernández-Romero, José Luis; Sanz-Cabanillas, Juan Luis; Maestre-López, Beatriz; González-Padilla, Marcelino; Carmona-Fernández, Pedro J; García-Nieto, Antonio Vélez; Isla-Tejera, Beatriz

    2017-12-29

    Article summaries' information and structure may influence researchers/clinicians' decisions to conduct deeper full-text analyses. Specifically, abstracts of systematic reviews (SRs) and meta-analyses (MA) should provide structured summaries for quick assessment. This study explored a method for determining the methodological quality and bias risk of full-text reviews using abstract information alone. Systematic literature searches for SRs and/or MA about psoriasis were undertaken on MEDLINE, EMBASE, and Cochrane database. For each review, quality, abstract-reporting completeness, full-text methodological quality, and bias risk were evaluated using Preferred Reporting Items for Systematic Reviews and Meta-analyses for abstracts (PRISMA-A), Assessing the Methodological Quality of Systematic Reviews (AMSTAR), and ROBIS tools, respectively. Article-, author-, and journal-derived metadata were systematically extracted from eligible studies using a piloted template, and explanatory variables concerning abstract-reporting quality were assessed using univariate and multivariate-regression models. Two classification models concerning SRs' methodological quality and bias risk were developed based on per-item and total PRISMA-A scores and decision-tree algorithms. This work was supported, in part, by project ICI1400136 (JR). No funding was received from any pharmaceutical company. This study analysed 139 SRs on psoriasis interventions. On average, they featured 56.7% of PRISMA-A items. The mean total PRISMA-A score was significantly higher for high-methodological-quality SRs than for moderate- and low-methodological-quality reviews. SRs with low-bias risk showed higher total PRISMA-A values than reviews with high-bias risk. In the final model, only 'authors per review > 6' (OR: 1.098; 95%CI: 1.012-1.194), 'academic source of funding' (OR: 3.630; 95%CI: 1.788-7.542), and 'PRISMA-endorsed journal' (OR: 4.370; 95%CI: 1.785-10.98) predicted PRISMA-A variability. Reviews with a

  17. New methods for fall risk prediction.

    PubMed

    Ejupi, Andreas; Lord, Stephen R; Delbaere, Kim

    2014-09-01

    Accidental falls are the leading cause of injury-related death and hospitalization in old age, with over one-third of the older adults experiencing at least one fall or more each year. Because of limited healthcare resources, regular objective fall risk assessments are not possible in the community on a large scale. New methods for fall prediction are necessary to identify and monitor those older people at high risk of falling who would benefit from participating in falls prevention programmes. Technological advances have enabled less expensive ways to quantify physical fall risk in clinical practice and in the homes of older people. Recently, several studies have demonstrated that sensor-based fall risk assessments of postural sway, functional mobility, stepping and walking can discriminate between fallers and nonfallers. Recent research has used low-cost, portable and objective measuring instruments to assess fall risk in older people. Future use of these technologies holds promise for assessing fall risk accurately in an unobtrusive manner in clinical and daily life settings.

  18. Segmented Poincaré plot analysis for risk stratification in patients with dilated cardiomyopathy.

    PubMed

    Voss, A; Fischer, C; Schroeder, R; Figulla, H R; Goernig, M

    2010-01-01

    The prognostic value of heart rate variability in patients with dilated cardiomyopathy (DCM) is limited and does not contribute to risk stratification although the dynamics of ventricular repolarization differs considerably between DCM patients and healthy subjects. Neither linear nor nonlinear methods of heart rate variability analysis could discriminate between patients at high and low risk for sudden cardiac death. The aim of this study was to analyze the suitability of the new developed segmented Poincaré plot analysis (SPPA) to enhance risk stratification in DCM. In contrast to the usual applied Poincaré plot analysis the SPPA retains nonlinear features from investigated beat-to-beat interval time series. Main features of SPPA are the rotation of cloud of points and their succeeded variability depended segmentation. Significant row and column probabilities were calculated from the segments and led to discrimination (up to p<0.005) between low and high risk in DCM patients. For the first time an index from Poincaré plot analysis of heart rate variability was able to contribute to risk stratification in patients suffering from DCM.

  19. Migraine Headache and Ischemic Stroke Risk: An Updated Meta-analysis

    PubMed Central

    Spector, June T.; Kahn, Susan R.; Jones, Miranda R.; Jayakumar, Monisha; Dalal, Deepan; Nazarian, Saman

    2010-01-01

    Background Observational studies, including recent large cohort studies which were unavailable for prior meta-analysis, have suggested an association between migraine headache and ischemic stroke. We performed an updated meta-analysis to quantitatively summarize the strength of association between migraine and ischemic stroke risk. Methods We systematically searched electronic databases, including MEDLINE and EMBASE, through February 2009 for studies of human subjects in the English language. Study selection using a priori selection criteria, data extraction, and assessment of study quality were conducted independently by reviewer pairs using standardized forms. Results Twenty-one (60%) of 35 studies met the selection criteria, for a total of 622,381 participants (13 case-control, 8 cohort studies) included in the meta-analysis. The pooled adjusted odds ratio of ischemic stroke comparing migraineurs to non-migraineurs using a random effects model was 2.30 (95% confidence interval [CI], 1.91-2.76). The pooled adjusted effect estimates for studies that reported relative risks and hazard ratios, respectively, were 2.41 (95% CI, 1.81-3.20) and 1.52 (95% CI, 0.99-2.35). The overall pooled effect estimate was 2.04 (95% CI, 1.72-2.43). Results were robust to sensitivity analyses excluding lower quality studies. Conclusions Migraine is associated with increased ischemic stroke risk. These findings underscore the importance of identifying high-risk migraineurs with other modifiable stroke risk factors. Future studies of the effect of migraine treatment and modifiable risk factor reduction on stroke risk in migraineurs are warranted. PMID:20493462

  20. Improved Methods for Fire Risk Assessment in Low-Income and Informal Settlements.

    PubMed

    Twigg, John; Christie, Nicola; Haworth, James; Osuteye, Emmanuel; Skarlatidou, Artemis

    2017-02-01

    Fires cause over 300,000 deaths annually worldwide and leave millions more with permanent injuries: some 95% of these deaths are in low- and middle-income countries. Burn injury risk is strongly associated with low-income and informal (or slum) settlements, which are growing rapidly in an urbanising world. Fire policy and mitigation strategies in poorer countries are constrained by inadequate data on incidence, impacts, and causes, which is mainly due to a lack of capacity and resources for data collection, analysis, and modelling. As a first step towards overcoming such challenges, this project reviewed the literature on the subject to assess the potential of a range of methods and tools for identifying, assessing, and addressing fire risk in low-income and informal settlements; the process was supported by an expert workshop at University College London in May 2016. We suggest that community-based risk and vulnerability assessment methods, which are widely used in disaster risk reduction, could be adapted to urban fire risk assessment, and could be enhanced by advances in crowdsourcing and citizen science for geospatial data creation and collection. To assist urban planners, emergency managers, and community organisations who are working in resource-constrained settings to identify and assess relevant fire risk factors, we also suggest an improved analytical framework based on the Haddon Matrix.

  1. Improved Methods for Fire Risk Assessment in Low-Income and Informal Settlements

    PubMed Central

    Twigg, John; Christie, Nicola; Haworth, James; Osuteye, Emmanuel; Skarlatidou, Artemis

    2017-01-01

    Fires cause over 300,000 deaths annually worldwide and leave millions more with permanent injuries: some 95% of these deaths are in low- and middle-income countries. Burn injury risk is strongly associated with low-income and informal (or slum) settlements, which are growing rapidly in an urbanising world. Fire policy and mitigation strategies in poorer countries are constrained by inadequate data on incidence, impacts, and causes, which is mainly due to a lack of capacity and resources for data collection, analysis, and modelling. As a first step towards overcoming such challenges, this project reviewed the literature on the subject to assess the potential of a range of methods and tools for identifying, assessing, and addressing fire risk in low-income and informal settlements; the process was supported by an expert workshop at University College London in May 2016. We suggest that community-based risk and vulnerability assessment methods, which are widely used in disaster risk reduction, could be adapted to urban fire risk assessment, and could be enhanced by advances in crowdsourcing and citizen science for geospatial data creation and collection. To assist urban planners, emergency managers, and community organisations who are working in resource-constrained settings to identify and assess relevant fire risk factors, we also suggest an improved analytical framework based on the Haddon Matrix. PMID:28157149

  2. Lack of association between NAT2 polymorphism and prostate cancer risk: a meta-analysis and trial sequential analysis

    PubMed Central

    Tang, Jingyuan; Xu, Lingyan; Xu, Haoxiang; Li, Ran; Han, Peng; Yang, Haiwei

    2017-01-01

    Previous studies have investigated the association between NAT2 polymorphism and the risk of prostate cancer (PCa). However, the findings from these studies remained inconsistent. Hence, we performed a meta-analysis to provide a more reliable conclusion about such associations. In the present meta-analysis, 13 independent case-control studies were included with a total of 14,469 PCa patients and 10,689 controls. All relevant studies published were searched in the databates PubMed, EMBASE, and Web of Science, till March 1st, 2017. We used the pooled odds ratios (ORs) with 95% confidence intervals (CIs) to evaluate the strength of the association between NAT2*4 allele and susceptibility to PCa. Subgroup analysis was carried out by ethnicity, source of controls and genotyping method. What's more, we also performed trial sequential analysis (TSA) to reduce the risk of type I error and evaluate whether the evidence of the results was firm. Firstly, our results indicated that NAT2*4 allele was not associated with PCa susceptibility (OR = 1.00, 95% CI= 0.95–1.05; P = 0.100). However, after excluding two studies for its heterogeneity and publication bias, no significant relationship was also detected between NAT2*4 allele and the increased risk of PCa, in fixed-effect model (OR = 0.99, 95% CI= 0.94–1.04; P = 0.451). Meanwhile, no significant increased risk of PCa was found in the subgroup analyses by ethnicity, source of controls and genotyping method. Moreover, TSA demonstrated that such association was confirmed in the present study. Therefore, this meta-analysis suggested that no significant association between NAT2 polymorphism and the risk of PCa was found. PMID:28915684

  3. Risk analysis for autonomous underwater vehicle operations in extreme environments.

    PubMed

    Brito, Mario Paulo; Griffiths, Gwyn; Challenor, Peter

    2010-12-01

    Autonomous underwater vehicles (AUVs) are used increasingly to explore hazardous marine environments. Risk assessment for such complex systems is based on subjective judgment and expert knowledge as much as on hard statistics. Here, we describe the use of a risk management process tailored to AUV operations, the implementation of which requires the elicitation of expert judgment. We conducted a formal judgment elicitation process where eight world experts in AUV design and operation were asked to assign a probability of AUV loss given the emergence of each fault or incident from the vehicle's life history of 63 faults and incidents. After discussing methods of aggregation and analysis, we show how the aggregated risk estimates obtained from the expert judgments were used to create a risk model. To estimate AUV survival with mission distance, we adopted a statistical survival function based on the nonparametric Kaplan-Meier estimator. We present theoretical formulations for the estimator, its variance, and confidence limits. We also present a numerical example where the approach is applied to estimate the probability that the Autosub3 AUV would survive a set of missions under Pine Island Glacier, Antarctica in January-March 2009. © 2010 Society for Risk Analysis.

  4. PROOF OF CONCEPT FOR A HUMAN RELIABILITY ANALYSIS METHOD FOR HEURISTIC USABILITY EVALUATION OF SOFTWARE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ronald L. Boring; David I. Gertman; Jeffrey C. Joe

    2005-09-01

    An ongoing issue within human-computer interaction (HCI) is the need for simplified or “discount” methods. The current economic slowdown has necessitated innovative methods that are results driven and cost effective. The myriad methods of design and usability are currently being cost-justified, and new techniques are actively being explored that meet current budgets and needs. Recent efforts in human reliability analysis (HRA) are highlighted by the ten-year development of the Standardized Plant Analysis Risk HRA (SPAR-H) method. The SPAR-H method has been used primarily for determining humancentered risk at nuclear power plants. The SPAR-H method, however, shares task analysis underpinnings withmore » HCI. Despite this methodological overlap, there is currently no HRA approach deployed in heuristic usability evaluation. This paper presents an extension of the existing SPAR-H method to be used as part of heuristic usability evaluation in HCI.« less

  5. TNF -308 G/A Polymorphism and Risk of Acne Vulgaris: A Meta-Analysis

    PubMed Central

    Yang, Jian-Kang; Wu, Wen-Juan; Qi, Jue; He, Li; Zhang, Ya-Ping

    2014-01-01

    Background The -308 G/A polymorphism in the tumor necrosis factor (TNF) gene has been implicated in the risk of acne vulgaris, but the results are inconclusive. The present meta-analysis aimed to investigate the overall association between the -308 G/A polymorphism and acne vulgaris risk. Methods We searched in Pubmed, Embase, Web of Science and CNKI for studies evaluating the association between the -308 G/A gene polymorphism and acne vulgaris risk. Data were extracted and statistical analysis was performed using STATA 12.0 software. Results A total of five publications involving 1553 subjects (728 acne vulgaris cases and 825 controls) were included in this meta-analysis. Combined analysis revealed a significant association between this polymorphism and acne vulgaris risk under recessive model (OR = 2.73, 95% CI: 1.37–5.44, p = 0.004 for AA vs. AG + GG). Subgroup analysis by ethnicity showed that the acne vulgaris risk associated with the -308 G/A gene polymorphism was significantly elevated among Caucasians under recessive model (OR = 2.34, 95% CI: 1.13–4.86, p = 0.023). Conclusion This meta-analysis suggests that the -308 G/A polymorphism in the TNF gene contributes to acne vulgaris risk, especially in Caucasian populations. Further studies among different ethnicity populations are needed to validate these findings. PMID:24498378

  6. A ubiquitous method for street scale spatial data collection and analysis in challenging urban environments: mapping health risks using spatial video in Haiti.

    PubMed

    Curtis, Andrew; Blackburn, Jason K; Widmer, Jocelyn M; Morris, J Glenn

    2013-04-15

    Fine-scale and longitudinal geospatial analysis of health risks in challenging urban areas is often limited by the lack of other spatial layers even if case data are available. Underlying population counts, residential context, and associated causative factors such as standing water or trash locations are often missing unless collected through logistically difficult, and often expensive, surveys. The lack of spatial context also hinders the interpretation of results and designing intervention strategies structured around analytical insights. This paper offers a ubiquitous spatial data collection approach using a spatial video that can be used to improve analysis and involve participatory collaborations. A case study will be used to illustrate this approach with three health risks mapped at the street scale for a coastal community in Haiti. Spatial video was used to collect street and building scale information, including standing water, trash accumulation, presence of dogs, cohort specific population characteristics, and other cultural phenomena. These data were digitized into Google Earth and then coded and analyzed in a GIS using kernel density and spatial filtering approaches. The concentrations of these risks around area schools which are sometimes sources of diarrheal disease infection because of the high concentration of children and variable sanitary practices will show the utility of the method. In addition schools offer potential locations for cholera education interventions. Previously unavailable fine scale health risk data vary in concentration across the town, with some schools being proximate to greater concentrations of the mapped risks. The spatial video is also used to validate coded data and location specific risks within these "hotspots". Spatial video is a tool that can be used in any environment to improve local area health analysis and intervention. The process is rapid and can be repeated in study sites through time to track spatio

  7. New method for assessing risks of email

    NASA Astrophysics Data System (ADS)

    Raja, Seyyed H.; Afrooz, Farzad

    2013-03-01

    E-mail technology, has become one of the requirements of human lives for correspondence between individuals. Given this, the important point is that the messages, server and client of e-mail and correspondences that exchanged between different people have acceptable security, to make people sure to use of this technology. In the information age, many of financial and non financial transactions are done electronically, data exchange takes place via the internet and theft and manipulation of data can make exorbitant cost in terms of integrity, financial, political, economic and culture. E-mail correspondence in there is same and it is very important. With review took place, a method that will focus on email system for risks assessment is not provided. We are examining ways of assessing for other systems and their strengths and weaknesses, then we use Mr Convery method for assessing email risks which it is for assessing network risks. At the end of paper we have offered special table for email risk assessment.

  8. The influence of the free space environment on the superlight-weight thermal protection system: conception, methods, and risk analysis

    NASA Astrophysics Data System (ADS)

    Yatsenko, Vitaliy; Falchenko, Iurii; Fedorchuk, Viktor; Petrushynets, Lidiia

    2016-07-01

    This report focuses on the results of the EU project "Superlight-weight thermal protection system for space application (LIGHT-TPS)". The bottom line is an analysis of influence of the free space environment on the superlight-weight thermal protection system (TPS). This report focuses on new methods that based on the following models: synergetic, physical, and computational. This report concentrates on four approaches. The first concerns the synergetic approach. The synergetic approach to the solution of problems of self-controlled synthesis of structures and creation of self-organizing technologies is considered in connection with the super-problem of creation of materials with new functional properties. Synergetics methods and mathematical design are considered according to actual problems of material science. The second approach describes how the optimization methods can be used to determine material microstructures with optimized or targeted properties. This technique enables one to find unexpected microstructures with exotic behavior (e.g., negative thermal expansion coefficients). The third approach concerns the dynamic probabilistic risk analysis of TPS l elements with complex characterizations for damages using a physical model of TPS system and a predictable level of ionizing radiation and space weather. Focusing is given mainly on the TPS model, mathematical models for dynamic probabilistic risk assessment and software for the modeling and prediction of the influence of the free space environment. The probabilistic risk assessment method for TPS is presented considering some deterministic and stochastic factors. The last approach concerns results of experimental research of the temperature distribution on the surface of the honeycomb sandwich panel size 150 x 150 x 20 mm at the diffusion welding in vacuum are considered. An equipment, which provides alignment of temperature fields in a product for the formation of equal strength of welded joints is

  9. Developing and validating risk prediction models in an individual participant data meta-analysis

    PubMed Central

    2014-01-01

    Background Risk prediction models estimate the risk of developing future outcomes for individuals based on one or more underlying characteristics (predictors). We review how researchers develop and validate risk prediction models within an individual participant data (IPD) meta-analysis, in order to assess the feasibility and conduct of the approach. Methods A qualitative review of the aims, methodology, and reporting in 15 articles that developed a risk prediction model using IPD from multiple studies. Results The IPD approach offers many opportunities but methodological challenges exist, including: unavailability of requested IPD, missing patient data and predictors, and between-study heterogeneity in methods of measurement, outcome definitions and predictor effects. Most articles develop their model using IPD from all available studies and perform only an internal validation (on the same set of data). Ten of the 15 articles did not allow for any study differences in baseline risk (intercepts), potentially limiting their model’s applicability and performance in some populations. Only two articles used external validation (on different data), including a novel method which develops the model on all but one of the IPD studies, tests performance in the excluded study, and repeats by rotating the omitted study. Conclusions An IPD meta-analysis offers unique opportunities for risk prediction research. Researchers can make more of this by allowing separate model intercept terms for each study (population) to improve generalisability, and by using ‘internal-external cross-validation’ to simultaneously develop and validate their model. Methodological challenges can be reduced by prospectively planned collaborations that share IPD for risk prediction. PMID:24397587

  10. Applying Uncertainty Analysis to a Risk Assessment for the Pesticide Permethrin

    EPA Science Inventory

    We discuss the application of methods of uncertainty analysis from our previous poster to the problem of a risk assessment for exposure to the food-use pesticide permethrin resulting from residential pesticide crack and crevice application. Exposures are simulated by the SHEDS (S...

  11. Extractive waste management: A risk analysis approach.

    PubMed

    Mehta, Neha; Dino, Giovanna Antonella; Ajmone-Marsan, Franco; Lasagna, Manuela; Romè, Chiara; De Luca, Domenico Antonio

    2018-05-01

    Abandoned mine sites continue to present serious environmental hazards because the heavy metals associated with extractive waste are continuously released into the environment, where they threaten human life and the environment. Remediating and securing extractive waste are complex, lengthy and costly processes. Thus, in most European countries, a site is considered for intervention when it poses a risk to human health and the surrounding environment. As a consequence, risk analysis presents a viable decisional approach towards the management of extractive waste. To evaluate the effects posed by extractive waste to human health and groundwater, a risk analysis approach was used for an abandoned nickel extraction site in Campello Monti in North Italy. This site is located in the Southern Italian Alps. The area consists of large and voluminous mafic rocks intruded by mantle peridotite. The mining activities in this area have generated extractive waste. A risk analysis of the site was performed using Risk Based Corrective Action (RBCA) guidelines, considering the properties of extractive waste and water for the properties of environmental matrices. The results showed the presence of carcinogenic risk due to arsenic and risks to groundwater due to nickel. The results of the risk analysis form a basic understanding of the current situation at the site, which is affected by extractive waste. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Gis-Based Multi-Criteria Decision Analysis for Forest Fire Risk Mapping

    NASA Astrophysics Data System (ADS)

    Akay, A. E.; Erdoğan, A.

    2017-11-01

    The forested areas along the coastal zone of the Mediterranean region in Turkey are classified as first-degree fire sensitive areas. Forest fires are major environmental disaster that affects the sustainability of forest ecosystems. Besides, forest fires result in important economic losses and even threaten human lives. Thus, it is critical to determine the forested areas with fire risks and thereby minimize the damages on forest resources by taking necessary precaution measures in these areas. The risk of forest fire can be assessed based on various factors such as forest vegetation structures (tree species, crown closure, tree stage), topographic features (slope and aspect), and climatic parameters (temperature, wind). In this study, GIS-based Multi-Criteria Decision Analysis (MCDA) method was used to generate forest fire risk map. The study was implemented in the forested areas within Yayla Forest Enterprise Chiefs at Dursunbey Forest Enterprise Directorate which is classified as first degree fire sensitive area. In the solution process, "extAhp 2.0" plug-in running Analytic Hierarchy Process (AHP) method in ArcGIS 10.4.1 was used to categorize study area under five fire risk classes: extreme risk, high risk, moderate risk, and low risk. The results indicated that 23.81 % of the area was of extreme risk, while 25.81 % was of high risk. The result indicated that the most effective criterion was tree species, followed by tree stages. The aspect had the least effective criterion on forest fire risk. It was revealed that GIS techniques integrated with MCDA methods are effective tools to quickly estimate forest fire risk at low cost. The integration of these factors into GIS can be very useful to determine forested areas with high fire risk and also to plan forestry management after fire.

  13. Issues in benchmarking human reliability analysis methods : a literature review.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lois, Erasmia; Forester, John Alan; Tran, Tuan Q.

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessment (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study is currently underway that compares HRA methods with each other and against operator performance in simulator studies. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted,more » reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies are presented in order to aid in the design of future HRA benchmarking endeavors.« less

  14. Issues in Benchmarking Human Reliability Analysis Methods: A Literature Review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ronald L. Boring; Stacey M. L. Hendrickson; John A. Forester

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessments (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study comparing and evaluating HRA methods in assessing operator performance in simulator experiments is currently underway. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing pastmore » benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies are presented in order to aid in the design of future HRA benchmarking endeavors.« less

  15. Obesity as a risk factor for developing functional limitation among older adults: A conditional inference tree analysis

    USDA-ARS?s Scientific Manuscript database

    Objective: To examine the risk factors of developing functional decline and make probabilistic predictions by using a tree-based method that allows higher order polynomials and interactions of the risk factors. Methods: The conditional inference tree analysis, a data mining approach, was used to con...

  16. Decomposition-Based Failure Mode Identification Method for Risk-Free Design of Large Systems

    NASA Technical Reports Server (NTRS)

    Tumer, Irem Y.; Stone, Robert B.; Roberts, Rory A.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    When designing products, it is crucial to assure failure and risk-free operation in the intended operating environment. Failures are typically studied and eliminated as much as possible during the early stages of design. The few failures that go undetected result in unacceptable damage and losses in high-risk applications where public safety is of concern. Published NASA and NTSB accident reports point to a variety of components identified as sources of failures in the reported cases. In previous work, data from these reports were processed and placed in matrix form for all the system components and failure modes encountered, and then manipulated using matrix methods to determine similarities between the different components and failure modes. In this paper, these matrices are represented in the form of a linear combination of failures modes, mathematically formed using Principal Components Analysis (PCA) decomposition. The PCA decomposition results in a low-dimensionality representation of all failure modes and components of interest, represented in a transformed coordinate system. Such a representation opens the way for efficient pattern analysis and prediction of failure modes with highest potential risks on the final product, rather than making decisions based on the large space of component and failure mode data. The mathematics of the proposed method are explained first using a simple example problem. The method is then applied to component failure data gathered from helicopter, accident reports to demonstrate its potential.

  17. Image Based Biomarker of Breast Cancer Risk: Analysis of Risk Disparity among Minority Populations

    DTIC Science & Technology

    2013-03-01

    TITLE: Image Based Biomarker of Breast Cancer Risk: Analysis of Risk Disparity among Minority Populations PRINCIPAL INVESTIGATOR: Fengshan Liu...SUBTITLE 5a. CONTRACT NUMBER Image Based Biomarker of Breast Cancer Risk: Analysis of Risk Disparity among Minority Populations 5b. GRANT NUMBER...identifying the prevalence of women with incomplete visualization of the breast . We developed a code to estimate the breast cancer risks using the

  18. Working session 5: Operational aspects and risk analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cizelj, L.; Donoghue, J.

    1997-02-01

    A general observation is that both operational aspects and risk analysis cannot be adequately discussed without information presented in other sessions. Some overlap of conclusions and recommendations is therefore to be expected. Further, it was assumed that recommendations concerning improvements in some related topics were generated by other sessions and are not repeated here. These include: (1) Knowledge on degradation mechanisms (initiation, progression, and failure). (2) Modeling of degradation (initiation, progression, and failure). (3) Capabilities of NDE methods. (4) Preventive maintenance and repair. One should note here, however, that all of these directly affect both operational and risk aspects ofmore » affected plants. A list of conclusions and recommendations is based on available presentations and discussions addressing risk and operational experience. The authors aimed at reaching as broad a consensus as possible. It should be noted here that there is no strict delineation between operational and safety aspects of degradation of steam generator tubes. This is caused by different risk perceptions in different countries/societies. The conclusions and recommendations were divided into four broad groups: human reliability; leakage monitoring; risk impact; and consequence assessment.« less

  19. Categorizing accident sequences in the external radiotherapy for risk analysis

    PubMed Central

    2013-01-01

    Purpose This study identifies accident sequences from the past accidents in order to help the risk analysis application to the external radiotherapy. Materials and Methods This study reviews 59 accidental cases in two retrospective safety analyses that have collected the incidents in the external radiotherapy extensively. Two accident analysis reports that accumulated past incidents are investigated to identify accident sequences including initiating events, failure of safety measures, and consequences. This study classifies the accidents by the treatments stages and sources of errors for initiating events, types of failures in the safety measures, and types of undesirable consequences and the number of affected patients. Then, the accident sequences are grouped into several categories on the basis of similarity of progression. As a result, these cases can be categorized into 14 groups of accident sequence. Results The result indicates that risk analysis needs to pay attention to not only the planning stage, but also the calibration stage that is committed prior to the main treatment process. It also shows that human error is the largest contributor to initiating events as well as to the failure of safety measures. This study also illustrates an event tree analysis for an accident sequence initiated in the calibration. Conclusion This study is expected to provide sights into the accident sequences for the prospective risk analysis through the review of experiences. PMID:23865005

  20. A stable systemic risk ranking in China's banking sector: Based on principal component analysis

    NASA Astrophysics Data System (ADS)

    Fang, Libing; Xiao, Binqing; Yu, Honghai; You, Qixing

    2018-02-01

    In this paper, we compare five popular systemic risk rankings, and apply principal component analysis (PCA) model to provide a stable systemic risk ranking for the Chinese banking sector. Our empirical results indicate that five methods suggest vastly different systemic risk rankings for the same bank, while the combined systemic risk measure based on PCA provides a reliable ranking. Furthermore, according to factor loadings of the first component, PCA combined ranking is mainly based on fundamentals instead of market price data. We clearly find that price-based rankings are not as practical a method as fundamentals-based ones. This PCA combined ranking directly shows systemic risk contributions of each bank for banking supervision purpose and reminds banks to prevent and cope with the financial crisis in advance.

  1. Three methods for integration of environmental risk into the benefit-risk assessment of veterinary medicinal products.

    PubMed

    Chapman, Jennifer L; Porsch, Lucas; Vidaurre, Rodrigo; Backhaus, Thomas; Sinclair, Chris; Jones, Glyn; Boxall, Alistair B A

    2017-12-15

    Veterinary medicinal products (VMPs) require, as part of the European Union (EU) authorization process, consideration of both risks and benefits. Uses of VMPs have multiple risks (e.g., risks to the animal being treated, to the person administering the VMP) including risks to the environment. Environmental risks are not directly comparable to therapeutic benefits; there is no standardized approach to compare both environmental risks and therapeutic benefits. We have developed three methods for communicating and comparing therapeutic benefits and environmental risks for the benefit-risk assessment that supports the EU authorization process. Two of these methods support independent product evaluation (i.e., a summative classification and a visual scoring matrix classification); the other supports a comparative evaluation between alternative products (i.e., a comparative classification). The methods and the challenges to implementing a benefit-risk assessment including environmental risk are presented herein; how these concepts would work in current policy is discussed. Adaptability to scientific and policy development is considered. This work is an initial step in the development of a standardized methodology for integrated decision-making for VMPs. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2010-01-01

    A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multifactor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.

  3. Probabilistic Methods for Structural Reliability and Risk

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.

    2008-01-01

    A probabilistic method is used to evaluate the structural reliability and risk of select metallic and composite structures. The method is a multiscale, multifunctional and it is based on the most elemental level. A multi-factor interaction model is used to describe the material properties which are subsequently evaluated probabilistically. The metallic structure is a two rotor aircraft engine, while the composite structures consist of laminated plies (multiscale) and the properties of each ply are the multifunctional representation. The structural component is modeled by finite element. The solution method for structural responses is obtained by an updated simulation scheme. The results show that the risk for the two rotor engine is about 0.0001 and the composite built-up structure is also 0.0001.

  4. Failure analysis in the identification of synergies between cleaning monitoring methods.

    PubMed

    Whiteley, Greg S; Derry, Chris; Glasbey, Trevor

    2015-02-01

    The 4 monitoring methods used to manage the quality assurance of cleaning outcomes within health care settings are visual inspection, microbial recovery, fluorescent marker assessment, and rapid ATP bioluminometry. These methods each generate different types of information, presenting a challenge to the successful integration of monitoring results. A systematic approach to safety and quality control can be used to interrogate the known qualities of cleaning monitoring methods and provide a prospective management tool for infection control professionals. We investigated the use of failure mode and effects analysis (FMEA) for measuring failure risk arising through each cleaning monitoring method. FMEA uses existing data in a structured risk assessment tool that identifies weaknesses in products or processes. Our FMEA approach used the literature and a small experienced team to construct a series of analyses to investigate the cleaning monitoring methods in a way that minimized identified failure risks. FMEA applied to each of the cleaning monitoring methods revealed failure modes for each. The combined use of cleaning monitoring methods in sequence is preferable to their use in isolation. When these 4 cleaning monitoring methods are used in combination in a logical sequence, the failure modes noted for any 1 can be complemented by the strengths of the alternatives, thereby circumventing the risk of failure of any individual cleaning monitoring method. Copyright © 2015 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  5. PRA and Risk Informed Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernsen, Sidney A.; Simonen, Fredric A.; Balkey, Kenneth R.

    2006-01-01

    The Boiler and Pressure Vessel Code (BPVC) of the American Society of Mechanical Engineers (ASME) has introduced a risk based approach into Section XI that covers Rules for Inservice Inspection of Nuclear Power Plant Components. The risk based approach requires application of the probabilistic risk assessments (PRA). Because no industry consensus standard existed for PRAs, ASME has developed a standard to evaluate the quality level of an available PRA needed to support a given risk based application. The paper describes the PRA standard, Section XI application of PRAs, and plans for broader applications of PRAs to other ASME nuclear codesmore » and standards. The paper addresses several specific topics of interest to Section XI. Important consideration are special methods (surrogate components) used to overcome the lack of PRA treatments of passive components in PRAs. The approach allows calculations of conditional core damage probabilities both for component failures that cause initiating events and failures in standby systems that decrease the availability of these systems. The paper relates the explicit risk based methods of the new Section XI code cases to the implicit consideration of risk used in the development of Section XI. Other topics include the needed interactions of ISI engineers, plant operating staff, PRA specialists, and members of expert panels that review the risk based programs.« less

  6. Comparison of cluster-based and source-attribution methods for estimating transmission risk using large HIV sequence databases.

    PubMed

    Le Vu, Stéphane; Ratmann, Oliver; Delpech, Valerie; Brown, Alison E; Gill, O Noel; Tostevin, Anna; Fraser, Christophe; Volz, Erik M

    2018-06-01

    Phylogenetic clustering of HIV sequences from a random sample of patients can reveal epidemiological transmission patterns, but interpretation is hampered by limited theoretical support and statistical properties of clustering analysis remain poorly understood. Alternatively, source attribution methods allow fitting of HIV transmission models and thereby quantify aspects of disease transmission. A simulation study was conducted to assess error rates of clustering methods for detecting transmission risk factors. We modeled HIV epidemics among men having sex with men and generated phylogenies comparable to those that can be obtained from HIV surveillance data in the UK. Clustering and source attribution approaches were applied to evaluate their ability to identify patient attributes as transmission risk factors. We find that commonly used methods show a misleading association between cluster size or odds of clustering and covariates that are correlated with time since infection, regardless of their influence on transmission. Clustering methods usually have higher error rates and lower sensitivity than source attribution method for identifying transmission risk factors. But neither methods provide robust estimates of transmission risk ratios. Source attribution method can alleviate drawbacks from phylogenetic clustering but formal population genetic modeling may be required to estimate quantitative transmission risk factors. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  7. Balancing precision and risk: should multiple detection methods be analyzed separately in N-mixture models?

    USGS Publications Warehouse

    Graves, Tabitha A.; Royle, J. Andrew; Kendall, Katherine C.; Beier, Paul; Stetz, Jeffrey B.; Macleod, Amy C.

    2012-01-01

    Using multiple detection methods can increase the number, kind, and distribution of individuals sampled, which may increase accuracy and precision and reduce cost of population abundance estimates. However, when variables influencing abundance are of interest, if individuals detected via different methods are influenced by the landscape differently, separate analysis of multiple detection methods may be more appropriate. We evaluated the effects of combining two detection methods on the identification of variables important to local abundance using detections of grizzly bears with hair traps (systematic) and bear rubs (opportunistic). We used hierarchical abundance models (N-mixture models) with separate model components for each detection method. If both methods sample the same population, the use of either data set alone should (1) lead to the selection of the same variables as important and (2) provide similar estimates of relative local abundance. We hypothesized that the inclusion of 2 detection methods versus either method alone should (3) yield more support for variables identified in single method analyses (i.e. fewer variables and models with greater weight), and (4) improve precision of covariate estimates for variables selected in both separate and combined analyses because sample size is larger. As expected, joint analysis of both methods increased precision as well as certainty in variable and model selection. However, the single-method analyses identified different variables and the resulting predicted abundances had different spatial distributions. We recommend comparing single-method and jointly modeled results to identify the presence of individual heterogeneity between detection methods in N-mixture models, along with consideration of detection probabilities, correlations among variables, and tolerance to risk of failing to identify variables important to a subset of the population. The benefits of increased precision should be weighed against

  8. Relative risk analysis of several manufactured nanomaterials: an insurance industry context.

    PubMed

    Robichaud, Christine Ogilvie; Tanzil, Dicksen; Weilenmann, Ulrich; Wiesner, Mark R

    2005-11-15

    A relative risk assessment is presented for the industrial fabrication of several nanomaterials. The production processes for five nanomaterials were selected for this analysis, based on their current or near-term potential for large-scale production and commercialization: single-walled carbon nanotubes, bucky balls (C60), one variety of quantum dots, alumoxane nanoparticles, and nano-titanium dioxide. The assessment focused on the activities surrounding the fabrication of nanomaterials, exclusive of any impacts or risks with the nanomaterials themselves. A representative synthesis method was selected for each nanomaterial based on its potential for scaleup. A list of input materials, output materials, and waste streams for each step of fabrication was developed and entered into a database that included key process characteristics such as temperature and pressure. The physical-chemical properties and quantities of the inventoried materials were used to assess relative risk based on factors such as volatility, carcinogenicity, flammability, toxicity, and persistence. These factors were first used to qualitatively rank risk, then combined using an actuarial protocol developed by the insurance industry for the purpose of calculating insurance premiums for chemical manufacturers. This protocol ranks three categories of risk relative to a 100 point scale (where 100 represents maximum risk): incident risk, normal operations risk, and latent contamination risk. Results from this analysis determined that relative environmental risk from manufacturing each of these five materials was comparatively low in relation to other common industrial manufacturing processes.

  9. Method of self-harm in adolescents and young adults and risk of subsequent suicide.

    PubMed

    Beckman, Karin; Mittendorfer-Rutz, Ellenor; Waern, Margda; Larsson, Henrik; Runeson, Bo; Dahlin, Marie

    2018-03-05

    Self-harm is common in youth and an important risk factor for suicide. Certain self-harm methods might indicate a higher risk of suicide. The main aim of this study was to determine whether some methods of self-harm in adolescents (10-17 years) and young adults (18-24 years) are associated with a particularly high risk of suicide. A secondary aim was to ascertain how different self-harm methods might affect the probability of psychiatric follow-up. Five Swedish registers were linked in a national population-based cohort study. All nonfatal self-harm events recorded in specialist health care, excluding psychiatry and primary care services, among 10-24 year olds between 2000 and 2009 were included. Methods were classified as poisoning, cutting/piercing, violent method (gassing, hanging, strangulation/suffocation, drowning, jumping and firearms), other and multiple methods. Hazard Ratios (HR) for suicide were calculated in Cox regression models for each method with poisoning as the reference. Odds Ratios (OR) for psychiatric inpatient care were determined in logistic regression models. Analyses were adjusted for important covariates and stratified by age group and treatment setting (inpatient/outpatient). Among adolescents with initial medical hospitalisation, use of a violent method was associated with a near eightfold increase in HR for suicide compared to self-poisoning in the adjusted analysis [HR 7.8; 95% confidence interval (CI) 3.2-19.0]. Among hospitalised young adult women, adjusted HRs were elevated fourfold for both cutting [4.0 (1.9-8.8)] and violent methods [3.9 (1.5-10.6)]. Method of self-harm did not affect suicide risk in young adult men. Adolescents using violent methods had an increased probability of psychiatric inpatient care following initial treatment for self-harm. Violent self-harm requiring medical hospitalisation may signal particularly high risk of future suicide in adolescents (both sexes) and in young adult women. For the latter group

  10. An Updated Meta-Analysis of Risk of Multiple Sclerosis following Infectious Mononucleosis

    PubMed Central

    Handel, Adam E.; Williamson, Alexander J.; Disanto, Giulio; Handunnetthi, Lahiru; Giovannoni, Gavin; Ramagopalan, Sreeram V.

    2010-01-01

    Background Multiple sclerosis (MS) appears to develop in genetically susceptible individuals as a result of environmental exposures. Epstein-Barr virus (EBV) infection is an almost universal finding among individuals with MS. Symptomatic EBV infection as manifested by infectious mononucleosis (IM) has been shown in a previous meta-analysis to be associated with the risk of MS, however a number of much larger studies have since been published. Methods/Principal Findings We performed a Medline search to identify articles published since the original meta-analysis investigating MS risk following IM. A total of 18 articles were included in this study, including 19390 MS patients and 16007 controls. We calculated the relative risk of MS following IM using a generic inverse variance with random effects model. This showed that the risk of MS was strongly associated with IM (relative risk (RR) 2.17; 95% confidence interval 1.97–2.39; p<10−54). Discussion Our results establish firmly that a history of infectious mononucleosis significantly increases the risk of multiple sclerosis. Future work should focus on the mechanism of this association and interaction with other risk factors. PMID:20824132

  11. Use of labour induction and risk of cesarean delivery: a systematic review and meta-analysis

    PubMed Central

    Mishanina, Ekaterina; Rogozinska, Ewelina; Thatthi, Tej; Uddin-Khan, Rehan; Khan, Khalid S.; Meads, Catherine

    2014-01-01

    Background: Induction of labour is common, and cesarean delivery is regarded as its major complication. We conducted a systematic review and meta-analysis to investigate whether the risk of cesarean delivery is higher or lower following labour induction compared with expectant management. Methods: We searched 6 electronic databases for relevant articles published through April 2012 to identify randomized controlled trials (RCTs) in which labour induction was compared with placebo or expectant management among women with a viable singleton pregnancy. We assessed risk of bias and obtained data on rates of cesarean delivery. We used regression analysis techniques to explore the effect of patient characteristics, induction methods and study quality on risk of cesarean delivery. Results: We identified 157 eligible RCTs (n = 31 085). Overall, the risk of cesarean delivery was 12% lower with labour induction than with expectant management (pooled relative risk [RR] 0.88, 95% confidence interval [CI] 0.84–0.93; I2 = 0%). The effect was significant in term and post-term gestations but not in preterm gestations. Meta-regression analysis showed that initial cervical score, indication for induction and method of induction did not alter the main result. There was a reduced risk of fetal death (RR 0.50, 95% CI 0.25–0.99; I2 = 0%) and admission to a neonatal intensive care unit (RR 0.86, 95% CI 0.79–0.94), and no impact on maternal death (RR 1.00, 95% CI 0.10–9.57; I2 = 0%) with labour induction. Interpretation: The risk of cesarean delivery was lower among women whose labour was induced than among those managed expectantly in term and post-term gestations. There were benefits for the fetus and no increased risk of maternal death. PMID:24778358

  12. 49 CFR 260.17 - Credit risk premium analysis.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Credit risk premium analysis. 260.17 Section 260... Financial Assistance § 260.17 Credit risk premium analysis. (a) When Federal appropriations are not available to cover the total subsidy cost, the Administrator will determine the Credit Risk Premium...

  13. Different type 2 diabetes risk assessments predict dissimilar numbers at 'high risk': a retrospective analysis of diabetes risk-assessment tools.

    PubMed

    Gray, Benjamin J; Bracken, Richard M; Turner, Daniel; Morgan, Kerry; Thomas, Michael; Williams, Sally P; Williams, Meurig; Rice, Sam; Stephens, Jeffrey W

    2015-12-01

    Use of a validated risk-assessment tool to identify individuals at high risk of developing type 2 diabetes is currently recommended. It is under-reported, however, whether a different risk tool alters the predicted risk of an individual. This study explored any differences between commonly used validated risk-assessment tools for type 2 diabetes. Cross-sectional analysis of individuals who participated in a workplace-based risk assessment in Carmarthenshire, South Wales. Retrospective analysis of 676 individuals (389 females and 287 males) who participated in a workplace-based diabetes risk-assessment initiative. Ten-year risk of type 2 diabetes was predicted using the validated QDiabetes(®), Leicester Risk Assessment (LRA), FINDRISC, and Cambridge Risk Score (CRS) algorithms. Differences between the risk-assessment tools were apparent following retrospective analysis of individuals. CRS categorised the highest proportion (13.6%) of individuals at 'high risk' followed by FINDRISC (6.6%), QDiabetes (6.1%), and, finally, the LRA was the most conservative risk tool (3.1%). Following further analysis by sex, over one-quarter of males were categorised at high risk using CRS (25.4%), whereas a greater percentage of females were categorised as high risk using FINDRISC (7.8%). The adoption of a different valid risk-assessment tool can alter the predicted risk of an individual and caution should be used to identify those individuals who really are at high risk of type 2 diabetes. © British Journal of General Practice 2015.

  14. Assessing risk of draft survey by AHP method

    NASA Astrophysics Data System (ADS)

    Xu, Guangcheng; Zhao, Kuimin; Zuo, Zhaoying; Liu, Gang; Jian, Binguo; Lin, Yan; Fan, Yukun; Wang, Fei

    2018-04-01

    The paper assesses the risks of vessel floating in the seawater for draft survey by using the analytic hierarchy process. On this basis, the paper established draft survey risk index from the view of draft reading, ballast water, fresh water, and calculation process and so on. Then the paper proposes the method to deal with risk assessment using one concrete sample.

  15. Extensions to decision curve analysis, a novel method for evaluating diagnostic tests, prediction models and molecular markers

    PubMed Central

    Vickers, Andrew J; Cronin, Angel M; Elkin, Elena B; Gonen, Mithat

    2008-01-01

    Background Decision curve analysis is a novel method for evaluating diagnostic tests, prediction models and molecular markers. It combines the mathematical simplicity of accuracy measures, such as sensitivity and specificity, with the clinical applicability of decision analytic approaches. Most critically, decision curve analysis can be applied directly to a data set, and does not require the sort of external data on costs, benefits and preferences typically required by traditional decision analytic techniques. Methods In this paper we present several extensions to decision curve analysis including correction for overfit, confidence intervals, application to censored data (including competing risk) and calculation of decision curves directly from predicted probabilities. All of these extensions are based on straightforward methods that have previously been described in the literature for application to analogous statistical techniques. Results Simulation studies showed that repeated 10-fold crossvalidation provided the best method for correcting a decision curve for overfit. The method for applying decision curves to censored data had little bias and coverage was excellent; for competing risk, decision curves were appropriately affected by the incidence of the competing risk and the association between the competing risk and the predictor of interest. Calculation of decision curves directly from predicted probabilities led to a smoothing of the decision curve. Conclusion Decision curve analysis can be easily extended to many of the applications common to performance measures for prediction models. Software to implement decision curve analysis is provided. PMID:19036144

  16. Insomnia and the risk of depression: a meta-analysis of prospective cohort studies.

    PubMed

    Li, Liqing; Wu, Chunmei; Gan, Yong; Qu, Xianguo; Lu, Zuxun

    2016-11-05

    Observational studies suggest that insomnia might be associated with an increased risk of depression with inconsistent results. This study aimed at conducting a meta-analysis of prospective cohort studies to evaluate the association between insomnia and the risk of depression. Relevant cohort studies were comprehensively searched from the PubMed, Embase, Web of Science, and China National Knowledge Infrastructure databases (up to October 2014) and from the reference lists of retrieved articles. A random-effects model was used to calculate the pooled risk estimates and 95 % confidence intervals (CIs). The I 2 statistic was used to assess the heterogeneity and potential sources of heterogeneity were assessed with meta-regression. The potential publication bias was explored by using funnel plots, Egger's test, and Duval and Tweedie trim-and-fill methods. Thirty-four cohort studies involving 172,077 participants were included in this meta-analysis with an average follow-up period of 60.4 months (ranging from 3.5 to 408). Statistical analysis suggested a positive relationship between insomnia and depression, the pooled RR was 2.27 (95 % CI: 1.89-2.71), and a high heterogeneity was observed (I 2  = 92.6 %, P < 0.001). Visual inspection of the funnel plot revealed some asymmetry. The Egger's test identified evidence of substantial publication bias (P <0.05), but correction for this bias using trim-and-fill method did not alter the combined risk estimates. This meta-analysis indicates that insomnia is significantly associated with an increased risk of depression, which has implications for the prevention of depression in non-depressed individuals with insomnia symptoms.

  17. Method and apparatus for assessing cardiovascular risk

    NASA Technical Reports Server (NTRS)

    Albrecht, Paul (Inventor); Bigger, J. Thomas (Inventor); Cohen, Richard J. (Inventor)

    1998-01-01

    The method for assessing risk of an adverse clinical event includes detecting a physiologic signal in the subject and determining from the physiologic signal a sequence of intervals corresponding to time intervals between heart beats. The long-time structure of fluctuations in the intervals over a time period of more than fifteen minutes is analyzed to assess risk of an adverse clinical event. In a preferred embodiment, the physiologic signal is an electrocardiogram and the time period is at least fifteen minutes. A preferred method for analyzing the long-time structure variability in the intervals includes computing the power spectrum and fitting the power spectrum to a power law dependence on frequency over a selected frequency range such as 10.sup.-4 to 10.sup.-2 Hz. Characteristics of the long-time structure fluctuations in the intervals is used to assess risk of an adverse clinical event.

  18. Perceived Risks Associated with Contraceptive Method Use among Men and Women in Ibadan and Kaduna, Nigeria.

    PubMed

    Schwandt, Hilary M; Skinner, Joanna; Hebert, Luciana E; Saad, Abdulmumin

    2015-12-01

    Research shows that side effects are often the most common reason for contraceptive non-use in Nigeria; however, research to date has not explored the underlying factors that influence risk and benefit perceptions associated with specific contraceptive methods in Nigeria. A qualitative study design using focus group discussions was used to explore social attitudes and beliefs about family planning methods in Ibadan and Kaduna, Nigeria. A total of 26 focus group discussions were held in 2010 with men and women of reproductive age, disaggregated by city, sex, age, marital status, neighborhood socioeconomic status, and--for women only--family planning experience. A discussion guide was used that included specific questions about the perceived risks and benefits associated with the use of six different family planning methods. A thematic content analytic approach guided the analysis. Participants identified a spectrum of risks encompassing perceived threats to health (both real and fictitious) and social concerns, as well as benefits associated with each method. By exploring Nigerian perspectives on the risks and benefits associated with specific family planning methods, programs aiming to increase contraceptive use in Nigeria can be better equipped to highlight recognized benefits, address specific concerns, and work to dispel misperceptions associated with each family planning method.

  19. Breast Cancer Risk From Modifiable and Non-Modifiable Risk Factors among Women in Southeast Asia: A Meta-Analysis

    PubMed

    Nindrea, Ricvan Dana; Aryandono, Teguh; Lazuardi, Lutfan

    2017-12-28

    Objective: The aim of this study was to determine breast cancer risk from modifiable and non-modifiable factors among women in Southeast Asia. Methods: This meta-analysis was performed on research articles on breast cancer risk factors in PubMed, ProQuest and EBSCO databases published between 1997 and October 2017. Pooled odds ratios (OR) are calculated using fixed and random-effect models. Data were processed using Review Manager 5.3 (RevMan 5.3). Results: From a total of 1,211 articles, 15 studies (1 cohort and 14 case control studies) met the criteria for systematic review. Meta-analysis results showed that of the known modifiable risk factors for breast cancer, parity (nulipara) had the highest odd ratio (OR = 1.85 [95% CI 1.47-2.32]) followed by body mass index (overweight) (OR = 1.61 [95% CI 1.43-1.80]) and use of oral contraceptives (OR = 1.27 [95% CI 1.07-1.51]). Of non-modifiable risk factors, family history of breast cancer had the highest odd ratio (OR = 2.53 [95% CI 1.25-5.09]), followed by age (≥ 40 years) (OR = 1.53 [95% CI 1.34-1.76]) and menopausal status (OR = 1.44 [95% CI 1.26-1.65]). Conclusion: This analysis confirmed associations between both modifiable risk factors (parity, body mass index and use of oral contraceptives) and non-modifiable risk factors (family history of breast cancer, age and menopausal status) with breast cancer. Creative Commons Attribution License

  20. Putting problem formulation at the forefront of GMO risk analysis.

    PubMed

    Tepfer, Mark; Racovita, Monica; Craig, Wendy

    2013-01-01

    When applying risk assessment and the broader process of risk analysis to decisions regarding the dissemination of genetically modified organisms (GMOs), the process has a tendency to become remarkably complex. Further, as greater numbers of countries consider authorising the large-scale dissemination of GMOs, and as GMOs with more complex traits reach late stages of development, there has been increasing concern about the burden posed by the complexity of risk analysis. We present here an improved approach for GMO risk analysis that gives a central role to problem formulation. Further, the risk analysis strategy has been clarified and simplified in order to make rigorously scientific risk assessment and risk analysis more broadly accessible to diverse stakeholder groups.

  1. Risk factors for baclofen pump infection in children: a multivariate analysis.

    PubMed

    Spader, Heather S; Bollo, Robert J; Bowers, Christian A; Riva-Cambrin, Jay

    2016-06-01

    OBJECTIVE Intrathecal baclofen infusion systems to manage severe spasticity and dystonia are associated with higher infection rates in children than in adults. Factors unique to this population, such as poor nutrition and physical limitations for pump placement, have been hypothesized as the reasons for this disparity. The authors assessed potential risk factors for infection in a multivariate analysis. METHODS Patients who underwent implantation of a programmable pump and intrathecal catheter for baclofen infusion at a single center between January 1, 2000, and March 1, 2012, were identified in this retrospective cohort study. The primary end point was infection. Potential risk factors investigated included preoperative (i.e., demographics, body mass index [BMI], gastrostomy tube, tracheostomy, previous spinal fusion), intraoperative (i.e., surgeon, antibiotics, pump size, catheter location), and postoperative (i.e., wound dehiscence, CSF leak, and number of revisions) factors. Univariate analysis was performed, and a multivariate logistic regression model was created to identify independent risk factors for infection. RESULTS A total of 254 patients were evaluated. The overall infection rate was 9.8%. Univariate analysis identified young age, shorter height, lower weight, dehiscence, CSF leak, and number of revisions within 6 months of pump placement as significantly associated with infection. Multivariate analysis identified young age, dehiscence, and number of revisions as independent risk factors for infection. CONCLUSIONS Young age, wound dehiscence, and number of revisions were independent risk factors for infection in this pediatric cohort. A low BMI and the presence of either a gastrostomy or tracheostomy were not associated with infection and may not be contraindications for this procedure.

  2. Streamlining project delivery through risk analysis.

    DOT National Transportation Integrated Search

    2015-08-01

    Project delivery is a significant area of concern and is subject to several risks throughout Plan Development : Process (PDP). These risks are attributed to major areas of project development, such as environmental : analysis, right-of-way (ROW) acqu...

  3. Insomnia and risk of dementia in older adults: Systematic review and meta-analysis.

    PubMed

    de Almondes, Katie Moraes; Costa, Mônica Vieira; Malloy-Diniz, Leandro Fernandes; Diniz, Breno Satler

    2016-06-01

    There are cross-sectional evidences of an association between sleep disorders and cognitive impairment on older adults. However, there are no consensus by means of longitudinal studies data on the increased risk of developing dementia related to insomnia. We conduct a systematic review and meta-analysis to evaluate the risk of incident all-cause dementia in individuals with insomnia in population-based prospective cohort studies. Five studies of 5.242 retrieved references were included in the meta-analysis. We used the generic inverse variance method with a random effects model to calculate the pooled risk of dementia in older adults with insomnia. We assessed heterogeneity in the meta-analysis by means of the Q-test and I2 index. Study quality was assessed with the Newcastle-Ottawa Scale The results showed that Insomnia was associated with a significant risk of all-cause dementia (RR = 1.53 CI95% (1.07-2.18), z = 2.36, p = 0.02). There was evidence for significant heterogeneity in the analysis (q-value = 2.4, p < 0.001 I2 = 82%). Insomnia is associated with an increased risk for dementia. This results provide evidences that future studies should investigate dementia prevention among elderly individuals through screening and proper management of insomnia. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Spatiotemporal analysis of the agricultural drought risk in Heilongjiang Province, China

    NASA Astrophysics Data System (ADS)

    Pei, Wei; Fu, Qiang; Liu, Dong; Li, Tian-xiao; Cheng, Kun; Cui, Song

    2017-06-01

    Droughts are natural disasters that pose significant threats to agricultural production as well as living conditions, and a spatial-temporal difference analysis of agricultural drought risk can help determine the spatial distribution and temporal variation of the drought risk within a region. Moreover, this type of analysis can provide a theoretical basis for the identification, prevention, and mitigation of drought disasters. In this study, the overall dispersion and local aggregation of projection points were based on research by Friedman and Tukey (IEEE Trans on Computer 23:881-890, 1974). In this work, high-dimensional samples were clustered by cluster analysis. The clustering results were represented by the clustering matrix, which determined the local density in the projection index. This method avoids the problem of determining a cutoff radius. An improved projection pursuit model is proposed that combines cluster analysis and the projection pursuit model, which offer advantages for classification and assessment, respectively. The improved model was applied to analyze the agricultural drought risk of 13 cities in Heilongjiang Province over 6 years (2004, 2006, 2008, 2010, 2012, and 2014). The risk of an agricultural drought disaster was characterized by 14 indicators and the following four aspects: hazard, exposure, sensitivity, and resistance capacity. The spatial distribution and temporal variation characteristics of the agricultural drought risk in Heilongjiang Province were analyzed. The spatial distribution results indicated that Suihua, Qigihar, Daqing, Harbin, and Jiamusi are located in high-risk areas, Daxing'anling and Yichun are located in low-risk areas, and the differences among the regions were primarily caused by the aspects exposure and resistance capacity. The temporal variation results indicated that the risk of agricultural drought in most areas presented an initially increasing and then decreasing trend. A higher value for the exposure

  5. Construction risk assessment of deep foundation pit in metro station based on G-COWA method

    NASA Astrophysics Data System (ADS)

    You, Weibao; Wang, Jianbo; Zhang, Wei; Liu, Fangmeng; Yang, Diying

    2018-05-01

    In order to get an accurate understanding of the construction safety of deep foundation pit in metro station and reduce the probability and loss of risk occurrence, a risk assessment method based on G-COWA is proposed. Firstly, relying on the specific engineering examples and the construction characteristics of deep foundation pit, an evaluation index system based on the five factors of “human, management, technology, material and environment” is established. Secondly, the C-OWA operator is introduced to realize the evaluation index empowerment and weaken the negative influence of expert subjective preference. The gray cluster analysis and fuzzy comprehensive evaluation method are combined to construct the construction risk assessment model of deep foundation pit, which can effectively solve the uncertainties. Finally, the model is applied to the actual project of deep foundation pit of Qingdao Metro North Station, determine its construction risk rating is “medium”, evaluate the model is feasible and reasonable. And then corresponding control measures are put forward and useful reference are provided.

  6. Multi-hazard risk analysis for management strategies

    NASA Astrophysics Data System (ADS)

    Kappes, M.; Keiler, M.; Bell, R.; Glade, T.

    2009-04-01

    Risk management is very often operating in a reactive way, responding to an event, instead of proactive starting with risk analysis and building up the whole process of risk evaluation, prevention, event management and regeneration. Since damage and losses from natural hazards raise continuously more and more studies, concepts (e.g. Switzerland or South Tyrol-Bolozano) and software packages (e.g. ARMAGEDOM, HAZUS or RiskScape) are developed to guide, standardize and facilitate the risk analysis. But these approaches focus on different aspects and are mostly closely adapted to the situation (legislation, organization of the administration, specific processes etc.) of the specific country or region. We propose in this study the development of a flexible methodology for multi-hazard risk analysis, identifying the stakeholders and their needs, processes and their characteristics, modeling approaches as well as incoherencies occurring by combining all these different aspects. Based on this concept a flexible software package will be established consisting of ArcGIS as central base and being complemented by various modules for hazard modeling, vulnerability assessment and risk calculation. Not all modules will be developed newly but taken from the current state-of-the-art and connected or integrated into ArcGIS. For this purpose two study sites, Valtellina in Italy and Bacelonnette in France, were chosen and the hazards types debris flows, rockfalls, landslides, avalanches and floods are planned to be included in the tool for a regional multi-hazard risk analysis. Since the central idea of this tool is its flexibility this will only be a first step, in the future further processes and scales can be included and the instrument thus adapted to any study site.

  7. A workshop on developing risk assessment methods for medical use of radioactive material. Volume 1: Summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tortorelli, J.P.

    1995-08-01

    A workshop was held at the Idaho National Engineering Laboratory, August 16--18, 1994 on the topic of risk assessment on medical devices that use radioactive isotopes. Its purpose was to review past efforts to develop a risk assessment methodology to evaluate these devices, and to develop a program plan and a scoping document for future methodology development. This report contains a summary of that workshop. Participants included experts in the fields of radiation oncology, medical physics, risk assessment, human-error analysis, and human factors. Staff from the US Nuclear Regulatory Commission (NRC) associated with the regulation of medical uses of radioactivemore » materials and with research into risk-assessment methods participated in the workshop. The workshop participants concurred in NRC`s intended use of risk assessment as an important technology in the development of regulations for the medical use of radioactive material and encouraged the NRC to proceed rapidly with a pilot study. Specific recommendations are included in the executive summary and the body of this report. An appendix contains the 8 papers presented at the conference: NRC proposed policy statement on the use of probabilistic risk assessment methods in nuclear regulatory activities; NRC proposed agency-wide implementation plan for probabilistic risk assessment; Risk evaluation of high dose rate remote afterloading brachytherapy at a large research/teaching institution; The pros and cons of using human reliability analysis techniques to analyze misadministration events; Review of medical misadministration event summaries and comparison of human error modeling; Preliminary examples of the development of error influences and effects diagrams to analyze medical misadministration events; Brachytherapy risk assessment program plan; and Principles of brachytherapy quality assurance.« less

  8. Chemical Mixtures Health Risk Assessment of Environmental Contaminants: Concepts, Methods, Applications

    EPA Science Inventory

    This problems-based, introductory workshop focuses on methods to assess health risks posed by exposures to chemical mixtures in the environment. Chemical mixtures health risk assessment methods continue to be developed and evolve to address concerns over health risks from multic...

  9. [Evaluation of the risk related to repetitive work activities: testing of several methods proposed in the literature].

    PubMed

    Capodaglio, E M; Facioli, M; Bazzini, G

    2001-01-01

    Pathologies due to the repetitive activity of the upper limbs constitutes a growing part of the work-related musculo-skeletal disorders. At the moment, there are no universally accepted and validated methods for the description and assessment of the work-related risks. Yet, the criteria fundamentally characterizing the exposure are rather clear and even. This study reports a practical example of the application of some recent risk assessment methods proposed in the literature, combining objective and subjective measures obtained on the field, with the traditional activity analysis.

  10. An approximate method for determining of investment risk

    NASA Astrophysics Data System (ADS)

    Slavkova, Maria; Tzenova, Zlatina

    2016-12-01

    In this work a method for determining of investment risk during all economic states is considered. It is connected to matrix games with two players. A definition for risk in a matrix game is introduced. Three properties are proven. It is considered an appropriate example.

  11. Practical, transparent prospective risk analysis for the clinical laboratory.

    PubMed

    Janssens, Pim Mw

    2014-11-01

    Prospective risk analysis (PRA) is an essential element in quality assurance for clinical laboratories. Practical approaches to conducting PRA in laboratories, however, are scarce. On the basis of the classical Failure Mode and Effect Analysis method, an approach to PRA was developed for application to key laboratory processes. First, the separate, major steps of the process under investigation are identified. Scores are then given for the Probability (P) and Consequence (C) of predefined types of failures and the chances of Detecting (D) these failures. Based on the P and C scores (on a 10-point scale), an overall Risk score (R) is calculated. The scores for each process were recorded in a matrix table. Based on predetermined criteria for R and D, it was determined whether a more detailed analysis was required for potential failures and, ultimately, where risk-reducing measures were necessary, if any. As an illustration, this paper presents the results of the application of PRA to our pre-analytical and analytical activities. The highest R scores were obtained in the stat processes, the most common failure type in the collective process steps was 'delayed processing or analysis', the failure type with the highest mean R score was 'inappropriate analysis' and the failure type most frequently rated as suboptimal was 'identification error'. The PRA designed is a useful semi-objective tool to identify process steps with potential failures rated as risky. Its systematic design and convenient output in matrix tables makes it easy to perform, practical and transparent. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  12. [Impact of water pollution risk in water transfer project based on fault tree analysis].

    PubMed

    Liu, Jian-Chang; Zhang, Wei; Wang, Li-Min; Li, Dai-Qing; Fan, Xiu-Ying; Deng, Hong-Bing

    2009-09-15

    The methods to assess water pollution risk for medium water transfer are gradually being explored. The event-nature-proportion method was developed to evaluate the probability of the single event. Fault tree analysis on the basis of calculation on single event was employed to evaluate the extent of whole water pollution risk for the channel water body. The result indicates, that the risk of pollutants from towns and villages along the line of water transfer project to the channel water body is at high level with the probability of 0.373, which will increase pollution to the channel water body at the rate of 64.53 mg/L COD, 4.57 mg/L NH4(+) -N and 0.066 mg/L volatilization hydroxybenzene, respectively. The measurement of fault probability on the basis of proportion method is proved to be useful in assessing water pollution risk under much uncertainty.

  13. Proposal of a method for evaluating tsunami risk using response-surface methodology

    NASA Astrophysics Data System (ADS)

    Fukutani, Y.

    2017-12-01

    Information on probabilistic tsunami inundation hazards is needed to define and evaluate tsunami risk. Several methods for calculating these hazards have been proposed (e.g. Løvholt et al. (2012), Thio (2012), Fukutani et al. (2014), Goda et al. (2015)). However, these methods are inefficient, and their calculation cost is high, since they require multiple tsunami numerical simulations, therefore lacking versatility. In this study, we proposed a simpler method for tsunami risk evaluation using response-surface methodology. Kotani et al. (2016) proposed an evaluation method for the probabilistic distribution of tsunami wave-height using a response-surface methodology. We expanded their study and developed a probabilistic distribution of tsunami inundation depth. We set the depth (x1) and the slip (x2) of an earthquake fault as explanatory variables and tsunami inundation depth (y) as an object variable. Subsequently, tsunami risk could be evaluated by conducting a Monte Carlo simulation, assuming that the generation probability of an earthquake follows a Poisson distribution, the probability distribution of tsunami inundation depth follows the distribution derived from a response-surface, and the damage probability of a target follows a log normal distribution. We applied the proposed method to a wood building located on the coast of Tokyo Bay. We implemented a regression analysis based on the results of 25 tsunami numerical calculations and developed a response-surface, which was defined as y=ax1+bx2+c (a:0.2615, b:3.1763, c=-1.1802). We assumed proper probabilistic distribution for earthquake generation, inundation height, and vulnerability. Based on these probabilistic distributions, we conducted Monte Carlo simulations of 1,000,000 years. We clarified that the expected damage probability of the studied wood building is 22.5%, assuming that an earthquake occurs. The proposed method is therefore a useful and simple way to evaluate tsunami risk using a response

  14. A systematic review and appraisal of methods of developing and validating lifestyle cardiovascular disease risk factors questionnaires.

    PubMed

    Nse, Odunaiya; Quinette, Louw; Okechukwu, Ogah

    2015-09-01

    Well developed and validated lifestyle cardiovascular disease (CVD) risk factors questionnaires is the key to obtaining accurate information to enable planning of CVD prevention program which is a necessity in developing countries. We conducted this review to assess methods and processes used for development and content validation of lifestyle CVD risk factors questionnaires and possibly develop an evidence based guideline for development and content validation of lifestyle CVD risk factors questionnaires. Relevant databases at the Stellenbosch University library were searched for studies conducted between 2008 and 2012, in English language and among humans. Using the following databases; pubmed, cinahl, psyc info and proquest. Search terms used were CVD risk factors, questionnaires, smoking, alcohol, physical activity and diet. Methods identified for development of lifestyle CVD risk factors were; review of literature either systematic or traditional, involvement of expert and /or target population using focus group discussion/interview, clinical experience of authors and deductive reasoning of authors. For validation, methods used were; the involvement of expert panel, the use of target population and factor analysis. Combination of methods produces questionnaires with good content validity and other psychometric properties which we consider good.

  15. Comparison Of Intake Gate Closure Methods At Lower Granite, Little Goose, Lower Monumental, And Mcnary Dams Using Risk-Based Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gore, Bryan F.; Blackburn, Tyrone R.; Heasler, Patrick G.

    2001-01-19

    The objective of this report is to compare the benefits and costs of modifications proposed for intake gate closure systems at four hydroelectric stations on the Lower Snake and Upper Columbia Rivers in the Walla Walla District that are unable to meet the COE 10-minute closure rule due to the installation of fish screens. The primary benefit of the proposed modifications is to reduce the risk of damage to the station and environs when emergency intake gate closure is required. Consequently, this report presents the results and methodology of an extensive risk analysis performed to assess the reliability of powerhousemore » systems and the costs and timing of potential damages resulting from events requiring emergency intake gate closure. As part of this analysis, the level of protection provided by the nitrogen emergency closure system was also evaluated. The nitrogen system was the basis for the original recommendation to partially disable the intake gate systems. The risk analysis quantifies this protection level.« less

  16. [The methods of assessment of health risk from exposure to radon and radon daughters].

    PubMed

    Demin, V F; Zhukovskiy, M V; Kiselev, S M

    2014-01-01

    The critical analysis of existing models of the relationship dose-effect (RDE) for radon exposure on human health has been performed. Conclusion about the necessity and possibility of improving these models has been made. A new improved version ofthe RDE has been developed. A technique for assessing the human health risk of exposure to radon, including the method for estimating of exposure doses of radon, an improved model of RDE, proper methodology risk assessment has been described. Methodology is proposed for the use in the territory of Russia.

  17. Disease Risk Analysis and Post-Release Health Surveillance for a Reintroduction Programme: the Pool Frog Pelophylax lessonae.

    PubMed

    Sainsbury, A W; Yu-Mei, R; Ågren, E; Vaughan-Higgins, R J; Mcgill, I S; Molenaar, F; Peniche, G; Foster, J

    2017-10-01

    There are risks from disease in undertaking wild animal reintroduction programmes. Methods of disease risk analysis have been advocated to assess and mitigate these risks, and post-release health and disease surveillance can be used to assess the effectiveness of the disease risk analysis, but results for a reintroduction programme have not to date been recorded. We carried out a disease risk analysis for the reintroduction of pool frogs (Pelophylax lessonae) to England, using information gained from the literature and from diagnostic testing of Swedish pool frogs and native amphibians. Ranavirus and Batrachochytrium dendrobatidis were considered high-risk disease threats for pool frogs at the destination site. Quarantine was used to manage risks from disease due to these two agents at the reintroduction site: the quarantine barrier surrounded the reintroduced pool frogs. Post-release health surveillance was carried out through regular health examinations of amphibians in the field at the reintroduction site and collection and examination of dead amphibians. No significant health or disease problems were detected, but the detection rate of dead amphibians was very low. Methods to detect a higher proportion of dead reintroduced animals and closely related species are required to better assess the effects of reintroduction on health and disease. © 2016 Blackwell Verlag GmbH.

  18. Risk Costs for New Dams: Economic Analysis and Effects of Monitoring

    NASA Astrophysics Data System (ADS)

    Paté-Cornell, M. Elisabeth; Tagaras, George

    1986-01-01

    This paper presents new developments and illustrations of the introduction of risk and costs in cost-benefit analysis for new dams. The emphasis is on a method of evaluation of the risk costs based on the structure of the local economy. Costs to agricultural property as well as residential, commercial, industrial, and public property are studied in detail. Of particular interest is the case of sequential dam failure and the evaluation of the risk costs attributable to a new dam upstream from an existing one. Three real cases are presented as illustrations of the method: the Auburn Dam, the Dickey-Lincoln School Project, and the Teton Dam, which failed in 1976. This last case provides a calibration tool for the estimation of loss ratios. For these three projects, the risk-modified benefit-cost ratios are computed to assess the effect of the risk on the economic performance of the project. The role of a warning system provided by systematic monitoring of the dam is analyzed: by reducing the risk costs, the warning system attenuates their effect on the benefit-cost ratio. The precursors, however, can be missed or misinterpreted: monitoring does not guarantee that the risks to human life can be reduced to zero. This study shows, in particular, that it is critical to consider the risk costs in the decision to build a new dam when the flood area is large and densely populated.

  19. Advances in Chemical Mixtures Risk Methods

    EPA Science Inventory

    This presentation is an overview of emerging issues for dose addition in chemical mixtures risk assessment. It is intended to give the participants a perspective of recent developments in methods for dose addition. The workshop abstract is as follows:This problems-based, half-day...

  20. [Statistical prediction methods in violence risk assessment and its application].

    PubMed

    Liu, Yuan-Yuan; Hu, Jun-Mei; Yang, Min; Li, Xiao-Song

    2013-06-01

    It is an urgent global problem how to improve the violence risk assessment. As a necessary part of risk assessment, statistical methods have remarkable impacts and effects. In this study, the predicted methods in violence risk assessment from the point of statistics are reviewed. The application of Logistic regression as the sample of multivariate statistical model, decision tree model as the sample of data mining technique, and neural networks model as the sample of artificial intelligence technology are all reviewed. This study provides data in order to contribute the further research of violence risk assessment.

  1. Association among Dietary Flavonoids, Flavonoid Subclasses and Ovarian Cancer Risk: A Meta-Analysis

    PubMed Central

    You, Ruxu; Yang, Yu; Liao, Jing; Chen, Dongsheng; Yu, Lixiu

    2016-01-01

    Background Previous studies have indicated that intake of dietary flavonoids or flavonoid subclasses is associated with the ovarian cancer risk, but presented controversial results. Therefore, we conducted a meta-analysis to derive a more precise estimation of these associations. Methods We performed a search in PubMed, Google Scholar and ISI Web of Science from their inception to April 25, 2015 to select studies on the association among dietary flavonoids, flavonoid subclasses and ovarian cancer risk. The information was extracted by two independent authors. We assessed the heterogeneity, sensitivity, publication bias and quality of the articles. A random-effects model was used to calculate the pooled risk estimates. Results Five cohort studies and seven case-control studies were included in the final meta-analysis. We observed that intake of dietary flavonoids can decrease ovarian cancer risk, which was demonstrated by pooled RR (RR = 0.82, 95% CI = 0.68–0.98). In a subgroup analysis by flavonoid subtypes, the ovarian cancer risk was also decreased for isoflavones (RR = 0.67, 95% CI = 0.50–0.92) and flavonols (RR = 0.68, 95% CI = 0.58–0.80). While there was no compelling evidence that consumption of flavones (RR = 0.86, 95% CI = 0.71–1.03) could decrease ovarian cancer risk, which revealed part sources of heterogeneity. The sensitivity analysis indicated stable results, and no publication bias was observed based on the results of Funnel plot analysis and Egger’s test (p = 0.26). Conclusions This meta-analysis suggested that consumption of dietary flavonoids and subtypes (isoflavones, flavonols) has a protective effect against ovarian cancer with a reduced risk of ovarian cancer except for flavones consumption. Nevertheless, further investigations on a larger population covering more flavonoid subclasses are warranted. PMID:26960146

  2. Private participation in infrastructure: A risk analysis of long-term contracts in power sector

    NASA Astrophysics Data System (ADS)

    Ceran, Nisangul

    The objective of this dissertation is to assess whether the private participation in energy sector through long term contracting, such as Build-Operate-Transfer (BOT) type investments, is an efficient way of promoting efficiency in the economy. To this end; the theoretical literature on the issue is discussed, the experience of several developing countries are examined, and a BOT project, which is undertaken by the Enron company in Turkey, has been studied in depth as a case study. Different risk analysis techniques, including sensitivity and probabilistic risk analysis with the Monte Carlo Simulation (MCS) method have been applied to assess the financial feasibility and risks of the case study project, and to shed light on the level of rent-seeking in the BOT agreements. Although data on rent seeking and corruption is difficult to obtain, the analysis of case study investment using the sensitivity and MCS method provided some information that can be used in assessing the level of rent-seeking in BOT projects. The risk analysis enabled to test the sustainability of the long-term BOT contracts through the analysis of projects financial feasibility with and without the government guarantees in the project. The approach of testing the sustainability of the project under different scenarios is helpful to understand the potential costs and contingent liabilities for the government and project's impact on a country's overall economy. The results of the risk analysis made by the MCS method for the BOT project used as the case study strongly suggest that, the BOT projects does not serve to the interest of the society and transfers substantial amount of public money to the private companies, implying severe governance problems. It is found that not only government but also private sector may be reluctant about full privatization of infrastructure due to several factors such as involvement of large sunk costs, very long time period for returns to be received, political and

  3. Extensions to decision curve analysis, a novel method for evaluating diagnostic tests, prediction models and molecular markers.

    PubMed

    Vickers, Andrew J; Cronin, Angel M; Elkin, Elena B; Gonen, Mithat

    2008-11-26

    Decision curve analysis is a novel method for evaluating diagnostic tests, prediction models and molecular markers. It combines the mathematical simplicity of accuracy measures, such as sensitivity and specificity, with the clinical applicability of decision analytic approaches. Most critically, decision curve analysis can be applied directly to a data set, and does not require the sort of external data on costs, benefits and preferences typically required by traditional decision analytic techniques. In this paper we present several extensions to decision curve analysis including correction for overfit, confidence intervals, application to censored data (including competing risk) and calculation of decision curves directly from predicted probabilities. All of these extensions are based on straightforward methods that have previously been described in the literature for application to analogous statistical techniques. Simulation studies showed that repeated 10-fold crossvalidation provided the best method for correcting a decision curve for overfit. The method for applying decision curves to censored data had little bias and coverage was excellent; for competing risk, decision curves were appropriately affected by the incidence of the competing risk and the association between the competing risk and the predictor of interest. Calculation of decision curves directly from predicted probabilities led to a smoothing of the decision curve. Decision curve analysis can be easily extended to many of the applications common to performance measures for prediction models. Software to implement decision curve analysis is provided.

  4. Pedophilia: an evaluation of diagnostic and risk prediction methods.

    PubMed

    Wilson, Robin J; Abracen, Jeffrey; Looman, Jan; Picheca, Janice E; Ferguson, Meaghan

    2011-06-01

    One hundred thirty child sexual abusers were diagnosed using each of following four methods: (a) phallometric testing, (b) strict application of Diagnostic and Statistical Manual of Mental Disorders (4th ed., text revision [DSM-IV-TR]) criteria, (c) Rapid Risk Assessment of Sex Offender Recidivism (RRASOR) scores, and (d) "expert" diagnoses rendered by a seasoned clinician. Comparative utility and intermethod consistency of these methods are reported, along with recidivism data indicating predictive validity for risk management. Results suggest that inconsistency exists in diagnosing pedophilia, leading to diminished accuracy in risk assessment. Although the RRASOR and DSM-IV-TR methods were significantly correlated with expert ratings, RRASOR and DSM-IV-TR were unrelated to each other. Deviant arousal was not associated with any of the other methods. Only the expert ratings and RRASOR scores were predictive of sexual recidivism. Logistic regression analyses showed that expert diagnosis did not add to prediction of sexual offence recidivism over and above RRASOR alone. Findings are discussed within a context of encouragement of clinical consistency and evidence-based practice regarding treatment and risk management of those who sexually abuse children.

  5. CUMULATIVE RISK ANALYSIS FOR ORGANOPHOSPHORUS PESTICIDES

    EPA Science Inventory

    Cumulative Risk Analysis for Organophosphorus Pesticides
    R. Woodrow Setzer, Jr. NHEERL MD-74, USEPA, RTP, NC 27711

    The US EPA has recently completed a risk assessment of the effects of exposure to 33 organophosphorous pesticides (OPs) through the diet, water, and resi...

  6. Risk analysis of landslide disaster in Ponorogo, East Java, Indonesia

    NASA Astrophysics Data System (ADS)

    Koesuma, S.; Saido, A. P.; Fukuda, Y.

    2016-11-01

    Ponorogo is one of regency in South-West of East Java Province, Indonesia, where located in subduction zone between Eurasia and Australia plate tectonics. It has a lot of mountain area which is disaster-prone area for landslide. We have collected landslide data in 305 villages in Ponorogo and make it to be Hazards Index. Then we also calculate Vulnerability Index, Economic Loss index, Environmental Damage Index and Capacity Index. The risk analysis map is composed of three components H (Hazards), V (Vulnerability, Economic Loss index, Environmental Damage Index) and C (Capacity Index). The method is based on regulations of National Disaster Management Authority (BNPB) number 02/2012 and number 03/2012. It has three classes of risk index, i.e. Low, Medium and High. Ponorogo city has a medium landslide risk index.

  7. Cyber Risk Management for Critical Infrastructure: A Risk Analysis Model and Three Case Studies.

    PubMed

    Paté-Cornell, M-Elisabeth; Kuypers, Marshall; Smith, Matthew; Keller, Philip

    2018-02-01

    Managing cyber security in an organization involves allocating the protection budget across a spectrum of possible options. This requires assessing the benefits and the costs of these options. The risk analyses presented here are statistical when relevant data are available, and system-based for high-consequence events that have not happened yet. This article presents, first, a general probabilistic risk analysis framework for cyber security in an organization to be specified. It then describes three examples of forward-looking analyses motivated by recent cyber attacks. The first one is the statistical analysis of an actual database, extended at the upper end of the loss distribution by a Bayesian analysis of possible, high-consequence attack scenarios that may happen in the future. The second is a systems analysis of cyber risks for a smart, connected electric grid, showing that there is an optimal level of connectivity. The third is an analysis of sequential decisions to upgrade the software of an existing cyber security system or to adopt a new one to stay ahead of adversaries trying to find their way in. The results are distributions of losses to cyber attacks, with and without some considered countermeasures in support of risk management decisions based both on past data and anticipated incidents. © 2017 Society for Risk Analysis.

  8. Life-table methods for detecting age-risk factor interactions in long-term follow-up studies.

    PubMed

    Logue, E E; Wing, S

    1986-01-01

    Methodological investigation has suggested that age-risk factor interactions should be more evident in age of experience life tables than in follow-up time tables due to the mixing of ages of experience over follow-up time in groups defined by age at initial examination. To illustrate the two approaches, age modification of the effect of total cholesterol on ischemic heart disease mortality in two long-term follow-up studies was investigated. Follow-up time life table analysis of 116 deaths over 20 years in one study was more consistent with a uniform relative risk due to cholesterol, while age of experience life table analysis was more consistent with a monotonic negative age interaction. In a second follow-up study (160 deaths over 24 years), there was no evidence of a monotonic negative age-cholesterol interaction by either method. It was concluded that age-specific life table analysis should be used when age-risk factor interactions are considered, but that both approaches yield almost identical results in absence of age interaction. The identification of the more appropriate life-table analysis should be ultimately guided by the nature of the age or time phenomena of scientific interest.

  9. Efficacy of different methods used for dry socket prevention and risk factor analysis: A systematic review

    PubMed Central

    Taberner-Vallverdú, Maria; Sánchez-Garcés, Mª Ángeles

    2017-01-01

    Background Dry socket is one of the most common complications that develops after the extraction of a permanent tooth, and its prevention is more effective than its treatment. Objectives Analyze the efficacy of different methods used in preventing dry socket in order to decrease its incidence after tooth extraction. Material and Methods A Cochrane and PubMed-MEDLINE database search was conducted with the search terms “dry socket”, “prevention”, “risk factors”, “alveolar osteitis” and “fibrynolitic alveolitis”, both individually and using the Boolean operator “AND”. The inclusion criteria were: clinical studies including at least 30 patients, articles published from 2005 to 2015 and written in English. The exclusion criteria were case reports and nonhuman studies. Results 30 publications were selected from a total of 250. Six of the 30 were excluded after reading the full text. The final review included 24 articles: 9 prospective studies, 2 retrospective studies and 13 clinical trials. They were stratified according to their level of scientific evidence using SIGN criteria (Scottish Intercollegiate Guidelines Network). Conclusions All treatments included in the review were aimed at decreasing the incidence of dry socket. Locally administering chlorhexidine or applying platelet-rich plasma reduces the likelihood of developing this complication. Antibiotic prescription does not avoid postoperative complications after lower third molar surgery. With regard to risk factors, all of the articles selected suggest that patient age, history of previous infection and the difficulty of the extraction are the most common predisposing factors for developing dry socket. There is no consensus that smoking, gender or menstrual cycles are risk factors. Taking the scientific quality of the articles evaluated into account, a level B recommendation has been given for the proposed-procedures in the prevention of dry socket. Key words:Dry socket, prevention

  10. Wine drinking and epithelial ovarian cancer risk: a meta-analysis

    PubMed Central

    Kim, Hee Seung; Shouten, Leo J.; Larsson, Susanna C.; Chung, Hyun Hoon; Kim, Yong Beom; Ju, Woong; Park, Noh Hyun; Song, Yong Sang; Kim, Seung Cheol; Kang, Soon-Beom

    2010-01-01

    Objective Wine has been the focus in the prevention of epithelial ovarian cancer (EOC) development because resveratrol abundant in wine has anti-carcinogenic properties. However, epidemiologic results have been heterogenous in the chemopreventive effect of wine on the development of EOC. Thus, we performed a meta-analysis for comparing EOC risk between wine and never drinkers using previous related studies. Methods After extensive search of the literature between January 1986 and December 2008, we analyzed 10 studies (3 cohort and 7 case control studies) with 135,871 women, who included 65,578 of wine and 70,293 of never drinkers. Results In all studies, there was no significant difference in EOC risk between wine and never drinkers (odds ratio [OR], 1.13; 95% confidence interval [CI], 0.92 to 1.38; random effects). When we performed re-analysis according to the study design, 3 cohort and 7 case control studies showed that there were also no significant differences in EOC risk between wine and never drinkers, respectively (OR, 1.44 and 1.04; 95% CI, 0.74 and 2.82 and 0.88 to 1.22; random effects). In sub-analyses using 2 case-control studies, EOC risk was not different between former and never drinkers (OR, 1.12; 95% CI, 0.87 to 1.44; fixed effect), and between current and former drinkers (OR, 0.74; 95% CI, 0.41 to 1.34; random effects). Conclusion Although resveratrol, abundantly found in wine, is a promising naturally occurring compound with chemopreventive properties on EOC in preclinical studies, this meta-analysis suggests the epidemiologic evidence shows no association between wine drinking and EOC risk. PMID:20613902

  11. Periodic benefit-risk assessment using Bayesian stochastic multi-criteria acceptability analysis

    PubMed Central

    Li, Kan; Yuan, Shuai Sammy; Wang, William; Wan, Shuyan Sabrina; Ceesay, Paulette; Heyse, Joseph F.; Mt-Isa, Shahrul; Luo, Sheng

    2018-01-01

    Benefit-risk (BR) assessment is essential to ensure the best decisions are made for a medical product in the clinical development process, regulatory marketing authorization, post-market surveillance, and coverage and reimbursement decisions. One challenge of BR assessment in practice is that the benefit and risk profile may keep evolving while new evidence is accumulating. Regulators and the International Conference on Harmonization (ICH) recommend performing periodic benefit-risk evaluation report (PBRER) through the product's lifecycle. In this paper, we propose a general statistical framework for periodic benefit-risk assessment, in which Bayesian meta-analysis and stochastic multi-criteria acceptability analysis (SMAA) will be combined to synthesize the accumulating evidence. The proposed approach allows us to compare the acceptability of different drugs dynamically and effectively and accounts for the uncertainty of clinical measurements and imprecise or incomplete preference information of decision makers. We apply our approaches to two real examples in a post-hoc way for illustration purpose. The proposed method may easily be modified for other pre and post market settings, and thus be an important complement to the current structured benefit-risk assessment (sBRA) framework to improve the transparent and consistency of the decision-making process. PMID:29505866

  12. Periodic benefit-risk assessment using Bayesian stochastic multi-criteria acceptability analysis.

    PubMed

    Li, Kan; Yuan, Shuai Sammy; Wang, William; Wan, Shuyan Sabrina; Ceesay, Paulette; Heyse, Joseph F; Mt-Isa, Shahrul; Luo, Sheng

    2018-04-01

    Benefit-risk (BR) assessment is essential to ensure the best decisions are made for a medical product in the clinical development process, regulatory marketing authorization, post-market surveillance, and coverage and reimbursement decisions. One challenge of BR assessment in practice is that the benefit and risk profile may keep evolving while new evidence is accumulating. Regulators and the International Conference on Harmonization (ICH) recommend performing periodic benefit-risk evaluation report (PBRER) through the product's lifecycle. In this paper, we propose a general statistical framework for periodic benefit-risk assessment, in which Bayesian meta-analysis and stochastic multi-criteria acceptability analysis (SMAA) will be combined to synthesize the accumulating evidence. The proposed approach allows us to compare the acceptability of different drugs dynamically and effectively and accounts for the uncertainty of clinical measurements and imprecise or incomplete preference information of decision makers. We apply our approaches to two real examples in a post-hoc way for illustration purpose. The proposed method may easily be modified for other pre and post market settings, and thus be an important complement to the current structured benefit-risk assessment (sBRA) framework to improve the transparent and consistency of the decision-making process. Copyright © 2018 Elsevier Inc. All rights reserved.

  13. Fracture risk in patients with type 2 diabetes mellitus and possible risk factors: a systematic review and meta-analysis

    PubMed Central

    Moayeri, Ardeshir; Mohamadpour, Mahmoud; Mousavi, Seyedeh Fatemeh; Shirzadpour, Ehsan; Mohamadpour, Safoura; Amraei, Mansour

    2017-01-01

    Aim Patients with type 2 diabetes mellitus (T2DM) have an increased risk of bone fractures. A variable increase in fracture risk has been reported depending on skeletal site, diabetes duration, study design, insulin use, and so on. The present meta-analysis aimed to investigate the association between T2DM with fracture risk and possible risk factors. Methods Different databases including PubMed, Institute for Scientific Information, and Scopus were searched up to May 2016. All epidemiologic studies on the association between T2DM and fracture risk were included. The relevant data obtained from these papers were analyzed by a random effects model and publication bias was assessed by funnel plot. All analyses were done by R software (version 3.2.1) and STATA (version 11.1). Results Thirty eligible studies were selected for the meta-analysis. We found a statistically significant positive association between T2DM and hip, vertebral, or foot fractures and no association between T2DM and wrist, proximal humerus, or ankle fractures. Overall, T2DM was associated with an increased risk of any fracture (summary relative risk =1.05, 95% confidence interval: 1.04, 1.06) and increased with age, duration of diabetes, and insulin therapy. Conclusion Our findings strongly support an association between T2DM and increased risk of overall fracture. These findings emphasize the need for fracture prevention strategies in patients with diabetes. PMID:28442913

  14. [The role of a specialised risk analysis group in the Veterinary Services of a developing country].

    PubMed

    Urbina-Amarís, M E

    2003-08-01

    Since the World Trade Organization (WTO) Agreement on the Application of Sanitary and Phytosanitary Measures was established, risk analysis in trade, and ultimately in Veterinary and Animal Health Services, has become strategically important. Irrespective of their concept (discipline, approach, method, process), all types of risk analysis in trade involve four periods or phases:--risk identification-- risk assessment--risk management--risk information or communication. All veterinarians involved in a risk analysis unit must have in-depth knowledge of statistics and the epidemiology of transmissible diseases, as well as a basic knowledge of veterinary science, economics, mathematics, data processing and social communication, to enable them to work with professionals in these disciplines. Many developing countries do not have enough well-qualified professionnals in these areas to support a risk analysis unit. This will need to be rectified by seeking strategic alliances with other public or private sectors that will provide the required support to run the unit properly. Due to the special nature of its risk analysis functions, its role in supporting decision-making, and the criteria of independence and transparency that are so crucial to its operations, the hierarchical position of the risk analysis unit should be close to the top management of the Veterinary Service. Due to the shortage of personnel in developing countries with the required training and scientific and technical qualifications, countries with organisations responsible for both animal and plant health protection would be advised to set up integrated plant and animal risk analysis units. In addition, these units could take charge of all activities relating to WTO agreements and regional agreements on animal and plant health management.

  15. Comparison of methods for estimating the attributable risk in the context of survival analysis.

    PubMed

    Gassama, Malamine; Bénichou, Jacques; Dartois, Laureen; Thiébaut, Anne C M

    2017-01-23

    The attributable risk (AR) measures the proportion of disease cases that can be attributed to an exposure in the population. Several definitions and estimation methods have been proposed for survival data. Using simulations, we compared four methods for estimating AR defined in terms of survival functions: two nonparametric methods based on Kaplan-Meier's estimator, one semiparametric based on Cox's model, and one parametric based on the piecewise constant hazards model, as well as one simpler method based on estimated exposure prevalence at baseline and Cox's model hazard ratio. We considered a fixed binary exposure with varying exposure probabilities and strengths of association, and generated event times from a proportional hazards model with constant or monotonic (decreasing or increasing) Weibull baseline hazard, as well as from a nonproportional hazards model. We simulated 1,000 independent samples of size 1,000 or 10,000. The methods were compared in terms of mean bias, mean estimated standard error, empirical standard deviation and 95% confidence interval coverage probability at four equally spaced time points. Under proportional hazards, all five methods yielded unbiased results regardless of sample size. Nonparametric methods displayed greater variability than other approaches. All methods showed satisfactory coverage except for nonparametric methods at the end of follow-up for a sample size of 1,000 especially. With nonproportional hazards, nonparametric methods yielded similar results to those under proportional hazards, whereas semiparametric and parametric approaches that both relied on the proportional hazards assumption performed poorly. These methods were applied to estimate the AR of breast cancer due to menopausal hormone therapy in 38,359 women of the E3N cohort. In practice, our study suggests to use the semiparametric or parametric approaches to estimate AR as a function of time in cohort studies if the proportional hazards assumption appears

  16. Why Map Issues? On Controversy Analysis as a Digital Method

    PubMed Central

    2015-01-01

    This article takes stock of recent efforts to implement controversy analysis as a digital method in the study of science, technology, and society (STS) and beyond and outlines a distinctive approach to address the problem of digital bias. Digital media technologies exert significant influence on the enactment of controversy in online settings, and this risks undermining the substantive focus of controversy analysis conducted by digital means. To address this problem, I propose a shift in thematic focus from controversy analysis to issue mapping. The article begins by distinguishing between three broad frameworks that currently guide the development of controversy analysis as a digital method, namely, demarcationist, discursive, and empiricist. Each has been adopted in STS, but only the last one offers a digital “move beyond impartiality.” I demonstrate this approach by analyzing issues of Internet governance with the aid of the social media platform Twitter. PMID:26336325

  17. Climate change, land slide risks and sustainable development, risk analysis and decision support process tool

    NASA Astrophysics Data System (ADS)

    Andersson-sköld, Y. B.; Tremblay, M.

    2011-12-01

    Climate change is in most parts of Sweden expected to result in increased precipitation and increased sea water levels causing flooding, erosion, slope instability and related secondary consequences. Landslide risks are expected to increase with climate change in large parts of Sweden due to increased annual precipitation, more intense precipitation and increased flows combined with dryer summers. In response to the potential climate related risks, and on the commission of the Ministry of Environment, the Swedish Geotechnical Institute (SGI) is at present performing a risk analysis project for the most prominent landslide risk area in Sweden: the Göta river valley. As part of this, a methodology for land slide ex-ante consequence analysis today, and in a future climate, has been developed and applied in the Göta river valley. Human life, settlements, industry, contaminated sites, infrastructure of national importance are invented and assessed important elements at risk. The goal of the consequence analysis is to produce a map of geographically distributed expected losses, which can be combined with a corresponding map displaying landslide probability to describe the risk (the combination of probability and consequence of a (negative) event). The risk analysis is GIS-aided in presenting and visualise the risk and using existing databases for quantification of the consequences represented by ex-ante estimated monetary losses. The results will be used on national, regional and as an indication of the risk on local level, to assess the need of measures to mitigate the risk. The costs and environmental and social impacts to mitigate the risk are expected to be very high but the costs and impacts of a severe landslide are expected to be even higher. Therefore, civil servants have pronounced a need of tools to assess both the vulnerability and a more holistic picture of impacts of climate change adaptation measures. At SGI a tool for the inclusion of sustainability

  18. Personal hair dyes use and risk of glioma: a meta-analysis

    PubMed Central

    Shao, Chuan; Qi, Zhen-Yu; Hui, Guo-Zhen; Wang, Zhong

    2013-01-01

    Background and Objective: Use of hair dyes for glioma risk has been investigated in numerous epidemiological studies, but the evidence is inconsistent. Therefore, a meta-analysis was performed to estimate the association between hair dyes use and glioma risk. Methods: We searched PubMed and EMBASE databases without any limitations, covering all papers published by the end of March 8, 2013. Cohort and case-control studies reporting relative risk estimates (RRs) with corresponding 95% confidence intervals (CIs) (or data to calculate them) on this issue were included. Random effects models were used to calculate the pooled RRs and corresponding 95% CIs. Results: Four case-control and two cohort studies were included in this meta-analysis. The summary RRs and 95 % CIs for ever users of any hair dyes were 1.132 (0.887-1.446) for all studies, 1.291 (0.938-1.777) for case-control studies, and 0.903 (0.774-1.054) for cohort studies. In the subgroup analysis by geographic regions and sex, the similar results were detected. No significant associations were also observed among the studies which reported data involving permanent hair dye use and duration of any hair dye use. Conclusion: In summary, the results of our study demonstrated that hair dyes use is not associated with risk of glioma. PMID:24179568

  19. Meta-analysis of association between mobile phone use and glioma risk.

    PubMed

    Wang, Yabo; Guo, Xiaqing

    2016-12-01

    The purpose of this study was to evaluate the association between mobile phone use and glioma risk through pooling the published data. By searching Medline, EMBSE, and CNKI databases, we screened the open published case-control or cohort studies about mobile phone use and glioma risk by systematic searching strategy. The pooled odds of mobile use in glioma patients versus healthy controls were calculated by meta-analysis method. The statistical analysis was done by Stata12.0 software (http://www.stata.com). After searching the Medline, EMBSE, and CNKI databases, we ultimately included 11 studies range from 2001 to 2008. For ≥1 year group, the data were pooled by random effects model. The combined data showed that there was no association between mobile phone use and glioma odds ratio (OR) =1.08 (95% confidence interval [CI]: 0.91-1.25,P > 0.05). However, a significant association was found between mobile phone use more than 5 years and glioma risk OR = 1.35 (95% CI: 1.09-1.62, P < 0.05). The publication bias of this study was evaluated by funnel plot and line regression test. The funnel plot and line regression test (t = 0.25,P = 0.81) did not indicate any publication bias. Long-term mobile phone use may increase the risk of developing glioma according to this meta-analysis.

  20. The risk of kidney stones following bariatric surgery: a systematic review and meta-analysis.

    PubMed

    Thongprayoon, Charat; Cheungpasitporn, Wisit; Vijayvargiya, Priya; Anthanont, Pimjai; Erickson, Stephen B

    2016-01-01

    With rising prevalence of morbid obesity, the number of bariatric surgeries performed each year has been increasing worldwide. The objective of this meta-analysis was to assess the risk of kidney stones following bariatric surgery. A literature search was performed using MEDLINE, EMBASE, and Cochrane Database of Systematic Reviews from inception through July 2015. Only studies reporting relative risks, odd ratios or hazard ratios (HRs) to compare risk of kidney stones in patients who underwent bariatric surgery versus no surgery were included. Pooled risk ratios (RR) and 95% confidence interval (CI) were calculated using a random-effect, generic inverse variance method. Four studies (One randomized controlled trial and three cohort studies) with 11,348 patients were included in analysis to assess the risk of kidney stones following bariatric surgery. The pooled RR of kidney stones in patients undergoing bariatric surgery was 1.22 (95% CI, 0.63-2.35). The type of bariatric surgery subgroup analysis demonstrated an increased risk of kidney stones in patients following Roux-en-Y gastric bypass (RYGB) with the pooled RR of 1.73 (95% CI, 1.30-2.30) and a decreased risk of kidney stones in patients following restrictive procedures including laparoscopic banding or sleeve gastrectomy with the pooled RR of 0.37 (95% CI, 0.16-0.85). Our meta-analysis demonstrates an association between RYGB and increased risk of kidney stones. Restrictive bariatric surgery, on the other hand, may decrease kidney stone risk. Future study with long-term follow-up data is needed to confirm this potential benefit of restrictive bariatric surgery.

  1. Semicompeting risks in aging research: methods, issues and needs

    PubMed Central

    Varadhan, Ravi; Xue, Qian-Li; Bandeen-Roche, Karen

    2015-01-01

    A semicompeting risks problem involves two-types of events: a nonterminal and a terminal event (death). Typically, the nonterminal event is the focus of the study, but the terminal event can preclude the occurrence of the nonterminal event. Semicompeting risks are ubiquitous in studies of aging. Examples of semicompeting risk dyads include: dementia and death, frailty syndrome and death, disability and death, and nursing home placement and death. Semicompeting risk models can be divided into two broad classes: models based only on observables quantities (class O) and those based on potential (latent) failure times (class L). The classical illness-death model belongs to class O. This model is a special case of the multistate models, which has been an active area of methodology development. During the past decade and a half, there has also been a flurry of methodological activity on semicompeting risks based on latent failure times (L models). These advances notwithstanding, the semi-competing risks methodology has not penetrated biomedical research, in general, and gerontological research, in particular. Some possible reasons for this lack of uptake are: the methods are relatively new and sophisticated, conceptual problems associated with potential failure time models are difficult to overcome, paucity of expository articles aimed at educating practitioners, and non-availability of readily usable software. The main goals of this review article are: (i) to describe the major types of semicompeting risks problems arising in aging research, (ii) to provide a brief survey of the semicompeting risks methods, (iii) to suggest appropriate methods for addressing the problems in aging research, (iv) to highlight areas where more work is needed, and (v) to suggest ways to facilitate the uptake of the semicompeting risks methodology by the broader biomedical research community. PMID:24729136

  2. [Assessment of health risk in view of changes in labor code and methods used for protecting the environment].

    PubMed

    Indulski, J A; Rolecki, R

    1994-01-01

    In view of the present and proposed amendments to the Labor Code as well as bearing in mind anticipated harmonization of regulations in this area with those of EEC, the authors emphasize the need for well developed methodology for assessing chemical safety in an occupational environment with special reference to health effects in people exposed to chemicals. Methods for assessing health risk induced by work under conditions of exposure to chemicals were divided into: methods for assessing technological/processing risk, and methods for assessing health risk related to the toxic effect of chemicals. The need for developing means of risk communication in order to secure proper risk perception among people exposed to chemicals and risk managers responsible for prevention against chemical hazards was also stressed. It is suggested to establish a centre for chemical substances in order to settle down all issues pertaining to human exposure to chemicals. The centre would be responsible, under the provisions of the Chemical Substances Act, for the qualitative and quantitative analysis of the present situation and for the development of guidelines on assessment of health risk among persons exposed to chemicals.

  3. A resilience perspective to water risk management: case-study application of the adaptation tipping point method

    NASA Astrophysics Data System (ADS)

    Gersonius, Berry; Ashley, Richard; Jeuken, Ad; Nasruddin, Fauzy; Pathirana, Assela; Zevenbergen, Chris

    2010-05-01

    start the identification and analysis of adaptive strategies at the end of PSIR scheme: impact and examine whether, and for how long, current risk management strategies will continue to be effective under different future conditions. The most noteworthy application of this approach is the adaptation tipping point method. Adaptation tipping points (ATP) are defined as the points where the magnitude of change is such that the current risk management strategy can no longer meet its objectives. In the ATP method, policy objectives, determining aspirational functioning, are taken as the starting point. Also, the current measures to achieve these objectives are described. This is followed by a sensitivity analysis to determine the optimal and critical boundary conditions (state). Lastly, the state is related to pressures in terms of future change. It should be noted that in the ATP method the driver for adopting a new risk management strategy is not future change as such, but rather failing to meet the policy objectives. In the current paper, the ATP method is applied to the case study of an existing stormwater system in Dordrecht (the Netherlands). This application shows the potential of the ATP method to reduce the complexity of implementing a resilience-focused approach to water risk management. It is expected that this will help foster greater practical relevance of resilience as a perspective for the planning of water management structures.

  4. 12 CFR 327.9 - Assessment risk categories and pricing methods.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... institution's financial condition and the risk posed to the Deposit Insurance Fund. The three Supervisory... method. Under the financial ratios method for Risk Category I institutions, each of six financial ratios... the Board under § 327.10(c), will equal an institution's assessment rate. The six financial ratios are...

  5. Competing risk models in reliability systems, an exponential distribution model with Bayesian analysis approach

    NASA Astrophysics Data System (ADS)

    Iskandar, I.

    2018-03-01

    The exponential distribution is the most widely used reliability analysis. This distribution is very suitable for representing the lengths of life of many cases and is available in a simple statistical form. The characteristic of this distribution is a constant hazard rate. The exponential distribution is the lower rank of the Weibull distributions. In this paper our effort is to introduce the basic notions that constitute an exponential competing risks model in reliability analysis using Bayesian analysis approach and presenting their analytic methods. The cases are limited to the models with independent causes of failure. A non-informative prior distribution is used in our analysis. This model describes the likelihood function and follows with the description of the posterior function and the estimations of the point, interval, hazard function, and reliability. The net probability of failure if only one specific risk is present, crude probability of failure due to a specific risk in the presence of other causes, and partial crude probabilities are also included.

  6. Risk Analysis for Unintentional Slide Deployment During Airline Operations.

    PubMed

    Ayra, Eduardo S; Insua, David Ríos; Castellanos, María Eugenia; Larbi, Lydia

    2015-09-01

    We present a risk analysis undertaken to mitigate problems in relation to the unintended deployment of slides under normal operations within a commercial airline. This type of incident entails relevant costs for the airline industry. After assessing the likelihood and severity of its consequences, we conclude that such risks need to be managed. We then evaluate the effectiveness of various countermeasures, describing and justifying the chosen ones. We also discuss several issues faced when implementing and communicating the proposed measures, thus fully illustrating the risk analysis process. © 2015 Society for Risk Analysis.

  7. A quick inexpensive laboratory method in acute paracetamol poisoning could improve risk assessment, management and resource utilization

    PubMed Central

    Senarathna, S.M.D.K. Ganga; Ranganathan, Shalini S.; Buckley, Nick; Soysa, S.S.S.B.D. Preethi; Fernandopulle, B. M. Rohini

    2012-01-01

    Objectives: Acute paracetamol poisoning is an emerging problem in Sri Lanka. Management guidelines recommend ingested dose and serum paracetamol concentrations to assess the risk. Our aim was to determine the usefulness of the patient's history of an ingested dose of >150 mg/kg and paracetamol concentration obtained by a simple colorimetric method to assess risk in patients with acute paracetamol poisoning. Materials and Methods: Serum paracetamol concentrations were determined in 100 patients with a history of paracetamol overdose using High Performance Liquid Chromatography (HPLC); (reference method). The results were compared to those obtained with a colorimetric method. The utility of risk assessment by reported dose ingested and colorimetric analysis were compared. Results: The area under the receiver operating characteristic curve for the history of ingested dose was 0.578 and there was no dose cut-off providing useful risk categorization. Both analytical methods had less than 5% intra- and inter-batch variation and were accurate on spiked samples. The time from blood collection to result was six times faster and ten times cheaper for colorimetry (30 minutes, US$2) than for HPLC (180 minutes, US$20). The correlation coefficient between the paracetamol levels by the two methods was 0.85. The agreement on clinical risk categorization on the standard nomogram was also good (Kappa = 0.62, sensitivity 81%, specificity 89%). Conclusions: History of dose ingested alone greatly over-estimated the number of patients who need antidotes and it was a poor predictor of risk. Paracetamol concentrations by colorimetry are rapid and inexpensive. The use of these would greatly improve the assessment of risk and greatly reduce unnecessary expenditure on antidotes. PMID:23087506

  8. Integration of PKPD relationships into benefit-risk analysis.

    PubMed

    Bellanti, Francesco; van Wijk, Rob C; Danhof, Meindert; Della Pasqua, Oscar

    2015-11-01

    Despite the continuous endeavour to achieve high standards in medical care through effectiveness measures, a quantitative framework for the assessment of the benefit-risk balance of new medicines is lacking prior to regulatory approval. The aim of this short review is to summarise the approaches currently available for benefit-risk assessment. In addition, we propose the use of pharmacokinetic-pharmacodynamic (PKPD) modelling as the pharmacological basis for evidence synthesis and evaluation of novel therapeutic agents. A comprehensive literature search has been performed using MESH terms in PubMed, in which articles describing benefit-risk assessment and modelling and simulation were identified. In parallel, a critical review of multi-criteria decision analysis (MCDA) is presented as a tool for characterising a drug's safety and efficacy profile. A definition of benefits and risks has been proposed by the European Medicines Agency (EMA), in which qualitative and quantitative elements are included. However, in spite of the value of MCDA as a quantitative method, decisions about benefit-risk balance continue to rely on subjective expert opinion. By contrast, a model-informed approach offers the opportunity for a more comprehensive evaluation of benefit-risk balance before extensive evidence is generated in clinical practice. Benefit-risk balance should be an integral part of the risk management plan and as such considered before marketing authorisation. Modelling and simulation can be incorporated into MCDA to support the evidence synthesis as well evidence generation taking into account the underlying correlations between favourable and unfavourable effects. In addition, it represents a valuable tool for the optimization of protocol design in effectiveness trials. © 2015 The British Pharmacological Society.

  9. Uncertainty analysis in vulnerability estimations for elements at risk- a review of concepts and some examples on landslides

    NASA Astrophysics Data System (ADS)

    Ciurean, R. L.; Glade, T.

    2012-04-01

    Decision under uncertainty is a constant of everyday life and an important component of risk management and governance. Recently, experts have emphasized the importance of quantifying uncertainty in all phases of landslide risk analysis. Due to its multi-dimensional and dynamic nature, (physical) vulnerability is inherently complex and the "degree of loss" estimates imprecise and to some extent even subjective. Uncertainty analysis introduces quantitative modeling approaches that allow for a more explicitly objective output, improving the risk management process as well as enhancing communication between various stakeholders for better risk governance. This study presents a review of concepts for uncertainty analysis in vulnerability of elements at risk to landslides. Different semi-quantitative and quantitative methods are compared based on their feasibility in real-world situations, hazard dependency, process stage in vulnerability assessment (i.e. input data, model, output), and applicability within an integrated landslide hazard and risk framework. The resulted observations will help to identify current gaps and future needs in vulnerability assessment, including estimation of uncertainty propagation, transferability of the methods, development of visualization tools, but also address basic questions like what is uncertainty and how uncertainty can be quantified or treated in a reliable and reproducible way.

  10. Cost Risk Analysis Based on Perception of the Engineering Process

    NASA Technical Reports Server (NTRS)

    Dean, Edwin B.; Wood, Darrell A.; Moore, Arlene A.; Bogart, Edward H.

    1986-01-01

    In most cost estimating applications at the NASA Langley Research Center (LaRC), it is desirable to present predicted cost as a range of possible costs rather than a single predicted cost. A cost risk analysis generates a range of cost for a project and assigns a probability level to each cost value in the range. Constructing a cost risk curve requires a good estimate of the expected cost of a project. It must also include a good estimate of expected variance of the cost. Many cost risk analyses are based upon an expert's knowledge of the cost of similar projects in the past. In a common scenario, a manager or engineer, asked to estimate the cost of a project in his area of expertise, will gather historical cost data from a similar completed project. The cost of the completed project is adjusted using the perceived technical and economic differences between the two projects. This allows errors from at least three sources. The historical cost data may be in error by some unknown amount. The managers' evaluation of the new project and its similarity to the old project may be in error. The factors used to adjust the cost of the old project may not correctly reflect the differences. Some risk analyses are based on untested hypotheses about the form of the statistical distribution that underlies the distribution of possible cost. The usual problem is not just to come up with an estimate of the cost of a project, but to predict the range of values into which the cost may fall and with what level of confidence the prediction is made. Risk analysis techniques that assume the shape of the underlying cost distribution and derive the risk curve from a single estimate plus and minus some amount usually fail to take into account the actual magnitude of the uncertainty in cost due to technical factors in the project itself. This paper addresses a cost risk method that is based on parametric estimates of the technical factors involved in the project being costed. The engineering

  11. A Regional Decision Support Scheme for Pest Risk Analysis in Southeast Asia.

    PubMed

    Soliman, T; MacLeod, A; Mumford, J D; Nghiem, T P L; Tan, H T W; Papworth, S K; Corlett, R T; Carrasco, L R

    2016-05-01

    A key justification to support plant health regulations is the ability of quarantine services to conduct pest risk analyses (PRA). Despite the supranational nature of biological invasions and the close proximity and connectivity of Southeast Asian countries, PRAs are conducted at the national level. Furthermore, some countries have limited experience in the development of PRAs, which may result in inadequate phytosanitary responses that put their plant resources at risk to pests vectored via international trade. We review existing decision support schemes for PRAs and, following international standards for phytosanitary measures, propose new methods that adapt existing practices to suit the unique characteristics of Southeast Asia. Using a formal written expert elicitation survey, a panel of regional scientific experts was asked to identify and rate unique traits of Southeast Asia with respect to PRA. Subsequently, an expert elicitation workshop with plant protection officials was conducted to verify the potential applicability of the developed methods. Rich biodiversity, shortage of trained personnel, social vulnerability, tropical climate, agriculture-dependent economies, high rates of land-use change, and difficulties in implementing risk management options were identified as challenging Southeast Asian traits. The developed methods emphasize local Southeast Asian conditions and could help support authorities responsible for carrying out PRAs within the region. These methods could also facilitate the creation of other PRA schemes in low- and middle-income tropical countries. © 2016 Society for Risk Analysis.

  12. A multicriteria decision analysis model and risk assessment framework for carbon capture and storage.

    PubMed

    Humphries Choptiany, John Michael; Pelot, Ronald

    2014-09-01

    Multicriteria decision analysis (MCDA) has been applied to various energy problems to incorporate a variety of qualitative and quantitative criteria, usually spanning environmental, social, engineering, and economic fields. MCDA and associated methods such as life-cycle assessments and cost-benefit analysis can also include risk analysis to address uncertainties in criteria estimates. One technology now being assessed to help mitigate climate change is carbon capture and storage (CCS). CCS is a new process that captures CO2 emissions from fossil-fueled power plants and injects them into geological reservoirs for storage. It presents a unique challenge to decisionmakers (DMs) due to its technical complexity, range of environmental, social, and economic impacts, variety of stakeholders, and long time spans. The authors have developed a risk assessment model using a MCDA approach for CCS decisions such as selecting between CO2 storage locations and choosing among different mitigation actions for reducing risks. The model includes uncertainty measures for several factors, utility curve representations of all variables, Monte Carlo simulation, and sensitivity analysis. This article uses a CCS scenario example to demonstrate the development and application of the model based on data derived from published articles and publicly available sources. The model allows high-level DMs to better understand project risks and the tradeoffs inherent in modern, complex energy decisions. © 2014 Society for Risk Analysis.

  13. A Western Dietary Pattern Increases Prostate Cancer Risk: A Systematic Review and Meta-Analysis.

    PubMed

    Fabiani, Roberto; Minelli, Liliana; Bertarelli, Gaia; Bacci, Silvia

    2016-10-12

    Dietary patterns were recently applied to examine the relationship between eating habits and prostate cancer (PC) risk. While the associations between PC risk with the glycemic index and Mediterranean score have been reviewed, no meta-analysis is currently available on dietary patterns defined by "a posteriori" methods. A literature search was carried out (PubMed, Web of Science) to identify studies reporting the relationship between dietary patterns and PC risk. Relevant dietary patterns were selected and the risks estimated were calculated by a random-effect model. Multivariable-adjusted odds ratios (ORs), for a first-percentile increase in dietary pattern score, were combined by a dose-response meta-analysis. Twelve observational studies were included in the meta-analysis which identified a "Healthy pattern" and a "Western pattern". The Healthy pattern was not related to PC risk (OR = 0.96; 95% confidence interval (CI): 0.88-1.04) while the Western pattern significantly increased it (OR = 1.34; 95% CI: 1.08-1.65). In addition, the "Carbohydrate pattern", which was analyzed in four articles, was positively associated with a higher PC risk (OR = 1.64; 95% CI: 1.35-2.00). A significant linear trend between the Western ( p = 0.011) pattern, the Carbohydrate ( p = 0.005) pattern, and the increment of PC risk was observed. The small number of studies included in the meta-analysis suggests that further investigation is necessary to support these findings.

  14. A Western Dietary Pattern Increases Prostate Cancer Risk: A Systematic Review and Meta-Analysis

    PubMed Central

    Fabiani, Roberto; Minelli, Liliana; Bertarelli, Gaia; Bacci, Silvia

    2016-01-01

    Dietary patterns were recently applied to examine the relationship between eating habits and prostate cancer (PC) risk. While the associations between PC risk with the glycemic index and Mediterranean score have been reviewed, no meta-analysis is currently available on dietary patterns defined by “a posteriori” methods. A literature search was carried out (PubMed, Web of Science) to identify studies reporting the relationship between dietary patterns and PC risk. Relevant dietary patterns were selected and the risks estimated were calculated by a random-effect model. Multivariable-adjusted odds ratios (ORs), for a first-percentile increase in dietary pattern score, were combined by a dose-response meta-analysis. Twelve observational studies were included in the meta-analysis which identified a “Healthy pattern” and a “Western pattern”. The Healthy pattern was not related to PC risk (OR = 0.96; 95% confidence interval (CI): 0.88–1.04) while the Western pattern significantly increased it (OR = 1.34; 95% CI: 1.08–1.65). In addition, the “Carbohydrate pattern”, which was analyzed in four articles, was positively associated with a higher PC risk (OR = 1.64; 95% CI: 1.35–2.00). A significant linear trend between the Western (p = 0.011) pattern, the Carbohydrate (p = 0.005) pattern, and the increment of PC risk was observed. The small number of studies included in the meta-analysis suggests that further investigation is necessary to support these findings. PMID:27754328

  15. Empirical analysis on future-cash arbitrage risk with portfolio VaR

    NASA Astrophysics Data System (ADS)

    Chen, Rongda; Li, Cong; Wang, Weijin; Wang, Ze

    2014-03-01

    This paper constructs the positive arbitrage position by alternating the spot index with Chinese Exchange Traded Fund (ETF) portfolio and estimating the arbitrage-free interval of futures with the latest trade data. Then, an improved Delta-normal method was used, which replaces the simple linear correlation coefficient with tail dependence correlation coefficient, to measure VaR (Value-at-risk) of the arbitrage position. Analysis of VaR implies that the risk of future-cash arbitrage is less than that of investing completely in either futures or spot market. Then according to the compositional VaR and the marginal VaR, we should increase the futures position and decrease the spot position appropriately to minimize the VaR, which can minimize risk subject to certain revenues.

  16. Stratospheric Aerosol and Gas Experiment, SAGE III on ISS, An Earth Science Mission on the International Space Station, Schedule Risk Analysis, A Project Perspective

    NASA Technical Reports Server (NTRS)

    Bonine, Lauren

    2015-01-01

    The presentation provides insight into the schedule risk analysis process used by the Stratospheric Aerosol and Gas Experiment III on the International Space Station Project. The presentation focuses on the schedule risk analysis process highlighting the methods for identification of risk inputs, the inclusion of generic risks identified outside the traditional continuous risk management process, and the development of tailored analysis products used to improve risk informed decision making.

  17. Association of N-acetyltransferase 1 polymorphism and bladder cancer risk: an updated meta-analysis and trial sequential analysis.

    PubMed

    Xu, Zicheng; Li, Xiao; Qin, Zhiqiang; Xue, Jianxin; Wang, Jingyuan; Liu, Zhentao; Cai, Hongzhou; Yu, Bin; Xu, Ting; Zou, Qin

    2017-07-24

    Individual studies of the association between N-acetyltransferase 1 (NAT1)*10 allele and bladder cancer susceptibility have shown inconclusive results. To derive a more precise estimation of any such relationship, we performed this systemic review and updated meta-analysis based on 17 publications. A total of 17 studies were investigated with 4,322 bladder cancer cases and 4,944 controls. The pooled odds ratios (ORs) with 95% confidence intervals (CIs) were used to assess the strength of the association. Subgroup analyses were conducted based on ethnicity, sex, source of controls and detecting methods. Then trial sequential analysis was performed to evaluate whether the evidence of the results was sufficient and reduce the risk of type I error. There was no association between NAT1*10 allele and bladder cancer risk in a random-effects model (OR = 0.96, 95% CI, 0.84-1.10) or in a fixed-effects model (OR = 0.95, 95% CI, 0.87-1.03). In addition, no significantly increased risk of bladder cancer was found in any other subgroup analysis. Then, trial sequential analyses demonstrated that the results of our study need to be further verified. Despite its limitations, the results of the present meta-analysis suggested that there was no association between NAT1*10 allele and bladder cancer risk. More importantly, our findings need to be further validated regarding whether being without the NAT1*10 allele could in the future be shown to be a potential marker for the risk of bladder cancer.

  18. Alcohol Intake and Risk of Thyroid Cancer: A Meta-Analysis of Observational Studies

    PubMed Central

    Hong, Seung-Hee; Myung, Seung-Kwon; Kim, Hyeon Suk

    2017-01-01

    Purpose The purpose of this study was to assess whether alcohol intake is associated with the risk of thyroid cancer by a meta-analysis of observational studies. Materials and Methods We searched PubMed and EMBASE in June of 2015 to locate eligible studies. We included observational studies such as cross-sectional studies, case-control studies, and cohort studies reporting odd ratios (ORs) or relative risk (RRs) with 95% confidence intervals (CIs). Results We included 33 observational studies with two cross-sectional studies, 20 case-controls studies, and 11 cohort studies, which involved a total of 7,725 thyroid cancer patients and 3,113,679 participants without thyroid cancer in the final analysis. In the fixed-effect model meta-analysis of all 33 studies, we found that alcohol intake was consistently associated with a decreased risk of thyroid cancer (OR or RR, 0.74; 95% CI, 0.67 to 0.83; I2=38.6%). In the subgroup meta-analysis by type of study, alcohol intake also decreased the risk of thyroid cancer in both case-control studies (OR, 0.77; 95% CI, 0.65 to 0.92; I2=29.5%; n=20) and cohort studies (RR, 0.70; 95% CI, 0.60 to 0.82; I2=0%; n=11). Moreover, subgroup meta-analyses by type of thyroid cancer, gender, amount of alcohol consumed, and methodological quality of study showed that alcohol intake was significantly associated with a decreased risk of thyroid cancer. Conclusion The current meta-analysis of observational studies found that, unlike most of other types of cancer, alcohol intake decreased the risk of thyroid cancer. PMID:27456949

  19. Climate Risk Informed Decision Analysis: A Hypothetical Application to the Waas Region

    NASA Astrophysics Data System (ADS)

    Gilroy, Kristin; Mens, Marjolein; Haasnoot, Marjolijn; Jeuken, Ad

    2016-04-01

    More frequent and intense hydrologic events under climate change are expected to enhance water security and flood risk management challenges worldwide. Traditional planning approaches must be adapted to address climate change and develop solutions with an appropriate level of robustness and flexibility. The Climate Risk Informed Decision Analysis (CRIDA) method is a novel planning approach embodying a suite of complementary methods, including decision scaling and adaptation pathways. Decision scaling offers a bottom-up approach to assess risk and tailors the complexity of the analysis to the problem at hand and the available capacity. Through adaptation pathway,s an array of future strategies towards climate robustness are developed, ranging in flexibility and immediacy of investments. Flexible pathways include transfer points to other strategies to ensure that the system can be adapted if future conditions vary from those expected. CRIDA combines these two approaches in a stakeholder driven process which guides decision makers through the planning and decision process, taking into account how the confidence in the available science, the consequences in the system, and the capacity of institutions should influence strategy selection. In this presentation, we will explain the CRIDA method and compare it to existing planning processes, such as the US Army Corps of Engineers Principles and Guidelines as well as Integrated Water Resources Management Planning. Then, we will apply the approach to a hypothetical case study for the Waas Region, a large downstream river basin facing rapid development threatened by increased flood risks. Through the case study, we will demonstrate how a stakeholder driven process can be used to evaluate system robustness to climate change; develop adaptation pathways for multiple objectives and criteria; and illustrate how varying levels of confidence, consequences, and capacity would play a role in the decision making process, specifically

  20. Cognitive impairment and risk of future stroke: a systematic review and meta-analysis

    PubMed Central

    Lee, Meng; Saver, Jeffrey L.; Hong, Keun-Sik; Wu, Yi-Ling; Liu, Hsing-Cheng; Rao, Neal M.; Ovbiagele, Bruce

    2014-01-01

    Background: Several studies have assessed the link between cognitive impairment and risk of future stroke, but results have been inconsistent. We conducted a systematic review and meta-analysis of cohort studies to determine the association between cognitive impairment and risk of future stroke. Methods: We searched MEDLINE and Embase (1966 to November 2013) and conducted a manual search of bibliographies of relevant retrieved articles and reviews. We included cohort studies that reported multivariable adjusted relative risks and 95% confidence intervals or standard errors for stroke with respect to baseline cognitive impairment. Results: We identified 18 cohort studies (total 121 879 participants) and 7799 stroke events. Pooled analysis of results from all studies showed that stroke risk increased among patients with cognitive impairment at baseline (relative risk [RR] 1.39, 95% confidence interval [CI] 1.24–1.56). The results were similar when we restricted the analysis to studies that used a widely adopted definition of cognitive impairment (i.e., Mini-Mental State Examination score < 25 or nearest equivalent) (RR 1.64, 95% CI 1.46–1.84). Cognitive impairment at baseline was also associated with an increased risk of fatal stroke (RR 1.68, 95% CI 1.21–2.33) and ischemic stroke (RR 1.65, 95% CI 1.41–1.93). Interpretation: Baseline cognitive impairment was associated with a significantly higher risk of future stroke, especially ischemic and fatal stroke. PMID:25157064

  1. Assessing environmental risks for high intensity agriculture using the material flow analysis method--a case study of the Dongting Lake basin in South Central China.

    PubMed

    Yin, Guanyi; Liu, Liming; Yuan, Chengcheng

    2015-07-01

    This study primarily examined the assessment of environmental risk in high intensity agricultural areas. Dongting Lake basin was taken as a case study, which is one of the major grain producing areas in China. Using data obtained from 1989 to 2012, we applied Material Flow Analysis (MFA) to show the material consumption, pollutant output and production storage in the agricultural-environmental system and assessed the environmental risk index on the basis of the MFA results. The results predicted that the status of the environmental quality of the Dongting Lake area is unsatisfactory for the foreseeable future. The direct material input (DMI) declined by 13.9%, the domestic processed output (DPO) increased by 28.21%, the intensity of material consumption (IMC) decreased by 36.7%, the intensity of material discharge (IMD) increased by 10%, the material productivity (MP) increased by 27 times, the environmental efficiency (EE) increased by 15.31 times, and the material storage (PAS) increased by 0.23%. The DMI and DPO was higher at rural places on the edge of cities, whereas the risk of urban agriculture has arisen due to the higher increasing rate of DMI and DPO in cities compared with the counties. The composite environmental risk index increased from 0.33 to 0.96, indicating that the total environmental risk changed gradually but seriously during the 24 years assessed. The driving factors that affect environmental risk in high intensity agriculture can be divided into five classes: social, economic, human, natural and disruptive incidents. This study discussed a number of effective measures for protecting the environment while ensuring food production yields. Additional research in other areas and certain improvements of this method in future studies may be necessary to develop a more effective method of managing and controlling agricultural-environmental interactions.

  2. Root Source Analysis/ValuStream[Trade Mark] - A Methodology for Identifying and Managing Risks

    NASA Technical Reports Server (NTRS)

    Brown, Richard Lee

    2008-01-01

    Root Source Analysis (RoSA) is a systems engineering methodology that has been developed at NASA over the past five years. It is designed to reduce costs, schedule, and technical risks by systematically examining critical assumptions and the state of the knowledge needed to bring to fruition the products that satisfy mission-driven requirements, as defined for each element of the Work (or Product) Breakdown Structure (WBS or PBS). This methodology is sometimes referred to as the ValuStream method, as inherent in the process is the linking and prioritizing of uncertainties arising from knowledge shortfalls directly to the customer's mission driven requirements. RoSA and ValuStream are synonymous terms. RoSA is not simply an alternate or improved method for identifying risks. It represents a paradigm shift. The emphasis is placed on identifying very specific knowledge shortfalls and assumptions that are the root sources of the risk (the why), rather than on assessing the WBS product(s) themselves (the what). In so doing RoSA looks forward to anticipate, identify, and prioritize knowledge shortfalls and assumptions that are likely to create significant uncertainties/ risks (as compared to Root Cause Analysis, which is most often used to look back to discover what was not known, or was assumed, that caused the failure). Experience indicates that RoSA, with its primary focus on assumptions and the state of the underlying knowledge needed to define, design, build, verify, and operate the products, can identify critical risks that historically have been missed by the usual approaches (i.e., design review process and classical risk identification methods). Further, the methodology answers four critical questions for decision makers and risk managers: 1. What s been included? 2. What's been left out? 3. How has it been validated? 4. Has the real source of the uncertainty/ risk been identified, i.e., is the perceived problem the real problem? Users of the RoSA methodology

  3. FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES

    DTIC Science & Technology

    2017-06-01

    FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES BY AMANDA DONNELLY A THESIS...work develops a comparative model for strategic design methodologies, focusing on the primary elements of vision, time, process, communication and...collaboration, and risk assessment. My analysis dissects and compares three potential design methodologies including, net assessment, scenarios and

  4. Modeling Opponents in Adversarial Risk Analysis.

    PubMed

    Rios Insua, David; Banks, David; Rios, Jesus

    2016-04-01

    Adversarial risk analysis has been introduced as a framework to deal with risks derived from intentional actions of adversaries. The analysis supports one of the decisionmakers, who must forecast the actions of the other agents. Typically, this forecast must take account of random consequences resulting from the set of selected actions. The solution requires one to model the behavior of the opponents, which entails strategic thinking. The supported agent may face different kinds of opponents, who may use different rationality paradigms, for example, the opponent may behave randomly, or seek a Nash equilibrium, or perform level-k thinking, or use mirroring, or employ prospect theory, among many other possibilities. We describe the appropriate analysis for these situations, and also show how to model the uncertainty about the rationality paradigm used by the opponent through a Bayesian model averaging approach, enabling a fully decision-theoretic solution. We also show how as we observe an opponent's decision behavior, this approach allows learning about the validity of each of the rationality models used to predict his decision by computing the models' (posterior) probabilities, which can be understood as a measure of their validity. We focus on simultaneous decision making by two agents. © 2015 Society for Risk Analysis.

  5. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    DOEpatents

    West, Phillip B [Idaho Falls, ID; Novascone, Stephen R [Idaho Falls, ID; Wright, Jerry P [Idaho Falls, ID

    2012-05-29

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  6. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    DOEpatents

    West, Phillip B [Idaho Falls, ID; Novascone, Stephen R [Idaho Falls, ID; Wright, Jerry P [Idaho Falls, ID

    2011-09-27

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  7. Hydrologic risk analysis in the Yangtze River basin through coupling Gaussian mixtures into copulas

    NASA Astrophysics Data System (ADS)

    Fan, Y. R.; Huang, W. W.; Huang, G. H.; Li, Y. P.; Huang, K.; Li, Z.

    2016-02-01

    In this study, a bivariate hydrologic risk framework is proposed through coupling Gaussian mixtures into copulas, leading to a coupled GMM-copula method. In the coupled GMM-Copula method, the marginal distributions of flood peak, volume and duration are quantified through Gaussian mixture models and the joint probability distributions of flood peak-volume, peak-duration and volume-duration are established through copulas. The bivariate hydrologic risk is then derived based on the joint return period of flood variable pairs. The proposed method is applied to the risk analysis for the Yichang station on the main stream of the Yangtze River, China. The results indicate that (i) the bivariate risk for flood peak-volume would keep constant for the flood volume less than 1.0 × 105 m3/s day, but present a significant decreasing trend for the flood volume larger than 1.7 × 105 m3/s day; and (ii) the bivariate risk for flood peak-duration would not change significantly for the flood duration less than 8 days, and then decrease significantly as duration value become larger. The probability density functions (pdfs) of the flood volume and duration conditional on flood peak can also be generated through the fitted copulas. The results indicate that the conditional pdfs of flood volume and duration follow bimodal distributions, with the occurrence frequency of the first vertex decreasing and the latter one increasing as the increase of flood peak. The obtained conclusions from the bivariate hydrologic analysis can provide decision support for flood control and mitigation.

  8. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments.

    PubMed

    Conners, Erin E; West, Brooke S; Roth, Alexis M; Meckel-Parker, Kristen G; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D; Brouwer, Kimberly C

    2016-01-01

    Increasingly, 'place', including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions.

  9. Quantitative, Qualitative and Geospatial Methods to Characterize HIV Risk Environments

    PubMed Central

    Conners, Erin E.; West, Brooke S.; Roth, Alexis M.; Meckel-Parker, Kristen G.; Kwan, Mei-Po; Magis-Rodriguez, Carlos; Staines-Orozco, Hugo; Clapp, John D.; Brouwer, Kimberly C.

    2016-01-01

    Increasingly, ‘place’, including physical and geographical characteristics as well as social meanings, is recognized as an important factor driving individual and community health risks. This is especially true among marginalized populations in low and middle income countries (LMIC), whose environments may also be more difficult to study using traditional methods. In the NIH-funded longitudinal study Mapa de Salud, we employed a novel approach to exploring the risk environment of female sex workers (FSWs) in two Mexico/U.S. border cities, Tijuana and Ciudad Juárez. In this paper we describe the development, implementation, and feasibility of a mix of quantitative and qualitative tools used to capture the HIV risk environments of FSWs in an LMIC setting. The methods were: 1) Participatory mapping; 2) Quantitative interviews; 3) Sex work venue field observation; 4) Time-location-activity diaries; 5) In-depth interviews about daily activity spaces. We found that the mixed-methodology outlined was both feasible to implement and acceptable to participants. These methods can generate geospatial data to assess the role of the environment on drug and sexual risk behaviors among high risk populations. Additionally, the adaptation of existing methods for marginalized populations in resource constrained contexts provides new opportunities for informing public health interventions. PMID:27191846

  10. Chemical Mixtures Health Risk Assessment of Environmental Contaminants: Concepts, Methods, And Applications

    EPA Science Inventory

    This problems-based, introductory workshop focuses on methods to assess health risks posed by exposures to chemical mixtures in the environment. Chemical mixtures health risk assessment methods continue to be developed and evolve to address concerns over health risks from multic...

  11. Pleiotropic analysis of cancer risk loci on esophageal adenocarcinoma risk

    PubMed Central

    Lee, Eunjung; Stram, Daniel O.; Ek, Weronica E.; Onstad, Lynn E; MacGregor, Stuart; Gharahkhani, Puya; Ye, Weimin; Lagergren, Jesper; Shaheen, Nicholas J.; Murray, Liam J.; Hardie, Laura J; Gammon, Marilie D.; Chow, Wong-Ho; Risch, Harvey A.; Corley, Douglas A.; Levine, David M; Whiteman, David C.; Bernstein, Leslie; Bird, Nigel C.; Vaughan, Thomas L.; Wu, Anna H.

    2015-01-01

    Background Several cancer-associated loci identified from genome-wide association studies (GWAS) have been associated with risks of multiple cancer sites, suggesting pleiotropic effects. We investigated whether GWAS-identified risk variants for other common cancers are associated with risk of esophageal adenocarcinoma (EA) or its precursor, Barrett's esophagus (BE). Methods We examined the associations between risks of EA and BE and 387 single nucleotide polymorphisms (SNPs) that have been associated with risks of other cancers, by using genotype imputation data on 2,163 control participants and 3,885 (1,501 EA and 2,384 BE) case patients from the Barrett's and Esophageal Adenocarcinoma Genetic Susceptibility Study, and investigated effect modification by smoking history, body mass index (BMI), and reflux/heartburn. Results After correcting for multiple testing, none of the tested 387 SNPs were statistically significantly associated with risk of EA or BE. No evidence of effect modification by smoking, BMI, or reflux/heartburn was observed. Conclusions Genetic risk variants for common cancers identified from GWAS appear not to be associated with risks of EA or BE. Impact To our knowledge, this is the first investigation of pleiotropic genetic associations with risks of EA and BE. PMID:26364162

  12. Spatiotemporal analysis and mapping of oral cancer risk in changhua county (taiwan): an application of generalized bayesian maximum entropy method.

    PubMed

    Yu, Hwa-Lung; Chiang, Chi-Ting; Lin, Shu-De; Chang, Tsun-Kuo

    2010-02-01

    Incidence rate of oral cancer in Changhua County is the highest among the 23 counties of Taiwan during 2001. However, in health data analysis, crude or adjusted incidence rates of a rare event (e.g., cancer) for small populations often exhibit high variances and are, thus, less reliable. We proposed a generalized Bayesian Maximum Entropy (GBME) analysis of spatiotemporal disease mapping under conditions of considerable data uncertainty. GBME was used to study the oral cancer population incidence in Changhua County (Taiwan). Methodologically, GBME is based on an epistematics principles framework and generates spatiotemporal estimates of oral cancer incidence rates. In a way, it accounts for the multi-sourced uncertainty of rates, including small population effects, and the composite space-time dependence of rare events in terms of an extended Poisson-based semivariogram. The results showed that GBME analysis alleviates the noises of oral cancer data from population size effect. Comparing to the raw incidence data, the maps of GBME-estimated results can identify high risk oral cancer regions in Changhua County, where the prevalence of betel quid chewing and cigarette smoking is relatively higher than the rest of the areas. GBME method is a valuable tool for spatiotemporal disease mapping under conditions of uncertainty. 2010 Elsevier Inc. All rights reserved.

  13. Kaplan-Meier survival analysis overestimates cumulative incidence of health-related events in competing risk settings: a meta-analysis.

    PubMed

    Lacny, Sarah; Wilson, Todd; Clement, Fiona; Roberts, Derek J; Faris, Peter; Ghali, William A; Marshall, Deborah A

    2018-01-01

    Kaplan-Meier survival analysis overestimates cumulative incidence in competing risks (CRs) settings. The extent of overestimation (or its clinical significance) has been questioned, and CRs methods are infrequently used. This meta-analysis compares the Kaplan-Meier method to the cumulative incidence function (CIF), a CRs method. We searched MEDLINE, EMBASE, BIOSIS Previews, Web of Science (1992-2016), and article bibliographies for studies estimating cumulative incidence using the Kaplan-Meier method and CIF. For studies with sufficient data, we calculated pooled risk ratios (RRs) comparing Kaplan-Meier and CIF estimates using DerSimonian and Laird random effects models. We performed stratified meta-analyses by clinical area, rate of CRs (CRs/events of interest), and follow-up time. Of 2,192 identified abstracts, we included 77 studies in the systematic review and meta-analyzed 55. The pooled RR demonstrated the Kaplan-Meier estimate was 1.41 [95% confidence interval (CI): 1.36, 1.47] times higher than the CIF. Overestimation was highest among studies with high rates of CRs [RR = 2.36 (95% CI: 1.79, 3.12)], studies related to hepatology [RR = 2.60 (95% CI: 2.12, 3.19)], and obstetrics and gynecology [RR = 1.84 (95% CI: 1.52, 2.23)]. The Kaplan-Meier method overestimated the cumulative incidence across 10 clinical areas. Using CRs methods will ensure accurate results inform clinical and policy decisions. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. A performance-based approach to landslide risk analysis

    NASA Astrophysics Data System (ADS)

    Romeo, R. W.

    2009-04-01

    An approach for the risk assessment based on a probabilistic analysis of the performance of structures threatened by landslides is shown and discussed. The risk is a possible loss due to the occurrence of a potentially damaging event. Analytically the risk is the probability convolution of hazard, which defines the frequency of occurrence of the event (i.e., the demand), and fragility that defines the capacity of the system to withstand the event given its characteristics (i.e., severity) and those of the exposed goods (vulnerability), that is: Risk=p(D>=d|S,V) The inequality sets a damage (or loss) threshold beyond which the system's performance is no longer met. Therefore a consistent approach to risk assessment should: 1) adopt a probabilistic model which takes into account all the uncertainties of the involved variables (capacity and demand), 2) follow a performance approach based on given loss or damage thresholds. The proposed method belongs to the category of the semi-empirical ones: the theoretical component is given by the probabilistic capacity-demand model; the empirical component is given by the observed statistical behaviour of structures damaged by landslides. Two landslide properties alone are required: the area-extent and the type (or kinematism). All other properties required to determine the severity of landslides (such as depth, speed and frequency) are derived via probabilistic methods. The severity (or intensity) of landslides, in terms of kinetic energy, is the demand of resistance; the resistance capacity is given by the cumulative distribution functions of the limit state performance (fragility functions) assessed via damage surveys and cards compilation. The investigated limit states are aesthetic (of nominal concern alone), functional (interruption of service) and structural (economic and social losses). The damage probability is the probabilistic convolution of hazard (the probability mass function of the frequency of occurrence of given

  15. [Legal and methodical aspects of occupational risk management].

    PubMed

    2011-01-01

    Legal and methodical aspects of occupational risk management (ORM) are considered with account of new official documents. Introduction of risk and risk management notions into Labor Code reflects the change of forms of occupational health and safety. The role of hygienist and occupational medicine professionals in workplace conditions certification (WCC) and periodical medical examinations (PME) is strengthened. The ORM could be improved by introducing the block of prognosis and causation based on IT-technologies that could match systems of WCC and PME thus improving the effectiveness of prophylactics.

  16. Diagnostic accuracy of different caries risk assessment methods. A systematic review.

    PubMed

    Senneby, Anna; Mejàre, Ingegerd; Sahlin, Nils-Eric; Svensäter, Gunnel; Rohlin, Madeleine

    2015-12-01

    To evaluate the accuracy of different methods used to identify individuals with increased risk of developing dental coronal caries. Studies on following methods were included: previous caries experience, tests using microbiota, buffering capacity, salivary flow rate, oral hygiene, dietary habits and sociodemographic variables. QUADAS-2 was used to assess risk of bias. Sensitivity, specificity, predictive values, and likelihood ratios (LR) were calculated. Quality of evidence based on ≥3 studies of a method was rated according to GRADE. PubMed, Cochrane Library, Web of Science and reference lists of included publications were searched up to January 2015. From 5776 identified articles, 18 were included. Assessment of study quality identified methodological limitations concerning study design, test technology and reporting. No study presented low risk of bias in all domains. Three or more studies were found only for previous caries experience and salivary mutans streptococci and quality of evidence for these methods was low. Evidence regarding other methods was lacking. For previous caries experience, sensitivity ranged between 0.21 and 0.94 and specificity between 0.20 and 1. Tests using salivary mutans streptococci resulted in low sensitivity and high specificity. For children with primary teeth at baseline, pooled LR for a positive test was 3 for previous caries experience and 4 for salivary mutans streptococci, given a threshold ≥10(5) CFU/ml. Evidence on the validity of analysed methods used for caries risk assessment is limited. As methodological quality was low, there is a need to improve study design. Low validity for the analysed methods may lead to patients with increased risk not being identified, whereas some are falsely identified as being at risk. As caries risk assessment guides individualized decisions on interventions and intervals for patient recall, improved performance based on best evidence is greatly needed. Copyright © 2015 Elsevier Ltd

  17. Chemical Mixtures Health Risk Assessment of Environmental Contaminants: Concepts, Methods, Applications: Handbook

    EPA Science Inventory

    This problems-based, half-day, introductory workshop focuses on methods to assess health risks posed by exposures to chemical mixtures in the environment. Chemical mixtures health risk assessment methods continue to be developed and evolve to address concerns over health risks f...

  18. A quick inexpensive laboratory method in acute paracetamol poisoning could improve risk assessment, management and resource utilization.

    PubMed

    Senarathna, S M D K Ganga; Ranganathan, Shalini S; Buckley, Nick; Soysa, S S S B D Preethi; Fernandopulle, B M Rohini

    2012-01-01

    Acute paracetamol poisoning is an emerging problem in Sri Lanka. Management guidelines recommend ingested dose and serum paracetamol concentrations to assess the risk. Our aim was to determine the usefulness of the patient's history of an ingested dose of >150 mg/kg and paracetamol concentration obtained by a simple colorimetric method to assess risk in patients with acute paracetamol poisoning. Serum paracetamol concentrations were determined in 100 patients with a history of paracetamol overdose using High Performance Liquid Chromatography (HPLC); (reference method). The results were compared to those obtained with a colorimetric method. The utility of risk assessment by reported dose ingested and colorimetric analysis were compared. The area under the receiver operating characteristic curve for the history of ingested dose was 0.578 and there was no dose cut-off providing useful risk categorization. Both analytical methods had less than 5% intra- and inter-batch variation and were accurate on spiked samples. The time from blood collection to result was six times faster and ten times cheaper for colorimetry (30 minutes, US$2) than for HPLC (180 minutes, US$20). The correlation coefficient between the paracetamol levels by the two methods was 0.85. The agreement on clinical risk categorization on the standard nomogram was also good (Kappa = 0.62, sensitivity 81%, specificity 89%). History of dose ingested alone greatly over-estimated the number of patients who need antidotes and it was a poor predictor of risk. Paracetamol concentrations by colorimetry are rapid and inexpensive. The use of these would greatly improve the assessment of risk and greatly reduce unnecessary expenditure on antidotes.

  19. The risk of lung cancer among cooking adults: a meta-analysis of 23 observational studies.

    PubMed

    Jia, Peng-Li; Zhang, Chao; Yu, Jia-Jie; Xu, Chang; Tang, Li; Sun, Xin

    2018-02-01

    Cooking has been regarded as a potential risk factor for lung cancer. We aim to investigate the evidence of cooking oil fume and risk of lung cancer. Medline and Embase were searched for eligible studies. We conducted a meta-analysis to summarize the evidences of case-control or cohort studies, with subgroup analysis for the potential discrepancy. Sensitivity analysis was employed to test the robustness. We included 23 observational studies, involving 9411 lung cancer cases. Our meta-analysis found that, for cooking female, the pooled OR of cooking oil fume exposure was 1.98 (95% CI 1.54, 2.54, I 2  = 79%, n = 15) among non-smoking population and 2.00 (95% CI 1.46, 2.74, I 2  = 75%, n = 10) among partly smoking population. For cooking males, the pooled OR of lung cancer was 1.15 (95% CI 0.71, 1.87; I 2  = 80%, n = 4). When sub grouped by ventilation condition, the pooled OR for poor ventilation was 1.20 (95% CI 1.10, 1.31, I 2  = 2%) compared to good ventilation. For different cooking methods, our results suggested that stir frying (OR = 1.89, 95% CI 1.23, 2.90; I 2  = 66%) was associated with increased risk of lung cancer while not for deep frying (OR = 1.41, 95% CI 0.87, 2.29; I 2  = 5%). Sensitivity analysis suggested our results were stable. Cooking oil fume is likely to be a risk factor for lung cancer for female, regardless of smoking status. Poor ventilation may increase the risk of lung cancer. Cooking methods may have different effect on lung cancer that deep frying may be healthier than stir frying.

  20. Dietary magnesium intake and risk of metabolic syndrome: a meta-analysis

    PubMed Central

    Dibaba, D. T.; Xun, P.; Fly, A. D.; Yokota, K.; He, K.

    2014-01-01

    Aims To estimate quantitatively the association between dietary magnesium intake and risk of metabolic syndrome by combining the relevant published articles using meta-analysis. Methods We reviewed the relevant literature in PubMed and EMBASE published up until August 2013 and obtained additional information through Google or a hand search of the references in relevant articles. A random-effects or fixed-effects model, as appropriate, was used to pool the effect sizes on metabolic syndrome comparing individuals with the highest dietary magnesium intake with those having the lowest intake. The dose–response relationship was assessed for every 100-mg/day increment in magnesium intake and risk of metabolic syndrome. Result Six cross-sectional studies, including a total of 24 473 individuals and 6311 cases of metabolic syndrome, were identified as eligible for the meta-analysis. A weighted inverse association was found between dietary magnesium intake and the risk of metabolic syndrome (odds ratio 0.69, 95% CI 0.59, 0.81) comparing the highest with the lowest group. For every 100-mg/day increment in magnesium intake, the overall risk of having metabolic syndrome was lowered by 17% (odds ratio 0.83, 95% CI 0. 77, 0.89). Conclusion Findings from the present meta-analysis suggest that dietary magnesium intake is inversely associated with the prevalence of metabolic syndrome. Further studies, in particular well-designed longitudinal cohort studies and randomized placebo-controlled clinical trials, are warranted to provide solid evidence and to establish causal inference. PMID:24975384

  1. Bisphosphonates and risk of atrial fibrillation: a meta-analysis

    PubMed Central

    2010-01-01

    Introduction Bisphosphonates are the most commonly used drugs for the prevention and treatment of osteoporosis. Although a recent FDA review of the results of clinical trials reported no clear link between bisphosphonates and serious or non-serious atrial fibrillation (AF), some epidemiologic studies have suggested an association between AF and bisphosphonates. Methods We conducted a meta-analysis of non-experimental studies to evaluate the risk of AF associated with bisphosphonates. Studies were identified by searching MEDLINE and EMBASE using a combination of the Medical Subject Headings and keywords. Our search was limited to English language articles. The pooled estimates of odds ratios (OR) as a measure of effect size were calculated using a random effects model. Results Seven eligible studies with 266,761 patients were identified: three cohort, three case-control, and one self-controlled case series. Bisphosphonate exposure was not associated with an increased risk of AF [pooled multivariate OR 1.04, 95% confidence interval (CI) 0.92-1.16] after adjusting for known risk factors. Moderate heterogeneity was noted (I-squared score = 62.8%). Stratified analyses by study design, cohort versus case-control studies, yielded similar results. Egger's and Begg's tests did not suggest an evidence of publication bias (P = 0.90, 1.00 respectively). No clear asymmetry was observed in the funnel plot analysis. Few studies compared risk between bisphosphonates or by dosing. Conclusions Our study did not find an association between bisphosphonate exposure and AF. This finding is consistent with the FDA's statement. PMID:20170505

  2. An evaluation of Computational Fluid dynamics model for flood risk analysis

    NASA Astrophysics Data System (ADS)

    Di Francesco, Silvia; Biscarini, Chiara; Montesarchio, Valeria

    2014-05-01

    This work presents an analysis of the hydrological-hydraulic engineering requisites for Risk evaluation and efficient flood damage reduction plans. Most of the research efforts have been dedicated to the scientific and technical aspects of risk assessment, providing estimates of possible alternatives and of the risk associated. In the decision making process for mitigation plan, the contribute of scientist is crucial, due to the fact that Risk-Damage analysis is based on evaluation of flow field ,of Hydraulic Risk and on economical and societal considerations. The present paper will focus on the first part of process, the mathematical modelling of flood events which is the base for all further considerations. The evaluation of potential catastrophic damage consequent to a flood event and in particular to dam failure requires modelling of the flood with sufficient detail so to capture the spatial and temporal evolutions of the event, as well of the velocity field. Thus, the selection of an appropriate mathematical model to correctly simulate flood routing is an essential step. In this work we present the application of two 3D Computational fluid dynamics models to a synthetic and real case study in order to evaluate the correct evolution of flow field and the associated flood Risk . The first model is based on a opensource CFD platform called openFoam. Water flow is schematized with a classical continuum approach based on Navier-Stokes equation coupled with Volume of fluid (VOF) method to take in account the multiphase character of river bottom-water- air systems. The second model instead is based on the Lattice Boltzmann method, an innovative numerical fluid dynamics scheme based on Boltzmann's kinetic equation that represents the flow dynamics at the macroscopic level by incorporating a microscopic kinetic approach. Fluid is seen as composed by particles that can move and collide among them. Simulation results from both models are promising and congruent to

  3. Initial Risk Analysis and Decision Making Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engel, David W.

    2012-02-01

    Commercialization of new carbon capture simulation initiative (CCSI) technology will include two key elements of risk management, namely, technical risk (will process and plant performance be effective, safe, and reliable) and enterprise risk (can project losses and costs be controlled within the constraints of market demand to maintain profitability and investor confidence). Both of these elements of risk are incorporated into the risk analysis subtask of Task 7. Thus far, this subtask has developed a prototype demonstration tool that quantifies risk based on the expected profitability of expenditures when retrofitting carbon capture technology on a stylized 650 MW pulverized coalmore » electric power generator. The prototype is based on the selection of specific technical and financial factors believed to be important determinants of the expected profitability of carbon capture, subject to uncertainty. The uncertainty surrounding the technical performance and financial variables selected thus far is propagated in a model that calculates the expected profitability of investments in carbon capture and measures risk in terms of variability in expected net returns from these investments. Given the preliminary nature of the results of this prototype, additional work is required to expand the scope of the model to include additional risk factors, additional information on extant and proposed risk factors, the results of a qualitative risk factor elicitation process, and feedback from utilities and other interested parties involved in the carbon capture project. Additional information on proposed distributions of these risk factors will be integrated into a commercial implementation framework for the purpose of a comparative technology investment analysis.« less

  4. School Health Promotion Policies and Adolescent Risk Behaviors in Israel: A Multilevel Analysis

    ERIC Educational Resources Information Center

    Tesler, Riki; Harel-Fisch, Yossi; Baron-Epel, Orna

    2016-01-01

    Background: Health promotion policies targeting risk-taking behaviors are being implemented across schools in Israel. This study identified the most effective components of these policies influencing cigarette smoking and alcohol consumption among adolescents. Methods: Logistic hierarchical linear model (HLM) analysis of data for 5279 students in…

  5. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system components, part 2

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The technical effort and computer code enhancements performed during the sixth year of the Probabilistic Structural Analysis Methods program are summarized. Various capabilities are described to probabilistically combine structural response and structural resistance to compute component reliability. A library of structural resistance models is implemented in the Numerical Evaluations of Stochastic Structures Under Stress (NESSUS) code that included fatigue, fracture, creep, multi-factor interaction, and other important effects. In addition, a user interface was developed for user-defined resistance models. An accurate and efficient reliability method was developed and was successfully implemented in the NESSUS code to compute component reliability based on user-selected response and resistance models. A risk module was developed to compute component risk with respect to cost, performance, or user-defined criteria. The new component risk assessment capabilities were validated and demonstrated using several examples. Various supporting methodologies were also developed in support of component risk assessment.

  6. Electrocardiologic and related methods of non-invasive detection and risk stratification in myocardial ischemia: state of the art and perspectives

    PubMed Central

    Huebner, Thomas; Goernig, Matthias; Schuepbach, Michael; Sanz, Ernst; Pilgram, Roland; Seeck, Andrea; Voss, Andreas

    2010-01-01

    Background: Electrocardiographic methods still provide the bulk of cardiovascular diagnostics. Cardiac ischemia is associated with typical alterations in cardiac biosignals that have to be measured, analyzed by mathematical algorithms and allegorized for further clinical diagnostics. The fast growing fields of biomedical engineering and applied sciences are intensely focused on generating new approaches to cardiac biosignal analysis for diagnosis and risk stratification in myocardial ischemia. Objectives: To present and review the state of the art in and new approaches to electrocardiologic methods for non-invasive detection and risk stratification in coronary artery disease (CAD) and myocardial ischemia; secondarily, to explore the future perspectives of these methods. Methods: In follow-up to the Expert Discussion at the 2008 Workshop on "Biosignal Analysis" of the German Society of Biomedical Engineering in Potsdam, Germany, we comprehensively searched the pertinent literature and databases and compiled the results into this review. Then, we categorized the state-of-the-art methods and selected new approaches based on their applications in detection and risk stratification of myocardial ischemia. Finally, we compared the pros and cons of the methods and explored their future potentials for cardiology. Results: Resting ECG, particularly suited for detecting ST-elevation myocardial infarctions, and exercise ECG, for the diagnosis of stable CAD, are state-of-the-art methods. New exercise-free methods for detecting stable CAD include cardiogoniometry (CGM); methods for detecting acute coronary syndrome without ST elevation are Body Surface Potential Mapping, functional imaging and CGM. Heart rate variability and blood pressure variability analyses, microvolt T-wave alternans and signal-averaged ECG mainly serve in detecting and stratifying the risk for lethal arrythmias in patients with myocardial ischemia or previous myocardial infarctions. Telemedicine and ambient

  7. Dietary Flavonoid Intake and Smoking-Related Cancer Risk: A Meta-Analysis

    PubMed Central

    Woo, Hae Dong; Kim, Jeongseon

    2013-01-01

    Purpose To systematically investigate the effects of dietary flavonoids and flavonoid subclasses on the risk of smoking-related cancer in observational studies. Methods Summary estimates and corresponding standard errors were calculated using the multivariate-adjusted odds ratio (OR) or relative risk (RR) and 95% CI of selected studies and weighted by the inverse variance. Results A total of 35 studies, including 19 case-controls (9,525 cases and 15,835 controls) and 15 cohort studies (988,082 subjects and 8,161 cases), were retrieved for the meta-analysis. Total dietary flavonoids and most of the flavonoid subclasses were inversely associated with smoking-related cancer risk (OR: 0.82, 95% CI: 0.72-0.93). In subgroup analyses by cancer site, significant associations were observed in aerodigestive tract and lung cancers. Total dietary flavonoid intake was significantly associated with aerodigestive tract cancer risk (OR: 0.67, 95% CI: 0.54-0.83) marginally associated with lung cancer risk (OR: 0.84, 95% CI: 0.71-1.00). Subgroup analyses by smoking status showed significantly different results. The intake of total flavonoids, flavonols, flavones, and flavanones, as well as the flavonols quercetin and kaempferol was significantly associated with decreased risk of smoking-related cancer in smokers, whereas no association was observed in non-smokers, except for flavanones. In meta-analysis for the effect of subclasses of dietary flavonoids by cancer type, aerodigestive tract cancer was inversely associated with most flavonoid subclasses. Conclusion The protective effects of flavonoids on smoking-related cancer risk varied across studies, but the overall results indicated that intake of dietary flavonoids, especially flavonols, was inversely associated with smoking-related cancer risk. The protective effects of flavonoids on smoking-related cancer risk were more prominent in smokers. PMID:24069431

  8. Inhalation Anthrax: Dose Response and Risk Analysis

    PubMed Central

    Thran, Brandolyn; Morse, Stephen S.; Hugh-Jones, Martin; Massulik, Stacey

    2008-01-01

    The notion that inhalation of a single Bacillus anthracis spore is fatal has become entrenched nearly to the point of urban legend, in part because of incomplete articulation of the scientific basis for microbial risk assessment, particularly dose-response assessment. Risk analysis (ie, risk assessment, risk communication, risk management) necessitates transparency: distinguishing scientific facts, hypotheses, judgments, biases in interpretations, and potential misinformation. The difficulty in achieving transparency for biothreat risk is magnified by misinformation and poor characterization of both dose-response relationships and the driving mechanisms that cause susceptibility or resistance to disease progression. Regrettably, this entrenchment unnecessarily restricts preparedness planning to a single response scenario: decontaminate until no spores are detectable in air, water, or on surfaces—essentially forcing a zero-tolerance policy inconsistent with the biology of anthrax. We present evidence about inhalation anthrax dose-response relationships, including reports from multiple studies documenting exposures insufficient to cause inhalation anthrax in laboratory animals and humans. The emphasis of the article is clarification about what is known from objective scientific evidence for doses of anthrax spores associated with survival and mortality. From this knowledge base, we discuss the need for future applications of more formal risk analysis processes to guide development of alternative non-zero criteria or standards based on science to inform preparedness planning and other risk management activities. PMID:18582166

  9. Dietary Inflammatory Potential Score and Risk of Breast Cancer: Systematic Review and Meta-analysis.

    PubMed

    Zahedi, Hoda; Djalalinia, Shirin; Sadeghi, Omid; Asayesh, Hamid; Noroozi, Mehdi; Gorabi, Armita Mahdavi; Mohammadi, Rasool; Qorbani, Mostafa

    2018-02-07

    Several studies have been conducted on the relationship between dietary inflammatory potential (DIP) and breast cancer. However, the findings are conflicting. This systematic review and meta-analysis summarizes the findings on the association between DIP and the risk of breast cancer. We used relevant keywords and searched online international electronic databases, including PubMed and NLM Gateway (for Medline), Institute for Scientific Information (ISI), and Scopus for articles published through February 2017. All cross-sectional, case-control, and cohort studies were included in this meta-analysis. Meta-analysis was performed using the random effects meta-analysis method to address heterogeneity among studies. Findings were analyzed statistically. Nine studies were included in the present systematic review and meta-analysis. The total sample size of these studies was 296,102, and the number of participants varied from 1453 to 122,788. The random effects meta-analysis showed a positive and significant association between DIP and the risk of breast cancer (pooled odds ratio, 1.14; 95% confidence interval, 1.01-1.27). The pooled effect size was not statistically significant because of the type of studies, including cohort (pooled relative risk, 1.04; 95% confidence interval, 0.98-1.10) and case-control (pooled odds ratio, 1.63; 95% confidence interval, 0.89-2.37) studies. We found a significant and positive association between higher DIP score and risk of breast cancer. Modifying inflammatory characteristics of diet can substantially reduce the risk of breast cancer. Copyright © 2018 Elsevier Inc. All rights reserved.

  10. Study of risk factors for gastric cancer by populational databases analysis

    PubMed Central

    Ferrari, Fangio; Reis, Marco Antonio Moura

    2013-01-01

    AIM: To study the association between the incidence of gastric cancer and populational exposure to risk/protective factors through an analysis of international databases. METHODS: Open-access global databases concerning the incidence of gastric cancer and its risk/protective factors were identified through an extensive search on the Web. As its distribution was neither normal nor symmetric, the cancer incidence of each country was categorized according to ranges of percentile distribution. The association of each risk/protective factor with exposure was measured between the extreme ranges of the incidence of gastric cancer (under the 25th percentile and above the 75th percentile) by the use of the Mann-Whitney test, considering a significance level of 0.05. RESULTS: A variable amount of data omission was observed among all of the factors under study. A weak or nonexistent correlation between the incidence of gastric cancer and the study variables was shown by a visual analysis of scatterplot dispersion. In contrast, an analysis of categorized incidence revealed that the countries with the highest human development index (HDI) values had the highest rates of obesity in males and the highest consumption of alcohol, tobacco, fruits, vegetables and meat, which were associated with higher incidences of gastric cancer. There was no significant difference for the risk factors of obesity in females and fish consumption. CONCLUSION: Higher HDI values, coupled with a higher prevalence of male obesity and a higher per capita consumption of alcohol, tobacco, fruits, vegetables and meat, are associated with a higher incidence of gastric cancer based on an analysis of populational global data. PMID:24409066

  11. The dissection of risk: a conceptual analysis.

    PubMed

    O'Byrne, Patrick

    2008-03-01

    Recently, patient safety has gained popularity in the nursing literature. While this topic is used extensively and has been analyzed thoroughly, some of the concepts upon which it relies, such as risk, have remained undertheorized. In fact, despite its considerable use, the term 'risk' has been largely assumed to be inherently neutral - meaning that its definition and discovery is seen as objective and impartial, and that risk avoidance is natural and logical. Such an oversight in evaluation requires that the concept of risk be thoroughly analyzed as it relates to nursing practices, particularly in relation to those practices surrounding bio-political nursing care, such as public health, as well as other more trendy nursing topics, such as patient safety. Thus, this paper applies the Evolutionary Model of concept analysis to explore 'risk', and expose it as one mechanism of maintaining prescribed/ proscribed social practices. Thereby, an analysis of risk results in the definitions and roles of the discipline and profession of nursing expanding from solely being dedicated to patient care, to include, in addition, its functions as a governmental body that unwittingly maintains hegemonic infrastructures.

  12. Proteomics analysis of human breast milk to assess breast cancer risk.

    PubMed

    Aslebagh, Roshanak; Channaveerappa, Devika; Arcaro, Kathleen F; Darie, Costel C

    2018-02-01

    Detection of breast cancer (BC) in young women is challenging because mammography, the most common tool for detecting BC, is not effective on the dense breast tissue characteristic of young women. In addition to the limited means for detecting their BC, young women face a transient increased risk of pregnancy-associated BC. As a consequence, reproductively active women could benefit significantly from a tool that provides them with accurate risk assessment and early detection of BC. One potential method for detection of BC is biochemical monitoring of proteins and other molecules in bodily fluids such as serum, nipple aspirate, ductal lavage, tear, urine, saliva and breast milk. Of all these fluids, only breast milk provides access to a large volume of breast tissue, in the form of exfoliated epithelial cells, and to the local breast environment, in the form of molecules in the milk. Thus, analysis of breast milk is a non-invasive method with significant potential for assessing BC risk. Here we analyzed human breast milk by mass spectrometry (MS)-based proteomics to build a biomarker signature for early detection of BC. Ten milk samples from eight women provided five paired-groups (cancer versus control) for analysis of dysregulatedproteins: two within woman comparisons (milk from a diseased breast versus a healthy breast of the same woman) and three across women comparisons (milk from a woman with cancer versus a woman without cancer). Despite a wide range in the time between milk donation and cancer diagnosis (cancer diagnosis occurred from 1 month before to 24 months after milk donation), the levels of some proteins differed significantly between cancer and control in several of the five comparison groups. These pilot data are supportive of the idea that molecular analysis of breast milk will identify proteins informative for early detection and accurate assessment of BC risk, and warrant further research. Data are available via ProteomeXchange with identifier

  13. 49 CFR Appendix D to Part 172 - Rail Risk Analysis Factors

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 2 2011-10-01 2011-10-01 false Rail Risk Analysis Factors D Appendix D to Part... REQUIREMENTS, AND SECURITY PLANS Pt. 172, App. D Appendix D to Part 172—Rail Risk Analysis Factors A. This... safety and security risk analyses required by § 172.820. The risk analysis to be performed may be...

  14. 49 CFR Appendix D to Part 172 - Rail Risk Analysis Factors

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 2 2010-10-01 2010-10-01 false Rail Risk Analysis Factors D Appendix D to Part... REQUIREMENTS, AND SECURITY PLANS Pt. 172, App. D Appendix D to Part 172—Rail Risk Analysis Factors A. This... safety and security risk analyses required by § 172.820. The risk analysis to be performed may be...

  15. Looking beyond borders: integrating best practices in benefit-risk analysis into the field of food and nutrition.

    PubMed

    Tijhuis, M J; Pohjola, M V; Gunnlaugsdóttir, H; Kalogeras, N; Leino, O; Luteijn, J M; Magnússon, S H; Odekerken-Schröder, G; Poto, M; Tuomisto, J T; Ueland, O; White, B C; Holm, F; Verhagen, H

    2012-01-01

    An integrated benefit-risk analysis aims to give guidance in decision situations where benefits do not clearly prevail over risks, and explicit weighing of benefits and risks is thus indicated. The BEPRARIBEAN project aims to advance benefit-risk analysis in the area of food and nutrition by learning from other fields. This paper constitutes the final stage of the project, in which commonalities and differences in benefit-risk analysis are identified between the Food and Nutrition field and other fields, namely Medicines, Food Microbiology, Environmental Health, Economics and Marketing-Finance, and Consumer Perception. From this, ways forward are characterized for benefit-risk analysis in Food and Nutrition. Integrated benefit-risk analysis in Food and Nutrition may advance in the following ways: Increased engagement and communication between assessors, managers, and stakeholders; more pragmatic problem-oriented framing of assessment; accepting some risk; pre- and post-market analysis; explicit communication of the assessment purpose, input and output; more human (dose-response) data and more efficient use of human data; segmenting populations based on physiology; explicit consideration of value judgments in assessment; integration of multiple benefits and risks from multiple domains; explicit recognition of the impact of consumer beliefs, opinions, views, perceptions, and attitudes on behaviour; and segmenting populations based on behaviour; the opportunities proposed here do not provide ultimate solutions; rather, they define a collection of issues to be taken account of in developing methods, tools, practices and policies, as well as refining the regulatory context, for benefit-risk analysis in Food and Nutrition and other fields. Thus, these opportunities will now need to be explored further and incorporated into benefit-risk practice and policy. If accepted, incorporation of these opportunities will also involve a paradigm shift in Food and Nutrition benefit-risk

  16. Reconnecting Stochastic Methods With Hydrogeological Applications: A Utilitarian Uncertainty Analysis and Risk Assessment Approach for the Design of Optimal Monitoring Networks

    NASA Astrophysics Data System (ADS)

    Bode, Felix; Ferré, Ty; Zigelli, Niklas; Emmert, Martin; Nowak, Wolfgang

    2018-03-01

    Collaboration between academics and practitioners promotes knowledge transfer between research and industry, with both sides benefiting greatly. However, academic approaches are often not feasible given real-world limits on time, cost and data availability, especially for risk and uncertainty analyses. Although the need for uncertainty quantification and risk assessment are clear, there are few published studies examining how scientific methods can be used in practice. In this work, we introduce possible strategies for transferring and communicating academic approaches to real-world applications, countering the current disconnect between increasingly sophisticated academic methods and methods that work and are accepted in practice. We analyze a collaboration between academics and water suppliers in Germany who wanted to design optimal groundwater monitoring networks for drinking-water well catchments. Our key conclusions are: to prefer multiobjective over single-objective optimization; to replace Monte-Carlo analyses by scenario methods; and to replace data-hungry quantitative risk assessment by easy-to-communicate qualitative methods. For improved communication, it is critical to set up common glossaries of terms to avoid misunderstandings, use striking visualization to communicate key concepts, and jointly and continually revisit the project objectives. Ultimately, these approaches and recommendations are simple and utilitarian enough to be transferred directly to other practical water resource related problems.

  17. Obesity as a risk factor for developing functional limitation among older adults: A conditional inference tree analysis.

    PubMed

    Cheng, Feon W; Gao, Xiang; Bao, Le; Mitchell, Diane C; Wood, Craig; Sliwinski, Martin J; Smiciklas-Wright, Helen; Still, Christopher D; Rolston, David D K; Jensen, Gordon L

    2017-07-01

    To examine the risk factors of developing functional decline and make probabilistic predictions by using a tree-based method that allows higher order polynomials and interactions of the risk factors. The conditional inference tree analysis, a data mining approach, was used to construct a risk stratification algorithm for developing functional limitation based on BMI and other potential risk factors for disability in 1,951 older adults without functional limitations at baseline (baseline age 73.1 ± 4.2 y). We also analyzed the data with multivariate stepwise logistic regression and compared the two approaches (e.g., cross-validation). Over a mean of 9.2 ± 1.7 years of follow-up, 221 individuals developed functional limitation. Higher BMI, age, and comorbidity were consistently identified as significant risk factors for functional decline using both methods. Based on these factors, individuals were stratified into four risk groups via the conditional inference tree analysis. Compared to the low-risk group, all other groups had a significantly higher risk of developing functional limitation. The odds ratio comparing two extreme categories was 9.09 (95% confidence interval: 4.68, 17.6). Higher BMI, age, and comorbid disease were consistently identified as significant risk factors for functional decline among older individuals across all approaches and analyses. © 2017 The Obesity Society.

  18. Evaluation and applications of the clinically significant change method with the Violence Risk Scale-Sexual Offender version: implications for risk-change communication.

    PubMed

    Olver, Mark E; Beggs Christofferson, Sarah M; Wong, Stephen C P

    2015-02-01

    We examined the use of the clinically significant change (CSC) method with the Violence Risk Scale-Sexual Offender version (VRS-SO), and its implications for risk communication, in a combined sample of 945 treated sexual offenders from three international settings, followed up for a minimum 5 years post-release. The reliable change (RC) index was used to identify thresholds of clinically meaningful change and to create four CSC groups (already okay, recovered, improved, unchanged) based on VRS-SO dynamic scores and amount of change made. Outcome analyses demonstrated important CSC-group differences in 5-year rates of sexual and violent recidivism. However, when baseline risk was controlled via Cox regression survival analysis, the pattern and magnitude of CSC-group differences in sexual and violent recidivism changed to suggest that observed variation in recidivism base rates could be at least partly explained by pre-existing group differences in risk level. Implications for communication of risk-change information and applications to clinical practice are discussed. Copyright © 2015 John Wiley & Sons, Ltd.

  19. The effects of napping on the risk of hypertension: a systematic review and meta-analysis.

    PubMed

    Cheungpasitporn, Wisit; Thongprayoon, Charat; Srivali, Narat; Vijayvargiya, Priya; Andersen, Carl A; Kittanamongkolchai, Wonngarm; Sathick, Insara J Jaffer; Caples, Sean M; Erickson, Stephen B

    2016-11-01

    The risk of hypertension in adults who regularly take a nap is controversial. The objective of this meta-analysis was to assess the associations between napping and hypertension. A literature search was performed using MEDLINE, EMbase and The Cochrane Database of Systematic Reviews from inception through October, 2015. Studies that reported relative risks, odd ratios or hazard ratios comparing the risk of hypertension in individuals who regularly take nap were included. Pooled risk ratios (RR) and 95% confidence interval (CI) were calculated using a random-effect, generic inverse variance method. Nine observational studies with 112,267 individuals were included in the analysis to assess the risk of hypertension in nappers. The pooled RR of hypertension in nappers was 1.13 with 95% CI (0.98 to 1.30). When meta-analysis was limited only to studies assessing the risk of hypertension in daytime nappers, the pooled RR of hypertension was 1.19 with 95% CI (1.06 to 1.35). The data on association between nighttime napping in individuals who work night shift and hypertension were limited, only one observational study reported reduced risk of hypertension in nighttime nappers with odds ratio of 0.79 with 95% CI (0.63 to 1.00). Our meta-analysis demonstrates a significant association between daytime napping and hypertension. Future study is needed to assess the potential benefits of HTN screening for daytime nappers. © 2016 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.

  20. A quantitative risk analysis approach to port hydrocarbon logistics.

    PubMed

    Ronza, A; Carol, S; Espejo, V; Vílchez, J A; Arnaldos, J

    2006-01-16

    A method is presented that allows quantitative risk analysis to be performed on marine hydrocarbon terminals sited in ports. A significant gap was identified in the technical literature on QRA for the handling of hazardous materials in harbours published prior to this work. The analysis is extended to tanker navigation through port waters and loading and unloading facilities. The steps of the method are discussed, beginning with data collecting. As to accident scenario identification, an approach is proposed that takes into account minor and massive spills due to loading arm failures and tank rupture. Frequency estimation is thoroughly reviewed and a shortcut approach is proposed for frequency calculation. This allows for the two-fold possibility of a tanker colliding/grounding at/near the berth or while navigating to/from the berth. A number of probability data defining the possibility of a cargo spill after an external impact on a tanker are discussed. As to consequence and vulnerability estimates, a scheme is proposed for the use of ratios between the numbers of fatal victims, injured and evacuated people. Finally, an example application is given, based on a pilot study conducted in the Port of Barcelona, where the method was tested.

  1. A Framework for Flood Risk Analysis and Benefit Assessment of Flood Control Measures in Urban Areas

    PubMed Central

    Li, Chaochao; Cheng, Xiaotao; Li, Na; Du, Xiaohe; Yu, Qian; Kan, Guangyuan

    2016-01-01

    Flood risk analysis is more complex in urban areas than that in rural areas because of their closely packed buildings, different kinds of land uses, and large number of flood control works and drainage systems. The purpose of this paper is to propose a practical framework for flood risk analysis and benefit assessment of flood control measures in urban areas. Based on the concept of disaster risk triangle (hazard, vulnerability and exposure), a comprehensive analysis method and a general procedure were proposed for urban flood risk analysis. Urban Flood Simulation Model (UFSM) and Urban Flood Damage Assessment Model (UFDAM) were integrated to estimate the flood risk in the Pudong flood protection area (Shanghai, China). S-shaped functions were adopted to represent flood return period and damage (R-D) curves. The study results show that flood control works could significantly reduce the flood risk within the 66-year flood return period and the flood risk was reduced by 15.59%. However, the flood risk was only reduced by 7.06% when the flood return period exceeded 66-years. Hence, it is difficult to meet the increasing demands for flood control solely relying on structural measures. The R-D function is suitable to describe the changes of flood control capacity. This frame work can assess the flood risk reduction due to flood control measures, and provide crucial information for strategy development and planning adaptation. PMID:27527202

  2. Principal Component and Linkage Analysis of Cardiovascular Risk Traits in the Norfolk Isolate

    PubMed Central

    Cox, Hannah C.; Bellis, Claire; Lea, Rod A.; Quinlan, Sharon; Hughes, Roger; Dyer, Thomas; Charlesworth, Jac; Blangero, John; Griffiths, Lyn R.

    2009-01-01

    Objective(s) An individual's risk of developing cardiovascular disease (CVD) is influenced by genetic factors. This study focussed on mapping genetic loci for CVD-risk traits in a unique population isolate derived from Norfolk Island. Methods This investigation focussed on 377 individuals descended from the population founders. Principal component analysis was used to extract orthogonal components from 11 cardiovascular risk traits. Multipoint variance component methods were used to assess genome-wide linkage using SOLAR to the derived factors. A total of 285 of the 377 related individuals were informative for linkage analysis. Results A total of 4 principal components accounting for 83% of the total variance were derived. Principal component 1 was loaded with body size indicators; principal component 2 with body size, cholesterol and triglyceride levels; principal component 3 with the blood pressures; and principal component 4 with LDL-cholesterol and total cholesterol levels. Suggestive evidence of linkage for principal component 2 (h2 = 0.35) was observed on chromosome 5q35 (LOD = 1.85; p = 0.0008). While peak regions on chromosome 10p11.2 (LOD = 1.27; p = 0.005) and 12q13 (LOD = 1.63; p = 0.003) were observed to segregate with principal components 1 (h2 = 0.33) and 4 (h2 = 0.42), respectively. Conclusion(s): This study investigated a number of CVD risk traits in a unique isolated population. Findings support the clustering of CVD risk traits and provide interesting evidence of a region on chromosome 5q35 segregating with weight, waist circumference, HDL-c and total triglyceride levels. PMID:19339786

  3. Evaluation and recommendation of sensitivity analysis methods for application to Stochastic Human Exposure and Dose Simulation models.

    PubMed

    Mokhtari, Amirhossein; Christopher Frey, H; Zheng, Junyu

    2006-11-01

    Sensitivity analyses of exposure or risk models can help identify the most significant factors to aid in risk management or to prioritize additional research to reduce uncertainty in the estimates. However, sensitivity analysis is challenged by non-linearity, interactions between inputs, and multiple days or time scales. Selected sensitivity analysis methods are evaluated with respect to their applicability to human exposure models with such features using a testbed. The testbed is a simplified version of a US Environmental Protection Agency's Stochastic Human Exposure and Dose Simulation (SHEDS) model. The methods evaluated include the Pearson and Spearman correlation, sample and rank regression, analysis of variance, Fourier amplitude sensitivity test (FAST), and Sobol's method. The first five methods are known as "sampling-based" techniques, wheras the latter two methods are known as "variance-based" techniques. The main objective of the test cases was to identify the main and total contributions of individual inputs to the output variance. Sobol's method and FAST directly quantified these measures of sensitivity. Results show that sensitivity of an input typically changed when evaluated under different time scales (e.g., daily versus monthly). All methods provided similar insights regarding less important inputs; however, Sobol's method and FAST provided more robust insights with respect to sensitivity of important inputs compared to the sampling-based techniques. Thus, the sampling-based methods can be used in a screening step to identify unimportant inputs, followed by application of more computationally intensive refined methods to a smaller set of inputs. The implications of time variation in sensitivity results for risk management are briefly discussed.

  4. A quantitative flood risk analysis methodology for urban areas with integration of social research data

    NASA Astrophysics Data System (ADS)

    Escuder-Bueno, I.; Castillo-Rodríguez, J. T.; Zechner, S.; Jöbstl, C.; Perales-Momparler, S.; Petaccia, G.

    2012-09-01

    Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009-2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative). First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.

  5. Dietary patterns and depression risk: A meta-analysis.

    PubMed

    Li, Ye; Lv, Mei-Rong; Wei, Yan-Jin; Sun, Ling; Zhang, Ji-Xiang; Zhang, Huai-Guo; Li, Bin

    2017-07-01

    Although some studies have reported potential associations of dietary patterns with depression risk, a consistent perspective hasn't been estimated to date. Therefore, we conducted this meta-analysis to evaluate the relation between dietary patterns and the risk of depression. A literature research was conducted searching MEDLINE and EMBASE databases up to September 2016. In total, 21 studies from ten countries met the inclusion criteria and were included in the present meta-analysis. A dietary pattern characterized by a high intakes of fruit, vegetables, whole grain, fish, olive oil, low-fat dairy and antioxidants and low intakes of animal foods was apparently associated with a decreased risk of depression. A dietary pattern characterized by a high consumption of red and/or processed meat, refined grains, sweets, high-fat dairy products, butter, potatoes and high-fat gravy, and low intakes of fruits and vegetables is associated with an increased risk of depression. The results of this meta-analysis suggest that healthy pattern may decrease the risk of depression, whereas western-style may increase the risk of depression. However, more randomized controlled trails and cohort studies are urgently required to confirm this findings. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  6. Seawater intrusion risk analysis under climate change conditions for the Gaza Strip aquifer (Palestine)

    NASA Astrophysics Data System (ADS)

    Dentoni, Marta; Deidda, Roberto; Paniconi, Claudio; Marrocu, Marino; Lecca, Giuditta

    2014-05-01

    Seawater intrusion (SWI) has become a major threat to coastal freshwater resources, particularly in the Mediterranean basin, where this problem is exacerbated by the lack of appropriate groundwater resources management and with serious potential impacts from projected climate changes. A proper analysis and risk assessment that includes climate scenarios is essential for the design of water management measures to mitigate the environmental and socio-economic impacts of SWI. In this study a methodology for SWI risk analysis in coastal aquifers is developed and applied to the Gaza Strip coastal aquifer in Palestine. The method is based on the origin-pathway-target model, evaluating the final value of SWI risk by applying the overlay principle to the hazard map (representing the origin of SWI), the vulnerability map (representing the pathway of groundwater flow) and the elements map (representing the target of SWI). Results indicate the important role of groundwater simulation in SWI risk assessment and illustrate how mitigation measures can be developed according to predefined criteria to arrive at quantifiable expected benefits. Keywords: Climate change, coastal aquifer, seawater intrusion, risk analysis, simulation/optimization model. Acknowledgements. The study is partially funded by the project "Climate Induced Changes on the Hydrology of Mediterranean Basins (CLIMB)", FP7-ENV-2009-1, GA 244151.

  7. Flood risk assessment in France: comparison of extreme flood estimation methods (EXTRAFLO project, Task 7)

    NASA Astrophysics Data System (ADS)

    Garavaglia, F.; Paquet, E.; Lang, M.; Renard, B.; Arnaud, P.; Aubert, Y.; Carre, J.

    2013-12-01

    In flood risk assessment the methods can be divided in two families: deterministic methods and probabilistic methods. In the French hydrologic community the probabilistic methods are historically preferred to the deterministic ones. Presently a French research project named EXTRAFLO (RiskNat Program of the French National Research Agency, https://extraflo.cemagref.fr) deals with the design values for extreme rainfall and floods. The object of this project is to carry out a comparison of the main methods used in France for estimating extreme values of rainfall and floods, to obtain a better grasp of their respective fields of application. In this framework we present the results of Task 7 of EXTRAFLO project. Focusing on French watersheds, we compare the main extreme flood estimation methods used in French background: (i) standard flood frequency analysis (Gumbel and GEV distribution), (ii) regional flood frequency analysis (regional Gumbel and GEV distribution), (iii) local and regional flood frequency analysis improved by historical information (Naulet et al., 2005), (iv) simplify probabilistic method based on rainfall information (i.e. Gradex method (CFGB, 1994), Agregee method (Margoum, 1992) and Speed method (Cayla, 1995)), (v) flood frequency analysis by continuous simulation approach and based on rainfall information (i.e. Schadex method (Paquet et al., 2013, Garavaglia et al., 2010), Shyreg method (Lavabre et al., 2003)) and (vi) multifractal approach. The main result of this comparative study is that probabilistic methods based on additional information (i.e. regional, historical and rainfall information) provide better estimations than the standard flood frequency analysis. Another interesting result is that, the differences between the various extreme flood quantile estimations of compared methods increase with return period, staying relatively moderate up to 100-years return levels. Results and discussions are here illustrated throughout with the example

  8. A comprehensive risk analysis of coastal zones in China

    NASA Astrophysics Data System (ADS)

    Wang, Guanghui; Liu, Yijun; Wang, Hongbing; Wang, Xueying

    2014-03-01

    Although coastal zones occupy an important position in the world development, they face high risks and vulnerability to natural disasters because of their special locations and their high population density. In order to estimate their capability for crisis-response, various models have been established. However, those studies mainly focused on natural factors or conditions, which could not reflect the social vulnerability and regional disparities of coastal zones. Drawing lessons from the experiences of the United Nations Environment Programme (UNEP), this paper presents a comprehensive assessment strategy based on the mechanism of Risk Matrix Approach (RMA), which includes two aspects that are further composed of five second-class indicators. The first aspect, the probability phase, consists of indicators of economic conditions, social development, and living standards, while the second one, the severity phase, is comprised of geographic exposure and natural disasters. After weighing all of the above indicators by applying the Analytic Hierarchy Process (AHP) and Delphi Method, the paper uses the comprehensive assessment strategy to analyze the risk indices of 50 coastal cities in China. The analytical results are presented in ESRI ArcGis10.1, which generates six different risk maps covering the aspects of economy, society, life, environment, disasters, and an overall assessment of the five areas. Furthermore, the study also investigates the spatial pattern of these risk maps, with detailed discussion and analysis of different risks in coastal cities.

  9. The risk of bladder cancer in patients with urinary calculi: a meta-analysis.

    PubMed

    Yu, Zhang; Yue, Wu; Jiuzhi, Li; Youtao, Jiang; Guofei, Zhang; Wenbin, Guo

    2018-01-05

    The objective of this meta-analysis was to evaluate the association between a history of urinary calculi (UC) and the risk of bladder cancer (BC). A literature search was performed from inception until July 2017. Studies that reported odds ratios (OR), relative risks or hazard ratios comparing the risk of BC in patients with the history of UC vs those without the history of UC were included. Pooled odds ratios and 95% confidence interval (CI) were calculated using a random-effect or fixed-effect method. Thirteen studies were included in our analysis to assess the association between a history of UC and risk of BC. The pooled OR of BC in patients with UC was 1.87 (95% CI, 1.45-2.41). Bladder calculi [OR, 2.17 (95% CI, 1.52-3.08)] had a higher risk of BC than kidney calculi [OR, 1.39 (95% CI, 1.06-1.82)]. The subjects had a history of UC that was associated with increased BC risk both in males [OR, 2.04 (95% CI, 1.41-2.96)] and in females [OR, 2.99 (95% CI, 2.37-3.76)]. The subgroup analysis demonstrated that UC increasing risk of BC both in case-control studies [OR, 1.75 (95% CI, 1.25-2.45)] and cohort studies [OR, 2.27 (95% CI, 1.55-3.32)]. The pooled OR of BC risk in patients with UC were 1.60 (95% CI, 1.15-2.24) in America, 1.36 (95% CI, 1.14-1.64) in Europe and 3.05 (95% CI, 2.21-4.21) in Asia, respectively. Our study demonstrates a significant increased risk of BC in patients with prior UC. This finding suggests that a history of UC is associated with BC and may impact clinical management and cancer surveillance. Further studies still needed to confirm the findings.

  10. Benefit-Risk Analysis for Decision-Making: An Approach.

    PubMed

    Raju, G K; Gurumurthi, K; Domike, R

    2016-12-01

    The analysis of benefit and risk is an important aspect of decision-making throughout the drug lifecycle. In this work, the use of a benefit-risk analysis approach to support decision-making was explored. The proposed approach builds on the qualitative US Food and Drug Administration (FDA) approach to include a more explicit analysis based on international standards and guidance that enables aggregation and comparison of benefit and risk on a common basis and a lifecycle focus. The approach is demonstrated on six decisions over the lifecycle (e.g., accelerated approval, withdrawal, and traditional approval) using two case studies: natalizumab for multiple sclerosis (MS) and bedaquiline for multidrug-resistant tuberculosis (MDR-TB). © 2016 American Society for Clinical Pharmacology and Therapeutics.

  11. The “Dry-Run” Analysis: A Method for Evaluating Risk Scores for Confounding Control

    PubMed Central

    Wyss, Richard; Hansen, Ben B.; Ellis, Alan R.; Gagne, Joshua J.; Desai, Rishi J.; Glynn, Robert J.; Stürmer, Til

    2017-01-01

    Abstract A propensity score (PS) model's ability to control confounding can be assessed by evaluating covariate balance across exposure groups after PS adjustment. The optimal strategy for evaluating a disease risk score (DRS) model's ability to control confounding is less clear. DRS models cannot be evaluated through balance checks within the full population, and they are usually assessed through prediction diagnostics and goodness-of-fit tests. A proposed alternative is the “dry-run” analysis, which divides the unexposed population into “pseudo-exposed” and “pseudo-unexposed” groups so that differences on observed covariates resemble differences between the actual exposed and unexposed populations. With no exposure effect separating the pseudo-exposed and pseudo-unexposed groups, a DRS model is evaluated by its ability to retrieve an unconfounded null estimate after adjustment in this pseudo-population. We used simulations and an empirical example to compare traditional DRS performance metrics with the dry-run validation. In simulations, the dry run often improved assessment of confounding control, compared with the C statistic and goodness-of-fit tests. In the empirical example, PS and DRS matching gave similar results and showed good performance in terms of covariate balance (PS matching) and controlling confounding in the dry-run analysis (DRS matching). The dry-run analysis may prove useful in evaluating confounding control through DRS models. PMID:28338910

  12. Hotspot Identification for Shanghai Expressways Using the Quantitative Risk Assessment Method

    PubMed Central

    Chen, Can; Li, Tienan; Sun, Jian; Chen, Feng

    2016-01-01

    Hotspot identification (HSID) is the first and key step of the expressway safety management process. This study presents a new HSID method using the quantitative risk assessment (QRA) technique. Crashes that are likely to happen for a specific site are treated as the risk. The aggregation of the crash occurrence probability for all exposure vehicles is estimated based on the empirical Bayesian method. As for the consequences of crashes, crashes may not only cause direct losses (e.g., occupant injuries and property damages) but also result in indirect losses. The indirect losses are expressed by the extra delays calculated using the deterministic queuing diagram method. The direct losses and indirect losses are uniformly monetized to be considered as the consequences of this risk. The potential costs of crashes, as a criterion to rank high-risk sites, can be explicitly expressed as the sum of the crash probability for all passing vehicles and the corresponding consequences of crashes. A case study on the urban expressways of Shanghai is presented. The results show that the new QRA method for HSID enables the identification of a set of high-risk sites that truly reveal the potential crash costs to society. PMID:28036009

  13. Research Synthesis Methods in an Age of Globalized Risks: Lessons from the Global Burden of Foodborne Disease Expert Elicitation.

    PubMed

    2016-02-01

    We live in an age that increasingly calls for national or regional management of global risks. This article discusses the contributions that expert elicitation can bring to efforts to manage global risks and identifies challenges faced in conducting expert elicitation at this scale. In doing so it draws on lessons learned from conducting an expert elicitation as part of the World Health Organizations (WHO) initiative to estimate the global burden of foodborne disease; a study commissioned by the Foodborne Disease Epidemiology Reference Group (FERG). Expert elicitation is designed to fill gaps in data and research using structured, transparent methods. Such gaps are a significant challenge for global risk modeling. Experience with the WHO FERG expert elicitation shows that it is feasible to conduct an expert elicitation at a global scale, but that challenges do arise, including: defining an informative, yet feasible geographical structure for the elicitation; defining what constitutes expertise in a global setting; structuring international, multidisciplinary expert panels; and managing demands on experts' time in the elicitation. This article was written as part of a workshop, "Methods for Research Synthesis: A Cross-Disciplinary Approach" held at the Harvard Center for Risk Analysis on October 13, 2013. © 2016 Society for Risk Analysis.

  14. Mendelian randomisation analysis strongly implicates adiposity with risk of developing colorectal cancer

    PubMed Central

    Jarvis, David; Mitchell, Jonathan S; Law, Philip J; Palin, Kimmo; Tuupanen, Sari; Gylfe, Alexandra; Hänninen, Ulrika A; Cajuso, Tatiana; Tanskanen, Tomas; Kondelin, Johanna; Kaasinen, Eevi; Sarin, Antti-Pekka; Kaprio, Jaakko; Eriksson, Johan G; Rissanen, Harri; Knekt, Paul; Pukkala, Eero; Jousilahti, Pekka; Salomaa, Veikko; Ripatti, Samuli; Palotie, Aarno; Järvinen, Heikki; Renkonen-Sinisalo, Laura; Lepistö, Anna; Böhm, Jan; Meklin, Jukka-Pekka; Al-Tassan, Nada A; Palles, Claire; Martin, Lynn; Barclay, Ella; Farrington, Susan M; Timofeeva, Maria N; Meyer, Brian F; Wakil, Salma M; Campbell, Harry; Smith, Christopher G; Idziaszczyk, Shelley; Maughan, Timothy S; Kaplan, Richard; Kerr, Rachel; Kerr, David; Buchanan, Daniel D; Win, Aung K; Hopper, John L; Jenkins, Mark A; Lindor, Noralane M; Newcomb, Polly A; Gallinger, Steve; Conti, David; Schumacher, Fred; Casey, Graham; Taipale, Jussi; Aaltonen, Lauri A; Cheadle, Jeremy P; Dunlop, Malcolm G; Tomlinson, Ian P; Houlston, Richard S

    2016-01-01

    Background: Observational studies have associated adiposity with an increased risk of colorectal cancer (CRC). However, such studies do not establish a causal relationship. To minimise bias from confounding we performed a Mendelian randomisation (MR) analysis to examine the relationship between adiposity and CRC. Methods: We used SNPs associated with adult body mass index (BMI), waist-hip ratio (WHR), childhood obesity and birth weight as instrumental variables in a MR analysis of 9254 CRC cases and 18 386 controls. Results: In the MR analysis, the odds ratios (ORs) of CRC risk per unit increase in BMI, WHR and childhood obesity were 1.23 (95% CI: 1.02–1.49, P=0.033), 1.59 (95% CI: 1.08–2.34, P=0.019) and 1.07 (95% CI: 1.03–1.13, P=0.018), respectively. There was no evidence for association between birth weight and CRC (OR=1.22, 95% CI: 0.89–1.67, P=0.22). Combining these data with a concurrent MR-based analysis for BMI and WHR with CRC risk (totalling to 18 190 cases, 27 617 controls) provided increased support, ORs for BMI and WHR were 1.26 (95% CI: 1.10–1.44, P=7.7 × 10−4) and 1.40 (95% CI: 1.14–1.72, P=1.2 × 10−3), respectively. Conclusions: These data provide further evidence for a strong causal relationship between adiposity and the risk of developing CRC highlighting the urgent need for prevention and treatment of adiposity. PMID:27336604

  15. Homocysteine Level and Risk of Abdominal Aortic Aneurysm: A Meta-Analysis

    PubMed Central

    Cao, Hui; Hu, Xinhua; Zhang, Qiang; Li, Jun; Wang, Junpeng; Shao, Yang; Liu, Bing; Xin, Shijie

    2014-01-01

    Objectives Previous studies have reported inconsistent findings regarding the association between elevated plasma homocysteine (Hcy) levels and abdominal aortic aneurysm (AAA). We investigated this association between Hcy levels in patients with AAA and unaffected controls by conducting a meta-analysis and systematic review. Methods We conducted a systematic literature search (up to August 2013) of the PubMed database and Embase. We selected observational studies that evaluated Hcy levels in subjects with AAA compared to unaffected controls. Criteria for inclusion were the assessment of baseline Hcy and risk of AAA as an outcome. The results were presented as odd ratio (OR) and corresponding 95% confidence intervals (CI) comparing AAA patients to the control subjects. Results 7 studies with 6,445 participants were identified and analyzed. Overall, elevated plasma Hcy was associated with an increased risk of AAA (3.29; 95% CI 1.66–6.51). The pooled adjusted OR from a random effect model of only men participants in the AAA compared with the control group was 2.36 (95% CI 0.63–8.82). Conclusion This meta-analysis and systematic review suggested that Hcy significantly increased the risk of AAA. PMID:24465733

  16. GPFA-AB_Phase1RiskAnalysisTask5DataUpload

    DOE Data Explorer

    Teresa E. Jordan

    2015-09-30

    This submission contains information used to compute the risk factors for the GPFA-AB project (DE-EE0006726). The risk factors are natural reservoir quality, thermal resource quality, potential for induced seismicity, and utilization. The methods used to combine the risk factors included taking the product, sum, and minimum of the four risk factors. The files are divided into images, rasters, shapefiles, and supporting information. The image files show what the raster and shapefiles should look like. The raster files contain the input risk factors, calculation of the scaled risk factors, and calculation of the combined risk factors. The shapefiles include definition of the fairways, definition of the US Census Places, the center of the raster cells, and locations of industries. Supporting information contains details of the calculations or processing used in generating the files. An image of the raster will have the same name except *.png as the file ending instead of *.tif. Images with “fairways” or “industries” added to the name are composed of a raster with the relevant shapefile added. The file About_GPFA-AB_Phase1RiskAnalysisTask5DataUpload.pdf contains information the citation, special use considerations, authorship, etc. More details on each file are given in the spreadsheet “list_of_contents.csv” in the folder “SupportingInfo”. Code used to calculate values is available at https://github.com/calvinwhealton/geothermal_pfa under the folder “combining_metrics”.

  17. Qualitative risk assessment during polymer mortar test specimens preparation - methods comparison

    NASA Astrophysics Data System (ADS)

    Silva, F.; Sousa, S. P. B.; Arezes, P.; Swuste, P.; Ribeiro, M. C. S.; Baptista, J. S.

    2015-05-01

    Polymer binder modification with inorganic nanomaterials (NM) could be a potential and efficient solution to control matrix flammability of polymer concrete (PC) materials without sacrificing other important properties. Occupational exposures can occur all along the life cycle of a NM and “nanoproducts” from research through scale-up, product development, manufacturing, and end of life. The main objective of the present study is to analyse and compare different qualitative risk assessment methods during the production of polymer mortars (PM) with NM. The laboratory scale production process was divided in 3 main phases (pre-production, production and post-production), which allow testing the assessment methods in different situations. The risk assessment involved in the manufacturing process of PM was made by using the qualitative analyses based on: French Agency for Food, Environmental and Occupational Health & Safety method (ANSES); Control Banding Nanotool (CB Nanotool); Ecole Polytechnique Fédérale de Lausanne method (EPFL); Guidance working safely with nanomaterials and nanoproducts (GWSNN); Istituto Superiore per la Prevenzione e la Sicurezza del Lavoro, Italy method (ISPESL); Precautionary Matrix for Synthetic Nanomaterials (PMSN); and Stoffenmanager Nano. It was verified that the different methods applied also produce different final results. In phases 1 and 3 the risk assessment tends to be classified as medium-high risk, while for phase 2 the more common result is medium level. It is necessary to improve the use of qualitative methods by defining narrow criteria for the methods selection for each assessed situation, bearing in mind that the uncertainties are also a relevant factor when dealing with the risk related to nanotechnologies field.

  18. Development of economic consequence methodology for process risk analysis.

    PubMed

    Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed

    2015-04-01

    A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies. © 2014 Society for Risk Analysis.

  19. Screening-Level Ecological Risk Assessment Methods, Revision 3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mirenda, Richard J.

    2012-08-16

    This document provides guidance for screening-level assessments of potential adverse impacts to ecological resources from release of environmental contaminants at the Los Alamos National Laboratory (LANL or the Laboratory). The methods presented are based on two objectives, namely: to provide a basis for reaching consensus with regulators, managers, and other interested parties on how to conduct screening-level ecological risk investigations at the Laboratory; and to provide guidance for ecological risk assessors under the Environmental Programs (EP) Directorate. This guidance promotes consistency, rigor, and defensibility in ecological screening investigations and in reporting those investigation results. The purpose of the screening assessmentmore » is to provide information to the risk managers so informed riskmanagement decisions can be made. This document provides examples of recommendations and possible risk-management strategies.« less

  20. A long-recommended but seldom-used method of analysis for fall injuries found a unique pattern of risk factors in the youngest-old.

    PubMed

    Legrand, Helen; Pihlsgård, Mats; Nordell, Eva; Elmståhl, Sölve

    2015-08-01

    Few studies on fall risk factors use long-recommended methods for analysis of recurrent events. Previous falls are the biggest risk factor for future falls, but few fall studies focus on the youngest-old. This study's objective was to apply Cox regression for recurrent events to identify factors associated with injurious falls in the youngest-old. Participants were community-dwelling residents of southern Sweden (n = 1,133), aged 59-67 at baseline (median 61.2), from the youngest cohorts of the larger Good Aging in Skåne (GÅS) study. Exposure variable data were collected from baseline study visits and medical records. Injurious falls, defined as emergency, inpatient, or specialist visits associated with ICD-10 fall codes during the follow-up period (2001-2011), were gathered from national and regional registries. Analysis was conducted using time to event Cox Regression for recurrent events. A majority (77.1 %) of injurious falls caused serious injuries such as fractures and open wounds. Exposure to nervous system medications [hazard ratio (HR) 1.40, 95 % confidence interval (CI) 1.03-1.89], central nervous system disease (HR 1.79, CI 1.18-2.70), and previous injurious fall(s) (HR 2.00, CI 1.50-2.68) were associated with increased hazard of injurious fall. Regression for recurrent events is feasible with typical falls' study data. The association of certain exposures with increased hazard of injurious falls begins earlier than previously studied. Different patterns of risk factors by age can provide insight into the progression of frailty. Tailored fall prevention screening and intervention may be of value in populations younger than those traditionally screened.

  1. Modeling Finite-Time Failure Probabilities in Risk Analysis Applications.

    PubMed

    Dimitrova, Dimitrina S; Kaishev, Vladimir K; Zhao, Shouqi

    2015-10-01

    In this article, we introduce a framework for analyzing the risk of systems failure based on estimating the failure probability. The latter is defined as the probability that a certain risk process, characterizing the operations of a system, reaches a possibly time-dependent critical risk level within a finite-time interval. Under general assumptions, we define two dually connected models for the risk process and derive explicit expressions for the failure probability and also the joint probability of the time of the occurrence of failure and the excess of the risk process over the risk level. We illustrate how these probabilistic models and results can be successfully applied in several important areas of risk analysis, among which are systems reliability, inventory management, flood control via dam management, infectious disease spread, and financial insolvency. Numerical illustrations are also presented. © 2015 Society for Risk Analysis.

  2. Methods for measuring risk-aversion: problems and solutions

    NASA Astrophysics Data System (ADS)

    Thomas, P. J.

    2013-09-01

    Risk-aversion is a fundamental parameter determining how humans act when required to operate in situations of risk. Its general applicability has been discussed in a companion presentation, and this paper examines methods that have been used in the past to measure it and their attendant problems. It needs to be borne in mind that risk-aversion varies with the size of the possible loss, growing strongly as the possible loss becomes comparable with the decision maker's assets. Hence measuring risk-aversion when the potential loss or gain is small will produce values close to the risk-neutral value of zero, irrespective of who the decision maker is. It will also be shown how the generally accepted practice of basing a measurement on the results of a three-term Taylor series will estimate a limiting value, minimum or maximum, rather than the value utilised in the decision. A solution is to match the correct utility function to the results instead.

  3. The Effect of XPD Polymorphisms on Digestive Tract Cancers Risk: A Meta-Analysis

    PubMed Central

    Zhang, Qian; Chen, Zhipeng; Lu, Kai; Shu, Yongqian; Chen, Tao; Zhu, Lingjun

    2014-01-01

    Background The Xeroderma pigmento-sum group D gene (XPD) plays a key role in nucleotide excision repair. Single nucleotide polymorphisms (SNP) located in its functional region may alter DNA repair capacity phenotype and cancer risk. Many studies have demonstrated that XPD polymorphisms are significantly associated with digestive tract cancers risk, but the results are inconsistent. We conducted a comprehensive meta-analysis to assess the association between XPD Lys751Gln polymorphism and digestive tract cancers risk. The digestive tract cancers that our study referred to, includes oral cancer, esophageal cancer, gastric cancer and colorectal cancer. Methods We searched PubMed and EmBase up to December 31, 2012 to identify eligible studies. A total of 37 case-control studies including 9027 cases and 16072 controls were involved in this meta-analysis. Statistical analyses were performed with Stata software (version 11.0, USA). Odds ratios (ORs) with 95% confidence intervals (CIs) were used to assess the strength of the association. Results The results showed that XPD Lys751Gln polymorphism was associated with the increased risk of digestive tract cancers (homozygote comparison (GlnGln vs. LysLys): OR = 1.12, 95% CI = 1.01–1.24, P = 0.029, P heterogeneity = 0.133). We found no statistical evidence for a significantly increased digestive tract cancers risk in the other genetic models. In the subgroup analysis, we also found the homozygote comparison increased the susceptibility of Asian population (OR = 1.28, 95% CI = 1.01–1.63, P = 0.045, P heterogeneity = 0.287). Stratified by cancer type and source of control, no significantly increased cancer risk was found in these subgroups. Additionally, risk estimates from hospital-based studies and esophageal studies were heterogeneous. Conclusions Our meta-analysis suggested that the XPD 751Gln/Gln genotype was a low-penetrate risk factor for developing digestive tract cancers, especially

  4. Wood dust exposure and lung cancer risk: a meta-analysis.

    PubMed

    Hancock, David G; Langley, Mary E; Chia, Kwan Leung; Woodman, Richard J; Shanahan, E Michael

    2015-12-01

    Occupational lung cancers represent a major health burden due to their increasing prevalence and poor long-term outcomes. While wood dust is a confirmed human carcinogen, its association with lung cancer remains unclear due to inconsistent findings in the literature. We aimed to clarify this association using meta-analysis. We performed a search of 10 databases to identify studies published until June 2014. We assessed the lung cancer risk associated with wood dust exposure as the primary outcome and with wood dust-related occupations as a secondary outcome. Random-effects models were used to pool summary risk estimates. 85 publications were included in the meta-analysis. A significantly increased risk for developing lung cancer was observed among studies that directly assessed wood dust exposure (RR 1.21, 95% CI 1.05 to 1.39, n=33) and that assessed wood dust-related occupations (RR 1.15, 95% CI 1.07 to 1.23, n=59). In contrast, a reduced risk for lung cancer was observed among wood dust (RR 0.63, 95% CI 0.39 to 0.99, n=5) and occupation (RR 0.96, 95% CI 0.95 to 0.98, n=1) studies originating in Nordic countries, where softwood dust is the primary exposure. These results were independent of the presence of adjustment for smoking and exposure classification methods. Only minor differences in risk between the histological subtypes were identified. This meta-analysis provides strong evidence for an association between wood dust and lung cancer, which is critically influenced by the geographic region of the study. The reasons for this region-specific effect estimates remain to be clarified, but may suggest a differential effect for hardwood and softwood dusts. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  5. Mendelian randomization analysis of a time-varying exposure for binary disease outcomes using functional data analysis methods.

    PubMed

    Cao, Ying; Rajan, Suja S; Wei, Peng

    2016-12-01

    A Mendelian randomization (MR) analysis is performed to analyze the causal effect of an exposure variable on a disease outcome in observational studies, by using genetic variants that affect the disease outcome only through the exposure variable. This method has recently gained popularity among epidemiologists given the success of genetic association studies. Many exposure variables of interest in epidemiological studies are time varying, for example, body mass index (BMI). Although longitudinal data have been collected in many cohort studies, current MR studies only use one measurement of a time-varying exposure variable, which cannot adequately capture the long-term time-varying information. We propose using the functional principal component analysis method to recover the underlying individual trajectory of the time-varying exposure from the sparsely and irregularly observed longitudinal data, and then conduct MR analysis using the recovered curves. We further propose two MR analysis methods. The first assumes a cumulative effect of the time-varying exposure variable on the disease risk, while the second assumes a time-varying genetic effect and employs functional regression models. We focus on statistical testing for a causal effect. Our simulation studies mimicking the real data show that the proposed functional data analysis based methods incorporating longitudinal data have substantial power gains compared to standard MR analysis using only one measurement. We used the Framingham Heart Study data to demonstrate the promising performance of the new methods as well as inconsistent results produced by the standard MR analysis that relies on a single measurement of the exposure at some arbitrary time point. © 2016 WILEY PERIODICALS, INC.

  6. A free and open source QGIS plugin for flood risk analysis: FloodRisk

    NASA Astrophysics Data System (ADS)

    Albano, Raffaele; Sole, Aurelia; Mancusi, Leonardo

    2016-04-01

    An analysis of global statistics shows a substantial increase in flood damage over the past few decades. Moreover, it is expected that flood risk will continue to rise due to the combined effect of increasing numbers of people and economic assets in risk-prone areas and the effects of climate change. In order to increase the resilience of European economies and societies, the improvement of risk assessment and management has been pursued in the last years. This results in a wide range of flood analysis models of different complexities with substantial differences in underlying components needed for its implementation, as geographical, hydrological and social differences demand specific approaches in the different countries. At present, it is emerging the need of promote the creation of open, transparent, reliable and extensible tools for a comprehensive, context-specific and applicable flood risk analysis. In this context, the free and open-source Quantum GIS (QGIS) plugin "FloodRisk" is a good starting point to address this objective. The vision of the developers of this free and open source software (FOSS) is to combine the main features of state-of-the-art science, collaboration, transparency and interoperability in an initiative to assess and communicate flood risk worldwide and to assist authorities to facilitate the quality and fairness of flood risk management at multiple scales. Among the scientific community, this type of activity can be labelled as "participatory research", intended as adopting a set of techniques that "are interactive and collaborative" and reproducible, "providing a meaningful research experience that both promotes learning and generates knowledge and research data through a process of guided discovery"' (Albano et al., 2015). Moreover, this FOSS geospatial approach can lowering the financial barriers to understanding risks at national and sub-national levels through a spatio-temporal domain and can provide better and more complete

  7. Analysis of risk factors for central venous port failure in cancer patients

    PubMed Central

    Hsieh, Ching-Chuan; Weng, Hsu-Huei; Huang, Wen-Shih; Wang, Wen-Ke; Kao, Chiung-Lun; Lu, Ming-Shian; Wang, Chia-Siu

    2009-01-01

    AIM: To analyze the risk factors for central port failure in cancer patients administered chemotherapy, using univariate and multivariate analyses. METHODS: A total of 1348 totally implantable venous access devices (TIVADs) were implanted into 1280 cancer patients in this cohort study. A Cox proportional hazard model was applied to analyze risk factors for failure of TIVADs. Log-rank test was used to compare actuarial survival rates. Infection, thrombosis, and surgical complication rates (χ2 test or Fisher’s exact test) were compared in relation to the risk factors. RESULTS: Increasing age, male gender and open-ended catheter use were significant risk factors reducing survival of TIVADs as determined by univariate and multivariate analyses. Hematogenous malignancy decreased the survival time of TIVADs; this reduction was not statistically significant by univariate analysis [hazard ratio (HR) = 1.336, 95% CI: 0.966-1.849, P = 0.080)]. However, it became a significant risk factor by multivariate analysis (HR = 1.499, 95% CI: 1.079-2.083, P = 0.016) when correlated with variables of age, sex and catheter type. Close-ended (Groshong) catheters had a lower thrombosis rate than open-ended catheters (2.5% vs 5%, P = 0.015). Hematogenous malignancy had higher infection rates than solid malignancy (10.5% vs 2.5%, P < 0.001). CONCLUSION: Increasing age, male gender, open-ended catheters and hematogenous malignancy were risk factors for TIVAD failure. Close-ended catheters had lower thrombosis rates and hematogenous malignancy had higher infection rates. PMID:19787834

  8. Meta-analysis for aggregated survival data with competing risks: a parametric approach using cumulative incidence functions.

    PubMed

    Bonofiglio, Federico; Beyersmann, Jan; Schumacher, Martin; Koller, Michael; Schwarzer, Guido

    2016-09-01

    Meta-analysis of a survival endpoint is typically based on the pooling of hazard ratios (HRs). If competing risks occur, the HRs may lose translation into changes of survival probability. The cumulative incidence functions (CIFs), the expected proportion of cause-specific events over time, re-connect the cause-specific hazards (CSHs) to the probability of each event type. We use CIF ratios to measure treatment effect on each event type. To retrieve information on aggregated, typically poorly reported, competing risks data, we assume constant CSHs. Next, we develop methods to pool CIF ratios across studies. The procedure computes pooled HRs alongside and checks the influence of follow-up time on the analysis. We apply the method to a medical example, showing that follow-up duration is relevant both for pooled cause-specific HRs and CIF ratios. Moreover, if all-cause hazard and follow-up time are large enough, CIF ratios may reveal additional information about the effect of treatment on the cumulative probability of each event type. Finally, to improve the usefulness of such analysis, better reporting of competing risks data is needed. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  9. Approach to proliferation risk assessment based on multiple objective analysis framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrianov, A.; Kuptsov, I.; Studgorodok 1, Obninsk, Kaluga region, 249030

    2013-07-01

    The approach to the assessment of proliferation risk using the methods of multi-criteria decision making and multi-objective optimization is presented. The approach allows the taking into account of the specifics features of the national nuclear infrastructure, and possible proliferation strategies (motivations, intentions, and capabilities). 3 examples of applying the approach are shown. First, the approach has been used to evaluate the attractiveness of HEU (high enriched uranium)production scenarios at a clandestine enrichment facility using centrifuge enrichment technology. Secondly, the approach has been applied to assess the attractiveness of scenarios for undeclared production of plutonium or HEU by theft of materialsmore » circulating in nuclear fuel cycle facilities and thermal reactors. Thirdly, the approach has been used to perform a comparative analysis of the structures of developing nuclear power systems based on different types of nuclear fuel cycles, the analysis being based on indicators of proliferation risk.« less

  10. Methods to Develop Inhalation Cancer Risk Estimates for ...

    EPA Pesticide Factsheets

    This document summarizes the approaches and rationale for the technical and scientific considerations used to derive inhalation cancer risks for emissions of chromium and nickel compounds from electric utility steam generating units. The purpose of this document is to discuss the methods used to develop inhalation cancer risk estimates associated with emissions of chromium and nickel compounds from coal- and oil-fired electric utility steam generating units (EGUs) in support of EPA's recently proposed Air Toxics Rule.

  11. New risk metrics and mathematical tools for risk analysis: Current and future challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skandamis, Panagiotis N., E-mail: pskan@aua.gr; Andritsos, Nikolaos, E-mail: pskan@aua.gr; Psomas, Antonios, E-mail: pskan@aua.gr

    The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) themore » Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total ‘failure’ that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user

  12. New risk metrics and mathematical tools for risk analysis: Current and future challenges

    NASA Astrophysics Data System (ADS)

    Skandamis, Panagiotis N.; Andritsos, Nikolaos; Psomas, Antonios; Paramythiotis, Spyridon

    2015-01-01

    The current status of the food safety supply world wide, has led Food and Agriculture Organization (FAO) and World Health Organization (WHO) to establishing Risk Analysis as the single framework for building food safety control programs. A series of guidelines and reports that detail out the various steps in Risk Analysis, namely Risk Management, Risk Assessment and Risk Communication is available. The Risk Analysis approach enables integration between operational food management systems, such as Hazard Analysis Critical Control Points, public health and governmental decisions. To do that, a series of new Risk Metrics has been established as follows: i) the Appropriate Level of Protection (ALOP), which indicates the maximum numbers of illnesses in a population per annum, defined by quantitative risk assessments, and used to establish; ii) Food Safety Objective (FSO), which sets the maximum frequency and/or concentration of a hazard in a food at the time of consumption that provides or contributes to the ALOP. Given that ALOP is rather a metric of the public health tolerable burden (it addresses the total `failure' that may be handled at a national level), it is difficult to be interpreted into control measures applied at the manufacturing level. Thus, a series of specific objectives and criteria for performance of individual processes and products have been established, all of them assisting in the achievement of FSO and hence, ALOP. In order to achieve FSO, tools quantifying the effect of processes and intrinsic properties of foods on survival and growth of pathogens are essential. In this context, predictive microbiology and risk assessment have offered an important assistance to Food Safety Management. Predictive modelling is the basis of exposure assessment and the development of stochastic and kinetic models, which are also available in the form of Web-based applications, e.g., COMBASE and Microbial Responses Viewer), or introduced into user-friendly softwares

  13. Fall-Risk-Increasing Drugs: A Systematic Review and Meta-Analysis: I. Cardiovascular Drugs.

    PubMed

    de Vries, Max; Seppala, Lotta J; Daams, Joost G; van de Glind, Esther M M; Masud, Tahir; van der Velde, Nathalie

    2018-04-01

    Use of certain medications is recognized as a major and modifiable risk factor for falls. Although the literature on psychotropic drugs is compelling, the literature on cardiovascular drugs as potential fall-risk-increasing drugs is conflicting. The aim of this systematic review and meta-analysis is to provide a comprehensive overview of the associations between cardiovascular medications and fall risk in older adults. Design: A systematic review and meta-analysis. Medline, Embase, and PsycINFO. Key search concepts were "fall," "aged," "causality," and "medication." Studies that investigated cardiovascular medications as risk factors for falls in participants ≥60 years old or participants with a mean age of 70 or older were included. A meta-analysis was performed using the generic inverse variance method, pooling unadjusted and adjusted odds ratios (ORs) separately. In total, 131 studies were included in the qualitative synthesis. Meta-analysis using adjusted ORs showed significant results (pooled OR [95% confidence interval]) for loop diuretics, OR 1.36 (1.17, 1.57), and beta-blocking agents, OR 0.88 (0.80, 0.97). Meta-analysis using unadjusted ORs showed significant results for digitalis, OR 1.60 (1.08, 2.36); digoxin, OR 2.06 (1.56, 2.74); and statins, OR 0.80 (0.65, 0.98). Most of the meta-analyses resulted in substantial heterogeneity that mostly did not disappear after stratification for population and setting. In a descriptive synthesis, consistent associations were not observed. Loop diuretics were significantly associated with increased fall risk, whereas beta-blockers were significantly associated with decreased fall risk. Digitalis and digoxin may increase the risk of falling, and statins may reduce it. For the majority of cardiovascular medication groups, outcomes were inconsistent. Furthermore, recent studies indicate that specific drug properties, such as selectivity of beta-blockers, may affect fall risk, and drug-disease interaction also may play

  14. Efficacy of ACL injury risk screening methods in identifying high-risk landing patterns during a sport-specific task.

    PubMed

    Fox, A S; Bonacci, J; McLean, S G; Saunders, N

    2017-05-01

    Screening methods sensitive to movement strategies that increase anterior cruciate ligament (ACL) loads are likely to be effective in identifying athletes at-risk of ACL injury. Current ACL injury risk screening methods are yet to be evaluated for their ability to identify athletes' who exhibit high-risk lower limb mechanics during sport-specific maneuvers associated with ACL injury occurrences. The purpose of this study was to examine the efficacy of two ACL injury risk screening methods in identifying high-risk lower limb mechanics during a sport-specific landing task. Thirty-two female athletes were screened using the Landing Error Scoring System (LESS) and Tuck Jump Assessment. Participants' also completed a sport-specific landing task, during which three-dimensional kinematic and kinetic data were collected. One-dimensional statistical parametric mapping was used to examine the relationships between screening method scores, and the three-dimensional hip and knee joint rotation and moment data from the sport-specific landing. Higher LESS scores were associated with reduced knee flexion from 30 to 57 ms after initial contact (P = 0.003) during the sport-specific landing; however, no additional relationships were found. These findings suggest the LESS and Tuck Jump Assessment may have minimal applicability in identifying athletes' who exhibit high-risk landing postures in the sport-specific task examined. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  15. Applicability of the Common Safety Method for Risk Evaluation and Assessment (CSM-RA) to the Space Domain

    NASA Astrophysics Data System (ADS)

    Moreira, Francisco; Silva, Nuno

    2016-08-01

    Safety systems require accident avoidance. This is covered by application standards, processes, techniques and tools that support the identification, analysis, elimination or reduction to an acceptable level of system risks and hazards. Ideally, a safety system should be free of hazards. However, both industry and academia have been struggling to ensure appropriate risk and hazard analysis, especially in what concerns completeness of the hazards, formalization, and timely analysis in order to influence the specifications and the implementation. Such analysis is also important when considering a change to an existing system. The Common Safety Method for Risk Evaluation and Assessment (CSM- RA) is a mandatory procedure whenever any significant change is proposed to the railway system in a European Member State. This paper provides insights on the fundamentals of CSM-RA based and complemented with Hazard Analysis. When and how to apply them, and the relation and similarities of these processes with industry standards and the system life cycles is highlighted. Finally, the paper shows how CSM-RA can be the basis of a change management process, guiding the identification and management of the hazards helping ensuring the similar safety level as the initial system. This paper will show how the CSM-RA principles can be used in other domains particularly for space system evolution.

  16. Method and system for dynamic probabilistic risk assessment

    NASA Technical Reports Server (NTRS)

    Dugan, Joanne Bechta (Inventor); Xu, Hong (Inventor)

    2013-01-01

    The DEFT methodology, system and computer readable medium extends the applicability of the PRA (Probabilistic Risk Assessment) methodology to computer-based systems, by allowing DFT (Dynamic Fault Tree) nodes as pivot nodes in the Event Tree (ET) model. DEFT includes a mathematical model and solution algorithm, supports all common PRA analysis functions and cutsets. Additional capabilities enabled by the DFT include modularization, phased mission analysis, sequence dependencies, and imperfect coverage.

  17. Quality-by-Design II: Application of Quantitative Risk Analysis to the Formulation of Ciprofloxacin Tablets.

    PubMed

    Claycamp, H Gregg; Kona, Ravikanth; Fahmy, Raafat; Hoag, Stephen W

    2016-04-01

    Qualitative risk assessment methods are often used as the first step to determining design space boundaries; however, quantitative assessments of risk with respect to the design space, i.e., calculating the probability of failure for a given severity, are needed to fully characterize design space boundaries. Quantitative risk assessment methods in design and operational spaces are a significant aid to evaluating proposed design space boundaries. The goal of this paper is to demonstrate a relatively simple strategy for design space definition using a simplified Bayesian Monte Carlo simulation. This paper builds on a previous paper that used failure mode and effects analysis (FMEA) qualitative risk assessment and Plackett-Burman design of experiments to identity the critical quality attributes. The results show that the sequential use of qualitative and quantitative risk assessments can focus the design of experiments on a reduced set of critical material and process parameters that determine a robust design space under conditions of limited laboratory experimentation. This approach provides a strategy by which the degree of risk associated with each known parameter can be calculated and allocates resources in a manner that manages risk to an acceptable level.

  18. Supplemental Hazard Analysis and Risk Assessment - Hydrotreater

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lowry, Peter P.; Wagner, Katie A.

    A supplemental hazard analysis was conducted and quantitative risk assessment performed in response to an independent review comment received by the Pacific Northwest National Laboratory (PNNL) from the U.S. Department of Energy Pacific Northwest Field Office (PNSO) against the Hydrotreater/Distillation Column Hazard Analysis Report issued in April 2013. The supplemental analysis used the hazardous conditions documented by the previous April 2013 report as a basis. The conditions were screened and grouped for the purpose of identifying whether additional prudent, practical hazard controls could be identified, using a quantitative risk evaluation to assess the adequacy of the controls and establish amore » lower level of concern for the likelihood of potential serious accidents. Calculations were performed to support conclusions where necessary.« less

  19. Pharmaceutical supply chain risk assessment in Iran using analytic hierarchy process (AHP) and simple additive weighting (SAW) methods.

    PubMed

    Jaberidoost, Mona; Olfat, Laya; Hosseini, Alireza; Kebriaeezadeh, Abbas; Abdollahi, Mohammad; Alaeddini, Mahdi; Dinarvand, Rassoul

    2015-01-01

    Pharmaceutical supply chain is a significant component of the health system in supplying medicines, particularly in countries where main drugs are provided by local pharmaceutical companies. No previous studies exist assessing risks and disruptions in pharmaceutical companies while assessing the pharmaceutical supply chain. Any risks affecting the pharmaceutical companies could disrupt supply medicines and health system efficiency. The goal of this study was the risk assessment in pharmaceutical industry in Iran considering process's priority, hazard and probability of risks. The study was carried out in 4 phases; risk identification through literature review, risk identification in Iranian pharmaceutical companies through interview with experts, risk analysis through a questionnaire and consultation with experts using group analytic hierarchy process (AHP) method and rating scale (RS) and risk evaluation of simple additive weighting (SAW) method. In total, 86 main risks were identified in the pharmaceutical supply chain with perspective of pharmaceutical companies classified in 11 classes. The majority of risks described in this study were related to the financial and economic category. Also financial management was found to be the most important factor for consideration. Although pharmaceutical industry and supply chain were affected by current political conditions in Iran during the study time, but half of total risks in the pharmaceutical supply chain were found to be internal risks which could be fixed by companies, internally. Likewise, political status and related risks forced companies to focus more on financial and supply management resulting in less attention to quality management.

  20. The impact of communicating genetic risks of disease on risk-reducing health behaviour: systematic review with meta-analysis.

    PubMed

    Hollands, Gareth J; French, David P; Griffin, Simon J; Prevost, A Toby; Sutton, Stephen; King, Sarah; Marteau, Theresa M

    2016-03-15

    To assess the impact of communicating DNA based disease risk estimates on risk-reducing health behaviours and motivation to engage in such behaviours. Systematic review with meta-analysis, using Cochrane methods. Medline, Embase, PsycINFO, CINAHL, and the Cochrane Central Register of Controlled Trials up to 25 February 2015. Backward and forward citation searches were also conducted. Randomised and quasi-randomised controlled trials involving adults in which one group received personalised DNA based estimates of disease risk for conditions where risk could be reduced by behaviour change. Eligible studies included a measure of risk-reducing behaviour. We examined 10,515 abstracts and included 18 studies that reported on seven behavioural outcomes, including smoking cessation (six studies; n=2663), diet (seven studies; n=1784), and physical activity (six studies; n=1704). Meta-analysis revealed no significant effects of communicating DNA based risk estimates on smoking cessation (odds ratio 0.92, 95% confidence interval 0.63 to 1.35, P=0.67), diet (standardised mean difference 0.12, 95% confidence interval -0.00 to 0.24, P=0.05), or physical activity (standardised mean difference -0.03, 95% confidence interval -0.13 to 0.08, P=0.62). There were also no effects on any other behaviours (alcohol use, medication use, sun protection behaviours, and attendance at screening or behavioural support programmes) or on motivation to change behaviour, and no adverse effects, such as depression and anxiety. Subgroup analyses provided no clear evidence that communication of a risk-conferring genotype affected behaviour more than communication of the absence of such a genotype. However, studies were predominantly at high or unclear risk of bias, and evidence was typically of low quality. Expectations that communicating DNA based risk estimates changes behaviour is not supported by existing evidence. These results do not support use of genetic testing or the search for risk

  1. A Review on Methods of Risk Adjustment and their Use in Integrated Healthcare Systems

    PubMed Central

    Juhnke, Christin; Bethge, Susanne

    2016-01-01

    Introduction: Effective risk adjustment is an aspect that is more and more given weight on the background of competitive health insurance systems and vital healthcare systems. The objective of this review was to obtain an overview of existing models of risk adjustment as well as on crucial weights in risk adjustment. Moreover, the predictive performance of selected methods in international healthcare systems should be analysed. Theory and methods: A comprehensive, systematic literature review on methods of risk adjustment was conducted in terms of an encompassing, interdisciplinary examination of the related disciplines. Results: In general, several distinctions can be made: in terms of risk horizons, in terms of risk factors or in terms of the combination of indicators included. Within these, another differentiation by three levels seems reasonable: methods based on mortality risks, methods based on morbidity risks as well as those based on information on (self-reported) health status. Conclusions and discussion: After the final examination of different methods of risk adjustment it was shown that the methodology used to adjust risks varies. The models differ greatly in terms of their included morbidity indicators. The findings of this review can be used in the evaluation of integrated healthcare delivery systems and can be integrated into quality- and patient-oriented reimbursement of care providers in the design of healthcare contracts. PMID:28316544

  2. Influence of dorsolateral prefrontal cortex and ventral striatum on risk avoidance in addiction: a mediation analysis*

    PubMed Central

    Yamamoto, Dorothy J.; Woo, Choong-Wan; Wager, Tor D.; Regner, Michael F.; Tanabe, Jody

    2015-01-01

    Background Alterations in frontal and striatal function are hypothesized to underlie risky decision-making in drug users, but how these regions interact to affect behavior is incompletely understood. We used mediation analysis to investigate how prefrontal cortex and ventral striatum together influence risk avoidance in abstinent drug users. Method Thirty-seven abstinent substance-dependent individuals (SDI) and 43 controls underwent fMRI while performing a decision-making task involving risk and reward. Analyses of a priori regions-of-interest tested whether activity in dorsolateral prefrontal cortex (DLPFC) and ventral striatum (VST) explained group differences in risk avoidance. Whole-brain analysis was conducted to identify brain regions influencing the negative VST-risk avoidance relationship. Results Right DLPFC (RDLPFC) positively mediated the group-risk avoidance relationship (p < 0.05); RDLPFC activity was higher in SDI and predicted higher risk avoidance across groups, controlling for SDI vs. controls. Conversely, VST activity negatively influenced risk avoidance (p < 0.05); it was higher in SDI, and predicted lower risk avoidance. Whole-brain analysis revealed that, across group, RDLPFC and left temporal-parietal junction positively (p ≤ 0.001) while right thalamus and left middle frontal gyrus negatively (p < 0.005) mediated the VST activity-risk avoidance relationship. Conclusion RDLPFC activity mediated less risky decision-making while VST mediated more risky decision-making across drug users and controls. These results suggest a dual pathway underlying decision-making, which, if imbalanced, may adversely influence choices involving risk. Modeling contributions of multiple brain systems to behavior through mediation analysis could lead to a better understanding of mechanisms of behavior and suggest neuromodulatory treatments for addiction. PMID:25736619

  3. Using risk elasticity to prioritize risk reduction strategies for geographical areas and industry sectors.

    PubMed

    Li, Pei-Chiun; Ma, Hwong-Wen

    2016-01-25

    The total quantity of chemical emissions does not take into account their chemical toxicity, and fails to be an accurate indicator of the potential impact on human health. The sources of released contaminants, and therefore, the potential risk, also differ based on geography. Because of the complexity of the risk, there is no integrated method to evaluate the effectiveness of risk reduction. Therefore, this study developed a method to incorporate the spatial variability of emissions into human health risk assessment to evaluate how to effectively reduce risk using risk elasticity analysis. Risk elasticity analysis, the percentage change in risk in response to the percentage change in emissions, was adopted in this study to evaluate the effectiveness and efficiency of risk reduction. The results show that the main industry sectors are different in each area, and that high emission in an area does not correspond to high risk. Decreasing the high emissions of certain sectors in an area does not result in efficient risk reduction in this area. This method can provide more holistic information for risk management, prevent the development of increased risk, and prioritize the risk reduction strategies. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Probabilistic Risk Analysis of Groundwater Related Problems in Subterranean Excavation Sites

    NASA Astrophysics Data System (ADS)

    Sanchez-Vila, X.; Jurado, A.; de Gaspari, F.; Vilarrasa, V.; Bolster, D.; Fernandez-Garcia, D.; Tartakovsky, D. M.

    2009-12-01

    Construction of subterranean excavations in densely populated areas is inherently hazardous. The number of construction sites (e.g., subway lines, railways and highway tunnels) has increased in recent years. These sites can pose risks to workers at the site as well as cause damage to surrounding buildings. The presence of groundwater makes the excavation even more complicated. We develop a probabilistic risk assessment (PRA) model o estimate the likelihood of occurrence of certain risks during a subway station construction. While PRA is widely used in many engineering fields, its applications to the underground constructions in general and to an underground station construction in particular are scarce if not nonexistent. This method enables us not only to evaluate the probability of failure, but also to quantify the uncertainty of the different events considered. The risk analysis was carried out using a fault tree analysis that made it possible to study a complex system in a structured and straightforward manner. As an example we consider an underground station for the new subway line in the Barcelona metropolitan area (Línia 9) through the town of Prat de Llobregat in the Llobregat River Delta, which is currently under development. A typical station on the L9 line lies partially between the shallow and the main aquifer. Specifically, it is located in the middle layer which is made up of silts and clays. By presenting this example we aim to illustrate PRA as an effective methodology for estimating and minimising risks and to demonstrate its utility as a potential tool for decision making.

  5. Assessment of Methods for Estimating Risk to Birds from ...

    EPA Pesticide Factsheets

    The U.S. EPA Ecological Risk Assessment Support Center (ERASC) announced the release of the final report entitled, Assessment of Methods for Estimating Risk to Birds from Ingestion of Contaminated Grit Particles. This report evaluates approaches for estimating the probability of ingestion by birds of contaminated particles such as pesticide granules or lead particles (i.e. shot or bullet fragments). In addition, it presents an approach for using this information to estimate the risk of mortality to birds from ingestion of lead particles. Response to ERASC Request #16

  6. Genetic Risk Score Analysis in Early-Onset Bipolar Disorder

    PubMed Central

    Croarkin, Paul E.; Luby, Joan L.; Cercy, Kelly; Geske, Jennifer R.; Veldic, Marin; Simonson, Matthew; Joshi, Paramjit T.; Wagner, Karen Dineen; Walkup, John T.; Nassan, Malik M.; Cuellar-Barboza, Alfredo B.; Casuto, Leah; McElroy, Susan L.; Jensen, Peter S.; Frye, Mark A.; Biernacka, Joanna M.

    2018-01-01

    Objective In this study, we performed a candidate genetic risk score (GRS) analysis of early-onset bipolar disorder. Method Treatment of Early Age Mania (TEAM) study enrollment and sample collection took place from 2003–2008. Mayo Clinic Bipolar Biobank samples were collected from 2009–2013. Genotyping and analyses for the present study took place from 2013–2014. The diagnosis of bipolar disorder was based on Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition, Text Revision criteria. Eight single-nucleotide polymorphisms (SNPs), previously reported in genome-wide association studies to be associated with bipolar disorder, were chosen for GRS analysis in early-onset bipolar disease. These SNPs map to 3 genes: CACNA1C (calcium channel, voltage-dependent, L type, alpha 1C subunit), ANK3 (ankyrin-3, node of Ranvier [ankyrin G]), and ODZ4 (teneurin transmembrane protein 4 [formerly “odz, odd Oz/ten-m homolog 4 {Drosophila}, ODZ4”]). The 8 candidate SNPs were genotyped in patients from the TEAM study (n=69), adult patients with bipolar disorder (n=732) including a subset with early-onset illness [n=192]), and healthy controls (n=776). GRS analyses were performed comparing early-onset cases with controls. In addition, associations of early-onset BD with individual SNPs and haplotypes were explored. Results GRS analysis revealed associations of the risk score with early-onset bipolar disorder (P=.01). Gene-level haplotype analysis comparing TEAM patients with controls suggested association of early-onset bipolar disorder with a CACNA1C haplotype (global test, P=.01). At the level of individual SNPs, comparison of TEAM cases with healthy controls provided nominally significant evidence for association of SNP rs10848632 in CACNA1C with early-onset bipolar disorder (P=.017), which did not remain significant after correction for multiple comparisons. Conclusion These preliminary analyses suggest that previously identified bipolar disorder risk loci

  7. Risk Factor Analysis in Low-Temperature Geothermal Play Fairway Analysis for the Appalachian Basin (GPFA-AB)

    DOE Data Explorer

    Teresa E. Jordan

    2015-09-30

    This submission contains information used to compute the risk factors for the GPFA-AB project (DE-EE0006726). The risk factors are natural reservoir quality, thermal resource quality, potential for induced seismicity, and utilization. The methods used to combine the risk factors included taking the product, sum, and minimum of the four risk factors. The files are divided into images, rasters, shapefiles, and supporting information. The image files show what the raster and shapefiles should look like. The raster files contain the input risk factors, calculation of the scaled risk factors, and calculation of the combined risk factors. The shapefiles include definition of the fairways, definition of the US Census Places, the center of the raster cells, and locations of industries. Supporting information contains details of the calculations or processing used in generating the files. An image of the raster will have the same name except *.png as the file ending instead of *.tif. Images with “fairways” or “industries” added to the name are composed of a raster with the relevant shapefile added. The file About_GPFA-AB_Phase1RiskAnalysisTask5DataUpload.pdf contains information the citation, special use considerations, authorship, etc. More details on each file are given in the spreadsheet “list_of_contents.csv” in the folder “SupportingInfo”. Code used to calculate values is available at https://github.com/calvinwhealton/geothermal_pfa under the folder “combining_metrics”.

  8. Efficacy of anthropometric measures for identifying cardiovascular disease risk in adolescents: review and meta-analysis.

    PubMed

    Lichtenauer, Michael; Wheatley, Sean D; Martyn-St James, Marrissa; Duncan, Michael J; Cobayashi, Fernanda; Berg, Gabriela; Musso, Carla; Graffigna, Mabel; Soutelo, Jimena; Bovet, Pascal; Kollias, Anastasios; Stergiou, George S; Grammatikos, Evangelos; Griffiths, Claire; Ingle, Lee; Jung, Christian

    2018-04-12

    To compare the ability of body mass index (BMI), waist circumference (WC) and waist to height ratio (WHtR) to estimate cardiovascular disease (CVD) risk levels in adolescents. A systematic review and meta-analysis was performed after a database search for relevant literature (Cochrane, Centre for Review and Dissemination, PubMed, British Nursing Index, CINAHL, BIOSIS citation index, ChildData, metaRegister). 117 records representing 96 studies with 994,595 participants were included in the systematic review, 14 of which (13 studies, n=14,610) were eligible for the meta-analysis. The results of the meta-analysis showed that BMI was a strong indicator of systolic blood pressure, diastolic blood pressure, triglycerides, high-density lipoprotein cholesterol and insulin; but not total cholesterol, low-density lipoprotein or glucose. Few studies were eligible for inclusion in the meta-analysis considering WC or WHtR (n≤2). The narrative synthesis found measures of central adiposity to be consistently valid indicators of the same risk factors as BMI. BMI was an indicator of CVD risk. WC and WHtR were efficacious for indicating the same risk factors BMI performed strongly for, though there was insufficient evidence to judge the relative strength of each measure possibly due to heterogeneity in the methods for measuring and classifying WC.

  9. Fall-Risk-Increasing Drugs: A Systematic Review and Meta-analysis: III. Others.

    PubMed

    Seppala, Lotta J; van de Glind, Esther M M; Daams, Joost G; Ploegmakers, Kimberley J; de Vries, Max; Wermelink, Anne M A T; van der Velde, Nathalie

    2018-04-01

    The use of psychotropic medication and cardiovascular medication has been associated with an increased risk of falling. However, other frequently prescribed medication classes are still under debate as potential risk factors for falls in the older population. The aim of this systematic review and meta-analysis is to evaluate the associations between fall risk and nonpsychotropic and noncardiovascular medications. A systematic review and meta-analysis. A search was conducted in Medline, PsycINFO, and Embase. Key search concepts were "falls," "aged," "medication," and "causality." Studies were included that investigated nonpsychotropic and noncardiovascular medications as risk factors for falls in participants ≥60 years or participants with a mean age ≥70 years. A meta-analysis was performed using the generic inverse variance method, pooling unadjusted and adjusted odds ratio (OR) estimates separately. In a qualitative synthesis, 281 studies were included. The results of meta-analysis using adjusted data were as follows (a pooled OR [95% confidence interval]): analgesics, 1.42 (0.91-2.23); nonsteroidal anti-inflammatory drugs (NSAIDs), 1.09 (0.96-1.23); opioids, 1.60 (1.35-1.91); anti-Parkinson drugs, 1.54 (0.99-2.39); antiepileptics, 1.55 (1.25-1.92); and polypharmacy, 1.75 (1.27-2.41). Most of the meta-analyses resulted in substantial heterogeneity that did not disappear after stratification for population and setting in most cases. In a descriptive synthesis, consistent associations with falls were observed for long-term proton pump inhibitor use and opioid initiation. Laxatives showed inconsistent associations with falls (7/20 studies showing a positive association). Opioid and antiepileptic use and polypharmacy were significantly associated with increased risk of falling in the meta-analyses. Long-term use of proton pump inhibitors and opioid initiation might increase the fall risk. Future research is necessary because the causal role of some medication

  10. Alzheimer disease and cancer risk: a meta-analysis.

    PubMed

    Shi, Hai-bin; Tang, Bo; Liu, Yao-Wen; Wang, Xue-Feng; Chen, Guo-Jun

    2015-03-01

    Alzheimer disease (AD) and cancer are seemingly two opposite ends of one spectrum. Studies have suggested that patients with AD showed a reduced risk of cancer and vice versa. However, available evidences are not conclusive. So we conducted a meta-analysis using published literatures to systematically examine cancer risk in AD patients. A PubMed, EMBASE, and Web of Science search were conducted in May 2014. Pooled risk ratios (RRs) with their corresponding 95 % confidence intervals (CIs) were obtained using random-effects meta-analysis. We tested for publication bias and heterogeneity, and stratified for study characteristics, smoking-related cancers versus nonsmoking-related cancers, and site-specific cancers. Nine studies were included in this meta-analysis. Compared with controls, the pooled RR of cancer in AD patients was 0.55 (95 % CI 0.41-0.75), with significant heterogeneity among these studies (P < 0.001, I(2) = 83.5 %). The reduced cancer risk was more substantial when we restricted analyses to cohort studies, studies with adjusted estimates, studies defining AD by generally accepted criteria, and studies with longer length of follow-up. In sub-analyses for site-specific cancers, only lung cancer showed significant decreased risk (RR 0.72; 95 % CI 0.56-0.91). We did not find significant publication bias (P = 0.251 for Begg and Mazumdar's test and P = 0.143 for Egger's regression asymmetry test). These results support an association between AD and decreased cancer risk.

  11. Arsenic metabolism and cancer risk: A meta-analysis.

    PubMed

    Gamboa-Loira, Brenda; Cebrián, Mariano E; Franco-Marina, Francisco; López-Carrillo, Lizbeth

    2017-07-01

    To describe the studies that have reported association measures between risk of cancer and the percentage distribution of urinary inorganic arsenic (iAs) metabolites by anatomical site, in non-ecological epidemiological studies. Studies were identified in the PubMed database in the period from 1990 to 2015. Inclusion criteria were: non-ecological epidemiological study, with histologically confirmed cancer cases, reporting the percentage distribution of inorganic arsenic (iAs), monomethylated (MMA) and dimethylated (DMA) metabolites, as well as association measures with confidence intervals (CI) between cancer and %iAs and/or %MMA and/or %DMA. A descriptive meta-analysis was performed by the method of the inverse of the variance for the fixed effects model and the DerSimonian and Laird's method for the random effects model. Heterogeneity was tested using the Q statistic and stratifying for epidemiological design and total As in urine. The possibility of publication bias was assessed through Begg's test. A total of 13 eligible studies were found, most of them were performed in Taiwan and focused on skin and bladder cancer. The positive association between %MMA and various types of cancer was consistent, in contrast to the negative relationship between %DMA and cancer that was inconsistent. The summary risk of bladder (OR=1.79; 95% CI: 1.42, 2.26, n=4 studies) and lung (OR=2.44; 95% CI: 1.57, 3.80, n=2 studies) cancer increased significantly with increasing %MMA, without statistical heterogeneity. In contrast, lung cancer risk was inversely related to %DMA (OR=0.58; 95% CI: 0.36, 0.93, n=2 studies), also without significant heterogeneity. These results were similar after stratifying by epidemiological design and total As in urine. No evidence of publication bias was found. These findings provide additional support that methylation needs to be taken into account when assessing the potential iAs carcinogenicity risk. Copyright © 2017. Published by Elsevier Inc.

  12. Developing an objective evaluation method to estimate diabetes risk in community-based settings.

    PubMed

    Kenya, Sonjia; He, Qing; Fullilove, Robert; Kotler, Donald P

    2011-05-01

    Exercise interventions often aim to affect abdominal obesity and glucose tolerance, two significant risk factors for type 2 diabetes. Because of limited financial and clinical resources in community and university-based environments, intervention effects are often measured with interviews or questionnaires and correlated with weight loss or body fat indicated by body bioimpedence analysis (BIA). However, self-reported assessments are subject to high levels of bias and low levels of reliability. Because obesity and body fat are correlated with diabetes at different levels in various ethnic groups, data reflecting changes in weight or fat do not necessarily indicate changes in diabetes risk. To determine how exercise interventions affect diabetes risk in community and university-based settings, improved evaluation methods are warranted. We compared a noninvasive, objective measurement technique--regional BIA--with whole-body BIA for its ability to assess abdominal obesity and predict glucose tolerance in 39 women. To determine regional BIA's utility in predicting glucose, we tested the association between the regional BIA method and blood glucose levels. Regional BIA estimates of abdominal fat area were significantly correlated (r = 0.554, P < 0.003) with fasting glucose. When waist circumference and family history of diabetes were added to abdominal fat in multiple regression models, the association with glucose increased further (r = 0.701, P < 0.001). Regional BIA estimates of abdominal fat may predict fasting glucose better than whole-body BIA as well as provide an objective assessment of changes in diabetes risk achieved through physical activity interventions in community settings.

  13. Methods Development for a Spatially Explicit Population-Level Risk Assessment, Uncertainty Analysis, and Comparison with Risk Quotient Approaches

    EPA Science Inventory

    The standard framework of Ecological Risk Assessment (ERA) uses organism-level assessment endpoints to qualitatively determine the risk to populations. While organism-level toxicity data provide the pathway by which a species may be affected by a chemical stressor, they neither i...

  14. Risk analysis of hematopoietic stem cell transplant process: failure mode, effect, and criticality analysis and hazard analysis critical control point methods integration based on guidelines to good manufacturing practice for medicinal product ANNEX 20 (February 2008).

    PubMed

    Gianassi, S; Bisin, S; Bindi, B; Spitaleri, I; Bambi, F

    2010-01-01

    The collection and handling of hematopoietic stem cells (HSCs) must meet high quality requirements. An integrated Quality Risk Management can help to identify and contain potential risks related to HSC production. Risk analysis techniques allow one to "weigh" identified hazards, considering the seriousness of their effects, frequency, and detectability, seeking to prevent the most harmful hazards. The Hazard Analysis Critical Point, recognized as the most appropriate technique to identify risks associated with physical, chemical, and biological hazards for cellular products, consists of classifying finished product specifications and limits of acceptability, identifying all off-specifications, defining activities that can cause them, and finally establishing both a monitoring system for each Critical Control Point and corrective actions for deviations. The severity of possible effects on patients, as well as the occurrence and detectability of critical parameters, are measured on quantitative scales (Risk Priority Number [RPN]). Risk analysis was performed with this technique on manipulation process of HPC performed at our blood center. The data analysis showed that hazards with higher values of RPN with greater impact on the process are loss of dose and tracking; technical skills of operators and manual transcription of data were the most critical parameters. Problems related to operator skills are handled by defining targeted training programs, while other critical parameters can be mitigated with the use of continuous control systems. The blood center management software was completed by a labeling system with forms designed to be in compliance with standards in force and by starting implementation of a cryopreservation management module. Copyright 2010 Elsevier Inc. All rights reserved.

  15. What patient characteristics guide nurses' clinical judgement on pressure ulcer risk? A mixed methods study.

    PubMed

    Balzer, K; Kremer, L; Junghans, A; Halfens, R J G; Dassen, T; Kottner, J

    2014-05-01

    Nurses' clinical judgement plays a vital role in pressure ulcer risk assessment, but evidence is lacking which patient characteristics are important for nurses' perception of patients' risk exposure. To explore which patient characteristics nurses employ when assessing pressure ulcer risk without use of a risk assessment scale. Mixed methods design triangulating observational data from the control group of a quasi-experimental trial and data from semi-structured interviews with nurses. Two traumatological wards at a university hospital. Quantitative data: A consecutive sample of 106 patients matching the eligibility criteria (age ≥ 18 years, no pressure ulcers category ≥ 2 at admission and ≥ 5 days expected length of stay). Qualitative data: A purposive sample of 16 nurses. Quantitative data: Predictor variables for pressure ulcer risk were measured by study assistants at the bedside each second day. Concurrently, nurses documented their clinical judgement on patients' pressure ulcer risk by means of a 4-step global judgement scale. Bivariate correlations between predictor variables and nurses' risk estimates were established. Qualitative data: In interviews, nurses were asked to assess fictitious patients' pressure ulcer risk and to justify their risk estimates. Patient characteristics perceived as relevant for nurses' judements were thematically clustered. Triangulation: Firstly, predictors of nurses' risk estimates identified in bivariate analysis were cross-mapped with interview findings. Secondly, three models to predict nurses' risk estimates underwent multiple linear regression analysis. Nurses consider multiple patient characteristics for pressure ulcer risk assessment, but regard some conditions more important than others. Triangulation showed that these are measures reflecting patients' exposure to pressure or overall care dependency. Qualitative data furthermore indicate that nurses are likely to trade off risk-enhancing conditions against

  16. Chi-squared Automatic Interaction Detection Decision Tree Analysis of Risk Factors for Infant Anemia in Beijing, China

    PubMed Central

    Ye, Fang; Chen, Zhi-Hua; Chen, Jie; Liu, Fang; Zhang, Yong; Fan, Qin-Ying; Wang, Lin

    2016-01-01

    Background: In the past decades, studies on infant anemia have mainly focused on rural areas of China. With the increasing heterogeneity of population in recent years, available information on infant anemia is inconclusive in large cities of China, especially with comparison between native residents and floating population. This population-based cross-sectional study was implemented to determine the anemic status of infants as well as the risk factors in a representative downtown area of Beijing. Methods: As useful methods to build a predictive model, Chi-squared automatic interaction detection (CHAID) decision tree analysis and logistic regression analysis were introduced to explore risk factors of infant anemia. A total of 1091 infants aged 6–12 months together with their parents/caregivers living at Heping Avenue Subdistrict of Beijing were surveyed from January 1, 2013 to December 31, 2014. Results: The prevalence of anemia was 12.60% with a range of 3.47%–40.00% in different subgroup characteristics. The CHAID decision tree model has demonstrated multilevel interaction among risk factors through stepwise pathways to detect anemia. Besides the three predictors identified by logistic regression model including maternal anemia during pregnancy, exclusive breastfeeding in the first 6 months, and floating population, CHAID decision tree analysis also identified the fourth risk factor, the maternal educational level, with higher overall classification accuracy and larger area below the receiver operating characteristic curve. Conclusions: The infant anemic status in metropolis is complex and should be carefully considered by the basic health care practitioners. CHAID decision tree analysis has demonstrated a better performance in hierarchical analysis of population with great heterogeneity. Risk factors identified by this study might be meaningful in the early detection and prompt treatment of infant anemia in large cities. PMID:27174328

  17. Adversarial Risk Analysis for Urban Security Resource Allocation.

    PubMed

    Gil, César; Rios Insua, David; Rios, Jesus

    2016-04-01

    Adversarial risk analysis (ARA) provides a framework to deal with risks originating from intentional actions of adversaries. We show how ARA may be used to allocate security resources in the protection of urban spaces. We take into account the spatial structure and consider both proactive and reactive measures, in that we aim at both trying to reduce criminality as well as recovering as best as possible from it, should it happen. We deal with the problem by deploying an ARA model over each spatial unit, coordinating the models through resource constraints, value aggregation, and proximity. We illustrate our approach with an example that uncovers several relevant policy issues. © 2016 Society for Risk Analysis.

  18. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nachtigal, Noel M.; Fruetel, Julia A.; Gleason, Nathaniel J.

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in themore » risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.« less

  19. Plasma urate concentration and risk of coronary heart disease: a Mendelian randomisation analysis

    PubMed Central

    White, Jon; Sofat, Reecha; Hemani, Gibran; Shah, Tina; Engmann, Jorgen; Dale, Caroline; Shah, Sonia; Kruger, Felix A; Giambartolomei, Claudia; Swerdlow, Daniel I; Palmer, Tom; McLachlan, Stela; Langenberg, Claudia; Zabaneh, Delilah; Lovering, Ruth; Cavadino, Alana; Jefferis, Barbara; Finan, Chris; Wong, Andrew; Amuzu, Antoinette; Ong, Ken; Gaunt, Tom R; Warren, Helen; Davies, Teri-Louise; Drenos, Fotios; Cooper, Jackie; Ebrahim, Shah; Lawlor, Debbie A; Talmud, Philippa J; Humphries, Steve E; Power, Christine; Hypponen, Elina; Richards, Marcus; Hardy, Rebecca; Kuh, Diana; Wareham, Nicholas; Ben-Shlomo, Yoav; Day, Ian N; Whincup, Peter; Morris, Richard; Strachan, Mark W J; Price, Jacqueline; Kumari, Meena; Kivimaki, Mika; Plagnol, Vincent; Whittaker, John C; Smith, George Davey; Dudbridge, Frank; Casas, Juan P; Holmes, Michael V; Hingorani, Aroon D

    2016-01-01

    Summary Background Increased circulating plasma urate concentration is associated with an increased risk of coronary heart disease, but the extent of any causative effect of urate on risk of coronary heart disease is still unclear. In this study, we aimed to clarify any causal role of urate on coronary heart disease risk using Mendelian randomisation analysis. Methods We first did a fixed-effects meta-analysis of the observational association of plasma urate and risk of coronary heart disease. We then used a conventional Mendelian randomisation approach to investigate the causal relevance using a genetic instrument based on 31 urate-associated single nucleotide polymorphisms (SNPs). To account for potential pleiotropic associations of certain SNPs with risk factors other than urate, we additionally did both a multivariable Mendelian randomisation analysis, in which the genetic associations of SNPs with systolic and diastolic blood pressure, HDL cholesterol, and triglycerides were included as covariates, and an Egger Mendelian randomisation (MR-Egger) analysis to estimate a causal effect accounting for unmeasured pleiotropy. Findings In the meta-analysis of 17 prospective observational studies (166 486 individuals; 9784 coronary heart disease events) a 1 SD higher urate concentration was associated with an odds ratio (OR) for coronary heart disease of 1·07 (95% CI 1·04–1·10). The corresponding OR estimates from the conventional, multivariable adjusted, and Egger Mendelian randomisation analysis (58 studies; 198 598 individuals; 65 877 events) were 1·18 (95% CI 1·08–1·29), 1·10 (1·00–1·22), and 1·05 (0·92–1·20), respectively, per 1 SD increment in plasma urate. Interpretation Conventional and multivariate Mendelian randomisation analysis implicates a causal role for urate in the development of coronary heart disease, but these estimates might be inflated by hidden pleiotropy. Egger Mendelian randomisation analysis, which accounts for

  20. A Systematic Evaluation of Field-Based Screening Methods for the Assessment of Anterior Cruciate Ligament (ACL) Injury Risk.

    PubMed

    Fox, Aaron S; Bonacci, Jason; McLean, Scott G; Spittle, Michael; Saunders, Natalie

    2016-05-01

    further studies met all criteria, resulting in 20 studies being included for review. Five ACL-screening methods-the Landing Error Scoring System (LESS), Clinic-Based Algorithm, Observational Screening of Dynamic Knee Valgus (OSDKV), 2D-Cam Method, and Tuck Jump Assessment-were identified. There was limited evidence supporting the use of field-based screening methods in predicting ACL injuries across a range of populations. Differences relating to the equipment and time required to complete screening methods were identified. Only screening methods for ACL injury risk were included for review. Field-based screening methods developed for lower-limb injury risk in general may also incorporate, and be useful in, screening for ACL injury risk. Limited studies were available relating to the OSDKV and 2D-Cam Method. The LESS showed predictive validity in identifying ACL injuries, however only in a youth athlete population. The LESS also appears practical for community-wide use due to the minimal equipment and set-up/analysis time required. The Clinic-Based Algorithm may have predictive value for ACL injury risk as it identifies athletes who exhibit high frontal plane knee loads during a landing task, but requires extensive additional equipment and time, which may limit its application to wider community settings.

  1. Risk assessment of underpass infrastructure project based on IS0 31000 and ISO 21500 using fishbone diagram and RFMEA (project risk failure mode and effects analysis) method

    NASA Astrophysics Data System (ADS)

    Purwanggono, Bambang; Margarette, Anastasia

    2017-12-01

    Completion time of highway construction is very meaningful for smooth transportation, moreover expected number of ownership motor vehicle will increase each year. Therefore, this study was conducted with to analyze the constraints that contained in an infrastructure development project. This research was conducted on Jatingaleh Underpass Project, Semarang. This research was carried out while the project is running, on the implementation, this project is experiencing delays. This research is done to find out what are the constraints that occur in execution of a road infrastructure project, in particular that causes delays. The method that used to find the root cause is fishbone diagram to obtain a possible means of mitigation. Coupled with the RFMEA method used to determine the critical risks that must be addressed immediately on road infrastructure project. The result of data tabulation in this study indicates that the most possible mitigation tool to make a Standard Operating Procedure (SOP) recommendations to disrupt utilities that interfere project implementation. Process of risk assessment has been carried out systematically based on ISO 31000:2009 on risk management and for determination of delayed variables, the requirements of process groups according to ISO 21500:2013 on project management were used.

  2. Computational methods for global/local analysis

    NASA Technical Reports Server (NTRS)

    Ransom, Jonathan B.; Mccleary, Susan L.; Aminpour, Mohammad A.; Knight, Norman F., Jr.

    1992-01-01

    Computational methods for global/local analysis of structures which include both uncoupled and coupled methods are described. In addition, global/local analysis methodology for automatic refinement of incompatible global and local finite element models is developed. Representative structural analysis problems are presented to demonstrate the global/local analysis methods.

  3. Parenchymal Texture Analysis in Digital Breast Tomosynthesis for Breast Cancer Risk Estimation: A Preliminary Study

    PubMed Central

    Kontos, Despina; Bakic, Predrag R.; Carton, Ann-Katherine; Troxel, Andrea B.; Conant, Emily F.; Maidment, Andrew D.A.

    2009-01-01

    Rationale and Objectives Studies have demonstrated a relationship between mammographic parenchymal texture and breast cancer risk. Although promising, texture analysis in mammograms is limited by tissue superimposition. Digital breast tomosynthesis (DBT) is a novel tomographic x-ray breast imaging modality that alleviates the effect of tissue superimposition, offering superior parenchymal texture visualization compared to mammography. Our study investigates the potential advantages of DBT parenchymal texture analysis for breast cancer risk estimation. Materials and Methods DBT and digital mammography (DM) images of 39 women were analyzed. Texture features, shown in studies with mammograms to correlate with cancer risk, were computed from the retroareolar breast region. We compared the relative performance of DBT and DM texture features in correlating with two measures of breast cancer risk: (i) the Gail and Claus risk estimates, and (ii) mammographic breast density. Linear regression was performed to model the association between texture features and increasing levels of risk. Results No significant correlation was detected between parenchymal texture and the Gail and Claus risk estimates. Significant correlations were observed between texture features and breast density. Overall, the DBT texture features demonstrated stronger correlations with breast percent density (PD) than DM (p ≤0.05). When dividing our study population in groups of increasing breast PD, the DBT texture features appeared to be more discriminative, having regression lines with overall lower p-values, steeper slopes, and higher R2 estimates. Conclusion Although preliminary, our results suggest that DBT parenchymal texture analysis could provide more accurate characterization of breast density patterns, which could ultimately improve breast cancer risk estimation. PMID:19201357

  4. 38 CFR 75.115 - Risk analysis.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... preparation of the risk analysis may include data mining if necessary for the development of relevant... degree of protection for the data, e.g., unencrypted, plain text; (6) Time the data has been out of VA...

  5. Seismic risk analysis for the Babcock and Wilcox facility, Leechburg, Pennsylvania

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1977-10-21

    The results of a detailed seismic risk analysis of the Babcock and Wilcox Plutonium Fuel Fabrication facility at Leechburg, Pennsylvania are presented. This report focuses on earthquakes; the other natural hazards, being addressed in separate reports, are severe weather (strong winds and tornados) and floods. The calculational method used is based on Cornell's work (1968); it has been previously applied to safety evaluations of major projects. The historical seismic record was established after a review of available literature, consultation with operators of local seismic arrays and examination of appropriate seismic data bases. Because of the aseismicity of the region aroundmore » the site, an analysis different from the conventional closest approach in a tectonic province was adapted. Earthquakes as far from the site as 1,000 km were included, as were the possibility of earthquakes at the site. In addition, various uncertainties in the input were explicitly considered in the analysis. The results of the risk analysis, which include a Bayesian estimate of the uncertainties, are presented, expressed as return period accelerations. The best estimate curve indicates that the Babcock and Wilcox facility will experience 0.05 g every 220 years and 0.10 g every 1400 years. The bounding curves roughly represent the one standard deviation confidence limits about the best estimate, reflecting the uncertainty in certain of the input. Detailed examination of the results show that the accelerations are very insensitive to the details of the source region geometries or the historical earthquake statistics in each region and that each of the source regions contributes almost equally to the cumulative risk at the site. If required for structural analysis, acceleration response spectra for the site can be constructed by scaling the mean response spectrum for alluvium in WASH 1255 by these peak accelerations.« less

  6. Antidepressant use and risk of coronary heart disease: meta-analysis of observational studies

    PubMed Central

    Oh, Seung-Won; Kim, Joonseok; Myung, Seung-Kwon; Hwang, Seung-Sik; Yoon, Dae-Hyun

    2014-01-01

    Aims Our goal was to evaluate the association between antidepressant use and the risk of coronary heart disease (CHD) among subjects with no history of coronary heart disease. Methods A search of Medline, EMBASE, PsycINFO and the Cochrane Library was performed in January 2013. Two authors independently reviewed and selected eligible observational studies, based on predetermined selection criteria. Pooled relative risks (RRs) with confidence intervals (CIs) were calculated using random-effects or fixed-effects models. Results Sixteen observational studies (seven case–control studies and nine cohort studies) were included in the final analysis. There was no association between selective serotonin reuptake inhibitor use and the risk of CHD overall [odds ratio (OR), 0.93; 95% CI, 0.65–1.33] or in subgroup meta-analysis of case–control studies (OR, 0.91; 95% CI, 0.60–1.37) and cohort studies (RR, 0.96; 95% CI, 0.59–1.55). The use of tricyclic antidepressant was associated with an increased risk of CHD overall (OR, 1.51; 95% CI, 1.07–2.12), but it was observed only in case–control studies (OR, 1.56; 95% CI, 1.24–1.96) and low-quality studies (OR, 1.49; 95% CI, 1.20–1.85) in the subgroup meta-analyses. Conclusions This meta-analysis of observational studies in subjects with no history of CHD suggests that neither selective serotonin reuptake inhibitor nor tricyclic antidepressant use is associated with an increased risk of CHD. PMID:24646010

  7. The Use and Abuse of Risk Analysis in Policy Debate.

    ERIC Educational Resources Information Center

    Herbeck, Dale A.; Katsulas, John P.

    The best check on the preposterous claims of crisis rhetoric is an appreciation of the nature of risk analysis and how it functions in argumentation. The use of risk analysis is common in policy debate. While the stock issues paradigm focused the debate exclusively on the affirmative case, the advent of policy systems analysis has transformed…

  8. Accidental Water Pollution Risk Analysis of Mine Tailings Ponds in Guanting Reservoir Watershed, Zhangjiakou City, China.

    PubMed

    Liu, Renzhi; Liu, Jing; Zhang, Zhijiao; Borthwick, Alistair; Zhang, Ke

    2015-12-02

    Over the past half century, a surprising number of major pollution incidents occurred due to tailings dam failures. Most previous studies of such incidents comprised forensic analyses of environmental impacts after a tailings dam failure, with few considering the combined pollution risk before incidents occur at a watershed-scale. We therefore propose Watershed-scale Tailings-pond Pollution Risk Analysis (WTPRA), designed for multiple mine tailings ponds, stemming from previous watershed-scale accidental pollution risk assessments. Transferred and combined risk is embedded using risk rankings of multiple routes of the "source-pathway-target" in the WTPRA. The previous approach is modified using multi-criteria analysis, dam failure models, and instantaneous water quality models, which are modified for application to multiple tailings ponds. The study area covers the basin of Gutanting Reservoir (the largest backup drinking water source for Beijing) in Zhangjiakou City, where many mine tailings ponds are located. The resultant map shows that risk is higher downstream of Gutanting Reservoir and in its two tributary basins (i.e., Qingshui River and Longyang River). Conversely, risk is lower in the midstream and upstream reaches. The analysis also indicates that the most hazardous mine tailings ponds are located in Chongli and Xuanhua, and that Guanting Reservoir is the most vulnerable receptor. Sensitivity and uncertainty analyses are performed to validate the robustness of the WTPRA method.

  9. Accidental Water Pollution Risk Analysis of Mine Tailings Ponds in Guanting Reservoir Watershed, Zhangjiakou City, China

    PubMed Central

    Liu, Renzhi; Liu, Jing; Zhang, Zhijiao; Borthwick, Alistair; Zhang, Ke

    2015-01-01

    Over the past half century, a surprising number of major pollution incidents occurred due to tailings dam failures. Most previous studies of such incidents comprised forensic analyses of environmental impacts after a tailings dam failure, with few considering the combined pollution risk before incidents occur at a watershed-scale. We therefore propose Watershed-scale Tailings-pond Pollution Risk Analysis (WTPRA), designed for multiple mine tailings ponds, stemming from previous watershed-scale accidental pollution risk assessments. Transferred and combined risk is embedded using risk rankings of multiple routes of the “source-pathway-target” in the WTPRA. The previous approach is modified using multi-criteria analysis, dam failure models, and instantaneous water quality models, which are modified for application to multiple tailings ponds. The study area covers the basin of Gutanting Reservoir (the largest backup drinking water source for Beijing) in Zhangjiakou City, where many mine tailings ponds are located. The resultant map shows that risk is higher downstream of Gutanting Reservoir and in its two tributary basins (i.e., Qingshui River and Longyang River). Conversely, risk is lower in the midstream and upstream reaches. The analysis also indicates that the most hazardous mine tailings ponds are located in Chongli and Xuanhua, and that Guanting Reservoir is the most vulnerable receptor. Sensitivity and uncertainty analyses are performed to validate the robustness of the WTPRA method. PMID:26633450

  10. Matrix analysis and risk management to avert depression and suicide among workers

    PubMed Central

    2010-01-01

    Suicide is among the most tragic outcomes of all mental disorders, and the prevalence of suicide has risen dramatically during the last decade, particularly among workers. This paper reviews and proposes strategies to avert suicide and depression with regard to the mind body medicine equation hypothesis, metrics analysis of mental health problems from a public health and clinical medicine view. In occupational fields, the mind body medicine hypothesis has to deal with working environment, working condition, and workers' health. These three factors chosen in this paper were based on the concept of risk control, called San-kanri, which has traditionally been used in Japanese companies, and the causation concepts of host, agent, and environment. Working environment and working condition were given special focus with regard to tackling suicide problems. Matrix analysis was conducted by dividing the problem of working conditions into nine cells: three prevention levels (primary, secondary, and tertiary) were proposed for each of the three factors of the mind body medicine hypothesis (working environment, working condition, and workers' health). After using these main strategies (mind body medicine analysis and matrix analysis) to tackle suicide problems, the paper talks about the versatility of case-method teaching, "Hiyari-Hat activity," routine inspections by professionals, risk assessment analysis, and mandatory health check-up focusing on sleep and depression. In the risk assessment analysis, an exact assessment model was suggested using a formula based on multiplication of the following three factors: (1) severity, (2) frequency, and (3) possibility. Mental health problems, including suicide, are rather tricky to deal with because they involve evaluation of individual cases. The mind body medicine hypothesis and matrix analysis would be appropriate tactics for suicide prevention because they would help the evaluation of this issue as a tangible problem. PMID

  11. Ontology-based specification, identification and analysis of perioperative risks.

    PubMed

    Uciteli, Alexandr; Neumann, Juliane; Tahar, Kais; Saleh, Kutaiba; Stucke, Stephan; Faulbrück-Röhr, Sebastian; Kaeding, André; Specht, Martin; Schmidt, Tobias; Neumuth, Thomas; Besting, Andreas; Stegemann, Dominik; Portheine, Frank; Herre, Heinrich

    2017-09-06

    Medical personnel in hospitals often works under great physical and mental strain. In medical decision-making, errors can never be completely ruled out. Several studies have shown that between 50 and 60% of adverse events could have been avoided through better organization, more attention or more effective security procedures. Critical situations especially arise during interdisciplinary collaboration and the use of complex medical technology, for example during surgical interventions and in perioperative settings (the period of time before, during and after surgical intervention). In this paper, we present an ontology and an ontology-based software system, which can identify risks across medical processes and supports the avoidance of errors in particular in the perioperative setting. We developed a practicable definition of the risk notion, which is easily understandable by the medical staff and is usable for the software tools. Based on this definition, we developed a Risk Identification Ontology (RIO) and used it for the specification and the identification of perioperative risks. An agent system was developed, which gathers risk-relevant data during the whole perioperative treatment process from various sources and provides it for risk identification and analysis in a centralized fashion. The results of such an analysis are provided to the medical personnel in form of context-sensitive hints and alerts. For the identification of the ontologically specified risks, we developed an ontology-based software module, called Ontology-based Risk Detector (OntoRiDe). About 20 risks relating to cochlear implantation (CI) have already been implemented. Comprehensive testing has indicated the correctness of the data acquisition, risk identification and analysis components, as well as the web-based visualization of results.

  12. Analysis of risk factors for T. brucei rhodesiense sleeping sickness within villages in south-east Uganda

    PubMed Central

    Zoller, Thomas; Fèvre, Eric M; Welburn, Susan C; Odiit, Martin; Coleman, Paul G

    2008-01-01

    Background Sleeping sickness (HAT) caused by T.b. rhodesiense is a major veterinary and human public health problem in Uganda. Previous studies have investigated spatial risk factors for T.b. rhodesiense at large geographic scales, but none have properly investigated such risk factors at small scales, i.e. within affected villages. In the present work, we use a case-control methodology to analyse both behavioural and spatial risk factors for HAT in an endemic area. Methods The present study investigates behavioural and occupational risk factors for infection with HAT within villages using a questionnaire-based case-control study conducted in 17 villages endemic for HAT in SE Uganda, and spatial risk factors in 4 high risk villages. For the spatial analysis, the location of homesteads with one or more cases of HAT up to three years prior to the beginning of the study was compared to all non-case homesteads. Analysing spatial associations with respect to irregularly shaped geographical objects required the development of a new approach to geographical analysis in combination with a logistic regression model. Results The study was able to identify, among other behavioural risk factors, having a family member with a history of HAT (p = 0.001) as well as proximity of a homestead to a nearby wetland area (p < 0.001) as strong risk factors for infection. The novel method of analysing complex spatial interactions used in the study can be applied to a range of other diseases. Conclusion Spatial risk factors for HAT are maintained across geographical scales; this consistency is useful in the design of decision support tools for intervention and prevention of the disease. Familial aggregation of cases was confirmed for T. b. rhodesiense HAT in the study and probably results from shared behavioural and spatial risk factors amongmembers of a household. PMID:18590541

  13. Risk of myocardial infarction and stroke in bipolar disorder: a systematic review and exploratory meta-analysis

    PubMed Central

    Prieto, M.L.; Cuéllar-Barboza, A.B.; Bobo, W.V.; Roger, V.L.; Bellivier, F.; Leboyer, M.; West, C.P.; Frye, M.A.

    2016-01-01

    Objective To review the evidence on and estimate the risk of myocardial infarction and stroke in bipolar disorder. Method A systematic search using MEDLINE, EMBASE, PsycINFO, Web of Science, Scopus, Cochrane Database of Systematic Reviews, and bibliographies (1946 – May, 2013) was conducted. Case-control and cohort studies of bipolar disorder patients age 15 or older with myocardial infarction or stroke as outcomes were included. Two independent reviewers extracted data and assessed quality. Estimates of effect were summarized using random-effects meta-analysis. Results Five cohort studies including 13 115 911 participants (27 092 bipolar) were included. Due to the use of registers, different statistical methods, and inconsistent adjustment for confounders, there was significant methodological heterogeneity among studies. The exploratory meta-analysis yielded no evidence for a significant increase in the risk of myocardial infarction: [relative risk (RR): 1.09, 95% CI 0.96–1.24, P = 0.20; I2 = 6%]. While there was evidence of significant study heterogeneity, the risk of stroke in bipolar disorder was significantly increased (RR 1.74, 95% CI 1.29–2.35; P = 0.0003; I2 = 83%). Conclusion There may be a differential risk of myocardial infarction and stroke in patients with bipolar disorder. Confidence in these pooled estimates was limited by the small number of studies, significant heterogeneity and dissimilar methodological features. PMID:24850482

  14. Prospective risk analysis prior to retrospective incident reporting and analysis as a means to enhance incident reporting behaviour: a quasi-experimental field study.

    PubMed

    Kessels-Habraken, Marieke; De Jonge, Jan; Van der Schaaf, Tjerk; Rutte, Christel

    2010-05-01

    Hospitals can apply prospective and retrospective methods to reduce the large number of medical errors. Retrospective methods are used to identify errors after they occur and to facilitate learning. Prospective methods aim to determine, assess and minimise risks before incidents happen. This paper questions whether the order of implementation of those two methods influences the resultant impact on incident reporting behaviour. From November 2007 until June 2008, twelve wards of two Dutch general hospitals participated in a quasi-experimental reversed-treatment non-equivalent control group design. The six units of Hospital 1 first conducted a prospective analysis, after which a sophisticated incident reporting and analysis system was implemented. On the six units of Hospital 2 the two methods were implemented in reverse order. Data from the incident reporting and analysis system and from a questionnaire were used to assess between-hospital differences regarding the number of reported incidents, the spectrum of reported incident types, and the profession of reporters. The results show that carrying out a prospective analysis first can improve incident reporting behaviour in terms of a wider spectrum of reported incident types and a larger proportion of incidents reported by doctors. However, the proposed order does not necessarily yield a larger number of reported incidents. This study fills an important gap in safety management research regarding the order of the implementation of prospective and retrospective methods, and contributes to literature on incident reporting. This research also builds on the network theory of social contagion. The results might indicate that health care employees can disseminate their risk perceptions through communication with their direct colleagues. Copyright 2010 Elsevier Ltd. All rights reserved.

  15. Risk analysis for renewable energy projects due to constraints arising

    NASA Astrophysics Data System (ADS)

    Prostean, G.; Vasar, C.; Prostean, O.; Vartosu, A.

    2016-02-01

    Starting from the target of the European Union (EU) to use renewable energy in the area that aims a binding target of 20% renewable energy in final energy consumption by 2020, this article illustrates the identification of risks for implementation of wind energy projects in Romania, which could lead to complex technical implications, social and administrative. In specific projects analyzed in this paper were identified critical bottlenecks in the future wind power supply chain and reasonable time periods that may arise. Renewable energy technologies have to face a number of constraints that delayed scaling-up their production process, their transport process, the equipment reliability, etc. so implementing these types of projects requiring complex specialized team, the coordination of which also involve specific risks. The research team applied an analytical risk approach to identify major risks encountered within a wind farm project developed in Romania in isolated regions with different particularities, configured for different geographical areas (hill and mountain locations in Romania). Identification of major risks was based on the conceptual model set up for the entire project implementation process. Throughout this conceptual model there were identified specific constraints of such process. Integration risks were examined by an empirical study based on the method HAZOP (Hazard and Operability). The discussion describes the analysis of our results implementation context of renewable energy projects in Romania and creates a framework for assessing energy supply to any entity from renewable sources.

  16. Cardiovascular risk from water arsenic exposure in Vietnam: Application of systematic review and meta-regression analysis in chemical health risk assessment.

    PubMed

    Phung, Dung; Connell, Des; Rutherford, Shannon; Chu, Cordia

    2017-06-01

    A systematic review (SR) and meta-analysis cannot provide the endpoint answer for a chemical risk assessment (CRA). The objective of this study was to apply SR and meta-regression (MR) analysis to address this limitation using a case study in cardiovascular risk from arsenic exposure in Vietnam. Published studies were searched from PubMed using the keywords of arsenic exposure and cardiovascular diseases (CVD). Random-effects meta-regression was applied to model the linear relationship between arsenic concentration in water and risk of CVD, and then the no-observable-adverse-effect level (NOAEL) were identified from the regression function. The probabilistic risk assessment (PRA) technique was applied to characterize risk of CVD due to arsenic exposure by estimating the overlapping coefficient between dose-response and exposure distribution curves. The risks were evaluated for groundwater, treated and drinking water. A total of 8 high quality studies for dose-response and 12 studies for exposure data were included for final analyses. The results of MR suggested a NOAEL of 50 μg/L and a guideline of 5 μg/L for arsenic in water which valued as a half of NOAEL and guidelines recommended from previous studies and authorities. The results of PRA indicated that the observed exposure level with exceeding CVD risk was 52% for groundwater, 24% for treated water, and 10% for drinking water in Vietnam, respectively. The study found that systematic review and meta-regression can be considered as an ideal method to chemical risk assessment due to its advantages to bring the answer for the endpoint question of a CRA. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Aircraft Conceptual Design and Risk Analysis Using Physics-Based Noise Prediction

    NASA Technical Reports Server (NTRS)

    Olson, Erik D.; Mavris, Dimitri N.

    2006-01-01

    An approach was developed which allows for design studies of commercial aircraft using physics-based noise analysis methods while retaining the ability to perform the rapid trade-off and risk analysis studies needed at the conceptual design stage. A prototype integrated analysis process was created for computing the total aircraft EPNL at the Federal Aviation Regulations Part 36 certification measurement locations using physics-based methods for fan rotor-stator interaction tones and jet mixing noise. The methodology was then used in combination with design of experiments to create response surface equations (RSEs) for the engine and aircraft performance metrics, geometric constraints and take-off and landing noise levels. In addition, Monte Carlo analysis was used to assess the expected variability of the metrics under the influence of uncertainty, and to determine how the variability is affected by the choice of engine cycle. Finally, the RSEs were used to conduct a series of proof-of-concept conceptual-level design studies demonstrating the utility of the approach. The study found that a key advantage to using physics-based analysis during conceptual design lies in the ability to assess the benefits of new technologies as a function of the design to which they are applied. The greatest difficulty in implementing physics-based analysis proved to be the generation of design geometry at a sufficient level of detail for high-fidelity analysis.

  18. Metformin therapy and the risk of colorectal adenoma in patients with type 2 diabetes: A meta-analysis

    PubMed Central

    Hou, Yi-Chao; Hu, Qiang; Huang, Jiao; Fang, Jing-Yuan; Xiong, Hua

    2017-01-01

    Background Existing data evaluating the impact of metformin on the colorectal adenoma (CRA) risk in patients suffering from type 2 diabetes (T2D) are limited and controversial. We therefore summarized the studies currently available and assessed the relationship between metformin treatment and risk of CRA in T2D patients. Methods We systematically searched databases for eligible studies that explored the impact of metformin treatment on the occurrence of CRA in T2D patients from inception to June 2016. The summary odds ratio (OR) estimates with their 95% confidence interval (CI) were derived using random-effect, generic inverse variance methods. Sensitivity analysis and subgroup analysis were performed. Results Seven studies involving 7178 participants met the inclusion criteria. The pooling showed that metformin therapy has a 27% decrease in the CRA risk (OR, 0.73; 95% CI, 0.58 - 0.90). In subgroup analysis, we detected that metformin exhibits significant chemoprevention effects in Asia region (OR, 0.68; 95% CI, 0.48 - 0.96). Similar results were identified in both studies with adjusted ORs and high-quality studies (OR, 0.66; 95% CI, 0.50 - 0.86 and OR, 0.70; 95% CI, 0.58 - 0.84, respectively). Of note, an inverse relationship was noted that metformin therapy may result in a significant decrease in the advanced adenoma risk (OR, 0.52; 95% CI, 0.38 - 0.72). Low heterogeneity was observed, however, the results remained robust in multiplesensitivity analyses. Conclusions This meta-analysis indicates that metformin therapy is correlated with a significant decrease in the risk of CRA and advanced adenoma in T2D patients. Further confirmatory studies are warranted. PMID:27903961

  19. New Multi-HAzard and MulTi-RIsk Assessment MethodS for Europe (MATRIX): A research program towards mitigating multiple hazards and risks in Europe

    NASA Astrophysics Data System (ADS)

    Fleming, K. M.; Zschau, J.; Gasparini, P.; Modaressi, H.; Matrix Consortium

    2011-12-01

    Scientists, engineers, civil protection and disaster managers typically treat natural hazards and risks individually. This leads to the situation where the frequent causal relationships between the different hazards and risks, e.g., earthquakes and volcanos, or floods and landslides, are ignored. Such an oversight may potentially lead to inefficient mitigation planning. As part of their efforts to confront this issue, the European Union, under its FP7 program, is supporting the New Multi-HAzard and MulTi-RIsK Assessment MethodS for Europe or MATRIX project. The focus of MATRIX is on natural hazards, in particular earthquakes, landslides, volcanos, wild fires, storms and fluvial and coastal flooding. MATRIX will endeavour to develop methods and tools to tackle multi-type natural hazards and risks within a common framework, focusing on methodologies that are suited to the European context. The work will involve an assessment of current single-type hazard and risk assessment methodologies, including a comparison and quantification of uncertainties and harmonization of single-type methods, examining the consequence of cascade effects within a multi-hazard environment, time-dependent vulnerability, decision making and support for multi-hazard mitigation and adaption, and a series of test cases. Three test sites are being used to assess the methods developed within the project (Naples, Cologne, and the French West Indies), as well as a "virtual city" based on a comprehensive IT platform that will allow scenarios not represented by the test cases to be examined. In addition, a comprehensive dissemination program that will involve national platforms for disaster management, as well as various outreach activities, will be undertaken. The MATRIX consortium consists of ten research institutions (nine European and one Canadian), an end-user (i.e., one of the European national platforms for disaster reduction) and a partner from industry.

  20. Meta-analysis of studies on breast cancer risk and diet in Chinese women

    PubMed Central

    Wu, Ying-Chao; Zheng, Dong; Sun, Jin-Jie; Zou, Zhi-Kang; Ma, Zhong-Li

    2015-01-01

    Objective: A meta-analysis was carried out to summarize published data on the relationship between breast cancer and dietary factors. Methods: Databases in Chinese (China National Knowledge Infrastructure [CNKI], China Biology Medicine [CBM], WanFang, VIP) and in English (PubMed and Web of Science) were searched for articles analyzing vegetable, fruit, soy food and fat consumption and breast cancer risk published through June 30, 2013. Random effects models were used to estimate summary odds ratios (OR) based on high versus low intake, and subgroup analysis was conducted according to region, study design, paper quality and adjustment for confounding factors to detect the potential source of heterogeneity. Every study was screened according to the inclusion criteria and exclusion criteria, evaluated in accordance with the Newcastle-Ottawa Scale. RevMan 5.2 software was used for analysis. Results: Of 785 studies retrieved, 22 met inclusion criteria (13 in Chinese and 9 in English), representing 23,201 patients: 10,566 in the experimental group and 12,635 in the control group. Thirteen included studies showed vegetables consumption to be a relevant factor in breast cancer risk, OR = 0.77 (95% CI [confidence interval] 0.62-0.96). Eleven studies showed fruits consumption to be relevant, OR = 0.68 (95% CI 0.49-0.93). Significant differences were also found between those who consumed soy foods, OR = 0.68 (95% CI 0.50-0.93) and those who ate a high-fat diet, OR = 1.15 (95% CI 1.01-1.30). Conclusion: This analysis confirms the association between intake of vegetables, fruits, soy foods and fat and the risk of breast cancer from published sources. It’s suggested that high consumption of vegetables, fruits and soy foods may reduce the risk of breast cancer, while increasing fat consumption may increase the risk. PMID:25784976

  1. Comparison of the effects of streptokinase and tissue plasminogen activator on regional wall motion after first myocardial infarction: analysis by the centerline method with correction for area at risk.

    PubMed

    Cross, D B; Ashton, N G; Norris, R M; White, H D

    1991-04-01

    In a trial of streptokinase versus recombinant tissue-type plasminogen activator (rt-PA) for a first myocardial infarction, 270 patients were randomized. Regional left ventricular function was assessed in 214 patients at 3 weeks. The infarct-related artery was the left anterior descending artery in 78 patients, the right coronary artery in 122 and a dominant left circumflex artery in 14. Analysis was by the centerline method with a novel correction for the area of myocardium at risk, whereby the search region was determined by the anatomic distribution of the infarct-related artery. Infarct-artery patency at 3 weeks was 73% in the streptokinase group and 71% in the rt-PA group. Global left ventricular function did not differ between the two groups. Mean chord motion (+/- SD) in the most hypokinetic half of the defined search region was similar in the streptokinase and rt-PA groups (-2.4 +/- 1.5 versus -2.3 +/- 1.3, p = 0.63). There were no differences in hyperkinesia of the noninfarct zone. Compared with conventional centerline analysis, regional wall motion in the defined area at risk was significantly more abnormal. The two methods correlated strongly, however (r = 0.99, p less than 0.0001), and both methods produced similar overall results. Patients with a patent infarct-related artery and those with an occluded artery at the time of catheterization had similar levels of global function (ejection fraction 58 +/- 12% versus 57 +/- 12%, p = 0.58).(ABSTRACT TRUNCATED AT 250 WORDS)

  2. The application of hazard analysis and critical control points and risk management in the preparation of anti-cancer drugs.

    PubMed

    Bonan, Brigitte; Martelli, Nicolas; Berhoune, Malik; Maestroni, Marie-Laure; Havard, Laurent; Prognon, Patrice

    2009-02-01

    To apply the Hazard analysis and Critical Control Points method to the preparation of anti-cancer drugs. To identify critical control points in our cancer chemotherapy process and to propose control measures and corrective actions to manage these processes. The Hazard Analysis and Critical Control Points application began in January 2004 in our centralized chemotherapy compounding unit. From October 2004 to August 2005, monitoring of the process nonconformities was performed to assess the method. According to the Hazard Analysis and Critical Control Points method, a multidisciplinary team was formed to describe and assess the cancer chemotherapy process. This team listed all of the critical points and calculated their risk indexes according to their frequency of occurrence, their severity and their detectability. The team defined monitoring, control measures and corrective actions for each identified risk. Finally, over a 10-month period, pharmacists reported each non-conformity of the process in a follow-up document. Our team described 11 steps in the cancer chemotherapy process. The team identified 39 critical control points, including 11 of higher importance with a high-risk index. Over 10 months, 16,647 preparations were performed; 1225 nonconformities were reported during this same period. The Hazard Analysis and Critical Control Points method is relevant when it is used to target a specific process such as the preparation of anti-cancer drugs. This method helped us to focus on the production steps, which can have a critical influence on product quality, and led us to improve our process.

  3. 14 CFR Appendix C to Part 420 - Risk Analysis

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Risk Analysis C Appendix C to Part 420... TRANSPORTATION LICENSING LICENSE TO OPERATE A LAUNCH SITE Pt. 420, App. C Appendix C to Part 420—Risk Analysis (a... table C-1. Table C-1 provides sources where an applicant may obtain data acceptable to the FAA. An...

  4. Method of Breast Reconstruction Determines Venous Thromboembolism Risk Better Than Current Prediction Models

    PubMed Central

    Patel, Niyant V.; Wagner, Douglas S.

    2015-01-01

    Background: Venous thromboembolism (VTE) risk models including the Davison risk score and the 2005 Caprini risk assessment model have been validated in plastic surgery patients. However, their utility and predictive value in breast reconstruction has not been well described. We sought to determine the utility of current VTE risk models in this population and the VTE rate observed in various methods of breast reconstruction. Methods: A retrospective review of breast reconstructions by a single surgeon was performed. One hundred consecutive transverse rectus abdominis myocutaneous (TRAM) patients, 100 consecutive implant patients, and 100 consecutive latissimus dorsi patients were identified over a 10-year period. Patient demographics and presence of symptomatic VTE were collected. 2005 Caprini risk scores and Davison risk scores were calculated for each patient. Results: The TRAM reconstruction group was found to have a higher VTE rate (6%) than the implant (0%) and latissimus (0%) reconstruction groups (P < 0.01). Mean Davison risk scores and 2005 Caprini scores were similar across all reconstruction groups (P > 0.1). The vast majority of patients were stratified as high risk (87.3%) by the VTE risk models. However, only TRAM reconstruction patients demonstrated significant VTE risk. Conclusions: TRAM reconstruction appears to have a significantly higher risk of VTE than both implant and latissimus reconstruction. Current risk models do not effectively stratify breast reconstruction patients at risk for VTE. The method of breast reconstruction appears to have a significant role in patients’ VTE risk. PMID:26090287

  5. Consumption of Yogurt and the Incident Risk of Cardiovascular Disease: A Meta-Analysis of Nine Cohort Studies.

    PubMed

    Wu, Lei; Sun, Dali

    2017-03-22

    Previous systematic reviews and meta-analyses have evaluated the association of dairy consumption and the risk of cardiovascular disease (CVD). However, the findings were inconsistent. No quantitative analysis has specifically assessed the effect of yogurt intake on the incident risk of CVD. We searched the PubMed and the Embase databases from inception to 10 January 2017. A generic inverse-variance method was used to pool the fully-adjusted relative risks (RRs) and the corresponding 95% confidence intervals (CIs) with a random-effects model. A generalized least squares trend estimation model was used to calculate the specific slopes in the dose-response analysis. The present systematic review and meta-analysis identified nine prospective cohort articles involving a total of 291,236 participants. Compared with the lowest category, highest category of yogurt consumption was not significantly related with the incident risk of CVD, and the RR (95% CI) was 1.01 (0.95, 1.08) with an evidence of significant heterogeneity (I² = 52%). However, intake of ≥200 g/day yogurt was significantly associated with a lower risk of CVD in the subgroup analysis. There was a trend that a higher level of yogurt consumption was associated with a lower incident risk of CVD in the dose-response analysis. A daily dose of ≥200 g yogurt intake might be associated with a lower incident risk of CVD. Further cohort studies and randomized controlled trials are still demanded to establish and confirm the observed association in populations with different characteristics.

  6. Consumption of Yogurt and the Incident Risk of Cardiovascular Disease: A Meta-Analysis of Nine Cohort Studies

    PubMed Central

    Wu, Lei; Sun, Dali

    2017-01-01

    Previous systematic reviews and meta-analyses have evaluated the association of dairy consumption and the risk of cardiovascular disease (CVD). However, the findings were inconsistent. No quantitative analysis has specifically assessed the effect of yogurt intake on the incident risk of CVD. We searched the PubMed and the Embase databases from inception to 10 January 2017. A generic inverse-variance method was used to pool the fully-adjusted relative risks (RRs) and the corresponding 95% confidence intervals (CIs) with a random-effects model. A generalized least squares trend estimation model was used to calculate the specific slopes in the dose-response analysis. The present systematic review and meta-analysis identified nine prospective cohort articles involving a total of 291,236 participants. Compared with the lowest category, highest category of yogurt consumption was not significantly related with the incident risk of CVD, and the RR (95% CI) was 1.01 (0.95, 1.08) with an evidence of significant heterogeneity (I2 = 52%). However, intake of ≥200 g/day yogurt was significantly associated with a lower risk of CVD in the subgroup analysis. There was a trend that a higher level of yogurt consumption was associated with a lower incident risk of CVD in the dose-response analysis. A daily dose of ≥200 g yogurt intake might be associated with a lower incident risk of CVD. Further cohort studies and randomized controlled trials are still demanded to establish and confirm the observed association in populations with different characteristics. PMID:28327514

  7. Quantitative Method for Analyzing the Allocation of Risks in Transportation Construction

    DOT National Transportation Integrated Search

    1979-04-01

    The report presents a conceptual model of risk that was developed to analyze the impact on owner's cost of alternate allocations of risk among owner and contractor in mass transit construction. A model and analysis procedure are developed, based on d...

  8. Proton-pump inhibitors and risk of fractures: an update meta-analysis.

    PubMed

    Zhou, B; Huang, Y; Li, H; Sun, W; Liu, J

    2016-01-01

    To identify the relationship between proton-pump inhibitors (PPIs) and the risk of fracture, we conducted an update meta-analysis of observational studies. Results showed that PPI use was associated with a modestly increased risk of hip, spine, and any-site fracture. Many studies have investigated the association of proton-pump inhibitors (PPIs) with fracture risk, but the results have been inconsistent. To evaluate this question, we performed a meta-analysis of relevant observational studies. A systematic literature search up to February 2015 was performed in PubMed. We combined relative risks (RRs) for fractures using random-effects models and conducted subgroup and stratified analyses. Eighteen studies involving a total of 244,109 fracture cases were included in this meta-analysis. Pooled analysis showed that PPI use could moderately increase the risk of hip fracture [RR = 1.26, 95 % confidence intervals (CIs) 1.16–1.36]. There was statistically significant heterogeneity among studies (p < 0.001; I 2 = 71.9 %). After limiting to cohort studies, there was also a moderate increase in hip fracture risk without evidence of study heterogeneity. Pooling revealed that short-term use (<1 year) and longer use (>1 year) were similarly associated with increased risk of hip fracture. Furthermore, a moderately increased risk of spine (RR = 1.58, 95 % CI 1.38–1.82) and any-site fracture (RR = 1.33, 95 % CI 1.15–1.54) was also found among PPI users. In this update meta-analysis of observational studies, PPI use modestly increased the risk of hip, spine, and any-site fracture, but no evidence of duration effect in subgroup analysis.

  9. A risk analysis for production processes with disposable bioreactors.

    PubMed

    Merseburger, Tobias; Pahl, Ina; Müller, Daniel; Tanner, Markus

    2014-01-01

    : Quality management systems are, as a rule, tightly defined systems that conserve existing processes and therefore guarantee compliance with quality standards. But maintaining quality also includes introducing new enhanced production methods and making use of the latest findings of bioscience. The advances in biotechnology and single-use manufacturing methods for producing new drugs especially impose new challenges on quality management, as quality standards have not yet been set. New methods to ensure patient safety have to be established, as it is insufficient to rely only on current rules. A concept of qualification, validation, and manufacturing procedures based on risk management needs to be established and realized in pharmaceutical production. The chapter starts with an introduction to the regulatory background of the manufacture of medicinal products. It then continues with key methods of risk management. Hazards associated with the production of medicinal products with single-use equipment are described with a focus on bioreactors, storage containers, and connecting devices. The hazards are subsequently evaluated and criteria for risk evaluation are presented. This chapter concludes with aspects of industrial application of quality risk management.

  10. Hypothyroidism as a risk factor for open angle glaucoma: A systematic review and meta-analysis

    PubMed Central

    Liu, Yue; Zheng, Guangying

    2017-01-01

    Purpose The relationship between hypothyroidism and primary open angle glaucoma (POAG) has attracted intense interest recently, but the reported results have been controversial. This meta-analysis was carried out to determine the association between hypothyroidism and POAG. Methods The literature was identified from three databases (Web of Science, Embase, and PubMed). The meta-analyses were performed using random-effects models, with results reported as adjusted odds ratios (ORs) with 95% confidence intervals (CI 95%). Results A total of 11 studies meeting the inclusion criteria were included in the final meta-analysis. The pooled OR based on 11 risk estimates showed a statistically significant increased risk of POAG prevalence among individuals with hypothyroidism (OR = 1.64, 95% CI = 1.27–2.13). Substantial heterogeneity among these studies was detected (P < 0.001; I2 = 83.2%). Sub-group analysis revealed that the cohort studies and case–control studies showed a significant association between hypothyroidism and POAG, which was not observed in cross-sectional studies. There was no significant publication bias in this study. Conclusions The findings of this meta-analysis indicate that individuals with hypothyroidism have an increased risk of developing POAG. PMID:29069095

  11. Risk-based cost-benefit analysis for evaluating microbial risk mitigation in a drinking water system.

    PubMed

    Bergion, Viktor; Lindhe, Andreas; Sokolova, Ekaterina; Rosén, Lars

    2018-04-01

    Waterborne outbreaks of gastrointestinal diseases can cause large costs to society. Risk management needs to be holistic and transparent in order to reduce these risks in an effective manner. Microbial risk mitigation measures in a drinking water system were investigated using a novel approach combining probabilistic risk assessment and cost-benefit analysis. Lake Vomb in Sweden was used to exemplify and illustrate the risk-based decision model. Four mitigation alternatives were compared, where the first three alternatives, A1-A3, represented connecting 25, 50 and 75%, respectively, of on-site wastewater treatment systems in the catchment to the municipal wastewater treatment plant. The fourth alternative, A4, represented installing a UV-disinfection unit in the drinking water treatment plant. Quantitative microbial risk assessment was used to estimate the positive health effects in terms of quality adjusted life years (QALYs), resulting from the four mitigation alternatives. The health benefits were monetised using a unit cost per QALY. For each mitigation alternative, the net present value of health and environmental benefits and investment, maintenance and running costs was calculated. The results showed that only A4 can reduce the risk (probability of infection) below the World Health Organization guidelines of 10 -4 infections per person per year (looking at the 95th percentile). Furthermore, all alternatives resulted in a negative net present value. However, the net present value would be positive (looking at the 50 th percentile using a 1% discount rate) if non-monetised benefits (e.g. increased property value divided evenly over the studied time horizon and reduced microbial risks posed to animals), estimated at 800-1200 SEK (€100-150) per connected on-site wastewater treatment system per year, were included. This risk-based decision model creates a robust and transparent decision support tool. It is flexible enough to be tailored and applied to local

  12. Applicability of risk-based management and the need for risk-based economic decision analysis at hazardous waste contaminated sites.

    PubMed

    Khadam, Ibrahim; Kaluarachchi, Jagath J

    2003-07-01

    Decision analysis in subsurface contamination management is generally carried out through a traditional engineering economic viewpoint. However, new advances in human health risk assessment, namely, the probabilistic risk assessment, and the growing awareness of the importance of soft data in the decision-making process, require decision analysis methodologies that are capable of accommodating non-technical and politically biased qualitative information. In this work, we discuss the major limitations of the currently practiced decision analysis framework, which evolves around the definition of risk and cost of risk, and its poor ability to communicate risk-related information. A demonstration using a numerical example was conducted to provide insight on these limitations of the current decision analysis framework. The results from this simple ground water contamination and remediation scenario were identical to those obtained from studies carried out on existing Superfund sites, which suggests serious flaws in the current risk management framework. In order to provide a perspective on how these limitations may be avoided in future formulation of the management framework, more matured and well-accepted approaches to decision analysis in dam safety and the utility industry, where public health and public investment are of great concern, are presented and their applicability in subsurface remediation management is discussed. Finally, in light of the success of the application of risk-based decision analysis in dam safety and the utility industry, potential options for decision analysis in subsurface contamination management are discussed.

  13. Application of texture analysis method for mammogram density classification

    NASA Astrophysics Data System (ADS)

    Nithya, R.; Santhi, B.

    2017-07-01

    Mammographic density is considered a major risk factor for developing breast cancer. This paper proposes an automated approach to classify breast tissue types in digital mammogram. The main objective of the proposed Computer-Aided Diagnosis (CAD) system is to investigate various feature extraction methods and classifiers to improve the diagnostic accuracy in mammogram density classification. Texture analysis methods are used to extract the features from the mammogram. Texture features are extracted by using histogram, Gray Level Co-Occurrence Matrix (GLCM), Gray Level Run Length Matrix (GLRLM), Gray Level Difference Matrix (GLDM), Local Binary Pattern (LBP), Entropy, Discrete Wavelet Transform (DWT), Wavelet Packet Transform (WPT), Gabor transform and trace transform. These extracted features are selected using Analysis of Variance (ANOVA). The features selected by ANOVA are fed into the classifiers to characterize the mammogram into two-class (fatty/dense) and three-class (fatty/glandular/dense) breast density classification. This work has been carried out by using the mini-Mammographic Image Analysis Society (MIAS) database. Five classifiers are employed namely, Artificial Neural Network (ANN), Linear Discriminant Analysis (LDA), Naive Bayes (NB), K-Nearest Neighbor (KNN), and Support Vector Machine (SVM). Experimental results show that ANN provides better performance than LDA, NB, KNN and SVM classifiers. The proposed methodology has achieved 97.5% accuracy for three-class and 99.37% for two-class density classification.

  14. Risk of colorectal cancer in Asian patients with ulcerative colitis: a systematic review and meta-analysis

    PubMed Central

    Bopanna, Sawan; Ananthakrishnan, Ashwin N; Kedia, Saurabh; Yajnik, Vijay; Ahuja, Vineet

    2017-01-01

    Summary Background The increased risk of colorectal cancer in ulcerative colitis is well known. The risk of sporadic colorectal cancer in Asian populations is considered low and risk estimates of colorectal cancer related to ulcerative colitis from Asia vary. This meta-analysis is an Asian perspective on the risk of colorectal cancer related to ulcerative colitis. Methods We searched PubMed and Embase for terms related to colorectal cancer in ulcerative colitis from inception to July 1, 2016. The search for published articles was done by country for all countries in Asia. We included studies with information on the prevalence and cumulative risk of colorectal cancer at various timepoints. A random-effects meta-analysis was done to calculate the pooled prevalence as well as a cumulative risk at 10 years, 20 years, and 30 years of disease. Findings Our search identified 2575 articles; of which 44 were eligible for inclusion. Our analysis included a total of 31 287 patients with ulcerative colitis with a total of 293 reported colorectal cancers. Using pooled prevalence estimates from various studies, the overall prevalence was 0·85% (95% CI 0·65–1·04). The risks for colorectal cancer were 0·02% (95% CI 0·00–0·04) at 10 years, 4·81% (3·26–6·36) at 20 years, and 13·91% (7·09–20·72) at 30 years. Subgroup analysis by stratifying the studies according to region or period of the study did not reveal any significant differences. Interpretation We found the risk of colorectal cancer in Asian patients with ulcerative colitis was similar to recent estimates in Europe and North America. Adherence to screening is therefore necessary. Larger population-based, prospective studies are required for better estimates of the risk. PMID:28404156

  15. THE LIQUEFACTION RISK ANALYSIS OF CEMENT-TREATED SANDY GROUND CONSIDERING THE SPATIAL VARIABILITY OF SOIL STRENGTH

    NASA Astrophysics Data System (ADS)

    Kataoka, Norio; Kasama, Kiyonobu; Zen, Kouki; Chen, Guangqi

    This paper presents a probabilistic method for assessi ng the liquefaction risk of cement-treated ground, which is an anti-liquefaction ground improved by cemen t-mixing. In this study, the liquefaction potential of cement-treated ground is analyzed statistically using Monte Carlo Simulation based on the nonlinear earthquake response analysis consid ering the spatial variability of so il properties. The seismic bearing capacity of partially liquefied ground is analyzed in order to estimat e damage costs induced by partial liquefaction. Finally, the annual li quefaction risk is calcu lated by multiplying the liquefaction potential with the damage costs. The results indicated that the proposed new method enables to evaluate the probability of liquefaction, to estimate the damage costs using the hazard curv e, fragility curve induced by liquefaction, and liq uefaction risk curve.

  16. Analysis and Risk Evaluation on the Case of Alteration, Revitalization and Conversion of a Historic Building in Gdańsk

    NASA Astrophysics Data System (ADS)

    Grzyl, Beata; Kristowski, Adam; Miszewska-Urbańska, Emilia

    2017-10-01

    Each investment plan, including the one concerning a building, is exposed to the consequences of various types of threats taking place. Therefore, in the case of some large-scale, atypical and complicated building ventures, some actions included in the procedure of risk management should be taken (identifications, analysis, measurements, control and supervision of the risk). This will allow for the risk to be eliminated or limited. While preparing a building venture, an investor does not possess full information about the course of events on each stage of investment completion. The identification of the above-mentioned unknowns, subjecting them to quantification and specifying the method of dealing with them, allows an investor to increase the effectiveness of the intended plan. The enterprise discussed in this article and analyzed in the context of risk, concerns alteration, revitalization and conversion for office purposes of two buildings located in Gdańsk at 1 and 2 Lastadia Street. These buildings are situated on the area of historical urban layout of Gdańsk, in the northern-eastern part of Stare Przedmieście District (Old Suburb), about 800 meters south from Dlugi Targ Street and 200 meters west from The Old Motława River. The investor is “Gdańskie Melioracje Ltd.”, a limited liability company, which belongs to the Council of Gdańsk. In order to increase the effectiveness of the intended investment venture, while organizing the investment process, the investor commissioned preparation of an analysis and risk evaluation connected with the above-mentioned intention. Based on an on-site visit, the opinions of experts, who have been involved in the process of the preparation of the investment, studies of the available monographies about the technical condition of the buildings at 1 and 2 Lastadia Street and their own experiences, the authors identified 54 types of relevant risks, which have been systematized into 10 subject groups (among others

  17. Prediction of psychosis across protocols and risk cohorts using automated language analysis

    PubMed Central

    Corcoran, Cheryl M.; Carrillo, Facundo; Fernández‐Slezak, Diego; Bedi, Gillinder; Klim, Casimir; Javitt, Daniel C.; Bearden, Carrie E.; Cecchi, Guillermo A.

    2018-01-01

    Language and speech are the primary source of data for psychiatrists to diagnose and treat mental disorders. In psychosis, the very structure of language can be disturbed, including semantic coherence (e.g., derailment and tangentiality) and syntactic complexity (e.g., concreteness). Subtle disturbances in language are evident in schizophrenia even prior to first psychosis onset, during prodromal stages. Using computer‐based natural language processing analyses, we previously showed that, among English‐speaking clinical (e.g., ultra) high‐risk youths, baseline reduction in semantic coherence (the flow of meaning in speech) and in syntactic complexity could predict subsequent psychosis onset with high accuracy. Herein, we aimed to cross‐validate these automated linguistic analytic methods in a second larger risk cohort, also English‐speaking, and to discriminate speech in psychosis from normal speech. We identified an automated machine‐learning speech classifier – comprising decreased semantic coherence, greater variance in that coherence, and reduced usage of possessive pronouns – that had an 83% accuracy in predicting psychosis onset (intra‐protocol), a cross‐validated accuracy of 79% of psychosis onset prediction in the original risk cohort (cross‐protocol), and a 72% accuracy in discriminating the speech of recent‐onset psychosis patients from that of healthy individuals. The classifier was highly correlated with previously identified manual linguistic predictors. Our findings support the utility and validity of automated natural language processing methods to characterize disturbances in semantics and syntax across stages of psychotic disorder. The next steps will be to apply these methods in larger risk cohorts to further test reproducibility, also in languages other than English, and identify sources of variability. This technology has the potential to improve prediction of psychosis outcome among at‐risk youths and identify

  18. Prediction of psychosis across protocols and risk cohorts using automated language analysis.

    PubMed

    Corcoran, Cheryl M; Carrillo, Facundo; Fernández-Slezak, Diego; Bedi, Gillinder; Klim, Casimir; Javitt, Daniel C; Bearden, Carrie E; Cecchi, Guillermo A

    2018-02-01

    Language and speech are the primary source of data for psychiatrists to diagnose and treat mental disorders. In psychosis, the very structure of language can be disturbed, including semantic coherence (e.g., derailment and tangentiality) and syntactic complexity (e.g., concreteness). Subtle disturbances in language are evident in schizophrenia even prior to first psychosis onset, during prodromal stages. Using computer-based natural language processing analyses, we previously showed that, among English-speaking clinical (e.g., ultra) high-risk youths, baseline reduction in semantic coherence (the flow of meaning in speech) and in syntactic complexity could predict subsequent psychosis onset with high accuracy. Herein, we aimed to cross-validate these automated linguistic analytic methods in a second larger risk cohort, also English-speaking, and to discriminate speech in psychosis from normal speech. We identified an automated machine-learning speech classifier - comprising decreased semantic coherence, greater variance in that coherence, and reduced usage of possessive pronouns - that had an 83% accuracy in predicting psychosis onset (intra-protocol), a cross-validated accuracy of 79% of psychosis onset prediction in the original risk cohort (cross-protocol), and a 72% accuracy in discriminating the speech of recent-onset psychosis patients from that of healthy individuals. The classifier was highly correlated with previously identified manual linguistic predictors. Our findings support the utility and validity of automated natural language processing methods to characterize disturbances in semantics and syntax across stages of psychotic disorder. The next steps will be to apply these methods in larger risk cohorts to further test reproducibility, also in languages other than English, and identify sources of variability. This technology has the potential to improve prediction of psychosis outcome among at-risk youths and identify linguistic targets for remediation

  19. Occupational risk for Legionella infection among dental healthcare workers: meta-analysis in occupational epidemiology.

    PubMed

    Petti, Stefano; Vitali, Matteo

    2017-07-13

    The occupational risk for Legionella infection among dental healthcare workers (DHCWs) is conjectured because of the risk of routine inhalation of potentially contaminated aerosols produced by the dental instruments. Nevertheless, occupational epidemiology studies are contrasting. This meta-analysis assessed the level of scientific evidence regarding the relative occupational risk for Legionella infection among DHCWs. Literature search was performed without time and language restrictions, using broad data banks (PubMed, Scopus, Web of Science, GOOGLE Scholar) and generic keywords ('legionella' AND 'dent*'). Analytical cross-sectional studies comparing prevalence of high serum Legionella antibody levels in DHCWs and occupationally unexposed individuals were considered. The relative occupational risk was assessed through prevalence ratio (PR) with 95% CI. Between-study heterogeneity was assessed (Cochran's Q test) and was used to choose the meta-analytic method. Study quality (modified Newcastle-Ottawa Scale) and publication bias (Begg and Mazumdar's test, Egger and colleagues' test, trim and fill R 0 method) were assessed formally and considered for the sensitivity analysis. Sensitivity analysis to study inclusion, subgroup analyses (dental staff categories; publication year, before vs after 1998, ie, 5 years after the release by the Centers for Disease Control and Prevention of the infection control guidelines in dental healthcare setting) were performed. Seven studies were included (2232 DHCWs, 1172 occupationally unexposed individuals). No evidence of publication bias was detected. The pooled PR estimate was statistically non-significant at 95% level (1.7; 95% CI 0.8 to 3.2), study-quality adjustment did not change the PR considerably (PR, 1.5; 95% CI 0.5 to 4.1). PR was statistically significant before 1998 and no longer significant after 1998. Subgroup analysis according to DHCW categories was inconclusive. There is no scientific evidence that DHCWs are

  20. Risk analysis for roadways subjected to multiple landslide-related hazards

    NASA Astrophysics Data System (ADS)

    Corominas, Jordi; Mavrouli, Olga

    2014-05-01

    Roadways through mountainous terrain often involve cuts and landslide areas whose stability is precarious and require protection and stabilization works. To optimize the allocation of resources, government and technical offices are increasingly interested in both the risk analysis and assessment. Risk analysis has to consider the hazard occurrence and the consequences. The consequences can be both direct and indirect. The former include the costs regarding the repair of the roadway, the damage of vehicles and the potential fatalities, while the latter refer to the costs related to the diversion of vehicles, the excess of distance travelled, the time differences, and tolls. The type of slope instabilities that may affect a roadway may vary and its effects as well. Most current approaches either consider a single hazardous phenomenon each time, or if applied at small (for example national) scale, they do not take into account local conditions at each section of the roadway. The objective of this work is the development of a simple and comprehensive methodology for the assessment of the risk due to multiple hazards along roadways, integrating different landslide types that include rockfalls, debris flows and considering as well the potential failure of retaining walls. To quantify risk, all hazards are expressed with a common term: their probability of occurrence. The methodology takes into consideration the specific local conditions along the roadway. For rockfalls and debris flow a variety of methods for assessing the probability of occurrence exists. To assess the annual probability of failure of retaining walls we use an indicator-based model that provides a hazard index. The model parameters consist in the design safety factor, and further anchorage design and construction parameters. The probability of failure is evaluated in function of the hazard index and next corrected (in terms of order of magnitude) according to in situ observations for increase of two

  1. Risk factors for technical failure of endoscopic double self-expandable metallic stent placement by partial stent-in-stent method.

    PubMed

    Kawakubo, Kazumichi; Kawakami, Hiroshi; Toyokawa, Yoshihide; Otani, Koichi; Kuwatani, Masaki; Abe, Yoko; Kawahata, Shuhei; Kubo, Kimitoshi; Kubota, Yoshimasa; Sakamoto, Naoya

    2015-01-01

    Endoscopic double self-expandable metallic stent (SEMS) placement by the partial stent-in-stent (PSIS) method has been reported to be useful for the management of unresectable hilar malignant biliary obstruction. However, it is technically challenging, and the optimal SEMS for the procedure remains unknown. The aim of this study was to identify the risk factors for technical failure of endoscopic double SEMS placement for unresectable malignant hilar biliary obstruction (MHBO). Between December 2009 and May 2013, 50 consecutive patients with MHBO underwent endoscopic double SEMS placement by the PSIS method. We retrospectively evaluated the rate of successful double SEMS placement and identified the risk factors for technical failure. The technical success rate for double SEMS placement was 82.0% (95% confidence interval [CI]: 69.2-90.2). On univariate analysis, the rate of technical failure was high in patients with metastatic disease and unilateral placement. Multivariate analysis revealed that metastatic disease was a significant risk factor for technical failure (odds ratio: 9.63, 95% CI: 1.11-105.5). The subgroup analysis after double guidewire insertion showed that the rate of technical success was higher in the laser-cut type SEMS with a large mesh and thick delivery system than in the braided type SEMS with a small mesh and thick delivery system. Metastatic disease was a significant risk factor for technical failure of double SEMS placement for unresectable MHBO. The laser-cut type SEMS with a large mesh and thin delivery system might be preferable for the PSIS procedure. © 2014 Japanese Society of Hepato-Biliary-Pancreatic Surgery.

  2. Cardiometabolic Risk Clustering in Spinal Cord Injury: Results of Exploratory Factor Analysis

    PubMed Central

    2013-01-01

    Background: Evidence suggests an elevated prevalence of cardiometabolic risks among persons with spinal cord injury (SCI); however, the unique clustering of risk factors in this population has not been fully explored. Objective: The purpose of this study was to describe unique clustering of cardiometabolic risk factors differentiated by level of injury. Methods: One hundred twenty-one subjects (mean 37 ± 12 years; range, 18–73) with chronic C5 to T12 motor complete SCI were studied. Assessments included medical histories, anthropometrics and blood pressure, and fasting serum lipids, glucose, insulin, and hemoglobin A1c (HbA1c). Results: The most common cardiometabolic risk factors were overweight/obesity, high levels of low-density lipoprotein (LDL-C), and low levels of high-density lipoprotein (HDL-C). Risk clustering was found in 76.9% of the population. Exploratory principal component factor analysis using varimax rotation revealed a 3–factor model in persons with paraplegia (65.4% variance) and a 4–factor solution in persons with tetraplegia (73.3% variance). The differences between groups were emphasized by the varied composition of the extracted factors: Lipid Profile A (total cholesterol [TC] and LDL-C), Body Mass-Hypertension Profile (body mass index [BMI], systolic blood pressure [SBP], and fasting insulin [FI]); Glycemic Profile (fasting glucose and HbA1c), and Lipid Profile B (TG and HDL-C). BMI and SBP formed a separate factor only in persons with tetraplegia. Conclusions: Although the majority of the population with SCI has risk clustering, the composition of the risk clusters may be dependent on level of injury, based on a factor analysis group comparison. This is clinically plausible and relevant as tetraplegics tend to be hypo- to normotensive and more sedentary, resulting in lower HDL-C and a greater propensity toward impaired carbohydrate metabolism. PMID:23960702

  3. Using multiscale texture and density features for near-term breast cancer risk analysis

    PubMed Central

    Sun, Wenqing; Tseng, Tzu-Liang (Bill); Qian, Wei; Zhang, Jianying; Saltzstein, Edward C.; Zheng, Bin; Lure, Fleming; Yu, Hui; Zhou, Shi

    2015-01-01

    Purpose: To help improve efficacy of screening mammography by eventually establishing a new optimal personalized screening paradigm, the authors investigated the potential of using the quantitative multiscale texture and density feature analysis of digital mammograms to predict near-term breast cancer risk. Methods: The authors’ dataset includes digital mammograms acquired from 340 women. Among them, 141 were positive and 199 were negative/benign cases. The negative digital mammograms acquired from the “prior” screening examinations were used in the study. Based on the intensity value distributions, five subregions at different scales were extracted from each mammogram. Five groups of features, including density and texture features, were developed and calculated on every one of the subregions. Sequential forward floating selection was used to search for the effective combinations. Using the selected features, a support vector machine (SVM) was optimized using a tenfold validation method to predict the risk of each woman having image-detectable cancer in the next sequential mammography screening. The area under the receiver operating characteristic curve (AUC) was used as the performance assessment index. Results: From a total number of 765 features computed from multiscale subregions, an optimal feature set of 12 features was selected. Applying this feature set, a SVM classifier yielded performance of AUC = 0.729 ± 0.021. The positive predictive value was 0.657 (92 of 140) and the negative predictive value was 0.755 (151 of 200). Conclusions: The study results demonstrated a moderately high positive association between risk prediction scores generated by the quantitative multiscale mammographic image feature analysis and the actual risk of a woman having an image-detectable breast cancer in the next subsequent examinations. PMID:26127038

  4. Holistic stakeholder-oriented and case study-based risk analysis

    NASA Astrophysics Data System (ADS)

    Heisterkamp, Tobias

    2013-04-01

    by evaluating the results and their correspondence to reality continuously. Using case studies can help to identify important stakeholders, notably potential affected groups. To cover essential interests of all important stakeholders, a wide range of vulnerabilities, regarding physical and social aspects, and including their resiliences, has to be assessed. The case studies of storm events offer a solid base for investigations, no matter which method is used. They expose shortcomings like gaps in the warning chain or misunderstandings in warning communication. Case studies of extreme events are very interesting for many stakeholders, insurances or fire brigades use them in their daily work. Thus a case study-based approach is a further chance to integrated practitioners into the analysis process and to get results, which they easily understand and which they can transfer into application. There could a second advantage in taking many data sets into account: Each data set like the meteorological observations of wind gust speeds has inherent shortcomings like limited expressiveness, significance or especially uncertainties. Using various approaches could frame the final result and prevent expanding biases or misinterpretations. Altogether, this work stresses the role of transdisciplinary holistic approaches in vulnerability assessments for risk analyses.

  5. Use of Model-Based Design Methods for Enhancing Resiliency Analysis of Unmanned Aerial Vehicles

    NASA Astrophysics Data System (ADS)

    Knox, Lenora A.

    The most common traditional non-functional requirement analysis is reliability. With systems becoming more complex, networked, and adaptive to environmental uncertainties, system resiliency has recently become the non-functional requirement analysis of choice. Analysis of system resiliency has challenges; which include, defining resilience for domain areas, identifying resilience metrics, determining resilience modeling strategies, and understanding how to best integrate the concepts of risk and reliability into resiliency. Formal methods that integrate all of these concepts do not currently exist in specific domain areas. Leveraging RAMSoS, a model-based reliability analysis methodology for Systems of Systems (SoS), we propose an extension that accounts for resiliency analysis through evaluation of mission performance, risk, and cost using multi-criteria decision-making (MCDM) modeling and design trade study variability modeling evaluation techniques. This proposed methodology, coined RAMSoS-RESIL, is applied to a case study in the multi-agent unmanned aerial vehicle (UAV) domain to investigate the potential benefits of a mission architecture where functionality to complete a mission is disseminated across multiple UAVs (distributed) opposed to being contained in a single UAV (monolithic). The case study based research demonstrates proof of concept for the proposed model-based technique and provides sufficient preliminary evidence to conclude which architectural design (distributed vs. monolithic) is most resilient based on insight into mission resilience performance, risk, and cost in addition to the traditional analysis of reliability.

  6. A Risk-Analysis Approach to Implementing Web-Based Assessment

    ERIC Educational Resources Information Center

    Ricketts, Chris; Zakrzewski, Stan

    2005-01-01

    Computer-Based Assessment is a risky business. This paper proposes the use of a model for web-based assessment systems that identifies pedagogic, operational, technical (non web-based), web-based and financial risks. The strategies and procedures for risk elimination or reduction arise from risk analysis and management and are the means by which…

  7. Cut set-based risk and reliability analysis for arbitrarily interconnected networks

    DOEpatents

    Wyss, Gregory D.

    2000-01-01

    Method for computing all-terminal reliability for arbitrarily interconnected networks such as the United States public switched telephone network. The method includes an efficient search algorithm to generate minimal cut sets for nonhierarchical networks directly from the network connectivity diagram. Efficiency of the search algorithm stems in part from its basis on only link failures. The method also includes a novel quantification scheme that likewise reduces computational effort associated with assessing network reliability based on traditional risk importance measures. Vast reductions in computational effort are realized since combinatorial expansion and subsequent Boolean reduction steps are eliminated through analysis of network segmentations using a technique of assuming node failures to occur on only one side of a break in the network, and repeating the technique for all minimal cut sets generated with the search algorithm. The method functions equally well for planar and non-planar networks.

  8. Risk analysis approach. [of carbon fiber release

    NASA Technical Reports Server (NTRS)

    Huston, R. J.

    1979-01-01

    The assessment of the carbon fiber hazard is outlined. Program objectives, requirements of the risk analysis, and elements associated with the physical phenomena of the accidental release are described.

  9. An object-oriented approach to risk and reliability analysis : methodology and aviation safety applications.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dandini, Vincent John; Duran, Felicia Angelica; Wyss, Gregory Dane

    2003-09-01

    This article describes how features of event tree analysis and Monte Carlo-based discrete event simulation can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology, with some of the best features of each. The resultant object-based event scenario tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible. Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST methodology is then applied to anmore » aviation safety problem that considers mechanisms by which an aircraft might become involved in a runway incursion incident. The resulting OBEST model demonstrates how a close link between human reliability analysis and probabilistic risk assessment methods can provide important insights into aviation safety phenomenology.« less

  10. Lagged segmented Poincaré plot analysis for risk stratification in patients with dilated cardiomyopathy.

    PubMed

    Voss, Andreas; Fischer, Claudia; Schroeder, Rico; Figulla, Hans R; Goernig, Matthias

    2012-07-01

    The objectives of this study were to introduce a new type of heart-rate variability analysis improving risk stratification in patients with idiopathic dilated cardiomyopathy (DCM) and to provide additional information about impaired heart beat generation in these patients. Beat-to-beat intervals (BBI) of 30-min ECGs recorded from 91 DCM patients and 21 healthy subjects were analyzed applying the lagged segmented Poincaré plot analysis (LSPPA) method. LSPPA includes the Poincaré plot reconstruction with lags of 1-100, rotating the cloud of points, its normalized segmentation adapted to their standard deviations, and finally, a frequency-dependent clustering. The lags were combined into eight different clusters representing specific frequency bands within 0.012-1.153 Hz. Statistical differences between low- and high-risk DCM could be found within the clusters II-VIII (e.g., cluster IV: 0.033-0.038 Hz; p = 0.0002; sensitivity = 85.7 %; specificity = 71.4 %). The multivariate statistics led to a sensitivity of 92.9 %, specificity of 85.7 % and an area under the curve of 92.1 % discriminating these patient groups. We introduced the LSPPA method to investigate time correlations in BBI time series. We found that LSPPA contributes considerably to risk stratification in DCM and yields the highest discriminant power in the low and very low-frequency bands.

  11. Failure mode and effect analysis: improving intensive care unit risk management processes.

    PubMed

    Askari, Roohollah; Shafii, Milad; Rafiei, Sima; Abolhassani, Mohammad Sadegh; Salarikhah, Elaheh

    2017-04-18

    Purpose Failure modes and effects analysis (FMEA) is a practical tool to evaluate risks, discover failures in a proactive manner and propose corrective actions to reduce or eliminate potential risks. The purpose of this paper is to apply FMEA technique to examine the hazards associated with the process of service delivery in intensive care unit (ICU) of a tertiary hospital in Yazd, Iran. Design/methodology/approach This was a before-after study conducted between March 2013 and December 2014. By forming a FMEA team, all potential hazards associated with ICU services - their frequency and severity - were identified. Then risk priority number was calculated for each activity as an indicator representing high priority areas that need special attention and resource allocation. Findings Eight failure modes with highest priority scores including endotracheal tube defect, wrong placement of endotracheal tube, EVD interface, aspiration failure during suctioning, chest tube failure, tissue injury and deep vein thrombosis were selected for improvement. Findings affirmed that improvement strategies were generally satisfying and significantly decreased total failures. Practical implications Application of FMEA in ICUs proved to be effective in proactively decreasing the risk of failures and corrected the control measures up to acceptable levels in all eight areas of function. Originality/value Using a prospective risk assessment approach, such as FMEA, could be beneficial in dealing with potential failures through proposing preventive actions in a proactive manner. The method could be used as a tool for healthcare continuous quality improvement so that the method identifies both systemic and human errors, and offers practical advice to deal effectively with them.

  12. Thinking beyond Opisthorchis viverrini for risk of cholangiocarcinoma in the lower Mekong region: a systematic review and meta-analysis.

    PubMed

    Steele, Jennifer A; Richter, Carsten H; Echaubard, Pierre; Saenna, Parichat; Stout, Virginia; Sithithaworn, Paiboon; Wilcox, Bruce A

    2018-05-17

    Cholangiocarcinoma (CCA) is a fatal bile duct cancer associated with infection by the liver fluke, Opisthorchis viverrini, in the lower Mekong region. Numerous public health interventions have focused on reducing exposure to O. viverrini, but incidence of CCA in the region remains high. While this may indicate the inefficacy of public health interventions due to complex social and cultural factors, it may further indicate other risk factors or interactions with the parasite are important in pathogenesis of CCA. This systematic review aims to provide a comprehensive analysis of described risk factors for CCA in addition to O. viverrini to guide future integrative interventions. We searched five international and seven Thai research databases to identify studies relevant to risk factors for CCA in the lower Mekong region. Selected studies were assessed for risk of bias and quality in terms of study design, population, CCA diagnostic methods, and statistical methods. The final 18 included studies reported numerous risk factors which were grouped into behaviors, socioeconomics, diet, genetics, gender, immune response, other infections, and treatment for O. viverrini. Seventeen risk factors were reported by two or more studies and were assessed with random effects models during meta-analysis. This meta-analysis indicates that the combination of alcohol and smoking (OR = 11.1, 95% CI: 5.63-21.92, P <  0.0001) is most significantly associated with increased risk for CCA and is an even greater risk factor than O. viverrini exposure. This analysis also suggests that family history of cancer, consumption of raw cyprinoid fish, consumption of high nitrate foods, and praziquantel treatment are associated with significantly increased risk. These risk factors may have complex relationships with the host, parasite, or pathogenesis of CCA, and many of these risk factors were found to interact with each other in one or more studies. Our findings suggest that a complex

  13. Application of meta-analysis methods for identifying proteomic expression level differences.

    PubMed

    Amess, Bob; Kluge, Wolfgang; Schwarz, Emanuel; Haenisch, Frieder; Alsaif, Murtada; Yolken, Robert H; Leweke, F Markus; Guest, Paul C; Bahn, Sabine

    2013-07-01

    We present new statistical approaches for identification of proteins with expression levels that are significantly changed when applying meta-analysis to two or more independent experiments. We showed that the Euclidean distance measure has reduced risk of false positives compared to the rank product method. Our Ψ-ranking method has advantages over the traditional fold-change approach by incorporating both the fold-change direction as well as the p-value. In addition, the second novel method, Π-ranking, considers the ratio of the fold-change and thus integrates all three parameters. We further improved the latter by introducing our third technique, Σ-ranking, which combines all three parameters in a balanced nonparametric approach. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Will HIV Vaccination Reshape HIV Risk Behavior Networks? A Social Network Analysis of Drug Users' Anticipated Risk Compensation

    PubMed Central

    Young, April M.; Halgin, Daniel S.; DiClemente, Ralph J.; Sterk, Claire E.; Havens, Jennifer R.

    2014-01-01

    Background An HIV vaccine could substantially impact the epidemic. However, risk compensation (RC), or post-vaccination increase in risk behavior, could present a major challenge. The methodology used in previous studies of risk compensation has been almost exclusively individual-level in focus, and has not explored how increased risk behavior could affect the connectivity of risk networks. This study examined the impact of anticipated HIV vaccine-related RC on the structure of high-risk drug users' sexual and injection risk network. Methods A sample of 433 rural drug users in the US provided data on their risk relationships (i.e., those involving recent unprotected sex and/or injection equipment sharing). Dyad-specific data were collected on likelihood of increasing/initiating risk behavior if they, their partner, or they and their partner received an HIV vaccine. Using these data and social network analysis, a "post-vaccination network" was constructed and compared to the current network on measures relevant to HIV transmission, including network size, cohesiveness (e.g., diameter, component structure, density), and centrality. Results Participants reported 488 risk relationships. Few reported an intention to decrease condom use or increase equipment sharing (4% and 1%, respectively). RC intent was reported in 30 existing risk relationships and vaccination was anticipated to elicit the formation of five new relationships. RC resulted in a 5% increase in risk network size (n = 142 to n = 149) and a significant increase in network density. The initiation of risk relationships resulted in the connection of otherwise disconnected network components, with the largest doubling in size from five to ten. Conclusions This study demonstrates a new methodological approach to studying RC and reveals that behavior change following HIV vaccination could potentially impact risk network connectivity. These data will be valuable in parameterizing future network models

  15. A multifactorial analysis of obesity as CVD risk factor: use of neural network based methods in a nutrigenetics context.

    PubMed

    Valavanis, Ioannis K; Mougiakakou, Stavroula G; Grimaldi, Keith A; Nikita, Konstantina S

    2010-09-08

    Obesity is a multifactorial trait, which comprises an independent risk factor for cardiovascular disease (CVD). The aim of the current work is to study the complex etiology beneath obesity and identify genetic variations and/or factors related to nutrition that contribute to its variability. To this end, a set of more than 2300 white subjects who participated in a nutrigenetics study was used. For each subject a total of 63 factors describing genetic variants related to CVD (24 in total), gender, and nutrition (38 in total), e.g. average daily intake in calories and cholesterol, were measured. Each subject was categorized according to body mass index (BMI) as normal (BMI ≤ 25) or overweight (BMI > 25). Two artificial neural network (ANN) based methods were designed and used towards the analysis of the available data. These corresponded to i) a multi-layer feed-forward ANN combined with a parameter decreasing method (PDM-ANN), and ii) a multi-layer feed-forward ANN trained by a hybrid method (GA-ANN) which combines genetic algorithms and the popular back-propagation training algorithm. PDM-ANN and GA-ANN were comparatively assessed in terms of their ability to identify the most important factors among the initial 63 variables describing genetic variations, nutrition and gender, able to classify a subject into one of the BMI related classes: normal and overweight. The methods were designed and evaluated using appropriate training and testing sets provided by 3-fold Cross Validation (3-CV) resampling. Classification accuracy, sensitivity, specificity and area under receiver operating characteristics curve were utilized to evaluate the resulted predictive ANN models. The most parsimonious set of factors was obtained by the GA-ANN method and included gender, six genetic variations and 18 nutrition-related variables. The corresponding predictive model was characterized by a mean accuracy equal of 61.46% in the 3-CV testing sets. The ANN based methods revealed factors

  16. A multifactorial analysis of obesity as CVD risk factor: Use of neural network based methods in a nutrigenetics context

    PubMed Central

    2010-01-01

    Background Obesity is a multifactorial trait, which comprises an independent risk factor for cardiovascular disease (CVD). The aim of the current work is to study the complex etiology beneath obesity and identify genetic variations and/or factors related to nutrition that contribute to its variability. To this end, a set of more than 2300 white subjects who participated in a nutrigenetics study was used. For each subject a total of 63 factors describing genetic variants related to CVD (24 in total), gender, and nutrition (38 in total), e.g. average daily intake in calories and cholesterol, were measured. Each subject was categorized according to body mass index (BMI) as normal (BMI ≤ 25) or overweight (BMI > 25). Two artificial neural network (ANN) based methods were designed and used towards the analysis of the available data. These corresponded to i) a multi-layer feed-forward ANN combined with a parameter decreasing method (PDM-ANN), and ii) a multi-layer feed-forward ANN trained by a hybrid method (GA-ANN) which combines genetic algorithms and the popular back-propagation training algorithm. Results PDM-ANN and GA-ANN were comparatively assessed in terms of their ability to identify the most important factors among the initial 63 variables describing genetic variations, nutrition and gender, able to classify a subject into one of the BMI related classes: normal and overweight. The methods were designed and evaluated using appropriate training and testing sets provided by 3-fold Cross Validation (3-CV) resampling. Classification accuracy, sensitivity, specificity and area under receiver operating characteristics curve were utilized to evaluate the resulted predictive ANN models. The most parsimonious set of factors was obtained by the GA-ANN method and included gender, six genetic variations and 18 nutrition-related variables. The corresponding predictive model was characterized by a mean accuracy equal of 61.46% in the 3-CV testing sets. Conclusions The ANN

  17. Development of Optimization method about Capital Structure and Senior-Sub Structure by considering Project-Risk

    NASA Astrophysics Data System (ADS)

    Kawamoto, Shigeru; Ikeda, Yuichi; Fukui, Chihiro; Tateshita, Fumihiko

    Private finance initiative is a business scheme that materializes social infrastructure and public services by utilizing private-sector resources. In this paper we propose a new method to optimize capital structure, which is the ratio of capital to debt, and senior-sub structure, which is the ratio of senior loan to subordinated loan, for private finance initiative. We make the quantitative analysis of a private finance initiative's project using the proposed method. We analyze trade-off structure between risk and return in the project, and optimize capital structure and senior-sub structure. The method we propose helps to improve financial stability of the project, and to make a fund raising plan that is expected to be reasonable for project sponsor and moneylender.

  18. Integrated flood hazard assessment based on spatial ordered weighted averaging method considering spatial heterogeneity of risk preference.

    PubMed

    Xiao, Yangfan; Yi, Shanzhen; Tang, Zhongqian

    2017-12-01

    Flood is the most common natural hazard in the world and has caused serious loss of life and property. Assessment of flood prone areas is of great importance for watershed management and reduction of potential loss of life and property. In this study, a framework of multi-criteria analysis (MCA) incorporating geographic information system (GIS), fuzzy analytic hierarchy process (AHP) and spatial ordered weighted averaging (OWA) method was developed for flood hazard assessment. The factors associated with geographical, hydrological and flood-resistant characteristics of the basin were selected as evaluation criteria. The relative importance of the criteria was estimated through fuzzy AHP method. The OWA method was utilized to analyze the effects of different risk attitudes of the decision maker on the assessment result. The spatial ordered weighted averaging method with spatially variable risk preference was implemented in the GIS environment to integrate the criteria. The advantage of the proposed method is that it has considered spatial heterogeneity in assigning risk preference in the decision-making process. The presented methodology has been applied to the area including Hanyang, Caidian and Hannan of Wuhan, China, where flood events occur frequently. The outcome of flood hazard distribution presents a tendency of high risk towards populated and developed areas, especially the northeast part of Hanyang city, which has suffered frequent floods in history. The result indicates where the enhancement projects should be carried out first under the condition of limited resources. Finally, sensitivity of the criteria weights was analyzed to measure the stability of results with respect to the variation of the criteria weights. The flood hazard assessment method presented in this paper is adaptable for hazard assessment of a similar basin, which is of great significance to establish counterplan to mitigate life and property losses. Copyright © 2017 Elsevier B.V. All

  19. High coffee consumption and different brewing methods in relation to postmenopausal endometrial cancer risk in the Norwegian Women and Cancer Study: a population-based prospective study

    PubMed Central

    2014-01-01

    Background Coffee and its compounds have been proposed to inhibit endometrial carcinogenesis. Studies in the Norwegian population can be especially interesting due to the high coffee consumption and increasing incidence of endometrial cancer in the country. Methods A total of 97 926 postmenopausal Norwegian women from the population-based prospective Norwegian Women and Cancer (NOWAC) Study, were included in the present analysis. We evaluated the general association between total coffee consumption and endometrial cancer risk as well as the possible impact of brewing method. Multivariate Cox regression analysis was used to estimate risks, and heterogeneity tests were performed to compare brewing methods. Results During an average of 10.9 years of follow-up, 462 incident endometrial cancer cases were identified. After multivariate adjustment, significant risk reduction was found among participants who drank ≥8 cups/day of coffee with a hazard ratio of 0.52 (95% confidence interval, CI 0.34-0.79). However, we did not observe a significant dose-response relationship. No significant heterogeneity in risk was found when comparing filtered and boiled coffee brewing methods. A reduction in endometrial cancer risk was observed in subgroup analyses among participants who drank ≥8 cups/day and had a body mass index ≥25 kg/m2, and in current smokers. Conclusions These data suggest that in this population with high coffee consumption, endometrial cancer risk decreases in women consuming ≥8 cups/day, independent of brewing method. PMID:24666820

  20. The effectiveness of risk management: an analysis of project risk planning across industries and countries.

    PubMed

    Zwikael, Ofer; Ahn, Mark

    2011-01-01

    This article examines the effectiveness of current risk management practices to reduce project risk using a multinational, multi-industry study across different scenarios and cultures. A survey was administered to 701 project managers, and their supervisors, in seven industries and three diverse countries (New Zealand, Israel, and Japan), in multiple languages during the 2002-2007 period. Results of this study show that project context--industry and country where a project is executed--significantly impacts perceived levels of project risk, and the intensity of risk management processes. Our findings also suggest that risk management moderates the relationship between risk level and project success. Specifically, we found that even moderate levels of risk management planning are sufficient to reduce the negative effect risk levels have on project success. © 2010 Society for Risk Analysis.

  1. Osteoporosis risk prediction using machine learning and conventional methods.

    PubMed

    Kim, Sung Kean; Yoo, Tae Keun; Oh, Ein; Kim, Deok Won

    2013-01-01

    A number of clinical decision tools for osteoporosis risk assessment have been developed to select postmenopausal women for the measurement of bone mineral density. We developed and validated machine learning models with the aim of more accurately identifying the risk of osteoporosis in postmenopausal women, and compared with the ability of a conventional clinical decision tool, osteoporosis self-assessment tool (OST). We collected medical records from Korean postmenopausal women based on the Korea National Health and Nutrition Surveys (KNHANES V-1). The training data set was used to construct models based on popular machine learning algorithms such as support vector machines (SVM), random forests (RF), artificial neural networks (ANN), and logistic regression (LR) based on various predictors associated with low bone density. The learning models were compared with OST. SVM had significantly better area under the curve (AUC) of the receiver operating characteristic (ROC) than ANN, LR, and OST. Validation on the test set showed that SVM predicted osteoporosis risk with an AUC of 0.827, accuracy of 76.7%, sensitivity of 77.8%, and specificity of 76.0%. We were the first to perform comparisons of the performance of osteoporosis prediction between the machine learning and conventional methods using population-based epidemiological data. The machine learning methods may be effective tools for identifying postmenopausal women at high risk for osteoporosis.

  2. Predicting complication risk in spine surgery: a prospective analysis of a novel risk assessment tool.

    PubMed

    Veeravagu, Anand; Li, Amy; Swinney, Christian; Tian, Lu; Moraff, Adrienne; Azad, Tej D; Cheng, Ivan; Alamin, Todd; Hu, Serena S; Anderson, Robert L; Shuer, Lawrence; Desai, Atman; Park, Jon; Olshen, Richard A; Ratliff, John K

    2017-07-01

    OBJECTIVE The ability to assess the risk of adverse events based on known patient factors and comorbidities would provide more effective preoperative risk stratification. Present risk assessment in spine surgery is limited. An adverse event prediction tool was developed to predict the risk of complications after spine surgery and tested on a prospective patient cohort. METHODS The spinal Risk Assessment Tool (RAT), a novel instrument for the assessment of risk for patients undergoing spine surgery that was developed based on an administrative claims database, was prospectively applied to 246 patients undergoing 257 spinal procedures over a 3-month period. Prospectively collected data were used to compare the RAT to the Charlson Comorbidity Index (CCI) and the American College of Surgeons National Surgery Quality Improvement Program (ACS NSQIP) Surgical Risk Calculator. Study end point was occurrence and type of complication after spine surgery. RESULTS The authors identified 69 patients (73 procedures) who experienced a complication over the prospective study period. Cardiac complications were most common (10.2%). Receiver operating characteristic (ROC) curves were calculated to compare complication outcomes using the different assessment tools. Area under the curve (AUC) analysis showed comparable predictive accuracy between the RAT and the ACS NSQIP calculator (0.670 [95% CI 0.60-0.74] in RAT, 0.669 [95% CI 0.60-0.74] in NSQIP). The CCI was not accurate in predicting complication occurrence (0.55 [95% CI 0.48-0.62]). The RAT produced mean probabilities of 34.6% for patients who had a complication and 24% for patients who did not (p = 0.0003). The generated predicted values were stratified into low, medium, and high rates. For the RAT, the predicted complication rate was 10.1% in the low-risk group (observed rate 12.8%), 21.9% in the medium-risk group (observed 31.8%), and 49.7% in the high-risk group (observed 41.2%). The ACS NSQIP calculator consistently

  3. Assessment and uncertainty analysis of groundwater risk.

    PubMed

    Li, Fawen; Zhu, Jingzhao; Deng, Xiyuan; Zhao, Yong; Li, Shaofei

    2018-01-01

    Groundwater with relatively stable quantity and quality is commonly used by human being. However, as the over-mining of groundwater, problems such as groundwater funnel, land subsidence and salt water intrusion have emerged. In order to avoid further deterioration of hydrogeological problems in over-mining regions, it is necessary to conduct the assessment of groundwater risk. In this paper, risks of shallow and deep groundwater in the water intake area of the South-to-North Water Transfer Project in Tianjin, China, were evaluated. Firstly, two sets of four-level evaluation index system were constructed based on the different characteristics of shallow and deep groundwater. Secondly, based on the normalized factor values and the synthetic weights, the risk values of shallow and deep groundwater were calculated. Lastly, the uncertainty of groundwater risk assessment was analyzed by indicator kriging method. The results meet the decision maker's demand for risk information, and overcome previous risk assessment results expressed in the form of deterministic point estimations, which ignore the uncertainty of risk assessment. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. A review of recent advances in risk analysis for wildfire management

    Treesearch

    Carol Miller; Alan A. Ager

    2012-01-01

    Risk analysis evolved out of the need to make decisions concerning highly stochastic events, and is well suited to analyze the timing, location and potential effects of wildfires. Over the past 10 years, the application of risk analysis to wildland fire management has seen steady growth with new risk-based analytical tools that support a wide range of fire and fuels...

  5. A simple rapid approach using coupled multivariate statistical methods, GIS and trajectory models to delineate areas of common oil spill risk

    NASA Astrophysics Data System (ADS)

    Guillen, George; Rainey, Gail; Morin, Michelle

    2004-04-01

    Currently, the Minerals Management Service uses the Oil Spill Risk Analysis model (OSRAM) to predict the movement of potential oil spills greater than 1000 bbl originating from offshore oil and gas facilities. OSRAM generates oil spill trajectories using meteorological and hydrological data input from either actual physical measurements or estimates generated from other hydrological models. OSRAM and many other models produce output matrices of average, maximum and minimum contact probabilities to specific landfall or target segments (columns) from oil spills at specific points (rows). Analysts and managers are often interested in identifying geographic areas or groups of facilities that pose similar risks to specific targets or groups of targets if a spill occurred. Unfortunately, due to the potentially large matrix generated by many spill models, this question is difficult to answer without the use of data reduction and visualization methods. In our study we utilized a multivariate statistical method called cluster analysis to group areas of similar risk based on potential distribution of landfall target trajectory probabilities. We also utilized ArcView™ GIS to display spill launch point groupings. The combination of GIS and multivariate statistical techniques in the post-processing of trajectory model output is a powerful tool for identifying and delineating areas of similar risk from multiple spill sources. We strongly encourage modelers, statistical and GIS software programmers to closely collaborate to produce a more seamless integration of these technologies and approaches to analyzing data. They are complimentary methods that strengthen the overall assessment of spill risks.

  6. Risk analysis of new oral anticoagulants for gastrointestinal bleeding and intracranial hemorrhage in atrial fibrillation patients: a systematic review and network meta-analysis.

    PubMed

    Xu, Wei-Wei; Hu, Shen-Jiang; Wu, Tao

    2017-07-01

    Antithrombotic therapy using new oral anticoagulants (NOACs) in patients with atrial fibrillation (AF) has been generally shown to have a favorable risk-benefit profile. Since there has been dispute about the risks of gastrointestinal bleeding (GIB) and intracranial hemorrhage (ICH), we sought to conduct a systematic review and network meta-analysis using Bayesian inference to analyze the risks of GIB and ICH in AF patients taking NOACs. We analyzed data from 20 randomized controlled trials of 91 671 AF patients receiving anticoagulants, antiplatelet drugs, or placebo. Bayesian network meta-analysis of two different evidence networks was performed using a binomial likelihood model, based on a network in which different agents (and doses) were treated as separate nodes. Odds ratios (ORs) and 95% confidence intervals (CIs) were modeled using Markov chain Monte Carlo methods. Indirect comparisons with the Bayesian model confirmed that aspirin+clopidogrel significantly increased the risk of GIB in AF patients compared to the placebo (OR 0.33, 95% CI 0.01-0.92). Warfarin was identified as greatly increasing the risk of ICH compared to edoxaban 30 mg (OR 3.42, 95% CI 1.22-7.24) and dabigatran 110 mg (OR 3.56, 95% CI 1.10-8.45). We further ranked the NOACs for the lowest risk of GIB (apixaban 5 mg) and ICH (apixaban 5 mg, dabigatran 110 mg, and edoxaban 30 mg). Bayesian network meta-analysis of treatment of non-valvular AF patients with anticoagulants suggested that NOACs do not increase risks of GIB and/or ICH, compared to each other.

  7. A method to determine the protection zone of chemical industrial park considering air quality, health risk and environmental risk: a case study.

    PubMed

    Shi, Jingang; Zhang, Mingbo; Li, Dong; Liu, Jia

    2018-04-01

    In China, chemical enterprises are required to cluster into a large number of chemical industrial parks (CIPs), which increase risks and threats to the environment and human being's health due to aggregation of the complicated chemical process and huge unit scale. Setting a scientific and reasonable protection zone around CIP is a very efficient way to protect surrounding people's health. A method was designed to determine the comprehensive protection zone of CIP, taking into account multiple factors: air quality, health risk and environmental risk. By establishing a comprehensive and multi-levels index system, the protection zone and the corresponding environmental risk management countermeasures can be proposed hierarchically, which are very important to the development and environmental risk management of CIP. A CIP located in coastal area of Shandong Province was studied, and it is turned out that the method to determine the protection zone of chemical industrial park considering air quality, health risk and environmental risk has great advantages compared with other methods.

  8. Multidimensional analysis of the effect of occupational exposure to organic solvents on lung cancer risk: the ICARE study

    PubMed Central

    Mattei, Francesca; Liverani, Silvia; Guida, Florence; Matrat, Mireille; Cenée, Sylvie; Azizi, Lamiae; Menvielle, Gwenn; Sanchez, Marie; Pilorget, Corinne; Lapôtre-Ledoux, Bénédicte; Luce, Danièle; Richardson, Sylvia; Stücker, Isabelle

    2016-01-01

    Background The association between lung cancer and occupational exposure to organic solvents is discussed. Since different solvents are often used simultaneously, it is difficult to assess the role of individual substances. Objectives The present study is focused on an in-depth investigation of the potential association between lung cancer risk and occupational exposure to a large group of organic solvents, taking into account the well-known risk factors for lung cancer, tobacco smoking and occupational exposure to asbestos. Methods We analysed data from the Investigation of occupational and environmental causes of respiratory cancers (ICARE) study, a large French population-based case–control study, set up between 2001 and 2007. A total of 2276 male cases and 2780 male controls were interviewed, and long-life occupational history was collected. In order to overcome the analytical difficulties created by multiple correlated exposures, we carried out a novel type of analysis based on Bayesian profile regression. Results After analysis with conventional logistic regression methods, none of the 11 solvents examined were associated with lung cancer risk. Through a profile regression approach, we did not observe any significant association between solvent exposure and lung cancer. However, we identified clusters at high risk that are related to occupations known to be at risk of developing lung cancer, such as painters. Conclusions Organic solvents do not appear to be substantial contributors to the occupational risk of lung cancer for the occupations known to be at risk. PMID:26911986

  9. Walking the line: Understanding pedestrian behaviour and risk at rail level crossings with cognitive work analysis.

    PubMed

    Read, Gemma J M; Salmon, Paul M; Lenné, Michael G; Stanton, Neville A

    2016-03-01

    Pedestrian fatalities at rail level crossings (RLXs) are a public safety concern for governments worldwide. There is little literature examining pedestrian behaviour at RLXs and no previous studies have adopted a formative approach to understanding behaviour in this context. In this article, cognitive work analysis is applied to understand the constraints that shape pedestrian behaviour at RLXs in Melbourne, Australia. The five phases of cognitive work analysis were developed using data gathered via document analysis, behavioural observation, walk-throughs and critical decision method interviews. The analysis demonstrates the complex nature of pedestrian decision making at RLXs and the findings are synthesised to provide a model illustrating the influences on pedestrian decision making in this context (i.e. time, effort and social pressures). Further, the CWA outputs are used to inform an analysis of the risks to safety associated with pedestrian behaviour at RLXs and the identification of potential interventions to reduce risk. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  10. Risk assessment for juvenile justice: a meta-analysis.

    PubMed

    Schwalbe, Craig S

    2007-10-01

    Risk assessment instruments are increasingly employed by juvenile justice settings to estimate the likelihood of recidivism among delinquent juveniles. In concert with their increased use, validation studies documenting their predictive validity have increased in number. The purpose of this study was to assess the average predictive validity of juvenile justice risk assessment instruments and to identify risk assessment characteristics that are associated with higher predictive validity. A search of the published and grey literature yielded 28 studies that estimated the predictive validity of 28 risk assessment instruments. Findings of the meta-analysis were consistent with effect sizes obtained in larger meta-analyses of criminal justice risk assessment instruments and showed that brief risk assessment instruments had smaller effect sizes than other types of instruments. However, this finding is tentative owing to limitations of the literature.

  11. Risk analysis for dry snow slab avalanche release by skier triggering

    NASA Astrophysics Data System (ADS)

    McClung, David

    2013-04-01

    Risk analysis is of primary importance for skier triggering of avalanches since human triggering is responsible for about 90% of deaths from slab avalanches in Europe and North America. Two key measureable quantities about dry slab avalanche release prior to initiation are the depth to the weak layer and the slope angle. Both are important in risk analysis. As the slope angle increases, the probability of avalanche release increases dramatically. As the slab depth increases, the consequences increase if an avalanche releases. Among the simplest risk definitions is (Vick, 2002): Risk = (Probability of failure) x (Consequences of failure). Here, these two components of risk are the probability or chance of avalanche release and the consequences given avalanche release. In this paper, for the first time, skier triggered avalanches were analyzed from probability theory and its relation to risk for both the D and . The data consisted of two quantities : (,D) taken from avalanche fracture line profiles after an avalanche has taken place. Two data sets from accidentally skier triggered avalanches were considered: (1) 718 for and (2) a set of 1242 values of D which represent average values along the fracture line. The values of D were both estimated (about 2/3) and measured (about 1/3) by ski guides from Canadian Mountain Holidays CMH). I also analyzed 1231 accidentally skier triggered avalanches reported by CMH ski guides for avalanche size (representing destructive potential) on the Canadian scale. The size analysis provided a second analysis of consequences to verify that using D. The results showed that there is an intermediate range of both D and with highest risk. ForD, the risk (product of consequences and probability of occurrence) is highest for D in the approximate range 0.6 m - 1.0 m. The consequences are low for lower values of D and the chance of release is low for higher values of D. Thus, the highest product is in the intermediate range. For slope angles

  12. Risk-based planning analysis for a single levee

    NASA Astrophysics Data System (ADS)

    Hui, Rui; Jachens, Elizabeth; Lund, Jay

    2016-04-01

    Traditional risk-based analysis for levee planning focuses primarily on overtopping failure. Although many levees fail before overtopping, few planning studies explicitly include intermediate geotechnical failures in flood risk analysis. This study develops a risk-based model for two simplified levee failure modes: overtopping failure and overall intermediate geotechnical failure from through-seepage, determined by the levee cross section represented by levee height and crown width. Overtopping failure is based only on water level and levee height, while through-seepage failure depends on many geotechnical factors as well, mathematically represented here as a function of levee crown width using levee fragility curves developed from professional judgment or analysis. These levee planning decisions are optimized to minimize the annual expected total cost, which sums expected (residual) annual flood damage and annualized construction costs. Applicability of this optimization approach to planning new levees or upgrading existing levees is demonstrated preliminarily for a levee on a small river protecting agricultural land, and a major levee on a large river protecting a more valuable urban area. Optimized results show higher likelihood of intermediate geotechnical failure than overtopping failure. The effects of uncertainty in levee fragility curves, economic damage potential, construction costs, and hydrology (changing climate) are explored. Optimal levee crown width is more sensitive to these uncertainties than height, while the derived general principles and guidelines for risk-based optimal levee planning remain the same.

  13. Body mass index and risk of BPH: a meta-analysis.

    PubMed

    Wang, S; Mao, Q; Lin, Y; Wu, J; Wang, X; Zheng, X; Xie, L

    2012-09-01

    Epidemiological studies have reported conflicting results relating obesity to BPH. A meta-analysis of cohort and case-control studies was conducted to pool the risk estimates of the association between obesity and BPH. Eligible studies were retrieved by both computer searches and review of references. We analyzed abstracted data with random effects models to obtain the summary risk estimates. Dose-response meta-analysis was performed for studies reporting categorical risk estimates for a series of exposure levels. A total of 19 studies met the inclusion criteria of the meta-analysis. Positive association with body mass index (BMI) was observed in BPH and lower urinary tract symptoms (LUTS) combined group (odds ratio=1.27, 95% confidence intervals 1.05-1.53). In subgroup analysis, BMI exhibited a positive dose-response relationship with BPH/LUTS in population-based case-control studies and a marginal positive association was observed between risk of BPH and increased BMI. However, no association between BPH/LUTS and BMI was observed in other subgroups stratified by study design, geographical region or primary outcome. The overall current literatures suggested that BMI was associated with increased risk of BPH. Further efforts should be made to confirm these findings and clarify the underlying biological mechanisms.

  14. Adversarial risk analysis with incomplete information: a level-k approach.

    PubMed

    Rothschild, Casey; McLay, Laura; Guikema, Seth

    2012-07-01

    This article proposes, develops, and illustrates the application of level-k game theory to adversarial risk analysis. Level-k reasoning, which assumes that players play strategically but have bounded rationality, is useful for operationalizing a Bayesian approach to adversarial risk analysis. It can be applied in a broad class of settings, including settings with asynchronous play and partial but incomplete revelation of early moves. Its computational and elicitation requirements are modest. We illustrate the approach with an application to a simple defend-attack model in which the defender's countermeasures are revealed with a probability less than one to the attacker before he decides on how or whether to attack. © 2011 Society for Risk Analysis.

  15. Fault and event tree analyses for process systems risk analysis: uncertainty handling formulations.

    PubMed

    Ferdous, Refaul; Khan, Faisal; Sadiq, Rehan; Amyotte, Paul; Veitch, Brian

    2011-01-01

    Quantitative risk analysis (QRA) is a systematic approach for evaluating likelihood, consequences, and risk of adverse events. QRA based on event (ETA) and fault tree analyses (FTA) employs two basic assumptions. The first assumption is related to likelihood values of input events, and the second assumption is regarding interdependence among the events (for ETA) or basic events (for FTA). Traditionally, FTA and ETA both use crisp probabilities; however, to deal with uncertainties, the probability distributions of input event likelihoods are assumed. These probability distributions are often hard to come by and even if available, they are subject to incompleteness (partial ignorance) and imprecision. Furthermore, both FTA and ETA assume that events (or basic events) are independent. In practice, these two assumptions are often unrealistic. This article focuses on handling uncertainty in a QRA framework of a process system. Fuzzy set theory and evidence theory are used to describe the uncertainties in the input event likelihoods. A method based on a dependency coefficient is used to express interdependencies of events (or basic events) in ETA and FTA. To demonstrate the approach, two case studies are discussed. © 2010 Society for Risk Analysis.

  16. 21 CFR 2.19 - Methods of analysis.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 1 2011-04-01 2011-04-01 false Methods of analysis. 2.19 Section 2.19 Food and... ADMINISTRATIVE RULINGS AND DECISIONS General Provisions § 2.19 Methods of analysis. Where the method of analysis... enforcement programs to utilize the methods of analysis of the AOAC INTERNATIONAL (AOAC) as published in the...

  17. 21 CFR 2.19 - Methods of analysis.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Methods of analysis. 2.19 Section 2.19 Food and... ADMINISTRATIVE RULINGS AND DECISIONS General Provisions § 2.19 Methods of analysis. Where the method of analysis... enforcement programs to utilize the methods of analysis of the AOAC INTERNATIONAL (AOAC) as published in the...

  18. Multi-criteria decision analysis and environmental risk assessment for nanomaterials

    NASA Astrophysics Data System (ADS)

    Linkov, Igor; Satterstrom, F. Kyle; Steevens, Jeffery; Ferguson, Elizabeth; Pleus, Richard C.

    2007-08-01

    Nanotechnology is a broad and complex discipline that holds great promise for innovations that can benefit mankind. Yet, one must not overlook the wide array of factors involved in managing nanomaterial development, ranging from the technical specifications of the material to possible adverse effects in humans. Other opportunities to evaluate benefits and risks are inherent in environmental health and safety (EHS) issues related to nanotechnology. However, there is currently no structured approach for making justifiable and transparent decisions with explicit trade-offs between the many factors that need to be taken into account. While many possible decision-making approaches exist, we believe that multi-criteria decision analysis (MCDA) is a powerful and scientifically sound decision analytical framework for nanomaterial risk assessment and management. This paper combines state-of-the-art research in MCDA methods applicable to nanotechnology with a hypothetical case study for nanomaterial management. The example shows how MCDA application can balance societal benefits against unintended side effects and risks, and how it can also bring together multiple lines of evidence to estimate the likely toxicity and risk of nanomaterials given limited information on physical and chemical properties. The essential contribution of MCDA is to link this performance information with decision criteria and weightings elicited from scientists and managers, allowing visualization and quantification of the trade-offs involved in the decision-making process.

  19. Soy and isoflavone consumption and risk of gastrointestinal cancer: a systematic review and meta-analysis.

    PubMed

    Tse, Genevieve; Eslick, Guy D

    2016-02-01

    Evidence suggests that soy foods have chemoprotective properties that may reduce the risk of certain cancers such as breast and prostate cancer. However, data involving gastrointestinal (GI) have been limited, and the evidence remains controversial. This study aims to determine the potential relationship between dietary soy intake and GI cancer risk with an evaluation of the effects of isoflavone as an active soy constituent. Relevant studies were identified after literature search via electronic databases through May 2014. Subgroup analysis for isoflavone intake (studies n = 10) was performed. Covariants including gender types, anatomical subsites and preparation methods were also evaluated. Pooled adjusted odds ratios (ORs) comparing highest and lowest categories of dietary pattern scores were calculated using a random effects model. Twenty-two case-control and 18 cohort studies were included for meta-analysis, which contained a total of 633,476 participants and 13,639 GI cancer cases. The combined OR was calculated as 0.93 (95% CI 0.87-0.99; p value heterogeneity = 0.01), showing only a slight decrease in risk, the association was stronger for colon cancer (OR 0.92; 95% CI 0.96-0.99; p value heterogeneity = 0.163) and colorectal cancer (CRC) (OR 0.92; 95% CI 0.87-0.97; p value heterogeneity = 0.3). Subgroup analysis for isoflavone intake showed a statistically significant risk reduction with a risk estimate of 0.73 (95% CI 0.59-0.92; p value heterogeneity = 0), and particularly for CRC (OR 0.76; 95% CI 0.59-0.98; p value heterogeneity = 0). This study provides evidence that soy intake as a food group is only associated with a small reduction in GI cancer risk. Separate analysis for dietary isoflavone intakes suggests a stronger inverse association.

  20. Impact of having a high-risk pregnancy on future postpartum contraceptive method choice.

    PubMed

    Kiykac Altinbas, Sadiman; Bayoglu Tekin, Yesim; Dilbaz, Berna; Kilic, Selim; Khalil, Susan S; Kandemir, Omer

    2014-12-01

    To compare the knowledge and preference of preconceptional contraception to future postpartum contraceptive method choice in high-risk pregnancies. Does a high-risk pregnancy condition affect future postpartum contraceptive method choice? Women hospitalised at the High Risk Pregnancy unit of a tertiary research and training hospital were asked to complete a self-reported questionnaire that included demographic characteristics, presence of unintended pregnancy, contraceptive method of choice before the current pregnancy, plans for contraceptive use following delivery and requests for any contraceptive counselling in the postpartum period. A total of 655 pregnant women were recruited. The mean age, gravidity and parity of the women were 27.48 ± 6.25 years, 2.81 ± 2.15 and 1.40 ± 1.77, respectively. High-risk pregnancy indications included 207 (31.6%) maternal, 396 (60.5%) foetal and 52 (7.9%) uterine factors. All postpartum contraceptive choices except for combined oral contraceptives (COCs) usage were significantly different from preconceptional contraceptive preferences (p<0.001). High-risk pregnancy indications, future child bearing, ideal number of children, income and education levels were the most important factors influencing postpartum contraceptive choices. While the leading contraceptive method in the postpartum period was long-acting reversible contraceptive methods (non-hormonal copper intrauterine device Cu-IUD, the levonorgestrel-releasing intrauterine system (LNG-IUS) (40%), the least preferred method was COCs use (5.2%) and preference of COCs use showed no difference between the preconceptional and postpartum periods (p=0.202). Overall 73.7% of the women wanted to receive contraceptive counselling before their discharge. A high-risk pregnancy condition may change the opinion and preference of contraceptive use, and also seems to affect the awareness of family planning methods. Copyright © 2014 Australian College of Midwives. All rights reserved.

  1. Examining the Nature of the Association Between Attention-Deficit/Hyperactivity Disorder and Nicotine Dependence: A Familial Risk Analysis

    PubMed Central

    Biederman, Joseph; Petty, Carter R.; Hammerness, Paul; Woodworth, K. Yvonne; Faraone, Stephen V.

    2013-01-01

    Objective The main aim of this study was to use familial risk analysis to examine the association between attention-deficit/hyperactivity disorder (ADHD) and nicotine dependence. Methods Subjects were children with (n = 257) and without (n = 229) ADHD of both sexes ascertained form pediatric and psychiatric referral sources and their first-degree relatives (N = 1627). Results Nicotine dependence in probands increased the risk for nicotine dependence in relatives irrespective of ADHD status. There was no evidence of cosegregation or assortative mating between these disorders. Patterns of familial risk analysis suggest that the association between ADHD and nicotine dependence is most consistent with the hypothesis of independent transmission of these disorders. Conclusions These findings may have important implications for the identification of a subgroup of children with ADHD at high risk for nicotine dependence based on parental history of nicotine dependence. PMID:23461889

  2. Tools and Methods for Risk Management in Multi-Site Engineering Projects

    NASA Astrophysics Data System (ADS)

    Zhou, Mingwei; Nemes, Laszlo; Reidsema, Carl; Ahmed, Ammar; Kayis, Berman

    In today's highly global business environment, engineering and manufacturing projects often involve two or more geographically dispersed units or departments, research centers or companies. This paper attempts to identify the requirements for risk management in a multi-site engineering project environment, and presents a review of the state-of-the-art tools and methods that can be used to manage risks in multi-site engineering projects. This leads to the development of a risk management roadmap, which will underpin the design and implementation of an intelligent risk mapping system.

  3. Impact of Domain Analysis on Reuse Methods

    DTIC Science & Technology

    1989-11-06

    return on the investment. The potential negative effects a "bad" domain analysis has on developing systems in the domain also increases the risks of a...importance of domain analysis as part of a software reuse program. A particular goal is to assist in avoiding the potential negative effects of ad hoc or...are specification objects discovered by performing object-oriented analysis. Object-based analysis approaches thus serve to capture a model of reality

  4. Celiac disease and the risk of kidney diseases: A systematic review and meta-analysis.

    PubMed

    Wijarnpreecha, Karn; Thongprayoon, Charat; Panjawatanan, Panadeekarn; Thamcharoen, Natanong; Pachariyanon, Pavida; Nakkala, Kiran; Cheungpasitporn, Wisit

    2016-12-01

    Previous epidemiologic studies attempting to demonstrate the risk of kidney diseases among patients with celiac disease (CD) have yielded inconsistent results. This meta-analysis was conducted with the aims to summarize all available evidence. A literature search was performed using MEDLINE and EMBASE from inception to May 2016. Studies that provided relative risks, odd ratios, or hazard ratios examining the risk of kidney diseases among patients with CD versus individuals without CD were included. Pooled risk ratios (RR) and 95% confidence interval (CI) were calculated using a random-effect, generic inverse variance method. Eight studies met our eligibility criteria and were included in our analysis. A pooled RR of overall kidney diseases in patients with CD was 2.01 (95% CI, 1.44-2.81, I 2 =76%). The pooled RR of end-stage renal disease in patients with CD was 2.57 (95% CI, 2.03-3.24). Subgroup analyses showed that significant risks were increased for diabetic nephropathy (pooled RR of 1.49, 95% CI, 1.09-2.02) and IgA nephropathy (pooled RR of 2.62, 95% CI, 1.27-5.42) in patients with CD. Our study demonstrates a significantly increased risk of kidney diseases among patients with CD. These findings may influence clinical management and primary prevention of kidney diseases in patients with CD. Copyright © 2016 Editrice Gastroenterologica Italiana S.r.l. Published by Elsevier Ltd. All rights reserved.

  5. Control of Risks Through the Use of Procedures: A Method for Evaluating the Change in Risk

    NASA Technical Reports Server (NTRS)

    Praino, Gregory T.; Sharit, Joseph

    2010-01-01

    This paper considers how procedures can be used to control risks faced by an organization and proposes a means of recognizing if a particular procedure reduces risk or contributes to the organization's exposure. The proposed method was developed out of the review of work documents and the governing procedures performed in the wake of the Columbia accident by NASA and the Space Shuttle prime contractor, United Space Alliance, LLC. A technique was needed to understand the rules, or procedural controls, in place at the time in the context of how important the role of each rule was. The proposed method assesses procedural risks, the residual risk associated with a hazard after a procedure's influence is accounted for, by considering each clause of a procedure as a unique procedural control that may be beneficial or harmful. For procedural risks with consequences severe enough to threaten the survival of the organization, the method measures the characteristics of each risk on a scale that is an alternative to the traditional consequence/likelihood couple. The dual benefits of the substitute scales are that they eliminate both the need to quantify a relationship between different consequence types and the need for the extensive history a probabilistic risk assessment would require. Control Value is used as an analog for the consequence, where the value of a rule is based on how well the control reduces the severity of the consequence when operating successfully. This value is composed of two parts: the inevitability of the consequence in the absence of the control, and the opportunity to intervene before the consequence is realized. High value controls will be ones where there is minimal need for intervention but maximum opportunity to actively prevent the outcome. Failure Likelihood is used as the substitute for the conventional likelihood of the outcome. For procedural controls, a failure is considered to be any non-malicious violation of the rule, whether intended or

  6. A Benefit-Risk Analysis Approach to Capture Regulatory Decision-Making: Multiple Myeloma.

    PubMed

    Raju, G K; Gurumurthi, Karthik; Domike, Reuben; Kazandjian, Dickran; Landgren, Ola; Blumenthal, Gideon M; Farrell, Ann; Pazdur, Richard; Woodcock, Janet

    2018-01-01

    Drug regulators around the world make decisions about drug approvability based on qualitative benefit-risk analysis. In this work, a quantitative benefit-risk analysis approach captures regulatory decision-making about new drugs to treat multiple myeloma (MM). MM assessments have been based on endpoints such as time to progression (TTP), progression-free survival (PFS), and objective response rate (ORR) which are different than benefit-risk analysis based on overall survival (OS). Twenty-three FDA decisions on MM drugs submitted to FDA between 2003 and 2016 were identified and analyzed. The benefits and risks were quantified relative to comparators (typically the control arm of the clinical trial) to estimate whether the median benefit-risk was positive or negative. A sensitivity analysis was demonstrated using ixazomib to explore the magnitude of uncertainty. FDA approval decision outcomes were consistent and logical using this benefit-risk framework. © 2017 American Society for Clinical Pharmacology and Therapeutics.

  7. Command Process Modeling & Risk Analysis

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  8. 7 CFR 2.71 - Director, Office of Risk Assessment and Cost-Benefit Analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Analysis. 2.71 Section 2.71 Agriculture Office of the Secretary of Agriculture DELEGATIONS OF AUTHORITY BY... Chief Economist § 2.71 Director, Office of Risk Assessment and Cost-Benefit Analysis. (a) Delegations..., Office of Risk Assessment and Cost-Benefit Analysis: (1) Responsible for assessing the risks to human...

  9. 7 CFR 2.71 - Director, Office of Risk Assessment and Cost-Benefit Analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Analysis. 2.71 Section 2.71 Agriculture Office of the Secretary of Agriculture DELEGATIONS OF AUTHORITY BY... Chief Economist § 2.71 Director, Office of Risk Assessment and Cost-Benefit Analysis. (a) Delegations..., Office of Risk Assessment and Cost-Benefit Analysis: (1) Responsible for assessing the risks to human...

  10. Spatio-temporal earthquake risk assessment for the Lisbon Metropolitan Area - A contribution to improving standard methods of population exposure and vulnerability analysis

    NASA Astrophysics Data System (ADS)

    Freire, Sérgio; Aubrecht, Christoph

    2010-05-01

    The recent 7.0 M earthquake that caused severe damage and destruction in parts of Haiti struck close to 5 PM (local time), at a moment when many people were not in their residences, instead being in their workplaces, schools, or churches. Community vulnerability assessment to seismic hazard relying solely on the location and density of resident-based census population, as is commonly the case, would grossly misrepresent the real situation. In particular in the context of global (climate) change, risk analysis is a research field increasingly gaining in importance whereas risk is usually defined as a function of hazard probability and vulnerability. Assessment and mapping of human vulnerability has however generally been lagging behind hazard analysis efforts. Central to the concept of vulnerability is the issue of human exposure. Analysis of exposure is often spatially tied to administrative units or reference objects such as buildings, spanning scales from the regional level to local studies for small areas. Due to human activities and mobility, the spatial distribution of population is time-dependent, especially in metropolitan areas. Accurately estimating population exposure is a key component of catastrophe loss modeling, one element of effective risk analysis and emergency management. Therefore, accounting for the spatio-temporal dynamics of human vulnerability correlates with recent recommendations to improve vulnerability analyses. Earthquakes are the prototype for a major disaster, being low-probability, rapid-onset, high-consequence events. Lisbon, Portugal, is subject to a high risk of earthquake, which can strike at any day and time, as confirmed by modern history (e.g. December 2009). The recently-approved Special Emergency and Civil Protection Plan (PEERS) is based on a Seismic Intensity map, and only contemplates resident population from the census as proxy for human exposure. In the present work we map and analyze the spatio-temporal distribution of

  11. Is adaptation or transformation needed? Active nanomaterials and risk analysis

    NASA Astrophysics Data System (ADS)

    Kuzma, Jennifer; Roberts, John Patrick

    2016-07-01

    Nanotechnology has been a key area of funding and policy for the United States and globally for the past two decades. Since nanotechnology research and development became a focus and nanoproducts began to permeate the market, scholars and scientists have been concerned about how to assess the risks that they may pose to human health and the environment. The newest generation of nanomaterials includes biomolecules that can respond to and influence their environments, and there is a need to explore whether and how existing risk-analysis frameworks are challenged by such novelty. To fill this niche, we used a modified approach of upstream oversight assessment (UOA), a subset of anticipatory governance. We first selected case studies of "active nanomaterials," that are early in research and development and designed for use in multiple sectors, and then considered them under several, key risk-analysis frameworks. We found two ways in which the cases challenge the frameworks. The first category relates to how to assess risk under a narrow framing of the term (direct health and environmental harm), and the second involves the definition of what constitutes a "risk" worthy of assessment and consideration in decision making. In light of these challenges, we propose some changes for risk analysis in the face of active nanostructures in order to improve risk governance.

  12. Association between dietary vitamin C intake and risk of esophageal cancer: A dose-response meta-analysis.

    PubMed

    Bo, Yacong; Lu, Yan; Zhao, Yan; Zhao, Erjiang; Yuan, Ling; Lu, Weiquan; Cui, Lingling; Lu, Quanjun

    2016-04-15

    While several epidemiological studies have investigated the association between vitamin C and risk of esophageal cancer, the results remain inconsistent. In the present study, a meta-analysis was conducted to assess the impact of dietary vitamin C intake on esophageal cancer risk. Online databases were searched up to March 29, 2015, for studies on the association between dietary vitamin C intake and esophageal cancer risk. Pooled risk ratios (RRs) or odds ratios (ORs) and 95% confidence intervals (CIs) were calculated using a random-effects model. Dose-response analyses were performed using the method of restricted cubic splines with four knots at percentiles of 5, 35, 65 and 95% of the distribution. Publication bias was estimated using Egger's tests and funnel plots. In all, 15 articles were included in this meta-analysis, including 20 studies, containing 7063 controls and 3955 cases of esophageal cancer. By comparing the highest vs. the lowest categories of vitamin C intake, we found that vitamin C was inversely associated with the risk of esophageal cancer [overall OR = 0.58, 95% CI = 0.49-0.68, I(2) = 56%]. A linear dose-response relationship was found. With an increase in dietary vitamin C intake of 50 mg/day, the risk of esophageal cancer statistically decreased by 13% (OR = 0.87, 95% CI = 0.80-0.93, p(linearity) = 0.0002). In conclusion, our analysis suggested that the higher intake of dietary vitamin C might have a protective effect against esophageal cancer. © 2015 UICC.

  13. Comparing 2 Adhesive Methods on Skin Integrity in the High-Risk Neonate.

    PubMed

    Boswell, Nicole; Waker, Cheryl L

    2016-12-01

    Nurses have a primary role in promoting neonatal skin integrity and skin care management of the critically ill neonate. Adhesive products are essential to secure needed medical devices but can be a significant factor contributing to skin breakdown. Current literature does not offer a definitive answer regarding which products most safely and effectively work to secure needed devices in the high-risk neonatal population. To determine which adhesive method is best practice to safely and effectively secure lines/tubes in the high-risk neonate population. The only main effect that was significant was age group with mean skin scores. Subjects in the younger group (24-28 weeks) had higher skin scores than in the older group (28-34 weeks), validating that younger gestations are at higher risk of breakdown with the use of adhesives. The findings did not clearly identify which product was superior to secure tubes and lines, or was the least injurious to skin of the high-risk neonate. Neither a transparent dressing only or transparent dressing over hydrocolloid method clearly demonstrated an advantage in the high-risk, preterm neonate. Anecdotal comments suggested staff preferred the transparent dressing over hydrocolloid method as providing better adhesive while protecting skin integrity. The findings validated that younger gestations are at higher risk of breakdown with the use of adhesives and therefore require close vigilance to maintain skin integrity.

  14. Review of methods to prevent and reduce the risk of Lyme disease.

    PubMed

    Lindsay, L R; Ogden, N H; Schofield, S W

    2015-06-04

    Cases of Lyme disease and areas with self-sustaining populations of vector ticks are increasing in Canada. This trend is expected to continue. Preventing Lyme disease will therefore become relevant to an increasing number of Canadians. To summarize methods for reducing the risk of tick bites and preventing transmission once a tick is feeding. A literature search was conducted to identify methods to reduce the risk of tick bites and the abundance of vector ticks, as well as the risk of becoming infected with the Lyme disease pathogen, Borrelia burgdorferi (BB), if bitten by a vector tick. Current approaches to reducing the risk of tick bites or preventing infection with BB once bitten are largely reliant on the individual. They include use of topical repellents, use of protective clothing, avoidance of risk areas and removing ticks soon (ideally within a day) after they attach. These methods are efficacious, but constrained by user adherence. Other approaches such as landscape modification or the use of acaricides to control ticks, have shown promise in other countries, but have not been widely adopted in Canada. Lyme disease will continue to present a threat in Canada. In additional to the existing interventions for prevention of tick bites and Lyme disease, there is a need for new tools to help reduce the risk of Lyme disease to Canadians.

  15. Advanced statistical methods for improved data analysis of NASA astrophysics missions

    NASA Technical Reports Server (NTRS)

    Feigelson, Eric D.

    1992-01-01

    The investigators under this grant studied ways to improve the statistical analysis of astronomical data. They looked at existing techniques, the development of new techniques, and the production and distribution of specialized software to the astronomical community. Abstracts of nine papers that were produced are included, as well as brief descriptions of four software packages. The articles that are abstracted discuss analytical and Monte Carlo comparisons of six different linear least squares fits, a (second) paper on linear regression in astronomy, two reviews of public domain software for the astronomer, subsample and half-sample methods for estimating sampling distributions, a nonparametric estimation of survival functions under dependent competing risks, censoring in astronomical data due to nondetections, an astronomy survival analysis computer package called ASURV, and improving the statistical methodology of astronomical data analysis.

  16. Coffee Consumption and Risk of Biliary Tract Cancers and Liver Cancer: A Dose–Response Meta-Analysis of Prospective Cohort Studies

    PubMed Central

    Micek, Agnieszka; Marranzano, Marina; Ray, Sumantra

    2017-01-01

    Background: A meta-analysis was conducted to summarize the evidence from prospective cohort and case-control studies regarding the association between coffee intake and biliary tract cancer (BTC) and liver cancer risk. Methods: Eligible studies were identified by searches of PubMed and EMBASE databases from the earliest available online indexing year to March 2017. The dose–response relationship was assessed by a restricted cubic spline model and multivariate random-effect meta-regression. A stratified and subgroup analysis by smoking status and hepatitis was performed to identify potential confounding factors. Results: We identified five studies on BTC risk and 13 on liver cancer risk eligible for meta-analysis. A linear dose–response meta-analysis did not show a significant association between coffee consumption and BTC risk. However, there was evidence of inverse correlation between coffee consumption and liver cancer risk. The association was consistent throughout the various potential confounding factors explored including smoking status, hepatitis, etc. Increasing coffee consumption by one cup per day was associated with a 15% reduction in liver cancer risk (RR 0.85; 95% CI 0.82 to 0.88). Conclusions: The findings suggest that increased coffee consumption is associated with decreased risk of liver cancer, but not BTC. PMID:28846640

  17. Development of Advanced Life Cycle Costing Methods for Technology Benefit/Cost/Risk Assessment

    NASA Technical Reports Server (NTRS)

    Yackovetsky, Robert (Technical Monitor)

    2002-01-01

    The overall objective of this three-year grant is to provide NASA Langley's System Analysis Branch with improved affordability tools and methods based on probabilistic cost assessment techniques. In order to accomplish this objective, the Aerospace Systems Design Laboratory (ASDL) needs to pursue more detailed affordability, technology impact, and risk prediction methods and to demonstrate them on variety of advanced commercial transports. The affordability assessment, which is a cornerstone of ASDL methods, relies on the Aircraft Life Cycle Cost Analysis (ALCCA) program originally developed by NASA Ames Research Center and enhanced by ASDL. This grant proposed to improve ALCCA in support of the project objective by updating the research, design, test, and evaluation cost module, as well as the engine development cost module. Investigations into enhancements to ALCCA include improved engine development cost, process based costing, supportability cost, and system reliability with airline loss of revenue for system downtime. A probabilistic, stand-alone version of ALCCA/FLOPS will also be developed under this grant in order to capture the uncertainty involved in technology assessments. FLOPS (FLight Optimization System program) is an aircraft synthesis and sizing code developed by NASA Langley Research Center. This probabilistic version of the coupled program will be used within a Technology Impact Forecasting (TIF) method to determine what types of technologies would have to be infused in a system in order to meet customer requirements. A probabilistic analysis of the CER's (cost estimating relationships) within ALCCA will also be carried out under this contract in order to gain some insight as to the most influential costs and the impact that code fidelity could have on future RDS (Robust Design Simulation) studies.

  18. Risk Evaluation of Bogie System Based on Extension Theory and Entropy Weight Method

    PubMed Central

    Du, Yanping; Zhang, Yuan; Zhao, Xiaogang; Wang, Xiaohui

    2014-01-01

    A bogie system is the key equipment of railway vehicles. Rigorous practical evaluation of bogies is still a challenge. Presently, there is overreliance on part-specific experiments in practice. In the present work, a risk evaluation index system of a bogie system has been established based on the inspection data and experts' evaluation. Then, considering quantitative and qualitative aspects, the risk state of a bogie system has been evaluated using an extension theory and an entropy weight method. Finally, the method has been used to assess the bogie system of four different samples. Results show that this method can assess the risk state of a bogie system exactly. PMID:25574159

  19. Risk evaluation of bogie system based on extension theory and entropy weight method.

    PubMed

    Du, Yanping; Zhang, Yuan; Zhao, Xiaogang; Wang, Xiaohui

    2014-01-01

    A bogie system is the key equipment of railway vehicles. Rigorous practical evaluation of bogies is still a challenge. Presently, there is overreliance on part-specific experiments in practice. In the present work, a risk evaluation index system of a bogie system has been established based on the inspection data and experts' evaluation. Then, considering quantitative and qualitative aspects, the risk state of a bogie system has been evaluated using an extension theory and an entropy weight method. Finally, the method has been used to assess the bogie system of four different samples. Results show that this method can assess the risk state of a bogie system exactly.

  20. NASA's Agency-Wide Strategy for Environmental Regulatory Risk Analysis and Communication

    NASA Technical Reports Server (NTRS)

    Scroggins, Sharon; Duda, Kristen

    2008-01-01

    This viewgraph presentation gives an overview of NASA's risk analysis communication programs associated with changing environmental policies. The topics include: 1) NASA Program Transition; 2) Principal Center for Regulatory Risk Analysis and Communication (RRAC PC); and 3) Regulatory Tracking and Communication Process.

  1. Risk Stratification Methods and Provision of Care Management Services in Comprehensive Primary Care Initiative Practices.

    PubMed

    Reddy, Ashok; Sessums, Laura; Gupta, Reshma; Jin, Janel; Day, Tim; Finke, Bruce; Bitton, Asaf

    2017-09-01

    Risk-stratified care management is essential to improving population health in primary care settings, but evidence is limited on the type of risk stratification method and its association with care management services. We describe risk stratification patterns and association with care management services for primary care practices in the Comprehensive Primary Care (CPC) initiative. We undertook a qualitative approach to categorize risk stratification methods being used by CPC practices and tested whether these stratification methods were associated with delivery of care management services. CPC practices reported using 4 primary methods to stratify risk for their patient populations: a practice-developed algorithm (n = 215), the American Academy of Family Physicians' clinical algorithm (n = 155), payer claims and electronic health records (n = 62), and clinical intuition (n = 52). CPC practices using practice-developed algorithm identified the most number of high-risk patients per primary care physician (282 patients, P = .006). CPC practices using clinical intuition had the most high-risk patients in care management and a greater proportion of high-risk patients receiving care management per primary care physician (91 patients and 48%, P =.036 and P =.128, respectively). CPC practices used 4 primary methods to identify high-risk patients. Although practices that developed their own algorithm identified the greatest number of high-risk patients, practices that used clinical intuition connected the greatest proportion of patients to care management services. © 2017 Annals of Family Medicine, Inc.

  2. Dynamic Positioning System (DPS) Risk Analysis Using Probabilistic Risk Assessment (PRA)

    NASA Technical Reports Server (NTRS)

    Thigpen, Eric B.; Boyer, Roger L.; Stewart, Michael A.; Fougere, Pete

    2017-01-01

    The National Aeronautics and Space Administration (NASA) Safety & Mission Assurance (S&MA) directorate at the Johnson Space Center (JSC) has applied its knowledge and experience with Probabilistic Risk Assessment (PRA) to projects in industries ranging from spacecraft to nuclear power plants. PRA is a comprehensive and structured process for analyzing risk in complex engineered systems and/or processes. The PRA process enables the user to identify potential risk contributors such as, hardware and software failure, human error, and external events. Recent developments in the oil and gas industry have presented opportunities for NASA to lend their PRA expertise to both ongoing and developmental projects within the industry. This paper provides an overview of the PRA process and demonstrates how this process was applied in estimating the probability that a Mobile Offshore Drilling Unit (MODU) operating in the Gulf of Mexico and equipped with a generically configured Dynamic Positioning System (DPS) loses location and needs to initiate an emergency disconnect. The PRA described in this paper is intended to be generic such that the vessel meets the general requirements of an International Maritime Organization (IMO) Maritime Safety Committee (MSC)/Circ. 645 Class 3 dynamically positioned vessel. The results of this analysis are not intended to be applied to any specific drilling vessel, although provisions were made to allow the analysis to be configured to a specific vessel if required.

  3. A quantitative method for risk assessment of agriculture due to climate change

    NASA Astrophysics Data System (ADS)

    Dong, Zhiqiang; Pan, Zhihua; An, Pingli; Zhang, Jingting; Zhang, Jun; Pan, Yuying; Huang, Lei; Zhao, Hui; Han, Guolin; Wu, Dong; Wang, Jialin; Fan, Dongliang; Gao, Lin; Pan, Xuebiao

    2018-01-01

    Climate change has greatly affected agriculture. Agriculture is facing increasing risks as its sensitivity and vulnerability to climate change. Scientific assessment of climate change-induced agricultural risks could help to actively deal with climate change and ensure food security. However, quantitative assessment of risk is a difficult issue. Here, based on the IPCC assessment reports, a quantitative method for risk assessment of agriculture due to climate change is proposed. Risk is described as the product of the degree of loss and its probability of occurrence. The degree of loss can be expressed by the yield change amplitude. The probability of occurrence can be calculated by the new concept of climate change effect-accumulated frequency (CCEAF). Specific steps of this assessment method are suggested. This method is determined feasible and practical by using the spring wheat in Wuchuan County of Inner Mongolia as a test example. The results show that the fluctuation of spring wheat yield increased with the warming and drying climatic trend in Wuchuan County. The maximum yield decrease and its probability were 3.5 and 64.6%, respectively, for the temperature maximum increase 88.3%, and its risk was 2.2%. The maximum yield decrease and its probability were 14.1 and 56.1%, respectively, for the precipitation maximum decrease 35.2%, and its risk was 7.9%. For the comprehensive impacts of temperature and precipitation, the maximum yield decrease and its probability were 17.6 and 53.4%, respectively, and its risk increased to 9.4%. If we do not adopt appropriate adaptation strategies, the degree of loss from the negative impacts of multiclimatic factors and its probability of occurrence will both increase accordingly, and the risk will also grow obviously.

  4. The report of Task Group 100 of the AAPM: Application of risk analysis methods to radiation therapy quality management

    PubMed Central

    Huq, M. Saiful; Fraass, Benedick A.; Dunscombe, Peter B.; Gibbons, John P.; Mundt, Arno J.; Mutic, Sasa; Palta, Jatinder R.; Rath, Frank; Thomadsen, Bruce R.; Williamson, Jeffrey F.; Yorke, Ellen D.

    2016-01-01

    The increasing complexity of modern radiation therapy planning and delivery challenges traditional prescriptive quality management (QM) methods, such as many of those included in guidelines published by organizations such as the AAPM, ASTRO, ACR, ESTRO, and IAEA. These prescriptive guidelines have traditionally focused on monitoring all aspects of the functional performance of radiotherapy (RT) equipment by comparing parameters against tolerances set at strict but achievable values. Many errors that occur in radiation oncology are not due to failures in devices and software; rather they are failures in workflow and process. A systematic understanding of the likelihood and clinical impact of possible failures throughout a course of radiotherapy is needed to direct limit QM resources efficiently to produce maximum safety and quality of patient care. Task Group 100 of the AAPM has taken a broad view of these issues and has developed a framework for designing QM activities, based on estimates of the probability of identified failures and their clinical outcome through the RT planning and delivery process. The Task Group has chosen a specific radiotherapy process required for “intensity modulated radiation therapy (IMRT)” as a case study. The goal of this work is to apply modern risk-based analysis techniques to this complex RT process in order to demonstrate to the RT community that such techniques may help identify more effective and efficient ways to enhance the safety and quality of our treatment processes. The task group generated by consensus an example quality management program strategy for the IMRT process performed at the institution of one of the authors. This report describes the methodology and nomenclature developed, presents the process maps, FMEAs, fault trees, and QM programs developed, and makes suggestions on how this information could be used in the clinic. The development and implementation of risk-assessment techniques will make radiation

  5. The report of Task Group 100 of the AAPM: Application of risk analysis methods to radiation therapy quality management.

    PubMed

    Huq, M Saiful; Fraass, Benedick A; Dunscombe, Peter B; Gibbons, John P; Ibbott, Geoffrey S; Mundt, Arno J; Mutic, Sasa; Palta, Jatinder R; Rath, Frank; Thomadsen, Bruce R; Williamson, Jeffrey F; Yorke, Ellen D

    2016-07-01

    The increasing complexity of modern radiation therapy planning and delivery challenges traditional prescriptive quality management (QM) methods, such as many of those included in guidelines published by organizations such as the AAPM, ASTRO, ACR, ESTRO, and IAEA. These prescriptive guidelines have traditionally focused on monitoring all aspects of the functional performance of radiotherapy (RT) equipment by comparing parameters against tolerances set at strict but achievable values. Many errors that occur in radiation oncology are not due to failures in devices and software; rather they are failures in workflow and process. A systematic understanding of the likelihood and clinical impact of possible failures throughout a course of radiotherapy is needed to direct limit QM resources efficiently to produce maximum safety and quality of patient care. Task Group 100 of the AAPM has taken a broad view of these issues and has developed a framework for designing QM activities, based on estimates of the probability of identified failures and their clinical outcome through the RT planning and delivery process. The Task Group has chosen a specific radiotherapy process required for "intensity modulated radiation therapy (IMRT)" as a case study. The goal of this work is to apply modern risk-based analysis techniques to this complex RT process in order to demonstrate to the RT community that such techniques may help identify more effective and efficient ways to enhance the safety and quality of our treatment processes. The task group generated by consensus an example quality management program strategy for the IMRT process performed at the institution of one of the authors. This report describes the methodology and nomenclature developed, presents the process maps, FMEAs, fault trees, and QM programs developed, and makes suggestions on how this information could be used in the clinic. The development and implementation of risk-assessment techniques will make radiation therapy

  6. Silent Brain Infarction and Risk of Future Stroke: A Systematic Review and Meta-Analysis

    PubMed Central

    Gupta, Ajay; Giambrone, Ashley E.; Gialdini, Gino; Finn, Caitlin; Delgado, Diana; Gutierrez, Jose; Wright, Clinton; Beiser, Alexa S.; Seshadri, Sudha; Pandya, Ankur; Kamel, Hooman

    2016-01-01

    Background and Purpose Silent brain infarction (SBI) on magnetic resonance imaging (MRI) has been proposed as a subclinical risk marker for future symptomatic stroke. We performed a systematic review and meta-analysis to summarize the association between MRI-defined SBI and future stroke risk. Methods We searched the medical literature to identify cohort studies involving adults with MRI detection of SBI who were subsequently followed for incident clinically-defined stroke. Study data and quality assessment were recorded in duplicate with disagreements in data extraction resolved by a third reader. Strength association between MRI detected SBI and future symptomatic stroke measured by a hazard ratio (HR). Results The meta-analysis included 13 studies (14,764 subjects) with a mean follow-up ranging from 25.7 to 174 months. SBI predicted the occurrence of stroke with a random effects crude relative risk of 2.94 (95% CI 2.24–3.86, P<0.001; Q=39.65, P<0.001). In the eight studies of 10,427 subjects providing HR adjusted for cardiovascular risk factors, SBI was an independent predictor of incident stroke (HR 2.08 [95% CI 1.69–2.56, P<0.001]; Q=8.99, P=0.25). In a subgroup analysis pooling 9,483 stroke-free individuals from large population-based studies, SBI was present in ~18% of participants and remained a strong predictor of future stroke (HR 2.06 [95% CI 1.64–2.59], p<0.01). Conclusions SBI is present in approximately one in five stroke-free older adults and is associated with a 2-fold increased risk of future stroke. Future studies of in-depth stroke risk evaluations and intensive prevention measures are warranted in patients with clinically unrecognized radiologically evident brain infarctions. PMID:26888534

  7. Using cognitive pre-testing methods in the development of a new evidenced-based pressure ulcer risk assessment instrument.

    PubMed

    Coleman, S; Nixon, J; Keen, J; Muir, D; Wilson, L; McGinnis, E; Stubbs, N; Dealey, C; Nelson, E A

    2016-11-16

    Variation in development methods of Pressure Ulcer Risk Assessment Instruments has led to inconsistent inclusion of risk factors and concerns about content validity. A new evidenced-based Risk Assessment Instrument, the Pressure Ulcer Risk Primary Or Secondary Evaluation Tool - PURPOSE-T was developed as part of a National Institute for Health Research (NIHR) funded Pressure Ulcer Research Programme (PURPOSE: RP-PG-0407-10056). This paper reports the pre-test phase to assess and improve PURPOSE-T acceptability, usability and confirm content validity. A descriptive study incorporating cognitive pre-testing methods and integration of service user views was undertaken over 3 cycles comprising PURPOSE-T training, a focus group and one-to-one think-aloud interviews. Clinical nurses from 2 acute and 2 community NHS Trusts, were grouped according to job role. Focus group participants used 3 vignettes to complete PURPOSE-T assessments and then participated in the focus group. Think-aloud participants were interviewed during their completion of PURPOSE-T. After each pre-test cycle analysis was undertaken and adjustment/improvements made to PURPOSE-T in an iterative process. This incorporated the use of descriptive statistics for data completeness and decision rule compliance and directed content analysis for interview and focus group data. Data were collected April 2012-June 2012. Thirty-four nurses participated in 3 pre-test cycles. Data from 3 focus groups, 12 think-aloud interviews incorporating 101 PURPOSE-T assessments led to changes to improve instrument content and design, flow and format, decision support and item-specific wording. Acceptability and usability were demonstrated by improved data completion and appropriate risk pathway allocation. The pre-test also confirmed content validity with clinical nurses. The pre-test was an important step in the development of the preliminary PURPOSE-T and the methods used may have wider instrument development application

  8. Risk management for outsourcing biomedical waste disposal – Using the failure mode and effects analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liao, Ching-Jong; Ho, Chao Chung, E-mail: ho919@pchome.com.tw

    Highlights: • This study is based on a real case in hospital in Taiwan. • We use Failure Mode and Effects Analysis (FMEA) as the evaluation method. • We successfully identify the evaluation factors of bio-medical waste disposal risk. - Abstract: Using the failure mode and effects analysis, this study examined biomedical waste companies through risk assessment. Moreover, it evaluated the supervisors of biomedical waste units in hospitals, and factors relating to the outsourcing risk assessment of biomedical waste in hospitals by referring to waste disposal acts. An expert questionnaire survey was conducted on the personnel involved in waste disposalmore » units in hospitals, in order to identify important factors relating to the outsourcing risk of biomedical waste in hospitals. This study calculated the risk priority number (RPN) and selected items with an RPN value higher than 80 for improvement. These items included “availability of freezing devices”, “availability of containers for sharp items”, “disposal frequency”, “disposal volume”, “disposal method”, “vehicles meeting the regulations”, and “declaration of three lists”. This study also aimed to identify important selection factors of biomedical waste disposal companies by hospitals in terms of risk. These findings can serve as references for hospitals in the selection of outsourcing companies for biomedical waste disposal.« less

  9. Efficacy of different methods used for dry socket prevention and risk factor analysis: A systematic review.

    PubMed

    Taberner-Vallverdú, M; Sánchez-Garcés, M-Á; Gay-Escoda, C

    2017-11-01

    Dry socket is one of the most common complications that develops after the extraction of a permanent tooth, and its prevention is more effective than its treatment. Analyze the efficacy of different methods used in preventing dry socket in order to decrease its incidence after tooth extraction. A Cochrane and PubMed-MEDLINE database search was conducted with the search terms "dry socket", "prevention", "risk factors", "alveolar osteitis" and "fibrynolitic alveolitis", both individually and using the Boolean operator "AND". The inclusion criteria were: clinical studies including at least 30 patients, articles published from 2005 to 2015 and written in English. The exclusion criteria were case reports and nonhuman studies. 30 publications were selected from a total of 250. Six of the 30 were excluded after reading the full text. The final review included 24 articles: 9 prospective studies, 2 retrospective studies and 13 clinical trials. They were stratified according to their level of scientific evidence using SIGN criteria (Scottish Intercollegiate Guidelines Network). All treatments included in the review were aimed at decreasing the incidence of dry socket. Locally administering chlorhexidine or applying platelet-rich plasma reduces the likelihood of developing this complication. Antibiotic prescription does not avoid postoperative complications after lower third molar surgery. With regard to risk factors, all of the articles selected suggest that patient age, history of previous infection and the difficulty of the extraction are the most common predisposing factors for developing dry socket. There is no consensus that smoking, gender or menstrual cycles are risk factors. Taking the scientific quality of the articles evaluated into account, a level B recommendation has been given for the proposed-procedures in the prevention of dry socket.

  10. Schedule Risk Assessment

    NASA Technical Reports Server (NTRS)

    Smith, Greg

    2003-01-01

    Schedule risk assessments determine the likelihood of finishing on time. Each task in a schedule has a varying degree of probability of being finished on time. A schedule risk assessment quantifies these probabilities by assigning values to each task. This viewgraph presentation contains a flow chart for conducting a schedule risk assessment, and profiles applicable several methods of data analysis.

  11. Disease risk score as a confounder summary method: systematic review and recommendations.

    PubMed

    Tadrous, Mina; Gagne, Joshua J; Stürmer, Til; Cadarette, Suzanne M

    2013-02-01

    To systematically examine trends and applications of the disease risk score (DRS) as a confounder summary method. We completed a systematic search of MEDLINE and Web of Science® to identify all English language articles that applied DRS methods. We tabulated the number of publications by year and type (empirical application, methodological contribution, or review paper) and summarized methods used in empirical applications overall and by publication year (<2000, ≥2000). Of 714 unique articles identified, 97 examined DRS methods and 86 were empirical applications. We observed a bimodal distribution in the number of publications over time, with a peak 1979-1980, and resurgence since 2000. The majority of applications with methodological detail derived DRS using logistic regression (47%), used DRS as a categorical variable in regression (93%), and applied DRS in a non-experimental cohort (47%) or case-control (42%) study. Few studies examined effect modification by outcome risk (23%). Use of DRS methods has increased yet remains low. Comparative effectiveness research may benefit from more DRS applications, particularly to examine effect modification by outcome risk. Standardized terminology may facilitate identification, application, and comprehension of DRS methods. More research is needed to support the application of DRS methods, particularly in case-control studies. Copyright © 2012 John Wiley & Sons, Ltd.

  12. Monte Carlo-based interval transformation analysis for multi-criteria decision analysis of groundwater management strategies under uncertain naphthalene concentrations and health risks

    NASA Astrophysics Data System (ADS)

    Ren, Lixia; He, Li; Lu, Hongwei; Chen, Yizhong

    2016-08-01

    A new Monte Carlo-based interval transformation analysis (MCITA) is used in this study for multi-criteria decision analysis (MCDA) of naphthalene-contaminated groundwater management strategies. The analysis can be conducted when input data such as total cost, contaminant concentration and health risk are represented as intervals. Compared to traditional MCDA methods, MCITA-MCDA has the advantages of (1) dealing with inexactness of input data represented as intervals, (2) mitigating computational time due to the introduction of Monte Carlo sampling method, (3) identifying the most desirable management strategies under data uncertainty. A real-world case study is employed to demonstrate the performance of this method. A set of inexact management alternatives are considered in each duration on the basis of four criteria. Results indicated that the most desirable management strategy lied in action 15 for the 5-year, action 8 for the 10-year, action 12 for the 15-year, and action 2 for the 20-year management.

  13. Review on pen-and-paper-based observational methods for assessing ergonomic risk factors of computer work.

    PubMed

    Rahman, Mohd Nasrull Abdol; Mohamad, Siti Shafika

    2017-01-01

    Computer works are associated with Musculoskeletal Disorders (MSDs). There are several methods have been developed to assess computer work risk factor related to MSDs. This review aims to give an overview of current techniques available for pen-and-paper-based observational methods in assessing ergonomic risk factors of computer work. We searched an electronic database for materials from 1992 until 2015. The selected methods were focused on computer work, pen-and-paper observational methods, office risk factors and musculoskeletal disorders. This review was developed to assess the risk factors, reliability and validity of pen-and-paper observational method associated with computer work. Two evaluators independently carried out this review. Seven observational methods used to assess exposure to office risk factor for work-related musculoskeletal disorders were identified. The risk factors involved in current techniques of pen and paper based observational tools were postures, office components, force and repetition. From the seven methods, only five methods had been tested for reliability. They were proven to be reliable and were rated as moderate to good. For the validity testing, from seven methods only four methods were tested and the results are moderate. Many observational tools already exist, but no single tool appears to cover all of the risk factors including working posture, office component, force, repetition and office environment at office workstations and computer work. Although the most important factor in developing tool is proper validation of exposure assessment techniques, the existing observational method did not test reliability and validity. Futhermore, this review could provide the researchers with ways on how to improve the pen-and-paper-based observational method for assessing ergonomic risk factors of computer work.

  14. Changes in Classes of Injury-Related Risks and Consequences of Risk-Level Drinking: a Latent Transition Analysis.

    PubMed

    Cochran, Gerald; Field, Craig; Caetano, Raul

    2015-07-01

    Risk-level drinking, drinking and driving, and alcohol-related violence are risk factors that result in injuries. The current study sought to identify which subgroups of patients experience the most behavioral change following a brief intervention. A secondary analysis of data from a brief alcohol intervention study was conducted. The sample (N = 664) includes at-risk drinkers who experienced an injury and were admitted for care to a Level 1 trauma center. Injury-related items from the Short Inventory of Problems+6 were used to perform a latent transition analysis to describe class transitions participants experienced following discharge. Four classes emerged for the year before and after the current injury. Most individuals transitioned from higher-risk classes into those with lower risk. Some participants maintained risky profiles, and others increased risks and consequences. Drinking and driving remained a persistent problem among the study participants. Although a large portion of intervention recipients improved risks and consequences of alcohol use following discharge, more intensive intervention services may be needed for a subset of patients who showed little or no improvement.

  15. Green Jobs: Definition and Method of Appraisal of Chemical and Biological Risks

    PubMed Central

    Cheneval, Erwan; Busque, Marc-Antoine; Ostiguy, Claude; Lavoie, Jacques; Bourbonnais, Robert; Labrèche, France; Bakhiyi, Bouchra; Zayed, Joseph

    2016-01-01

    In the wake of sustainable development, green jobs are developing rapidly, changing the work environment. However a green job is not automatically a safe job. The aim of the study was to define green jobs, and to establish a preliminary risk assessment of chemical substances and biological agents for workers in Quebec. An operational definition was developed, along with criteria and sustainable development principles to discriminate green jobs from regular jobs. The potential toxicity or hazard associated with their chemical and biological exposures was assessed, and the workers’ exposure appraised using an expert assessment method. A control banding approach was then used to assess risks for workers in selected green jobs. A double entry model allowed us to set priorities in terms of chemical or biological risk. Among jobs that present the highest risk potential, several are related to waste management. The developed method is flexible and could be adapted to better appraise the risks that workers are facing or to propose control measures. PMID:26718400

  16. Industrial machine systems risk assessment: a critical review of concepts and methods.

    PubMed

    Etherton, John R

    2007-02-01

    Reducing the risk of work-related death and injury to machine operators and maintenance personnel poses a continuing occupational safety challenge. The risk of injury from machinery in U.S. workplaces is high. Between 1992 and 2001, there were, on average, 520 fatalities per year involving machines and, on average, 3.8 cases per 10,000 workers of nonfatal caught-in-running-machine injuries involving lost workdays. A U.S. task group recently developed a technical reference guideline, ANSI B11 TR3, "A Guide to Estimate, Evaluate, & Reduce Risks Associated with Machine Tools," that is intended to bring machine tool risk assessment practice in the United States up to or above the level now required by the international standard, ISO 14121. The ANSI guideline emphasizes identifying tasks and hazards not previously considered, particularly those associated with maintenance; and it further emphasizes teamwork among line workers, engineers, and safety professionals. The value of this critical review of concepts and methods resides in (1) its linking current risk theory to machine system risk assessment and (2) its exploration of how various risk estimation tools translate into risk-informed decisions on industrial machine system design and use. The review was undertaken to set the stage for a field evaluation study on machine risk assessment among users of the ANSI B11 TR3 method.

  17. Critical asset and portfolio risk analysis: an all-hazards framework.

    PubMed

    Ayyub, Bilal M; McGill, William L; Kaminskiy, Mark

    2007-08-01

    This article develops a quantitative all-hazards framework for critical asset and portfolio risk analysis (CAPRA) that considers both natural and human-caused hazards. Following a discussion on the nature of security threats, the need for actionable risk assessments, and the distinction between asset and portfolio-level analysis, a general formula for all-hazards risk analysis is obtained that resembles the traditional model based on the notional product of consequence, vulnerability, and threat, though with clear meanings assigned to each parameter. Furthermore, a simple portfolio consequence model is presented that yields first-order estimates of interdependency effects following a successful attack on an asset. Moreover, depending on the needs of the decisions being made and available analytical resources, values for the parameters in this model can be obtained at a high level or through detailed systems analysis. Several illustrative examples of the CAPRA methodology are provided.

  18. A method for minimum risk portfolio optimization under hybrid uncertainty

    NASA Astrophysics Data System (ADS)

    Egorova, Yu E.; Yazenin, A. V.

    2018-03-01

    In this paper, we investigate a minimum risk portfolio model under hybrid uncertainty when the profitability of financial assets is described by fuzzy random variables. According to Feng, the variance of a portfolio is defined as a crisp value. To aggregate fuzzy information the weakest (drastic) t-norm is used. We construct an equivalent stochastic problem of the minimum risk portfolio model and specify the stochastic penalty method for solving it.

  19. Rapid Detection Method for the Four Most Common CHEK2 Mutations Based on Melting Profile Analysis.

    PubMed

    Borun, Pawel; Salanowski, Kacper; Godlewski, Dariusz; Walkowiak, Jaroslaw; Plawski, Andrzej

    2015-12-01

    CHEK2 is a tumor suppressor gene, and the mutations affecting the functionality of the protein product increase cancer risk in various organs. The elevated risk, in a significant percentage of cases, is determined by the occurrence of one of the four most common mutations in the CHEK2 gene, including c.470T>C (p.I157T), c.444+1G>A (IVS2+1G>A), c.1100delC, and c.1037+1538_1224+328del5395 (del5395). We have developed and validated a rapid and effective method for their detection based on high-resolution melting analysis and comparative-high-resolution melting, a novel approach enabling simultaneous detection of copy number variations. The analysis is performed in two polymerase chain reactions followed by melting analysis, without any additional reagents or handling other than that used in standard high-resolution melting. Validation of the method was conducted in a group of 103 patients with diagnosed breast cancer, a group of 240 unrelated patients with familial history of cancer associated with the CHEK2 gene mutations, and a 100-person control group. The results of the analyses for all three groups were fully consistent with the results from other methods. The method we have developed improves the identification of the CHEK2 mutation carriers, reduces the cost of such analyses, as well as facilitates their implementation. Along with the increased efficiency, the method maintains accuracy and reliability comparable to other more labor-consuming techniques.

  20. Association between appendectomy and risk of primary sclerosing cholangitis: A systematic review and meta-analysis.

    PubMed

    Wijarnpreecha, Karn; Panjawatanan, Panadeekarn; Mousa, Omar Y; Cheungpasitporn, Wisit; Pungpapong, Surakit; Ungprasert, Patompong

    2018-04-11

    Recent epidemiologic studies have suggested that appendectomy could be a risk factor for primary sclerosing cholangitis (PSC) although the results were inconsistent. This systematic review and meta-analysis was conducted to summarize all available evidence. A comprehensive literature review was conducted using MEDLINE and EMBASE database through January 2018 to identify all studies that reported the risk of PSC among individuals who had appendectomy versus those with no history of appendectomy. Effect estimates from each study were extracted and combined together using the random-effect, generic inverse variance method of DerSimonian and Laird. A total of 6 case-control studies with 2432 participants met the eligibility criteria and were included in the meta-analysis. The risk of PSC in individuals who had appendectomy was significantly higher than those with no history of appendectomy with the pooled odds ratio of 1.37 (95% CI: 1.15-1.63). The statistical heterogeneity was insignificant with an I2 of 0%. A significantly increased risk of PSC among individuals who had a history of appendectomy was found in this study. Copyright © 2018 Elsevier Masson SAS. All rights reserved.