NASA Astrophysics Data System (ADS)
Sun, K.; Cheng, D. B.; He, J. J.; Zhao, Y. L.
2018-02-01
Collapse gully erosion is a specific type of soil erosion in the red soil region of southern China, and early warning and prevention of the occurrence of collapse gully erosion is very important. Based on the idea of risk assessment, this research, taking Guangdong province as an example, adopt the information acquisition analysis and the logistic regression analysis, to discuss the feasibility for collapse gully erosion risk assessment in regional scale, and compare the applicability of the different risk assessment methods. The results show that in the Guangdong province, the risk degree of collapse gully erosion occurrence is high in northeastern and western area, and relatively low in southwestern and central part. The comparing analysis of the different risk assessment methods on collapse gully also indicated that the risk distribution patterns from the different methods were basically consistent. However, the accuracy of risk map from the information acquisition analysis method was slightly better than that from the logistic regression analysis method.
Failure mode effect analysis and fault tree analysis as a combined methodology in risk management
NASA Astrophysics Data System (ADS)
Wessiani, N. A.; Yoshio, F.
2018-04-01
There have been many studies reported the implementation of Failure Mode Effect Analysis (FMEA) and Fault Tree Analysis (FTA) as a method in risk management. However, most of the studies usually only choose one of these two methods in their risk management methodology. On the other side, combining these two methods will reduce the drawbacks of each methods when implemented separately. This paper aims to combine the methodology of FMEA and FTA in assessing risk. A case study in the metal company will illustrate how this methodology can be implemented. In the case study, this combined methodology will assess the internal risks that occur in the production process. Further, those internal risks should be mitigated based on their level of risks.
Evaluating the Risks of Clinical Research: Direct Comparative Analysis
Abdoler, Emily; Roberson-Nay, Roxann; Pine, Daniel S.; Wendler, David
2014-01-01
Abstract Objectives: Many guidelines and regulations allow children and adolescents to be enrolled in research without the prospect of clinical benefit when it poses minimal risk. However, few systematic methods exist to determine when research risks are minimal. This situation has led to significant variation in minimal risk judgments, raising concern that some children are not being adequately protected. To address this concern, we describe a new method for implementing the widely endorsed “risks of daily life” standard for minimal risk. This standard defines research risks as minimal when they do not exceed the risks posed by daily life activities or routine examinations. Methods: This study employed a conceptual and normative analysis, and use of an illustrative example. Results: Different risks are composed of the same basic elements: Type, likelihood, and magnitude of harm. Hence, one can compare the risks of research and the risks of daily life by comparing the respective basic elements with each other. We use this insight to develop a systematic method, direct comparative analysis, for implementing the “risks of daily life” standard for minimal risk. The method offers a way of evaluating research procedures that pose the same types of risk as daily life activities, such as the risk of experiencing anxiety, stress, or other psychological harm. We thus illustrate how direct comparative analysis can be applied in practice by using it to evaluate whether the anxiety induced by a respiratory CO2 challenge poses minimal or greater than minimal risks in children and adolescents. Conclusions: Direct comparative analysis is a systematic method for applying the “risks of daily life” standard for minimal risk to research procedures that pose the same types of risk as daily life activities. It thereby offers a method to protect children and adolescents in research, while ensuring that important studies are not blocked because of unwarranted concerns about research risks. PMID:25210944
Kwon, Deukwoo; Hoffman, F Owen; Moroz, Brian E; Simon, Steven L
2016-02-10
Most conventional risk analysis methods rely on a single best estimate of exposure per person, which does not allow for adjustment for exposure-related uncertainty. Here, we propose a Bayesian model averaging method to properly quantify the relationship between radiation dose and disease outcomes by accounting for shared and unshared uncertainty in estimated dose. Our Bayesian risk analysis method utilizes multiple realizations of sets (vectors) of doses generated by a two-dimensional Monte Carlo simulation method that properly separates shared and unshared errors in dose estimation. The exposure model used in this work is taken from a study of the risk of thyroid nodules among a cohort of 2376 subjects who were exposed to fallout from nuclear testing in Kazakhstan. We assessed the performance of our method through an extensive series of simulations and comparisons against conventional regression risk analysis methods. When the estimated doses contain relatively small amounts of uncertainty, the Bayesian method using multiple a priori plausible draws of dose vectors gave similar results to the conventional regression-based methods of dose-response analysis. However, when large and complex mixtures of shared and unshared uncertainties are present, the Bayesian method using multiple dose vectors had significantly lower relative bias than conventional regression-based risk analysis methods and better coverage, that is, a markedly increased capability to include the true risk coefficient within the 95% credible interval of the Bayesian-based risk estimate. An evaluation of the dose-response using our method is presented for an epidemiological study of thyroid disease following radiation exposure. Copyright © 2015 John Wiley & Sons, Ltd.
A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.
Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco
2018-01-01
One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.
Risk analysis for veterinary biologicals released into the environment.
Silva, S V; Samagh, B S; Morley, R S
1995-12-01
All veterinary biologicals licensed in Canada must be shown to be pure, potent, safe and effective. A risk-based approach is used to evaluate the safety of all biologicals, whether produced by conventional methods or by molecular biological techniques. Traditionally, qualitative risk assessment methods have been used for this purpose. More recently, quantitative risk assessment has become available for complex issues. The quantitative risk assessment method uses "scenario tree analysis' to predict the likelihood of various outcomes and their respective impacts. The authors describe the quantitative risk assessment approach which is used within the broader context of risk analysis (i.e. risk assessment, risk management and risk communication) to develop recommendations for the field release of veterinary biologicals. The general regulatory framework for the licensing of veterinary biologicals in Canada is also presented.
Evaluating the risks of clinical research: direct comparative analysis.
Rid, Annette; Abdoler, Emily; Roberson-Nay, Roxann; Pine, Daniel S; Wendler, David
2014-09-01
Many guidelines and regulations allow children and adolescents to be enrolled in research without the prospect of clinical benefit when it poses minimal risk. However, few systematic methods exist to determine when research risks are minimal. This situation has led to significant variation in minimal risk judgments, raising concern that some children are not being adequately protected. To address this concern, we describe a new method for implementing the widely endorsed "risks of daily life" standard for minimal risk. This standard defines research risks as minimal when they do not exceed the risks posed by daily life activities or routine examinations. This study employed a conceptual and normative analysis, and use of an illustrative example. Different risks are composed of the same basic elements: Type, likelihood, and magnitude of harm. Hence, one can compare the risks of research and the risks of daily life by comparing the respective basic elements with each other. We use this insight to develop a systematic method, direct comparative analysis, for implementing the "risks of daily life" standard for minimal risk. The method offers a way of evaluating research procedures that pose the same types of risk as daily life activities, such as the risk of experiencing anxiety, stress, or other psychological harm. We thus illustrate how direct comparative analysis can be applied in practice by using it to evaluate whether the anxiety induced by a respiratory CO2 challenge poses minimal or greater than minimal risks in children and adolescents. Direct comparative analysis is a systematic method for applying the "risks of daily life" standard for minimal risk to research procedures that pose the same types of risk as daily life activities. It thereby offers a method to protect children and adolescents in research, while ensuring that important studies are not blocked because of unwarranted concerns about research risks.
Multiple Interacting Risk Factors: On Methods for Allocating Risk Factor Interactions.
Price, Bertram; MacNicoll, Michael
2015-05-01
A persistent problem in health risk analysis where it is known that a disease may occur as a consequence of multiple risk factors with interactions is allocating the total risk of the disease among the individual risk factors. This problem, referred to here as risk apportionment, arises in various venues, including: (i) public health management, (ii) government programs for compensating injured individuals, and (iii) litigation. Two methods have been described in the risk analysis and epidemiology literature for allocating total risk among individual risk factors. One method uses weights to allocate interactions among the individual risk factors. The other method is based on risk accounting axioms and finding an optimal and unique allocation that satisfies the axioms using a procedure borrowed from game theory. Where relative risk or attributable risk is the risk measure, we find that the game-theory-determined allocation is the same as the allocation where risk factor interactions are apportioned to individual risk factors using equal weights. Therefore, the apportionment problem becomes one of selecting a meaningful set of weights for allocating interactions among the individual risk factors. Equal weights and weights proportional to the risks of the individual risk factors are discussed. © 2015 Society for Risk Analysis.
Fault tree analysis for integrated and probabilistic risk analysis of drinking water systems.
Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof
2009-04-01
Drinking water systems are vulnerable and subject to a wide range of risks. To avoid sub-optimisation of risk-reduction options, risk analyses need to include the entire drinking water system, from source to tap. Such an integrated approach demands tools that are able to model interactions between different events. Fault tree analysis is a risk estimation tool with the ability to model interactions between events. Using fault tree analysis on an integrated level, a probabilistic risk analysis of a large drinking water system in Sweden was carried out. The primary aims of the study were: (1) to develop a method for integrated and probabilistic risk analysis of entire drinking water systems; and (2) to evaluate the applicability of Customer Minutes Lost (CML) as a measure of risk. The analysis included situations where no water is delivered to the consumer (quantity failure) and situations where water is delivered but does not comply with water quality standards (quality failure). Hard data as well as expert judgements were used to estimate probabilities of events and uncertainties in the estimates. The calculations were performed using Monte Carlo simulations. CML is shown to be a useful measure of risks associated with drinking water systems. The method presented provides information on risk levels, probabilities of failure, failure rates and downtimes of the system. This information is available for the entire system as well as its different sub-systems. Furthermore, the method enables comparison of the results with performance targets and acceptable levels of risk. The method thus facilitates integrated risk analysis and consequently helps decision-makers to minimise sub-optimisation of risk-reduction options.
Application of the Maximum Entropy Method to Risk Analysis of Mergers and Acquisitions
NASA Astrophysics Data System (ADS)
Xie, Jigang; Song, Wenyun
The maximum entropy (ME) method can be used to analyze the risk of mergers and acquisitions when only pre-acquisition information is available. A practical example of the risk analysis of China listed firms’ mergers and acquisitions is provided to testify the feasibility and practicality of the method.
An Emerging New Risk Analysis Science: Foundations and Implications.
Aven, Terje
2018-05-01
To solve real-life problems-such as those related to technology, health, security, or climate change-and make suitable decisions, risk is nearly always a main issue. Different types of sciences are often supporting the work, for example, statistics, natural sciences, and social sciences. Risk analysis approaches and methods are also commonly used, but risk analysis is not broadly accepted as a science in itself. A key problem is the lack of explanatory power and large uncertainties when assessing risk. This article presents an emerging new risk analysis science based on novel ideas and theories on risk analysis developed in recent years by the risk analysis community. It builds on a fundamental change in thinking, from the search for accurate predictions and risk estimates, to knowledge generation related to concepts, theories, frameworks, approaches, principles, methods, and models to understand, assess, characterize, communicate, and (in a broad sense) manage risk. Examples are used to illustrate the importance of this distinct/separate risk analysis science for solving risk problems, supporting science in general and other disciplines in particular. © 2017 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.
Risk Analysis Methods for Deepwater Port Oil Transfer Systems
DOT National Transportation Integrated Search
1976-06-01
This report deals with the risk analysis methodology for oil spills from the oil transfer systems in deepwater ports. Failure mode and effect analysis in combination with fault tree analysis are identified as the methods best suited for the assessmen...
Paques, Joseph-Jean; Gauthier, François; Perez, Alejandro
2007-01-01
To assess and plan future risk-analysis research projects, 275 documents describing methods and tools for assessing the risks associated with industrial machines or with other sectors such as the military, and the nuclear and aeronautics industries, etc., were collected. These documents were in the format of published books or papers, standards, technical guides and company procedures collected throughout industry. From the collected documents, 112 documents were selected for analysis; 108 methods applied or potentially applicable for assessing the risks associated with industrial machines were analyzed and classified. This paper presents the main quantitative results of the analysis of the methods and tools.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-09
... Role of Risk Analysis in Decision-Making AGENCY: Environmental Protection Agency (EPA). ACTION: Notice... documents entitled, ``Using Probabilistic Methods to Enhance the Role of Risk Analysis in Decision- Making... Probabilistic Methods to Enhance the Role of Risk Analysis in Decision-Making, with Case Study Examples'' and...
Benefit-risk analysis : a brief review and proposed quantitative approaches.
Holden, William L
2003-01-01
Given the current status of benefit-risk analysis as a largely qualitative method, two techniques for a quantitative synthesis of a drug's benefit and risk are proposed to allow a more objective approach. The recommended methods, relative-value adjusted number-needed-to-treat (RV-NNT) and its extension, minimum clinical efficacy (MCE) analysis, rely upon efficacy or effectiveness data, adverse event data and utility data from patients, describing their preferences for an outcome given potential risks. These methods, using hypothetical data for rheumatoid arthritis drugs, demonstrate that quantitative distinctions can be made between drugs which would better inform clinicians, drug regulators and patients about a drug's benefit-risk profile. If the number of patients needed to treat is less than the relative-value adjusted number-needed-to-harm in an RV-NNT analysis, patients are willing to undergo treatment with the experimental drug to derive a certain benefit knowing that they may be at risk for any of a series of potential adverse events. Similarly, the results of an MCE analysis allow for determining the worth of a new treatment relative to an older one, given not only the potential risks of adverse events and benefits that may be gained, but also by taking into account the risk of disease without any treatment. Quantitative methods of benefit-risk analysis have a place in the evaluative armamentarium of pharmacovigilance, especially those that incorporate patients' perspectives.
Uncertainty characterization approaches for risk assessment of DBPs in drinking water: a review.
Chowdhury, Shakhawat; Champagne, Pascale; McLellan, P James
2009-04-01
The management of risk from disinfection by-products (DBPs) in drinking water has become a critical issue over the last three decades. The areas of concern for risk management studies include (i) human health risk from DBPs, (ii) disinfection performance, (iii) technical feasibility (maintenance, management and operation) of treatment and disinfection approaches, and (iv) cost. Human health risk assessment is typically considered to be the most important phase of the risk-based decision-making or risk management studies. The factors associated with health risk assessment and other attributes are generally prone to considerable uncertainty. Probabilistic and non-probabilistic approaches have both been employed to characterize uncertainties associated with risk assessment. The probabilistic approaches include sampling-based methods (typically Monte Carlo simulation and stratified sampling) and asymptotic (approximate) reliability analysis (first- and second-order reliability methods). Non-probabilistic approaches include interval analysis, fuzzy set theory and possibility theory. However, it is generally accepted that no single method is suitable for the entire spectrum of problems encountered in uncertainty analyses for risk assessment. Each method has its own set of advantages and limitations. In this paper, the feasibility and limitations of different uncertainty analysis approaches are outlined for risk management studies of drinking water supply systems. The findings assist in the selection of suitable approaches for uncertainty analysis in risk management studies associated with DBPs and human health risk.
Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis
NASA Technical Reports Server (NTRS)
Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William
2009-01-01
This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).
Assessing the validity of prospective hazard analysis methods: a comparison of two techniques
2014-01-01
Background Prospective Hazard Analysis techniques such as Healthcare Failure Modes and Effects Analysis (HFMEA) and Structured What If Technique (SWIFT) have the potential to increase safety by identifying risks before an adverse event occurs. Published accounts of their application in healthcare have identified benefits, but the reliability of some methods has been found to be low. The aim of this study was to examine the validity of SWIFT and HFMEA by comparing their outputs in the process of risk assessment, and comparing the results with risks identified by retrospective methods. Methods The setting was a community-based anticoagulation clinic, in which risk assessment activities had been previously performed and were available. A SWIFT and an HFMEA workshop were conducted consecutively on the same day by experienced experts. Participants were a mixture of pharmacists, administrative staff and software developers. Both methods produced lists of risks scored according to the method’s procedure. Participants’ views about the value of the workshops were elicited with a questionnaire. Results SWIFT identified 61 risks and HFMEA identified 72 risks. For both methods less than half the hazards were identified by the other method. There was also little overlap between the results of the workshops and risks identified by prior root cause analysis, staff interviews or clinical governance board discussions. Participants’ feedback indicated that the workshops were viewed as useful. Conclusions Although there was limited overlap, both methods raised important hazards. Scoping the problem area had a considerable influence on the outputs. The opportunity for teams to discuss their work from a risk perspective is valuable, but these methods cannot be relied upon in isolation to provide a comprehensive description. Multiple methods for identifying hazards should be used and data from different sources should be integrated to give a comprehensive view of risk in a system. PMID:24467813
Haghighi, Mona; Johnson, Suzanne Bennett; Qian, Xiaoning; Lynch, Kristian F; Vehik, Kendra; Huang, Shuai
2016-08-26
Regression models are extensively used in many epidemiological studies to understand the linkage between specific outcomes of interest and their risk factors. However, regression models in general examine the average effects of the risk factors and ignore subgroups with different risk profiles. As a result, interventions are often geared towards the average member of the population, without consideration of the special health needs of different subgroups within the population. This paper demonstrates the value of using rule-based analysis methods that can identify subgroups with heterogeneous risk profiles in a population without imposing assumptions on the subgroups or method. The rules define the risk pattern of subsets of individuals by not only considering the interactions between the risk factors but also their ranges. We compared the rule-based analysis results with the results from a logistic regression model in The Environmental Determinants of Diabetes in the Young (TEDDY) study. Both methods detected a similar suite of risk factors, but the rule-based analysis was superior at detecting multiple interactions between the risk factors that characterize the subgroups. A further investigation of the particular characteristics of each subgroup may detect the special health needs of the subgroup and lead to tailored interventions.
USDA-ARS?s Scientific Manuscript database
Objective: To examine the risk factors of developing functional decline and make probabilistic predictions by using a tree-based method that allows higher order polynomials and interactions of the risk factors. Methods: The conditional inference tree analysis, a data mining approach, was used to con...
NASA Astrophysics Data System (ADS)
Klügel, J.
2006-12-01
Deterministic scenario-based seismic hazard analysis has a long tradition in earthquake engineering for developing the design basis of critical infrastructures like dams, transport infrastructures, chemical plants and nuclear power plants. For many applications besides of the design of infrastructures it is of interest to assess the efficiency of the design measures taken. These applications require a method allowing to perform a meaningful quantitative risk analysis. A new method for a probabilistic scenario-based seismic risk analysis has been developed based on a probabilistic extension of proven deterministic methods like the MCE- methodology. The input data required for the method are entirely based on the information which is necessary to perform any meaningful seismic hazard analysis. The method is based on the probabilistic risk analysis approach common for applications in nuclear technology developed originally by Kaplan & Garrick (1981). It is based (1) on a classification of earthquake events into different size classes (by magnitude), (2) the evaluation of the frequency of occurrence of events, assigned to the different classes (frequency of initiating events, (3) the development of bounding critical scenarios assigned to each class based on the solution of an optimization problem and (4) in the evaluation of the conditional probability of exceedance of critical design parameters (vulnerability analysis). The advantage of the method in comparison with traditional PSHA consists in (1) its flexibility, allowing to use different probabilistic models for earthquake occurrence as well as to incorporate advanced physical models into the analysis, (2) in the mathematically consistent treatment of uncertainties, and (3) in the explicit consideration of the lifetime of the critical structure as a criterion to formulate different risk goals. The method was applied for the evaluation of the risk of production interruption losses of a nuclear power plant during its residual lifetime.
Vulnerabilities, Influences and Interaction Paths: Failure Data for Integrated System Risk Analysis
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Fleming, Land
2006-01-01
We describe graph-based analysis methods for identifying and analyzing cross-subsystem interaction risks from subsystem connectivity information. By discovering external and remote influences that would be otherwise unexpected, these methods can support better communication among subsystem designers at points of potential conflict and to support design of more dependable and diagnosable systems. These methods identify hazard causes that can impact vulnerable functions or entities if propagated across interaction paths from the hazard source to the vulnerable target. The analysis can also assess combined impacts of And-Or trees of disabling influences. The analysis can use ratings of hazards and vulnerabilities to calculate cumulative measures of the severity and importance. Identification of cross-subsystem hazard-vulnerability pairs and propagation paths across subsystems will increase coverage of hazard and risk analysis and can indicate risk control and protection strategies.
Risk Analysis of Earth-Rock Dam Failures Based on Fuzzy Event Tree Method
Fu, Xiao; Gu, Chong-Shi; Su, Huai-Zhi; Qin, Xiang-Nan
2018-01-01
Earth-rock dams make up a large proportion of the dams in China, and their failures can induce great risks. In this paper, the risks associated with earth-rock dam failure are analyzed from two aspects: the probability of a dam failure and the resulting life loss. An event tree analysis method based on fuzzy set theory is proposed to calculate the dam failure probability. The life loss associated with dam failure is summarized and refined to be suitable for Chinese dams from previous studies. The proposed method and model are applied to one reservoir dam in Jiangxi province. Both engineering and non-engineering measures are proposed to reduce the risk. The risk analysis of the dam failure has essential significance for reducing dam failure probability and improving dam risk management level. PMID:29710824
Fuzzy Comprehensive Evaluation Method Applied in the Real Estate Investment Risks Research
NASA Astrophysics Data System (ADS)
ML(Zhang Minli), Zhang; Wp(Yang Wenpo), Yang
Real estate investment is a high-risk and high returned of economic activity, the key of real estate analysis is the identification of their types of investment risk and the risk of different types of effective prevention. But, as the financial crisis sweeping the world, the real estate industry also faces enormous risks, how effective and correct evaluation of real estate investment risks becomes the multitudinous scholar concern[1]. In this paper, real estate investment risks were summarized and analyzed, and comparative analysis method is discussed and finally presented fuzzy comprehensive evaluation method, not only in theory has the advantages of science, in the application also has the reliability, for real estate investment risk assessment provides an effective means for investors in real estate investing guidance on risk factors and forecasts.
Kaplan-Meier Survival Analysis Overestimates the Risk of Revision Arthroplasty: A Meta-analysis.
Lacny, Sarah; Wilson, Todd; Clement, Fiona; Roberts, Derek J; Faris, Peter D; Ghali, William A; Marshall, Deborah A
2015-11-01
Although Kaplan-Meier survival analysis is commonly used to estimate the cumulative incidence of revision after joint arthroplasty, it theoretically overestimates the risk of revision in the presence of competing risks (such as death). Because the magnitude of overestimation is not well documented, the potential associated impact on clinical and policy decision-making remains unknown. We performed a meta-analysis to answer the following questions: (1) To what extent does the Kaplan-Meier method overestimate the cumulative incidence of revision after joint replacement compared with alternative competing-risks methods? (2) Is the extent of overestimation influenced by followup time or rate of competing risks? We searched Ovid MEDLINE, EMBASE, BIOSIS Previews, and Web of Science (1946, 1980, 1980, and 1899, respectively, to October 26, 2013) and included article bibliographies for studies comparing estimated cumulative incidence of revision after hip or knee arthroplasty obtained using both Kaplan-Meier and competing-risks methods. We excluded conference abstracts, unpublished studies, or studies using simulated data sets. Two reviewers independently extracted data and evaluated the quality of reporting of the included studies. Among 1160 abstracts identified, six studies were included in our meta-analysis. The principal reason for the steep attrition (1160 to six) was that the initial search was for studies in any clinical area that compared the cumulative incidence estimated using the Kaplan-Meier versus competing-risks methods for any event (not just the cumulative incidence of hip or knee revision); we did this to minimize the likelihood of missing any relevant studies. We calculated risk ratios (RRs) comparing the cumulative incidence estimated using the Kaplan-Meier method with the competing-risks method for each study and used DerSimonian and Laird random effects models to pool these RRs. Heterogeneity was explored using stratified meta-analyses and metaregression. The pooled cumulative incidence of revision after hip or knee arthroplasty obtained using the Kaplan-Meier method was 1.55 times higher (95% confidence interval, 1.43-1.68; p < 0.001) than that obtained using the competing-risks method. Longer followup times and higher proportions of competing risks were not associated with increases in the amount of overestimation of revision risk by the Kaplan-Meier method (all p > 0.10). This may be due to the small number of studies that met the inclusion criteria and conservative variance approximation. The Kaplan-Meier method overestimates risk of revision after hip or knee arthroplasty in populations where competing risks (such as death) might preclude the occurrence of the event of interest (revision). Competing-risks methods should be used to more accurately estimate the cumulative incidence of revision when the goal is to plan healthcare services and resource allocation for revisions.
NASA Technical Reports Server (NTRS)
Bonine, Lauren
2015-01-01
The presentation provides insight into the schedule risk analysis process used by the Stratospheric Aerosol and Gas Experiment III on the International Space Station Project. The presentation focuses on the schedule risk analysis process highlighting the methods for identification of risk inputs, the inclusion of generic risks identified outside the traditional continuous risk management process, and the development of tailored analysis products used to improve risk informed decision making.
Project risk management in the construction of high-rise buildings
NASA Astrophysics Data System (ADS)
Titarenko, Boris; Hasnaoui, Amir; Titarenko, Roman; Buzuk, Liliya
2018-03-01
This paper shows the project risk management methods, which allow to better identify risks in the construction of high-rise buildings and to manage them throughout the life cycle of the project. One of the project risk management processes is a quantitative analysis of risks. The quantitative analysis usually includes the assessment of the potential impact of project risks and their probabilities. This paper shows the most popular methods of risk probability assessment and tries to indicate the advantages of the robust approach over the traditional methods. Within the framework of the project risk management model a robust approach of P. Huber is applied and expanded for the tasks of regression analysis of project data. The suggested algorithms used to assess the parameters in statistical models allow to obtain reliable estimates. A review of the theoretical problems of the development of robust models built on the methodology of the minimax estimates was done and the algorithm for the situation of asymmetric "contamination" was developed.
A method for scenario-based risk assessment for robust aerospace systems
NASA Astrophysics Data System (ADS)
Thomas, Victoria Katherine
In years past, aircraft conceptual design centered around creating a feasible aircraft that could be built and could fly the required missions. More recently, aircraft viability entered into conceptual design, allowing that the product's potential to be profitable should also be examined early in the design process. While examining an aerospace system's feasibility and viability early in the design process is extremely important, it is also important to examine system risk. In traditional aerospace systems risk analysis, risk is examined from the perspective of performance, schedule, and cost. Recently, safety and reliability analysis have been brought forward in the design process to also be examined during late conceptual and early preliminary design. While these analyses work as designed, existing risk analysis methods and techniques are not designed to examine an aerospace system's external operating environment and the risks present there. A new method has been developed here to examine, during the early part of concept design, the risk associated with not meeting assumptions about the system's external operating environment. The risks are examined in five categories: employment, culture, government and politics, economics, and technology. The risks are examined over a long time-period, up to the system's entire life cycle. The method consists of eight steps over three focus areas. The first focus area is Problem Setup. During problem setup, the problem is defined and understood to the best of the decision maker's ability. There are four steps in this area, in the following order: Establish the Need, Scenario Development, Identify Solution Alternatives, and Uncertainty and Risk Identification. There is significant iteration between steps two through four. Focus area two is Modeling and Simulation. In this area the solution alternatives and risks are modeled, and a numerical value for risk is calculated. A risk mitigation model is also created. The four steps involved in completing the modeling and simulation are: Alternative Solution Modeling, Uncertainty Quantification, Risk Assessment, and Risk Mitigation. Focus area three consists of Decision Support. In this area a decision support interface is created that allows for game playing between solution alternatives and risk mitigation. A multi-attribute decision making process is also implemented to aid in decision making. A demonstration problem inspired by Airbus' mid 1980s decision to break into the widebody long-range market was developed to illustrate the use of this method. The results showed that the method is able to capture additional types of risk than previous analysis methods, particularly at the early stages of aircraft design. It was also shown that the method can be used to help create a system that is robust to external environmental factors. The addition of an external environment risk analysis in the early stages of conceptual design can add another dimension to the analysis of feasibility and viability. The ability to take risk into account during the early stages of the design process can allow for the elimination of potentially feasible and viable but too-risky alternatives. The addition of a scenario-based analysis instead of a traditional probabilistic analysis enabled uncertainty to be effectively bound and examined over a variety of potential futures instead of only a single future. There is also potential for a product to be groomed for a specific future that one believes is likely to happen, or for a product to be steered during design as the future unfolds.
[Survival analysis with competing risks: estimating failure probability].
Llorca, Javier; Delgado-Rodríguez, Miguel
2004-01-01
To show the impact of competing risks of death on survival analysis. We provide an example of survival time without chronic rejection after heart transplantation, where death before rejection acts as a competing risk. Using a computer simulation, we compare the Kaplan-Meier estimator and the multiple decrement model. The Kaplan-Meier method overestimated the probability of rejection. Next, we illustrate the use of the multiple decrement model to analyze secondary end points (in our example: death after rejection). Finally, we discuss Kaplan-Meier assumptions and why they fail in the presence of competing risks. Survival analysis should be adjusted for competing risks of death to avoid overestimation of the risk of rejection produced with the Kaplan-Meier method.
NASA Astrophysics Data System (ADS)
van der Vat, Marnix; Femke, Schasfoort; Rhee Gigi, Van; Manfred, Wienhoven; Nico, Polman; Joost, Delsman; den Hoek Paul, Van; Maat Judith, Ter; Marjolein, Mens
2016-04-01
It is widely acknowledged that drought management should move from a crisis to a risk-based approach. A risk-based approach to managing water resources requires a sound drought risk analysis, quantifying the probability and impacts of water shortage due to droughts. Impacts of droughts are for example crop yield losses, hydropower production losses, and water shortage for municipal and industrial use. Many studies analyse the balance between supply and demand, but there is little experience in translating this into economic metrics that can be used in a decision-making process on investments to reduce drought risk. We will present a drought risk analysis method for the Netherlands, with a focus on the underlying economic method to quantify the welfare effects of water shortage for different water users. Both the risk-based approach as well as the economic valuation of water shortage for various water users was explored in a study for the Dutch Government. First, an historic analysis of the effects of droughts on revenues and prices in agriculture as well as on shipping and nature was carried out. Second, a drought risk analysis method was developed that combines drought hazard and drought impact analysis in a probabilistic way for various sectors. This consists of a stepwise approach, from water availability through water shortage to economic impact, for a range of drought events with a certain return period. Finally, a local case study was conducted to test the applicability of the drought risk analysis method. Through the study, experience was gained into integrating hydrological and economic analyses, which is a prerequisite for drought risk analysis. Results indicate that the risk analysis method is promising and applicable for various sectors. However, it was also found that quantification of economic impacts from droughts is time-consuming, because location- and sector-specific data is needed, which is not always readily available. Furthermore, for some sectors hydrological data was lacking to make a reliable estimate of drought return periods. By 2021, the Netherlands Government aims to agree on the water supply service levels, which should describe water availability and quality that can be delivered with a certain return period. The Netherlands' Ministry of Infrastructure and the Environment, representatives of the regional water boards and Rijkswaterstaat (operating the main water system) as well as several consultants and research institutes are important stakeholders for further development of the method, evaluation of cases and the development of a quantitative risk-informed decision-making tool.
Evaluation of Contamination Inspection and Analysis Methods through Modeling System Performance
NASA Technical Reports Server (NTRS)
Seasly, Elaine; Dever, Jason; Stuban, Steven M. F.
2016-01-01
Contamination is usually identified as a risk on the risk register for sensitive space systems hardware. Despite detailed, time-consuming, and costly contamination control efforts during assembly, integration, and test of space systems, contaminants are still found during visual inspections of hardware. Improved methods are needed to gather information during systems integration to catch potential contamination issues earlier and manage contamination risks better. This research explores evaluation of contamination inspection and analysis methods to determine optical system sensitivity to minimum detectable molecular contamination levels based on IEST-STD-CC1246E non-volatile residue (NVR) cleanliness levels. Potential future degradation of the system is modeled given chosen modules representative of optical elements in an optical system, minimum detectable molecular contamination levels for a chosen inspection and analysis method, and determining the effect of contamination on the system. By modeling system performance based on when molecular contamination is detected during systems integration and at what cleanliness level, the decision maker can perform trades amongst different inspection and analysis methods and determine if a planned method is adequate to meet system requirements and manage contamination risk.
NASA Astrophysics Data System (ADS)
DELİCE, Yavuz
2015-04-01
Highways, Located in the city and intercity locations are generally prone to many kind of natural disaster risks. Natural hazards and disasters that may occur firstly from highway project making to construction and operation stages and later during the implementation of highway maintenance and repair stages have to be taken into consideration. And assessment of risks that may occur against adverse situations is very important in terms of project design, construction, operation maintenance and repair costs. Making hazard and natural disaster risk analysis is largely depending on the definition of the likelihood of the probable hazards on the highways. However, assets at risk , and the impacts of the events must be examined and to be rated in their own. With the realization of these activities, intended improvements against natural hazards and disasters will be made with the utilization of Failure Mode Effects Analysis (FMEA) method and their effects will be analyzed with further works. FMEA, is a useful method to identify the failure mode and effects depending on the type of failure rate effects priorities and finding the most optimum economic and effective solution. Although relevant measures being taken for the identified risks by this analysis method , it may also provide some information for some public institutions about the nature of these risks when required. Thus, the necessary measures will have been taken in advance in the city and intercity highways. Many hazards and natural disasters are taken into account in risk assessments. The most important of these dangers can be listed as follows; • Natural disasters 1. Meteorological based natural disasters (floods, severe storms, tropical storms, winter storms, avalanches, etc.). 2. Geological based natural disasters (earthquakes, tsunamis, landslides, subsidence, sinkholes, etc) • Human originated disasters 1. Transport accidents (traffic accidents), originating from the road surface defects (icing, signaling caused malfunctions and risks), fire or explosion etc.- In this study, with FMEA method, risk analysis of the urban and intercity motorways against natural disasters and hazards have been performed and found solutions were brought against these risks. Keywords: Failure Modes Effects Analysis (FMEA), Pareto Analyses (PA), Highways, Risk Management.
NASA Astrophysics Data System (ADS)
Cui, Jia; Hong, Bei; Jiang, Xuepeng; Chen, Qinghua
2017-05-01
With the purpose of reinforcing correlation analysis of risk assessment threat factors, a dynamic assessment method of safety risks based on particle filtering is proposed, which takes threat analysis as the core. Based on the risk assessment standards, the method selects threat indicates, applies a particle filtering algorithm to calculate influencing weight of threat indications, and confirms information system risk levels by combining with state estimation theory. In order to improve the calculating efficiency of the particle filtering algorithm, the k-means cluster algorithm is introduced to the particle filtering algorithm. By clustering all particles, the author regards centroid as the representative to operate, so as to reduce calculated amount. The empirical experience indicates that the method can embody the relation of mutual dependence and influence in risk elements reasonably. Under the circumstance of limited information, it provides the scientific basis on fabricating a risk management control strategy.
The JPL Cost Risk Analysis Approach that Incorporates Engineering Realism
NASA Technical Reports Server (NTRS)
Harmon, Corey C.; Warfield, Keith R.; Rosenberg, Leigh S.
2006-01-01
This paper discusses the JPL Cost Engineering Group (CEG) cost risk analysis approach that accounts for all three types of cost risk. It will also describe the evaluation of historical cost data upon which this method is based. This investigation is essential in developing a method that is rooted in engineering realism and produces credible, dependable results to aid decision makers.
Yang, Yu; Jiang, Yong-Hai; Lian, Xin-Ying; Xi, Bei-Dou; Ma, Zhi-Fei; Xu, Xiang-Jian; An, Da
2016-12-01
Hazardous waste landfill sites are a significant source of groundwater pollution. To ensure that these landfills with a significantly high risk of groundwater contamination are properly managed, a risk-based ranking method related to groundwater contamination is needed. In this research, a risk-based prioritization method for the classification of groundwater pollution from hazardous waste landfills was established. The method encompasses five phases, including risk pre-screening, indicator selection, characterization, classification and, lastly, validation. In the risk ranking index system employed here, 14 indicators involving hazardous waste landfills and migration in the vadose zone as well as aquifer were selected. The boundary of each indicator was determined by K-means cluster analysis and the weight of each indicator was calculated by principal component analysis. These methods were applied to 37 hazardous waste landfills in China. The result showed that the risk for groundwater contamination from hazardous waste landfills could be ranked into three classes from low to high risk. In all, 62.2 % of the hazardous waste landfill sites were classified in the low and medium risk classes. The process simulation method and standardized anomalies were used to validate the result of risk ranking; the results were consistent with the simulated results related to the characteristics of contamination. The risk ranking method was feasible, valid and can provide reference data related to risk management for groundwater contamination at hazardous waste landfill sites.
Advanced uncertainty modelling for container port risk analysis.
Alyami, Hani; Yang, Zaili; Riahi, Ramin; Bonsall, Stephen; Wang, Jin
2016-08-13
Globalization has led to a rapid increase of container movements in seaports. Risks in seaports need to be appropriately addressed to ensure economic wealth, operational efficiency, and personnel safety. As a result, the safety performance of a Container Terminal Operational System (CTOS) plays a growing role in improving the efficiency of international trade. This paper proposes a novel method to facilitate the application of Failure Mode and Effects Analysis (FMEA) in assessing the safety performance of CTOS. The new approach is developed through incorporating a Fuzzy Rule-Based Bayesian Network (FRBN) with Evidential Reasoning (ER) in a complementary manner. The former provides a realistic and flexible method to describe input failure information for risk estimates of individual hazardous events (HEs) at the bottom level of a risk analysis hierarchy. The latter is used to aggregate HEs safety estimates collectively, allowing dynamic risk-based decision support in CTOS from a systematic perspective. The novel feature of the proposed method, compared to those in traditional port risk analysis lies in a dynamic model capable of dealing with continually changing operational conditions in ports. More importantly, a new sensitivity analysis method is developed and carried out to rank the HEs by taking into account their specific risk estimations (locally) and their Risk Influence (RI) to a port's safety system (globally). Due to its generality, the new approach can be tailored for a wide range of applications in different safety and reliability engineering and management systems, particularly when real time risk ranking is required to measure, predict, and improve the associated system safety performance. Copyright © 2016 Elsevier Ltd. All rights reserved.
Cost-effectiveness analysis of risk-reduction measures to reach water safety targets.
Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof; Pettersson, Thomas J R
2011-01-01
Identifying the most suitable risk-reduction measures in drinking water systems requires a thorough analysis of possible alternatives. In addition to the effects on the risk level, also the economic aspects of the risk-reduction alternatives are commonly considered important. Drinking water supplies are complex systems and to avoid sub-optimisation of risk-reduction measures, the entire system from source to tap needs to be considered. There is a lack of methods for quantification of water supply risk reduction in an economic context for entire drinking water systems. The aim of this paper is to present a novel approach for risk assessment in combination with economic analysis to evaluate risk-reduction measures based on a source-to-tap approach. The approach combines a probabilistic and dynamic fault tree method with cost-effectiveness analysis (CEA). The developed approach comprises the following main parts: (1) quantification of risk reduction of alternatives using a probabilistic fault tree model of the entire system; (2) combination of the modelling results with CEA; and (3) evaluation of the alternatives with respect to the risk reduction, the probability of not reaching water safety targets and the cost-effectiveness. The fault tree method and CEA enable comparison of risk-reduction measures in the same quantitative unit and consider costs and uncertainties. The approach provides a structured and thorough analysis of risk-reduction measures that facilitates transparency and long-term planning of drinking water systems in order to avoid sub-optimisation of available resources for risk reduction. Copyright © 2010 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Jing, Wenjun; Zhao, Yan
2018-02-01
Stability is an important part of geotechnical engineering research. The operating experiences of underground storage caverns in salt rock all around the world show that the stability of the caverns is the key problem of safe operation. Currently, the combination of theoretical analysis and numerical simulation are the mainly adopts method of reserve stability analysis. This paper introduces the concept of risk into the stability analysis of underground geotechnical structure, and studies the instability of underground storage cavern in salt rock from the perspective of risk analysis. Firstly, the definition and classification of cavern instability risk is proposed, and the damage mechanism is analyzed from the mechanical angle. Then the main stability evaluating indicators of cavern instability risk are proposed, and an evaluation method of cavern instability risk is put forward. Finally, the established cavern instability risk assessment system is applied to the analysis and prediction of cavern instability risk after 30 years of operation in a proposed storage cavern group in the Huai’an salt mine. This research can provide a useful theoretical base for the safe operation and management of underground storage caverns in salt rock.
Rah, Jeong-Eun; Manger, Ryan P; Yock, Adam D; Kim, Gwe-Ya
2016-12-01
To examine the abilities of a traditional failure mode and effects analysis (FMEA) and modified healthcare FMEA (m-HFMEA) scoring methods by comparing the degree of congruence in identifying high risk failures. The authors applied two prospective methods of the quality management to surface image guided, linac-based radiosurgery (SIG-RS). For the traditional FMEA, decisions on how to improve an operation were based on the risk priority number (RPN). The RPN is a product of three indices: occurrence, severity, and detectability. The m-HFMEA approach utilized two indices, severity and frequency. A risk inventory matrix was divided into four categories: very low, low, high, and very high. For high risk events, an additional evaluation was performed. Based upon the criticality of the process, it was decided if additional safety measures were needed and what they comprise. The two methods were independently compared to determine if the results and rated risks matched. The authors' results showed an agreement of 85% between FMEA and m-HFMEA approaches for top 20 risks of SIG-RS-specific failure modes. The main differences between the two approaches were the distribution of the values and the observation that failure modes (52, 54, 154) with high m-HFMEA scores do not necessarily have high FMEA-RPN scores. In the m-HFMEA analysis, when the risk score is determined, the basis of the established HFMEA Decision Tree™ or the failure mode should be more thoroughly investigated. m-HFMEA is inductive because it requires the identification of the consequences from causes, and semi-quantitative since it allows the prioritization of high risks and mitigation measures. It is therefore a useful tool for the prospective risk analysis method to radiotherapy.
Seismic Hazard Analysis — Quo vadis?
NASA Astrophysics Data System (ADS)
Klügel, Jens-Uwe
2008-05-01
The paper is dedicated to the review of methods of seismic hazard analysis currently in use, analyzing the strengths and weaknesses of different approaches. The review is performed from the perspective of a user of the results of seismic hazard analysis for different applications such as the design of critical and general (non-critical) civil infrastructures, technical and financial risk analysis. A set of criteria is developed for and applied to an objective assessment of the capabilities of different analysis methods. It is demonstrated that traditional probabilistic seismic hazard analysis (PSHA) methods have significant deficiencies, thus limiting their practical applications. These deficiencies have their roots in the use of inadequate probabilistic models and insufficient understanding of modern concepts of risk analysis, as have been revealed in some recent large scale studies. These deficiencies result in the lack of ability of a correct treatment of dependencies between physical parameters and finally, in an incorrect treatment of uncertainties. As a consequence, results of PSHA studies have been found to be unrealistic in comparison with empirical information from the real world. The attempt to compensate these problems by a systematic use of expert elicitation has, so far, not resulted in any improvement of the situation. It is also shown that scenario-earthquakes developed by disaggregation from the results of a traditional PSHA may not be conservative with respect to energy conservation and should not be used for the design of critical infrastructures without validation. Because the assessment of technical as well as of financial risks associated with potential damages of earthquakes need a risk analysis, current method is based on a probabilistic approach with its unsolved deficiencies. Traditional deterministic or scenario-based seismic hazard analysis methods provide a reliable and in general robust design basis for applications such as the design of critical infrastructures, especially with systematic sensitivity analyses based on validated phenomenological models. Deterministic seismic hazard analysis incorporates uncertainties in the safety factors. These factors are derived from experience as well as from expert judgment. Deterministic methods associated with high safety factors may lead to too conservative results, especially if applied for generally short-lived civil structures. Scenarios used in deterministic seismic hazard analysis have a clear physical basis. They are related to seismic sources discovered by geological, geomorphologic, geodetic and seismological investigations or derived from historical references. Scenario-based methods can be expanded for risk analysis applications with an extended data analysis providing the frequency of seismic events. Such an extension provides a better informed risk model that is suitable for risk-informed decision making.
Petromilli Nordi Sasso Garcia, Patrícia; Polli, Gabriela Scatimburgo; Campos, Juliana Alvares Duarte Bonini
2013-01-01
As dentistry is a profession that demands a manipulative precision of hand movements, musculoskeletal disorders are among the most common occupational diseases. This study estimated the risk of musculoskeletal disorders developing in dental students using the Ovako Working Analysis System (OWAS) and Rapid Upper Limb Assessment (RULA) methods, and estimated the diagnostic agreement between the 2 methods. Students (n = 75), enrolled in the final undergraduate year at the Araraquara School of Dentistry--UNESP--were studied. Photographs were taken of students while performing diverse clinical procedures (n = 283) using a digital camera, which were assessed using OWAS and RULA. A risk score was attributed following each procedure performed by the student. The prevalence of the risk of musculoskeletal disorders was estimated per point and for a 95% CI. To assess the agreement between the 2 methods, Kappa statistics with linear weighting were used. The level of significance adopted was 5%. There was a high prevalence of the mean score for risk of musculoskeletal disorders in the dental students evaluated according to the OWAS method (p = 97.88%; 95% CI: 96.20-99.56%), and a high prevalence of the high score (p = 40.6; 95% CI: 34.9-46.4%) and extremely high risk (p = 59.4%; 95% CI: 53.6-65.1%) according to RULA method Null agreement was verified (k = 0) in the risk di agnosis of the tested methods. The risk of musculoskeletal disorders in dental students estimated by the OWAS method was medium, whereas the same risk by the RULA method was extremely high. There was no diagnostic agreement between the OWAS and RULA methods.
78 FR 25440 - Request for Information and Citations on Methods for Cumulative Risk Assessment
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-01
... Citations on Methods for Cumulative Risk Assessment AGENCY: Office of the Science Advisor, Environmental... requesting information and citations on approaches and methods for the planning, analysis, assessment, and... approaches to understanding risks to human health and the environment. For example, in Science & Decisions...
The development of a 3D risk analysis method.
I, Yet-Pole; Cheng, Te-Lung
2008-05-01
Much attention has been paid to the quantitative risk analysis (QRA) research in recent years due to more and more severe disasters that have happened in the process industries. Owing to its calculation complexity, very few software, such as SAFETI, can really make the risk presentation meet the practice requirements. However, the traditional risk presentation method, like the individual risk contour in SAFETI, is mainly based on the consequence analysis results of dispersion modeling, which usually assumes that the vapor cloud disperses over a constant ground roughness on a flat terrain with no obstructions and concentration fluctuations, which is quite different from the real situations of a chemical process plant. All these models usually over-predict the hazardous regions in order to maintain their conservativeness, which also increases the uncertainty of the simulation results. On the other hand, a more rigorous model such as the computational fluid dynamics (CFD) model can resolve the previous limitations; however, it cannot resolve the complexity of risk calculations. In this research, a conceptual three-dimensional (3D) risk calculation method was proposed via the combination of results of a series of CFD simulations with some post-processing procedures to obtain the 3D individual risk iso-surfaces. It is believed that such technique will not only be limited to risk analysis at ground level, but also be extended into aerial, submarine, or space risk analyses in the near future.
NASA Astrophysics Data System (ADS)
Aprilia, Ayu Rizky; Santoso, Imam; Ekasari, Dhita Murita
2017-05-01
Yogurt is a product based on milk, which has beneficial effects for health. The process for the production of yogurt is very susceptible to failure because it involves bacteria and fermentation. For an industry, the risks may cause harm and have a negative impact. In order for a product to be successful and profitable, it requires the analysis of risks that may occur during the production process. Risk analysis can identify the risks in detail and prevent as well as determine its handling, so that the risks can be minimized. Therefore, this study will analyze the risks of the production process with a case study in CV.XYZ. The method used in this research is the Fuzzy Failure Mode and Effect Analysis (fuzzy FMEA) and Fault Tree Analysis (FTA). The results showed that there are 6 risks from equipment variables, raw material variables, and process variables. Those risks include the critical risk, which is the risk of a lack of an aseptic process, more specifically if starter yogurt is damaged due to contamination by fungus or other bacteria and a lack of sanitation equipment. The results of quantitative analysis of FTA showed that the highest probability is the probability of the lack of an aseptic process, with a risk of 3.902%. The recommendations for improvement include establishing SOPs (Standard Operating Procedures), which include the process, workers, and environment, controlling the starter of yogurt and improving the production planning and sanitation equipment using hot water immersion.
Risk assessment in the North Caucasus ski resorts
NASA Astrophysics Data System (ADS)
Komarov, Anton Y.; Seliverstov, Yury G.; Glazovskaya, Tatyana G.; Turchaninova, Alla S.
2016-10-01
Avalanches pose a significant problem in most mountain regions of Russia. The constant growth of economic activity, and therefore the increased avalanche hazard, in the North Caucasus region lead to demand for the development of large-scale avalanche risk assessment methods. Such methods are needed for the determination of appropriate avalanche protection measures as well as for economic assessments.The requirement of natural hazard risk assessments is determined by the Federal Law of the Russian Federation (Federal Law 21.12.1994 N 68-FZ, 2016). However, Russian guidelines (SNIP 11-02-96, 2013; SNIP 22-02-2003, 2012) are not clearly presented concerning avalanche risk assessment calculations. Thus, we discuss these problems by presenting a new avalanche risk assessment approach, with the example of developing but poorly researched ski resort areas. The suggested method includes the formulas to calculate collective and individual avalanche risk. The results of risk analysis are shown in quantitative data that can be used to determine levels of avalanche risk (appropriate, acceptable and inappropriate) and to suggest methods to decrease the individual risk to an acceptable level or better. The analysis makes it possible to compare risk quantitative data obtained from different regions, analyze them and evaluate the economic feasibility of protection measures.
Li, Pei-Chiun; Ma, Hwong-Wen
2016-01-25
The total quantity of chemical emissions does not take into account their chemical toxicity, and fails to be an accurate indicator of the potential impact on human health. The sources of released contaminants, and therefore, the potential risk, also differ based on geography. Because of the complexity of the risk, there is no integrated method to evaluate the effectiveness of risk reduction. Therefore, this study developed a method to incorporate the spatial variability of emissions into human health risk assessment to evaluate how to effectively reduce risk using risk elasticity analysis. Risk elasticity analysis, the percentage change in risk in response to the percentage change in emissions, was adopted in this study to evaluate the effectiveness and efficiency of risk reduction. The results show that the main industry sectors are different in each area, and that high emission in an area does not correspond to high risk. Decreasing the high emissions of certain sectors in an area does not result in efficient risk reduction in this area. This method can provide more holistic information for risk management, prevent the development of increased risk, and prioritize the risk reduction strategies. Copyright © 2015 Elsevier B.V. All rights reserved.
Risk analysis of gravity dam instability using credibility theory Monte Carlo simulation model.
Xin, Cao; Chongshi, Gu
2016-01-01
Risk analysis of gravity dam stability involves complicated uncertainty in many design parameters and measured data. Stability failure risk ratio described jointly by probability and possibility has deficiency in characterization of influence of fuzzy factors and representation of the likelihood of risk occurrence in practical engineering. In this article, credibility theory is applied into stability failure risk analysis of gravity dam. Stability of gravity dam is viewed as a hybrid event considering both fuzziness and randomness of failure criterion, design parameters and measured data. Credibility distribution function is conducted as a novel way to represent uncertainty of influence factors of gravity dam stability. And combining with Monte Carlo simulation, corresponding calculation method and procedure are proposed. Based on a dam section, a detailed application of the modeling approach on risk calculation of both dam foundation and double sliding surfaces is provided. The results show that, the present method is feasible to be applied on analysis of stability failure risk for gravity dams. The risk assessment obtained can reflect influence of both sorts of uncertainty, and is suitable as an index value.
NASA Astrophysics Data System (ADS)
Melliana, Armen, Yusrizal, Akmal, Syarifah
2017-11-01
PT Nira Murni construction is a contractor of PT Chevron Pacific Indonesia which engaged in contractor, fabrication, maintenance construction suppliers, and labor services. The high of accident rate in this company is caused the lack of awareness of workplace safety. Therefore, it requires an effort to reduce the accident rate on the company so that the financial losses can be minimized. In this study, Safe T-Score method is used to analyze the accident rate by measuring the level of frequency. Analysis is continued using risk management methods which identify hazards, risk measurement and risk management. The last analysis uses Job safety analysis (JSA) which will identify the effect of accidents. From the result of this study can be concluded that Job Safety Analysis (JSA) methods has not been implemented properly. Therefore, JSA method needs to follow-up in the next study, so that can be well applied as prevention of occupational accidents.
Vickers, Andrew J; Cronin, Angel M; Elkin, Elena B; Gonen, Mithat
2008-01-01
Background Decision curve analysis is a novel method for evaluating diagnostic tests, prediction models and molecular markers. It combines the mathematical simplicity of accuracy measures, such as sensitivity and specificity, with the clinical applicability of decision analytic approaches. Most critically, decision curve analysis can be applied directly to a data set, and does not require the sort of external data on costs, benefits and preferences typically required by traditional decision analytic techniques. Methods In this paper we present several extensions to decision curve analysis including correction for overfit, confidence intervals, application to censored data (including competing risk) and calculation of decision curves directly from predicted probabilities. All of these extensions are based on straightforward methods that have previously been described in the literature for application to analogous statistical techniques. Results Simulation studies showed that repeated 10-fold crossvalidation provided the best method for correcting a decision curve for overfit. The method for applying decision curves to censored data had little bias and coverage was excellent; for competing risk, decision curves were appropriately affected by the incidence of the competing risk and the association between the competing risk and the predictor of interest. Calculation of decision curves directly from predicted probabilities led to a smoothing of the decision curve. Conclusion Decision curve analysis can be easily extended to many of the applications common to performance measures for prediction models. Software to implement decision curve analysis is provided. PMID:19036144
Augmenting the Deliberative Method for Ranking Risks.
Susel, Irving; Lasley, Trace; Montezemolo, Mark; Piper, Joel
2016-01-01
The Department of Homeland Security (DHS) characterized and prioritized the physical cross-border threats and hazards to the nation stemming from terrorism, market-driven illicit flows of people and goods (illegal immigration, narcotics, funds, counterfeits, and weaponry), and other nonmarket concerns (movement of diseases, pests, and invasive species). These threats and hazards pose a wide diversity of consequences with very different combinations of magnitudes and likelihoods, making it very challenging to prioritize them. This article presents the approach that was used at DHS to arrive at a consensus regarding the threats and hazards that stand out from the rest based on the overall risk they pose. Due to time constraints for the decision analysis, it was not feasible to apply multiattribute methodologies like multiattribute utility theory or the analytic hierarchy process. Using a holistic approach was considered, such as the deliberative method for ranking risks first published in this journal. However, an ordinal ranking alone does not indicate relative or absolute magnitude differences among the risks. Therefore, the use of the deliberative method for ranking risks is not sufficient for deciding whether there is a material difference between the top-ranked and bottom-ranked risks, let alone deciding what the stand-out risks are. To address this limitation of ordinal rankings, the deliberative method for ranking risks was augmented by adding an additional step to transform the ordinal ranking into a ratio scale ranking. This additional step enabled the selection of stand-out risks to help prioritize further analysis. © 2015 Society for Risk Analysis.
Baker, Jannah; White, Nicole; Mengersen, Kerrie
2014-11-20
Spatial analysis is increasingly important for identifying modifiable geographic risk factors for disease. However, spatial health data from surveys are often incomplete, ranging from missing data for only a few variables, to missing data for many variables. For spatial analyses of health outcomes, selection of an appropriate imputation method is critical in order to produce the most accurate inferences. We present a cross-validation approach to select between three imputation methods for health survey data with correlated lifestyle covariates, using as a case study, type II diabetes mellitus (DM II) risk across 71 Queensland Local Government Areas (LGAs). We compare the accuracy of mean imputation to imputation using multivariate normal and conditional autoregressive prior distributions. Choice of imputation method depends upon the application and is not necessarily the most complex method. Mean imputation was selected as the most accurate method in this application. Selecting an appropriate imputation method for health survey data, after accounting for spatial correlation and correlation between covariates, allows more complete analysis of geographic risk factors for disease with more confidence in the results to inform public policy decision-making.
Risk-Stratified Imputation in Survival Analysis
Kennedy, Richard E.; Adragni, Kofi P.; Tiwari, Hemant K.; Voeks, Jenifer H.; Brott, Thomas G.; Howard, George
2013-01-01
Background Censoring that is dependent on covariates associated with survival can arise in randomized trials due to changes in recruitment and eligibility criteria to minimize withdrawals, potentially leading to biased treatment effect estimates. Imputation approaches have been proposed to address censoring in survival analysis; and while these approaches may provide unbiased estimates of treatment effects, imputation of a large number of outcomes may over- or underestimate the associated variance based on the imputation pool selected. Purpose We propose an improved method, risk-stratified imputation, as an alternative to address withdrawal related to the risk of events in the context of time-to-event analyses. Methods Our algorithm performs imputation from a pool of replacement subjects with similar values of both treatment and covariate(s) of interest, that is, from a risk-stratified sample. This stratification prior to imputation addresses the requirement of time-to-event analysis that censored observations are representative of all other observations in the risk group with similar exposure variables. We compared our risk-stratified imputation to case deletion and bootstrap imputation in a simulated dataset in which the covariate of interest (study withdrawal) was related to treatment. A motivating example from a recent clinical trial is also presented to demonstrate the utility of our method. Results In our simulations, risk-stratified imputation gives estimates of treatment effect comparable to bootstrap and auxiliary variable imputation while avoiding inaccuracies of the latter two in estimating the associated variance. Similar results were obtained in analysis of clinical trial data. Limitations Risk-stratified imputation has little advantage over other imputation methods when covariates of interest are not related to treatment, although its performance is superior when covariates are related to treatment. Risk-stratified imputation is intended for categorical covariates, and may be sensitive to the width of the matching window if continuous covariates are used. Conclusions The use of the risk-stratified imputation should facilitate the analysis of many clinical trials, in which one group has a higher withdrawal rate that is related to treatment. PMID:23818434
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ravindra, M.K.; Banon, H.
1992-07-01
In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as amore » three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC's PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described.« less
Use-related risk analysis for medical devices based on improved FMEA.
Liu, Long; Shuai, Ma; Wang, Zhu; Li, Ping
2012-01-01
In order to effectively analyze and control use-related risk of medical devices, quantitative methodologies must be applied. Failure Mode and Effects Analysis (FMEA) is a proactive technique for error detection and risk reduction. In this article, an improved FMEA based on Fuzzy Mathematics and Grey Relational Theory is developed to better carry out user-related risk analysis for medical devices. As an example, the analysis process using this improved FMEA method for a certain medical device (C-arm X-ray machine) is described.
NASA Astrophysics Data System (ADS)
Debnath, Ashim Kumar; Chin, Hoong Chor
Navigational safety analysis relying on collision statistics is often hampered because of the low number of observations. A promising alternative approach that overcomes this problem is proposed in this paper. By analyzing critical vessel interactions this approach proactively measures collision risk in port waters. The proposed method is illustrated for quantitative measurement of collision risks in Singapore port fairways, and validated by examining correlations between the measured risks with those perceived by pilots. This method is an ethically appealing alternative to the collision-based analysis for fast, reliable and effective safety assessment, thus possessing great potential for managing collision risks in port waters.
Operational Implementation of a Pc Uncertainty Construct for Conjunction Assessment Risk Analysis
NASA Technical Reports Server (NTRS)
Newman, Lauri K.; Hejduk, Matthew D.; Johnson, Lauren C.
2016-01-01
Earlier this year the NASA Conjunction Assessment and Risk Analysis (CARA) project presented the theoretical and algorithmic aspects of a method to include the uncertainties in the calculation inputs when computing the probability of collision (Pc) between two space objects, principally uncertainties in the covariances and the hard-body radius. The output of this calculation approach is to produce rather than a single Pc value an entire probability density function that will represent the range of possible Pc values given the uncertainties in the inputs and bring CA risk analysis methodologies more in line with modern risk management theory. The present study provides results from the exercise of this method against an extended dataset of satellite conjunctions in order to determine the effect of its use on the evaluation of conjunction assessment (CA) event risk posture. The effects are found to be considerable: a good number of events are downgraded from or upgraded to a serious risk designation on the basis of consideration of the Pc uncertainty. The findings counsel the integration of the developed methods into NASA CA operations.
Vascular Disease, ESRD, and Death: Interpreting Competing Risk Analyses
Coresh, Josef; Segev, Dorry L.; Kucirka, Lauren M.; Tighiouart, Hocine; Sarnak, Mark J.
2012-01-01
Summary Background and objectives Vascular disease, a common condition in CKD, is a risk factor for mortality and ESRD. Optimal patient care requires accurate estimation and ordering of these competing risks. Design, setting, participants, & measurements This is a prospective cohort study of screened (n=885) and randomized participants (n=837) in the Modification of Diet in Renal Disease study (original study enrollment, 1989–1992), evaluating the association of vascular disease with ESRD and pre-ESRD mortality using standard survival analysis and competing risk regression. Results The method of analysis resulted in markedly different estimates. Cumulative incidence by standard analysis (censoring at the competing event) implied that, with vascular disease, the 15-year incidence was 66% and 51% for ESRD and pre-ESRD death, respectively. A more accurate representation of absolute risk was estimated with competing risk regression: 15-year incidence was 54% and 29% for ESRD and pre-ESRD death, respectively. For the association of vascular disease with pre-ESRD death, estimates of relative risk by the two methods were similar (standard survival analysis adjusted hazard ratio, 1.63; 95% confidence interval, 1.20–2.20; competing risk regression adjusted subhazard ratio, 1.57; 95% confidence interval, 1.15–2.14). In contrast, the hazard and subhazard ratios differed substantially for other associations, such as GFR and pre-ESRD mortality. Conclusions When competing events exist, absolute risk is better estimated using competing risk regression, but etiologic associations by this method must be carefully interpreted. The presence of vascular disease in CKD decreases the likelihood of survival to ESRD, independent of age and other risk factors. PMID:22859747
Vascular disease, ESRD, and death: interpreting competing risk analyses.
Grams, Morgan E; Coresh, Josef; Segev, Dorry L; Kucirka, Lauren M; Tighiouart, Hocine; Sarnak, Mark J
2012-10-01
Vascular disease, a common condition in CKD, is a risk factor for mortality and ESRD. Optimal patient care requires accurate estimation and ordering of these competing risks. This is a prospective cohort study of screened (n=885) and randomized participants (n=837) in the Modification of Diet in Renal Disease study (original study enrollment, 1989-1992), evaluating the association of vascular disease with ESRD and pre-ESRD mortality using standard survival analysis and competing risk regression. The method of analysis resulted in markedly different estimates. Cumulative incidence by standard analysis (censoring at the competing event) implied that, with vascular disease, the 15-year incidence was 66% and 51% for ESRD and pre-ESRD death, respectively. A more accurate representation of absolute risk was estimated with competing risk regression: 15-year incidence was 54% and 29% for ESRD and pre-ESRD death, respectively. For the association of vascular disease with pre-ESRD death, estimates of relative risk by the two methods were similar (standard survival analysis adjusted hazard ratio, 1.63; 95% confidence interval, 1.20-2.20; competing risk regression adjusted subhazard ratio, 1.57; 95% confidence interval, 1.15-2.14). In contrast, the hazard and subhazard ratios differed substantially for other associations, such as GFR and pre-ESRD mortality. When competing events exist, absolute risk is better estimated using competing risk regression, but etiologic associations by this method must be carefully interpreted. The presence of vascular disease in CKD decreases the likelihood of survival to ESRD, independent of age and other risk factors.
Risk analysis within environmental impact assessment of proposed construction activity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zeleňáková, Martina; Zvijáková, Lenka
Environmental impact assessment is an important process, prior to approval of the investment plan, providing a detailed examination of the likely and foreseeable impacts of proposed construction activity on the environment. The objective of this paper is to develop a specific methodology for the analysis and evaluation of environmental impacts of selected constructions – flood protection structures using risk analysis methods. The application of methodology designed for the process of environmental impact assessment will develop assumptions for further improvements or more effective implementation and performance of this process. The main objective of the paper is to improve the implementation ofmore » the environmental impact assessment process. Through the use of risk analysis methods in environmental impact assessment process, the set objective has been achieved. - Highlights: This paper is informed by an effort to develop research with the aim of: • Improving existing qualitative and quantitative methods for assessing the impacts • A better understanding of relations between probabilities and consequences • Methodology for the EIA of flood protection constructions based on risk analysis • Creative approaches in the search for environmentally friendly proposed activities.« less
Method of assessing a lipid-related health risk based on ion mobility analysis of lipoproteins
Benner, W. Henry; Krauss, Ronald M.; Blanche, Patricia J.
2010-12-14
A medical diagnostic method and instrumentation system for analyzing noncovalently bonded agglomerated biological particles is described. The method and system comprises: a method of preparation for the biological particles; an electrospray generator; an alpha particle radiation source; a differential mobility analyzer; a particle counter; and data acquisition and analysis means. The medical device is useful for the assessment of human diseases, such as cardiac disease risk and hyperlipidemia, by rapid quantitative analysis of lipoprotein fraction densities. Initially, purification procedures are described to reduce an initial blood sample to an analytical input to the instrument. The measured sizes from the analytical sample are correlated with densities, resulting in a spectrum of lipoprotein densities. The lipoprotein density distribution can then be used to characterize cardiac and other lipid-related health risks.
NASA Astrophysics Data System (ADS)
Mardi Safitri, Dian; Arfi Nabila, Zahra; Azmi, Nora
2018-03-01
Musculoskeletal Disorders (MSD) is one of the ergonomic risks due to manual activity, non-neutral posture and repetitive motion. The purpose of this study is to measure risk and implement ergonomic interventions to reduce the risk of MSD on the paper pallet assembly work station. Measurements to work posture are done by Ovako Working Posture Analysis (OWAS) methods and Rapid Entire Body Assessment (REBA) method, while the measurement of work repetitiveness was using Strain Index (SI) method. Assembly processes operators are identified has the highest risk level. OWAS score, Strain Index, and REBA values are 4, 20.25, and 11. Ergonomic improvements are needed to reduce that level of risk. Proposed improvements will be developed using the Quality Function Deployment (QFD) method applied with Axiomatic House of Quality (AHOQ) and Morphological Chart. As the result, risk level based on OWAS score & REBA score turn out from 4 & 11 to be 1 & 2. Biomechanics analysis of the operator also shows the decreasing values for L4-L5 moment, compression, joint shear, and joint moment strength.
Jahanfar, Ali; Amirmojahedi, Mohsen; Gharabaghi, Bahram; Dubey, Brajesh; McBean, Edward; Kumar, Dinesh
2017-03-01
Rapid population growth of major urban centres in many developing countries has created massive landfills with extraordinary heights and steep side-slopes, which are frequently surrounded by illegal low-income residential settlements developed too close to landfills. These extraordinary landfills are facing high risks of catastrophic failure with potentially large numbers of fatalities. This study presents a novel method for risk assessment of landfill slope failure, using probabilistic analysis of potential failure scenarios and associated fatalities. The conceptual framework of the method includes selecting appropriate statistical distributions for the municipal solid waste (MSW) material shear strength and rheological properties for potential failure scenario analysis. The MSW material properties for a given scenario is then used to analyse the probability of slope failure and the resulting run-out length to calculate the potential risk of fatalities. In comparison with existing methods, which are solely based on the probability of slope failure, this method provides a more accurate estimate of the risk of fatalities associated with a given landfill slope failure. The application of the new risk assessment method is demonstrated with a case study for a landfill located within a heavily populated area of New Delhi, India.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ronald L. Boring; David I. Gertman; Jeffrey C. Joe
2005-09-01
An ongoing issue within human-computer interaction (HCI) is the need for simplified or “discount” methods. The current economic slowdown has necessitated innovative methods that are results driven and cost effective. The myriad methods of design and usability are currently being cost-justified, and new techniques are actively being explored that meet current budgets and needs. Recent efforts in human reliability analysis (HRA) are highlighted by the ten-year development of the Standardized Plant Analysis Risk HRA (SPAR-H) method. The SPAR-H method has been used primarily for determining humancentered risk at nuclear power plants. The SPAR-H method, however, shares task analysis underpinnings withmore » HCI. Despite this methodological overlap, there is currently no HRA approach deployed in heuristic usability evaluation. This paper presents an extension of the existing SPAR-H method to be used as part of heuristic usability evaluation in HCI.« less
Failure mode and effects analysis: a comparison of two common risk prioritisation methods.
McElroy, Lisa M; Khorzad, Rebeca; Nannicelli, Anna P; Brown, Alexandra R; Ladner, Daniela P; Holl, Jane L
2016-05-01
Failure mode and effects analysis (FMEA) is a method of risk assessment increasingly used in healthcare over the past decade. The traditional method, however, can require substantial time and training resources. The goal of this study is to compare a simplified scoring method with the traditional scoring method to determine the degree of congruence in identifying high-risk failures. An FMEA of the operating room (OR) to intensive care unit (ICU) handoff was conducted. Failures were scored and ranked using both the traditional risk priority number (RPN) and criticality-based method, and a simplified method, which designates failures as 'high', 'medium' or 'low' risk. The degree of congruence was determined by first identifying those failures determined to be critical by the traditional method (RPN≥300), and then calculating the per cent congruence with those failures designated critical by the simplified methods (high risk). In total, 79 process failures among 37 individual steps in the OR to ICU handoff process were identified. The traditional method yielded Criticality Indices (CIs) ranging from 18 to 72 and RPNs ranging from 80 to 504. The simplified method ranked 11 failures as 'low risk', 30 as medium risk and 22 as high risk. The traditional method yielded 24 failures with an RPN ≥300, of which 22 were identified as high risk by the simplified method (92% agreement). The top 20% of CI (≥60) included 12 failures, of which six were designated as high risk by the simplified method (50% agreement). These results suggest that the simplified method of scoring and ranking failures identified by an FMEA can be a useful tool for healthcare organisations with limited access to FMEA expertise. However, the simplified method does not result in the same degree of discrimination in the ranking of failures offered by the traditional method. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Hamada, Tsuyoshi; Nakai, Yousuke; Isayama, Hiroyuki; Togawa, Osamu; Kogure, Hirofumi; Kawakubo, Kazumichi; Tsujino, Takeshi; Sasahira, Naoki; Hirano, Kenji; Yamamoto, Natsuyo; Ito, Yukiko; Sasaki, Takashi; Mizuno, Suguru; Toda, Nobuo; Tada, Minoru; Koike, Kazuhiko
2014-03-01
Self-expandable metallic stent (SEMS) placement is widely carried out for distal malignant biliary obstruction, and survival analysis is used to evaluate the cumulative incidences of SEMS dysfunction (e.g. the Kaplan-Meier [KM] method and the log-rank test). However, these statistical methods might be inappropriate in the presence of 'competing risks' (here, death without SEMS dysfunction), which affects the probability of experiencing the event of interest (SEMS dysfunction); that is, SEMS dysfunction can no longer be observed after death. A competing risk analysis has rarely been done in studies on SEMS. We introduced the concept of a competing risk analysis and illustrated its impact on the evaluation of SEMS outcomes using hypothetical and actual data. Our illustrative study included 476 consecutive patients who underwent SEMS placement for unresectable distal malignant biliary obstruction. A significant difference between cumulative incidences of SEMS dysfunction in male and female patients via theKM method (P = 0.044 by the log-rank test) disappeared after applying a competing risk analysis (P = 0.115 by Gray's test). In contrast, although cumulative incidences of SEMS dysfunction via the KM method were similar with and without chemotherapy (P = 0.647 by the log-rank test), cumulative incidence of SEMS dysfunction in the non-chemotherapy group was shown to be significantly lower (P = 0.031 by Gray's test) in a competing risk analysis. Death as a competing risk event needs to be appropriately considered in estimating a cumulative incidence of SEMS dysfunction, otherwise analytical results may be biased. © 2013 The Authors. Digestive Endoscopy © 2013 Japan Gastroenterological Endoscopy Society.
Giri, Veda N.; Coups, Elliot J.; Ruth, Karen; Goplerud, Julia; Raysor, Susan; Kim, Taylor Y.; Bagden, Loretta; Mastalski, Kathleen; Zakrzewski, Debra; Leimkuhler, Suzanne; Watkins-Bruner, Deborah
2009-01-01
Purpose Men with a family history (FH) of prostate cancer (PCA) and African American (AA) men are at higher risk for PCA. Recruitment and retention of these high-risk men into early detection programs has been challenging. We report a comprehensive analysis on recruitment methods, show rates, and participant factors from the Prostate Cancer Risk Assessment Program (PRAP), which is a prospective, longitudinal PCA screening study. Materials and Methods Men 35–69 years are eligible if they have a FH of PCA, are AA, or have a BRCA1/2 mutation. Recruitment methods were analyzed with respect to participant demographics and show to the first PRAP appointment using standard statistical methods Results Out of 707 men recruited, 64.9% showed to the initial PRAP appointment. More individuals were recruited via radio than from referral or other methods (χ2 = 298.13, p < .0001). Men recruited via radio were more likely to be AA (p<0.001), less educated (p=0.003), not married or partnered (p=0.007), and have no FH of PCA (p<0.001). Men recruited via referrals had higher incomes (p=0.007). Men recruited via referral were more likely to attend their initial PRAP visit than those recruited by radio or other methods (χ2 = 27.08, p < .0001). Conclusions This comprehensive analysis finds that radio leads to higher recruitment of AA men with lower socioeconomic status. However, these are the high-risk men that have lower show rates for PCA screening. Targeted motivational measures need to be studied to improve show rates for PCA risk assessment for these high-risk men. PMID:19758657
[FMEA applied to the radiotherapy patient care process].
Meyrieux, C; Garcia, R; Pourel, N; Mège, A; Bodez, V
2012-10-01
Failure modes and effects analysis (FMEA), is a risk analysis method used at the Radiotherapy Department of Institute Sainte-Catherine as part of a strategy seeking to continuously improve the quality and security of treatments. The method comprises several steps: definition of main processes; for each of them, description for every step of prescription, treatment preparation, treatment application; identification of the possible risks, their consequences, their origins; research of existing safety elements which may avoid these risks; grading of risks to assign a criticality score resulting in a numerical organisation of the risks. Finally, the impact of proposed corrective actions was then estimated by a new grading round. For each process studied, a detailed map of the risks was obtained, facilitating the identification of priority actions to be undertaken. For example, we obtain five steps in patient treatment planning with an unacceptable level of risk, 62 a level of moderate risk and 31 an acceptable level of risk. The FMEA method, used in the industrial domain and applied here to health care, is an effective tool for the management of risks in patient care. However, the time and training requirements necessary to implement this method should not be underestimated. Copyright © 2012 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.
Vickers, Andrew J; Cronin, Angel M; Elkin, Elena B; Gonen, Mithat
2008-11-26
Decision curve analysis is a novel method for evaluating diagnostic tests, prediction models and molecular markers. It combines the mathematical simplicity of accuracy measures, such as sensitivity and specificity, with the clinical applicability of decision analytic approaches. Most critically, decision curve analysis can be applied directly to a data set, and does not require the sort of external data on costs, benefits and preferences typically required by traditional decision analytic techniques. In this paper we present several extensions to decision curve analysis including correction for overfit, confidence intervals, application to censored data (including competing risk) and calculation of decision curves directly from predicted probabilities. All of these extensions are based on straightforward methods that have previously been described in the literature for application to analogous statistical techniques. Simulation studies showed that repeated 10-fold crossvalidation provided the best method for correcting a decision curve for overfit. The method for applying decision curves to censored data had little bias and coverage was excellent; for competing risk, decision curves were appropriately affected by the incidence of the competing risk and the association between the competing risk and the predictor of interest. Calculation of decision curves directly from predicted probabilities led to a smoothing of the decision curve. Decision curve analysis can be easily extended to many of the applications common to performance measures for prediction models. Software to implement decision curve analysis is provided.
Haneuse, Sebastien; Lee, Kyu Ha
2016-05-01
Hospital readmission is a key marker of quality of health care. Notwithstanding its widespread use, however, it remains controversial in part because statistical methods used to analyze readmission, primarily logistic regression and related models, may not appropriately account for patients who die before experiencing a readmission event within the time frame of interest. Toward resolving this, we describe and illustrate the semi-competing risks framework, which refers to the general setting where scientific interest lies with some nonterminal event (eg, readmission), the occurrence of which is subject to a terminal event (eg, death). Although several statistical analysis methods have been proposed for semi-competing risks data, we describe in detail the use of illness-death models primarily because of their relation to well-known methods for survival analysis and the availability of software. We also describe and consider in detail several existing approaches that could, in principle, be used to analyze semi-competing risks data, including composite end point and competing risks analyses. Throughout we illustrate the ideas and methods using data on N=49 763 Medicare beneficiaries hospitalized between 2011 and 2013 with a principle discharge diagnosis of heart failure. © 2016 American Heart Association, Inc.
Haneuse, Sebastien; Lee, Kyu Ha
2016-01-01
Hospital readmission is a key marker of quality of health care. Notwithstanding its widespread use, however, it remains controversial in part because statistical methods used to analyze readmission, primarily logistic regression and related models, may not appropriately account for patients who die prior to experiencing a readmission event within the timeframe of interest. Towards resolving this, we describe and illustrate the semi-competing risks framework, which refers to the general setting where scientific interest lies with some non-terminal event (e.g. readmission), the occurrence of which is subject to a terminal event (e.g. death). Although a number of statistical analysis methods have been proposed for semi-competing risks data, we describe in detail the use of illness-death models primarily because of their relation to well-known methods for survival analysis and the availability of software. We also describe and consider in detail a number of existing approaches that could, in principle, be used to analyze semi-competing risks data including composite endpoint and competing risks analyses. Throughout we illustrate the ideas and methods using data on N=49,763 Medicare beneficiaries hospitalized between 2011–2013 with a principle discharge diagnosis of heart failure. PMID:27072677
Latent Model Analysis of Substance Use and HIV Risk Behaviors among High-Risk Minority Adults
ERIC Educational Resources Information Center
Wang, Min Qi; Matthew, Resa F.; Chiu, Yu-Wen; Yan, Fang; Bellamy, Nikki D.
2007-01-01
Objectives: This study evaluated substance use and HIV risk profile using a latent model analysis based on ecological theory, inclusive of a risk and protective factor framework, in sexually active minority adults (N=1,056) who participated in a federally funded substance abuse and HIV prevention health initiative from 2002 to 2006. Methods: Data…
Brenn, T; Arnesen, E
1985-01-01
For comparative evaluation, discriminant analysis, logistic regression and Cox's model were used to select risk factors for total and coronary deaths among 6595 men aged 20-49 followed for 9 years. Groups with mortality between 5 and 93 per 1000 were considered. Discriminant analysis selected variable sets only marginally different from the logistic and Cox methods which always selected the same sets. A time-saving option, offered for both the logistic and Cox selection, showed no advantage compared with discriminant analysis. Analysing more than 3800 subjects, the logistic and Cox methods consumed, respectively, 80 and 10 times more computer time than discriminant analysis. When including the same set of variables in non-stepwise analyses, all methods estimated coefficients that in most cases were almost identical. In conclusion, discriminant analysis is advocated for preliminary or stepwise analysis, otherwise Cox's method should be used.
Van der Fels-Klerx, H J; Van Asselt, E D; Raley, M; Poulsen, M; Korsgaard, H; Bredsdorff, L; Nauta, M; D'agostino, M; Coles, D; Marvin, H J P; Frewer, L J
2018-01-22
This study aimed to critically review methods for ranking risks related to food safety and dietary hazards on the basis of their anticipated human health impacts. A literature review was performed to identify and characterize methods for risk ranking from the fields of food, environmental science and socio-economic sciences. The review used a predefined search protocol, and covered the bibliographic databases Scopus, CAB Abstracts, Web of Sciences, and PubMed over the period 1993-2013. All references deemed relevant, on the basis of predefined evaluation criteria, were included in the review, and the risk ranking method characterized. The methods were then clustered-based on their characteristics-into eleven method categories. These categories included: risk assessment, comparative risk assessment, risk ratio method, scoring method, cost of illness, health adjusted life years (HALY), multi-criteria decision analysis, risk matrix, flow charts/decision trees, stated preference techniques and expert synthesis. Method categories were described by their characteristics, weaknesses and strengths, data resources, and fields of applications. It was concluded there is no single best method for risk ranking. The method to be used should be selected on the basis of risk manager/assessor requirements, data availability, and the characteristics of the method. Recommendations for future use and application are provided.
How to Perform an Ethical Risk Analysis (eRA).
Hansson, Sven Ove
2018-02-26
Ethical analysis is often needed in the preparation of policy decisions on risk. A three-step method is proposed for performing an ethical risk analysis (eRA). In the first step, the people concerned are identified and categorized in terms of the distinct but compatible roles of being risk-exposed, a beneficiary, or a decisionmaker. In the second step, a more detailed classification of roles and role combinations is performed, and ethically problematic role combinations are identified. In the third step, further ethical deliberation takes place, with an emphasis on individual risk-benefit weighing, distributional analysis, rights analysis, and power analysis. Ethical issues pertaining to subsidiary risk roles, such as those of experts and journalists, are also treated in this phase. An eRA should supplement, not replace, a traditional risk analysis that puts emphasis on the probabilities and severities of undesirable events but does not cover ethical issues such as agency, interpersonal relationships, and justice. © 2018 Society for Risk Analysis.
An Empirical Assessment of Defense Contractor Risk 1976-1984.
1986-06-01
Model to evaluate the. Department of Defense contract pricing , financing, and profit policies . ’ D*’ ’ *NTV D? 7A’:: TA E *A l ..... -:- A-i SN 0102...defense con- tractor risk-return relationship is performed utilizing four methods: mean-variance analysis of rate of return, the Capital Asset Pricing Model ...relationship is performed utilizing four methods: mean- variance analysis of rate of return, the Capital Asset Pricing Model , mean-variance analysis of total
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ravindra, M.K.; Banon, H.
1992-07-01
In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as amore » three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC`s PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described.« less
Linking stressors and ecological responses
Gentile, J.H.; Solomon, K.R.; Butcher, J.B.; Harrass, M.; Landis, W.G.; Power, M.; Rattner, B.A.; Warren-Hicks, W.J.; Wenger, R.; Foran, Jeffery A.; Ferenc, Susan A.
1999-01-01
To characterize risk, it is necessary to quantify the linkages and interactions between chemical, physical and biological stressors and endpoints in the conceptual framework for ecological risk assessment (ERA). This can present challenges in a multiple stressor analysis, and it will not always be possible to develop a quantitative stressor-response profile. This review commences with a conceptual representation of the problem of developing a linkage analysis for multiple stressors and responses. The remainder of the review surveys a variety of mathematical and statistical methods (e.g., ranking methods, matrix models, multivariate dose-response for mixtures, indices, visualization, simulation modeling and decision-oriented methods) for accomplishing the linkage analysis for multiple stressors. Describing the relationships between multiple stressors and ecological effects are critical components of 'effects assessment' in the ecological risk assessment framework.
WE-B-BRC-02: Risk Analysis and Incident Learning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fraass, B.
Prospective quality management techniques, long used by engineering and industry, have become a growing aspect of efforts to improve quality management and safety in healthcare. These techniques are of particular interest to medical physics as scope and complexity of clinical practice continue to grow, thus making the prescriptive methods we have used harder to apply and potentially less effective for our interconnected and highly complex healthcare enterprise, especially in imaging and radiation oncology. An essential part of most prospective methods is the need to assess the various risks associated with problems, failures, errors, and design flaws in our systems. Wemore » therefore begin with an overview of risk assessment methodologies used in healthcare and industry and discuss their strengths and weaknesses. The rationale for use of process mapping, failure modes and effects analysis (FMEA) and fault tree analysis (FTA) by TG-100 will be described, as well as suggestions for the way forward. This is followed by discussion of radiation oncology specific risk assessment strategies and issues, including the TG-100 effort to evaluate IMRT and other ways to think about risk in the context of radiotherapy. Incident learning systems, local as well as the ASTRO/AAPM ROILS system, can also be useful in the risk assessment process. Finally, risk in the context of medical imaging will be discussed. Radiation (and other) safety considerations, as well as lack of quality and certainty all contribute to the potential risks associated with suboptimal imaging. The goal of this session is to summarize a wide variety of risk analysis methods and issues to give the medical physicist access to tools which can better define risks (and their importance) which we work to mitigate with both prescriptive and prospective risk-based quality management methods. Learning Objectives: Description of risk assessment methodologies used in healthcare and industry Discussion of radiation oncology-specific risk assessment strategies and issues Evaluation of risk in the context of medical imaging and image quality E. Samei: Research grants from Siemens and GE.« less
Li, Chunhui; Sun, Lian; Jia, Junxiang; Cai, Yanpeng; Wang, Xuan
2016-07-01
Source water areas are facing many potential water pollution risks. Risk assessment is an effective method to evaluate such risks. In this paper an integrated model based on k-means clustering analysis and set pair analysis was established aiming at evaluating the risks associated with water pollution in source water areas, in which the weights of indicators were determined through the entropy weight method. Then the proposed model was applied to assess water pollution risks in the region of Shiyan in which China's key source water area Danjiangkou Reservoir for the water source of the middle route of South-to-North Water Diversion Project is located. The results showed that eleven sources with relative high risk value were identified. At the regional scale, Shiyan City and Danjiangkou City would have a high risk value in term of the industrial discharge. Comparatively, Danjiangkou City and Yunxian County would have a high risk value in terms of agricultural pollution. Overall, the risk values of north regions close to the main stream and reservoir of the region of Shiyan were higher than that in the south. The results of risk level indicated that five sources were in lower risk level (i.e., level II), two in moderate risk level (i.e., level III), one in higher risk level (i.e., level IV) and three in highest risk level (i.e., level V). Also risks of industrial discharge are higher than that of the agricultural sector. It is thus essential to manage the pillar industry of the region of Shiyan and certain agricultural companies in the vicinity of the reservoir to reduce water pollution risks of source water areas. Copyright © 2016 Elsevier B.V. All rights reserved.
A comparative critical study between FMEA and FTA risk analysis methods
NASA Astrophysics Data System (ADS)
Cristea, G.; Constantinescu, DM
2017-10-01
Today there is used an overwhelming number of different risk analyses techniques with acronyms such as: FMEA (Failure Modes and Effects Analysis) and its extension FMECA (Failure Mode, Effects, and Criticality Analysis), DRBFM (Design Review by Failure Mode), FTA (Fault Tree Analysis) and and its extension ETA (Event Tree Analysis), HAZOP (Hazard & Operability Studies), HACCP (Hazard Analysis and Critical Control Points) and What-if/Checklist. However, the most used analysis techniques in the mechanical and electrical industry are FMEA and FTA. In FMEA, which is an inductive method, information about the consequences and effects of the failures is usually collected through interviews with experienced people, and with different knowledge i.e., cross-functional groups. The FMEA is used to capture potential failures/risks & impacts and prioritize them on a numeric scale called Risk Priority Number (RPN) which ranges from 1 to 1000. FTA is a deductive method i.e., a general system state is decomposed into chains of more basic events of components. The logical interrelationship of how such basic events depend on and affect each other is often described analytically in a reliability structure which can be visualized as a tree. Both methods are very time-consuming to be applied thoroughly, and this is why it is oftenly not done so. As a consequence possible failure modes may not be identified. To address these shortcomings, it is proposed to use a combination of FTA and FMEA.
Deng, Xinyang; Jiang, Wen
2017-09-12
Failure mode and effect analysis (FMEA) is a useful tool to define, identify, and eliminate potential failures or errors so as to improve the reliability of systems, designs, and products. Risk evaluation is an important issue in FMEA to determine the risk priorities of failure modes. There are some shortcomings in the traditional risk priority number (RPN) approach for risk evaluation in FMEA, and fuzzy risk evaluation has become an important research direction that attracts increasing attention. In this paper, the fuzzy risk evaluation in FMEA is studied from a perspective of multi-sensor information fusion. By considering the non-exclusiveness between the evaluations of fuzzy linguistic variables to failure modes, a novel model called D numbers is used to model the non-exclusive fuzzy evaluations. A D numbers based multi-sensor information fusion method is proposed to establish a new model for fuzzy risk evaluation in FMEA. An illustrative example is provided and examined using the proposed model and other existing method to show the effectiveness of the proposed model.
Deng, Xinyang
2017-01-01
Failure mode and effect analysis (FMEA) is a useful tool to define, identify, and eliminate potential failures or errors so as to improve the reliability of systems, designs, and products. Risk evaluation is an important issue in FMEA to determine the risk priorities of failure modes. There are some shortcomings in the traditional risk priority number (RPN) approach for risk evaluation in FMEA, and fuzzy risk evaluation has become an important research direction that attracts increasing attention. In this paper, the fuzzy risk evaluation in FMEA is studied from a perspective of multi-sensor information fusion. By considering the non-exclusiveness between the evaluations of fuzzy linguistic variables to failure modes, a novel model called D numbers is used to model the non-exclusive fuzzy evaluations. A D numbers based multi-sensor information fusion method is proposed to establish a new model for fuzzy risk evaluation in FMEA. An illustrative example is provided and examined using the proposed model and other existing method to show the effectiveness of the proposed model. PMID:28895905
[Impact of water pollution risk in water transfer project based on fault tree analysis].
Liu, Jian-Chang; Zhang, Wei; Wang, Li-Min; Li, Dai-Qing; Fan, Xiu-Ying; Deng, Hong-Bing
2009-09-15
The methods to assess water pollution risk for medium water transfer are gradually being explored. The event-nature-proportion method was developed to evaluate the probability of the single event. Fault tree analysis on the basis of calculation on single event was employed to evaluate the extent of whole water pollution risk for the channel water body. The result indicates, that the risk of pollutants from towns and villages along the line of water transfer project to the channel water body is at high level with the probability of 0.373, which will increase pollution to the channel water body at the rate of 64.53 mg/L COD, 4.57 mg/L NH4(+) -N and 0.066 mg/L volatilization hydroxybenzene, respectively. The measurement of fault probability on the basis of proportion method is proved to be useful in assessing water pollution risk under much uncertainty.
Tutorial: Parallel Computing of Simulation Models for Risk Analysis.
Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D
2016-10-01
Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.
Dynamic Blowout Risk Analysis Using Loss Functions.
Abimbola, Majeed; Khan, Faisal
2018-02-01
Most risk analysis approaches are static; failing to capture evolving conditions. Blowout, the most feared accident during a drilling operation, is a complex and dynamic event. The traditional risk analysis methods are useful in the early design stage of drilling operation while falling short during evolving operational decision making. A new dynamic risk analysis approach is presented to capture evolving situations through dynamic probability and consequence models. The dynamic consequence models, the focus of this study, are developed in terms of loss functions. These models are subsequently integrated with the probability to estimate operational risk, providing a real-time risk analysis. The real-time evolving situation is considered dependent on the changing bottom-hole pressure as drilling progresses. The application of the methodology and models are demonstrated with a case study of an offshore drilling operation evolving to a blowout. © 2017 Society for Risk Analysis.
A stable systemic risk ranking in China's banking sector: Based on principal component analysis
NASA Astrophysics Data System (ADS)
Fang, Libing; Xiao, Binqing; Yu, Honghai; You, Qixing
2018-02-01
In this paper, we compare five popular systemic risk rankings, and apply principal component analysis (PCA) model to provide a stable systemic risk ranking for the Chinese banking sector. Our empirical results indicate that five methods suggest vastly different systemic risk rankings for the same bank, while the combined systemic risk measure based on PCA provides a reliable ranking. Furthermore, according to factor loadings of the first component, PCA combined ranking is mainly based on fundamentals instead of market price data. We clearly find that price-based rankings are not as practical a method as fundamentals-based ones. This PCA combined ranking directly shows systemic risk contributions of each bank for banking supervision purpose and reminds banks to prevent and cope with the financial crisis in advance.
Risk based inspection for atmospheric storage tank
NASA Astrophysics Data System (ADS)
Nugroho, Agus; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin
2016-04-01
Corrosion is an attack that occurs on a metallic material as a result of environment's reaction.Thus, it causes atmospheric storage tank's leakage, material loss, environmental pollution, equipment failure and affects the age of process equipment then finally financial damage. Corrosion risk measurement becomesa vital part of Asset Management at the plant for operating any aging asset.This paper provides six case studies dealing with high speed diesel atmospheric storage tank parts at a power plant. A summary of the basic principles and procedures of corrosion risk analysis and RBI applicable to the Process Industries were discussed prior to the study. Semi quantitative method based onAPI 58I Base-Resource Document was employed. The risk associated with corrosion on the equipment in terms of its likelihood and its consequences were discussed. The corrosion risk analysis outcome used to formulate Risk Based Inspection (RBI) method that should be a part of the atmospheric storage tank operation at the plant. RBI gives more concern to inspection resources which are mostly on `High Risk' and `Medium Risk' criteria and less on `Low Risk' shell. Risk categories of the evaluated equipment were illustrated through case study analysis outcome.
2012-01-01
Background To demonstrate the use of risk-benefit analysis for comparing multiple competing interventions in the absence of randomized trials, we applied this approach to the evaluation of five anticoagulants to prevent thrombosis in patients undergoing orthopedic surgery. Methods Using a cost-effectiveness approach from a clinical perspective (i.e. risk benefit analysis) we compared thromboprophylaxis with warfarin, low molecular weight heparin, unfractionated heparin, fondaparinux or ximelagatran in patients undergoing major orthopedic surgery, with sub-analyses according to surgery type. Proportions and variances of events defining risk (major bleeding) and benefit (thrombosis averted) were obtained through a meta-analysis and used to define beta distributions. Monte Carlo simulations were conducted and used to calculate incremental risks, benefits, and risk-benefit ratios. Finally, net clinical benefit was calculated for all replications across a range of risk-benefit acceptability thresholds, with a reference range obtained by estimating the case fatality rate - ratio of thrombosis to bleeding. Results The analysis showed that compared to placebo ximelagatran was superior to other options but final results were influenced by type of surgery, since ximelagatran was superior in total knee replacement but not in total hip replacement. Conclusions Using simulation and economic techniques we demonstrate a method that allows comparing multiple competing interventions in the absence of randomized trials with multiple arms by determining the option with the best risk-benefit profile. It can be helpful in clinical decision making since it incorporates risk, benefit, and personal risk acceptance. PMID:22233221
Nitride, Chiara; Lee, Victoria; Baricevic-Jones, Ivona; Adel-Patient, Karine; Baumgartner, Sabine; Mills, E N Clare
2018-01-01
Allergen analysis is central to implementing and monitoring food allergen risk assessment and management processes by the food industry, but current methods for the determination of allergens in foods give highly variable results. The European Union-funded "Integrated Approaches to Food Allergen and Allergy Risk Management" (iFAAM) project has been working to address gaps in knowledge regarding food allergen management and analysis, including the development of novel MS and immuno-based allergen determination methods. Common allergenic food ingredients (peanut, hazelnut, walnut, cow's milk [Bos domesticus], and hen's egg [Gallus domesticus]) and common food matrixes (chocolate dessert and cookie) have been used for both clinical studies and analytical method development to ensure that the new methods are clinically relevant. Allergen molecules have been used as analytical targets and allergenic ingredients incurred into matrixes at levels close to reference doses that may trigger the use of precautionary allergen labeling. An interlaboratory method comparison has been undertaken for the determination of peanut in chocolate dessert using MS and immuno-based methods. The iFAAM approach has highlighted the need for methods to report test results in allergenic protein. This will allow food business operators to use them in risk assessments that are founded on clinical study data in which protein has been used as a measure of allergenic potency.
Application of multi-criteria decision-making to risk prioritisation in tidal energy developments
NASA Astrophysics Data System (ADS)
Kolios, Athanasios; Read, George; Ioannou, Anastasia
2016-01-01
This paper presents an analytical multi-criterion analysis for the prioritisation of risks for the development of tidal energy projects. After a basic identification of risks throughout the project and relevant stakeholders in the UK, classified through a political, economic, social, technological, legal and environmental analysis, relevant questionnaires provided scores to each risk and corresponding weights for each of the different sectors. Employing an extended technique for order of preference by similarity to ideal solution as well as the weighted sum method based on the data obtained, the risks identified are ranked based on their criticality, drawing attention of the industry in mitigating the ones scoring higher. Both methods were modified to take averages at different stages of the analysis in order to observe the effects on the final risk ranking. A sensitivity analysis of the results was also carried out with regard to the weighting factors given to the perceived expertise of participants, with different results being obtained whether a linear, squared or square root regression is used. Results of the study show that academics and industry have conflicting opinions with regard to the perception of the most critical risks.
NASA Astrophysics Data System (ADS)
Liu, P.
2013-12-01
Quantitative analysis of the risk for reservoir real-time operation is a hard task owing to the difficulty of accurate description of inflow uncertainties. The ensemble-based hydrologic forecasts directly depict the inflows not only the marginal distributions but also their persistence via scenarios. This motivates us to analyze the reservoir real-time operating risk with ensemble-based hydrologic forecasts as inputs. A method is developed by using the forecast horizon point to divide the future time into two stages, the forecast lead-time and the unpredicted time. The risk within the forecast lead-time is computed based on counting the failure number of forecast scenarios, and the risk in the unpredicted time is estimated using reservoir routing with the design floods and the reservoir water levels of forecast horizon point. As a result, a two-stage risk analysis method is set up to quantify the entire flood risks by defining the ratio of the number of scenarios that excessive the critical value to the total number of scenarios. The China's Three Gorges Reservoir (TGR) is selected as a case study, where the parameter and precipitation uncertainties are implemented to produce ensemble-based hydrologic forecasts. The Bayesian inference, Markov Chain Monte Carlo, is used to account for the parameter uncertainty. Two reservoir operation schemes, the real operated and scenario optimization, are evaluated for the flood risks and hydropower profits analysis. With the 2010 flood, it is found that the improvement of the hydrologic forecast accuracy is unnecessary to decrease the reservoir real-time operation risk, and most risks are from the forecast lead-time. It is therefore valuable to decrease the avarice of ensemble-based hydrologic forecasts with less bias for a reservoir operational purpose.
Sala, Emma; Bonfiglioli, Roberta; Fostinellil, Jacopo; Tomasi, Cesare; Graziosi, Francesca; Violante, Francesco S; Apostoli, Pietro
2014-01-01
Risk assessment for upper extremity work related muscoloskeletal disorders by applying six methods of ergonomic: a ten years experience. The objective of this research was to verify and validate the multiple step method suggested by SIMLII guidelines and to compare results obtained by use of these methods: Washington State Standard, OCRA, HAL, RULA, OREGE and STRAIN INDEX. 598 workstations for a total of 1800 analysis by different methods were considered, by adopting the following multiple step procedure: prelinminary evaluation by Washington State method and OCRA checklist in all the working stations, RULA or HAL as first level evaluation, OREGE or SI as second level evaluation. The preliminary evaluation resulted negative (risk absent) in the 75% of examined work stations and by using checklist OCRA optimal-acceptable condition was found in 58% by HAL in 92% of analysis, by RULA in 100%, by OREGE in 64%; by SI in 70% of examined working positions. We observed similar evaluation of strain among methods and main differences have been observed in posture and frequency assessment. The preliminary evaluation by State of Washington method appears to be an adequate instrument for identify the working condition at risk. All the adopted methods were in a good agreement in two estreme situations: high risk or absent risk, expecially in absent risk conditions. Level of accordance varied on the basis of their rationale and of the role of their different components so SIML indications about the critical use of biouzechanical methods and about the possible use of more than one of them (considering working chlaracteristics) have been confirmed.
Probabilistic Exposure Analysis for Chemical Risk Characterization
Bogen, Kenneth T.; Cullen, Alison C.; Frey, H. Christopher; Price, Paul S.
2009-01-01
This paper summarizes the state of the science of probabilistic exposure assessment (PEA) as applied to chemical risk characterization. Current probabilistic risk analysis methods applied to PEA are reviewed. PEA within the context of risk-based decision making is discussed, including probabilistic treatment of related uncertainty, interindividual heterogeneity, and other sources of variability. Key examples of recent experience gained in assessing human exposures to chemicals in the environment, and other applications to chemical risk characterization and assessment, are presented. It is concluded that, although improvements continue to be made, existing methods suffice for effective application of PEA to support quantitative analyses of the risk of chemically induced toxicity that play an increasing role in key decision-making objectives involving health protection, triage, civil justice, and criminal justice. Different types of information required to apply PEA to these different decision contexts are identified, and specific PEA methods are highlighted that are best suited to exposure assessment in these separate contexts. PMID:19223660
Risk management of key issues of FPSO
NASA Astrophysics Data System (ADS)
Sun, Liping; Sun, Hai
2012-12-01
Risk analysis of key systems have become a growing topic late of because of the development of offshore structures. Equipment failures of offloading system and fire accidents were analyzed based on the floating production, storage and offloading (FPSO) features. Fault tree analysis (FTA), and failure modes and effects analysis (FMEA) methods were examined based on information already researched on modules of relex reliability studio (RRS). Equipment failures were also analyzed qualitatively by establishing a fault tree and Boolean structure function based on the shortage of failure cases, statistical data, and risk control measures examined. Failure modes of fire accident were classified according to the different areas of fire occurrences during the FMEA process, using risk priority number (RPN) methods to evaluate their severity rank. The qualitative analysis of FTA gave the basic insight of forming the failure modes of FPSO offloading, and the fire FMEA gave the priorities and suggested processes. The research has practical importance for the security analysis problems of FPSO.
RAMPART (TM): Risk Assessment Method-Property Analysis and Ranking Tool v.4.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carson, Susan D.; Hunter, Regina L.; Link, Madison D.
RAMPART{trademark}, Risk Assessment Method-property Analysis and Ranking Tool, is a new type of computer software package for the assessment of risk to buildings. RAMPART{trademark} has been developed by Sandia National Laboratories (SNL) for the U.S. General Services Administration (GSA). RAMPART {trademark} has been designed and developed to be a risk-based decision support tool that requires no risk analysis expertise on the part of the user. The RAMPART{trademark} user interface elicits information from the user about the building. The RAMPART{trademark} expert system is a set of rules that embodies GSA corporate knowledge and SNL's risk assessment experience. The RAMPART{trademark} database containsmore » both data entered by the user during a building analysis session and large sets of natural hazard and crime data. RAMPART{trademark} algorithms use these data to assess the risk associated with a given building in the face of certain hazards. Risks arising from five natural hazards (earthquake, hurricane, winter storm, tornado and flood); crime (inside and outside the building); fire and terrorism are calculated. These hazards may cause losses of various kinds. RAMPART{trademark} considers death, injury, loss of mission, loss of property, loss of contents, loss of building use, and first-responder loss. The results of each analysis are presented graphically on the screen and in a written report.« less
Han, Z Y; Weng, W G
2011-05-15
In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.
Probabilistic Risk Assessment Procedures Guide for NASA Managers and Practitioners (Second Edition)
NASA Technical Reports Server (NTRS)
Stamatelatos,Michael; Dezfuli, Homayoon; Apostolakis, George; Everline, Chester; Guarro, Sergio; Mathias, Donovan; Mosleh, Ali; Paulos, Todd; Riha, David; Smith, Curtis;
2011-01-01
Probabilistic Risk Assessment (PRA) is a comprehensive, structured, and logical analysis method aimed at identifying and assessing risks in complex technological systems for the purpose of cost-effectively improving their safety and performance. NASA's objective is to better understand and effectively manage risk, and thus more effectively ensure mission and programmatic success, and to achieve and maintain high safety standards at NASA. NASA intends to use risk assessment in its programs and projects to support optimal management decision making for the improvement of safety and program performance. In addition to using quantitative/probabilistic risk assessment to improve safety and enhance the safety decision process, NASA has incorporated quantitative risk assessment into its system safety assessment process, which until now has relied primarily on a qualitative representation of risk. Also, NASA has recently adopted the Risk-Informed Decision Making (RIDM) process [1-1] as a valuable addition to supplement existing deterministic and experience-based engineering methods and tools. Over the years, NASA has been a leader in most of the technologies it has employed in its programs. One would think that PRA should be no exception. In fact, it would be natural for NASA to be a leader in PRA because, as a technology pioneer, NASA uses risk assessment and management implicitly or explicitly on a daily basis. NASA has probabilistic safety requirements (thresholds and goals) for crew transportation system missions to the International Space Station (ISS) [1-2]. NASA intends to have probabilistic requirements for any new human spaceflight transportation system acquisition. Methods to perform risk and reliability assessment in the early 1960s originated in U.S. aerospace and missile programs. Fault tree analysis (FTA) is an example. It would have been a reasonable extrapolation to expect that NASA would also become the world leader in the application of PRA. That was, however, not to happen. Early in the Apollo program, estimates of the probability for a successful roundtrip human mission to the moon yielded disappointingly low (and suspect) values and NASA became discouraged from further performing quantitative risk analyses until some two decades later when the methods were more refined, rigorous, and repeatable. Instead, NASA decided to rely primarily on the Hazard Analysis (HA) and Failure Modes and Effects Analysis (FMEA) methods for system safety assessment.
NASA Astrophysics Data System (ADS)
Hori, Toshikazu; Mohri, Yoshiyuki; Matsushima, Kenichi; Ariyoshi, Mitsuru
In recent years the increase in the number of heavy rainfall occurrences such as through unpredictable cloudbursts have resulted in the safety of the embankments of small earth dams needing to be improved. However, the severe financial condition of the government and local autonomous bodies necessitate the cost of improving them to be reduced. This study concerns the development of a method of evaluating the life cycle cost of small earth dams considered to pose a risk and in order to improve the safety of the downstream areas of small earth dams at minimal cost. Use of a safety evaluation method that is based on a combination of runoff analysis, saturated and unsaturated seepage analysis, and slope stability analysis enables the probability of a dam breach and its life cycle cost with the risk of heavy rainfall taken into account to be calculated. Moreover, use of the life cycle cost evaluation method will lead to the development of a technique for selecting the method of the optimal improvement or countermeasures against heavy rainfall.
NASA Astrophysics Data System (ADS)
Gan, Luping; Li, Yan-Feng; Zhu, Shun-Peng; Yang, Yuan-Jian; Huang, Hong-Zhong
2014-06-01
Failure mode, effects and criticality analysis (FMECA) and Fault tree analysis (FTA) are powerful tools to evaluate reliability of systems. Although single failure mode issue can be efficiently addressed by traditional FMECA, multiple failure modes and component correlations in complex systems cannot be effectively evaluated. In addition, correlated variables and parameters are often assumed to be precisely known in quantitative analysis. In fact, due to the lack of information, epistemic uncertainty commonly exists in engineering design. To solve these problems, the advantages of FMECA, FTA, fuzzy theory, and Copula theory are integrated into a unified hybrid method called fuzzy probability weighted geometric mean (FPWGM) risk priority number (RPN) method. The epistemic uncertainty of risk variables and parameters are characterized by fuzzy number to obtain fuzzy weighted geometric mean (FWGM) RPN for single failure mode. Multiple failure modes are connected using minimum cut sets (MCS), and Boolean logic is used to combine fuzzy risk priority number (FRPN) of each MCS. Moreover, Copula theory is applied to analyze the correlation of multiple failure modes in order to derive the failure probabilities of each MCS. Compared to the case where dependency among multiple failure modes is not considered, the Copula modeling approach eliminates the error of reliability analysis. Furthermore, for purpose of quantitative analysis, probabilities importance weight from failure probabilities are assigned to FWGM RPN to reassess the risk priority, which generalize the definition of probability weight and FRPN, resulting in a more accurate estimation than that of the traditional models. Finally, a basic fatigue analysis case drawn from turbine and compressor blades in aeroengine is used to demonstrate the effectiveness and robustness of the presented method. The result provides some important insights on fatigue reliability analysis and risk priority assessment of structural system under failure correlations.
Risk analysis with a fuzzy-logic approach of a complex installation
NASA Astrophysics Data System (ADS)
Peikert, Tim; Garbe, Heyno; Potthast, Stefan
2016-09-01
This paper introduces a procedural method based on fuzzy logic to analyze systematic the risk of an electronic system in an intentional electromagnetic environment (IEME). The method analyzes the susceptibility of a complex electronic installation with respect to intentional electromagnetic interference (IEMI). It combines the advantages of well-known techniques as fault tree analysis (FTA), electromagnetic topology (EMT) and Bayesian networks (BN) and extends the techniques with an approach to handle uncertainty. This approach uses fuzzy sets, membership functions and fuzzy logic to handle the uncertainty with probability functions and linguistic terms. The linguistic terms add to the risk analysis the knowledge from experts of the investigated system or environment.
Indicators of economic security of the region: a risk-based approach to assessing and rating
NASA Astrophysics Data System (ADS)
Karanina, Elena; Loginov, Dmitri
2017-10-01
The article presents the results of research of theoretical and methodical problems of strategy development for economic security of a particular region, justified by the composition of risk factors. The analysis of those risk factors is performed. The threshold values of indicators of economic security of regions were determined using the methods of socioeconomic statistics. The authors concluded that in modern Russian conditions it is necessary to pay great attention to the analysis of the composition and level of indicators of economic security of the region and, based on the materials of this analysis, to formulate more accurate decisions concerning the strategy of socio-economic development.
Comprehensive risk assessment method of catastrophic accident based on complex network properties
NASA Astrophysics Data System (ADS)
Cui, Zhen; Pang, Jun; Shen, Xiaohong
2017-09-01
On the macro level, the structural properties of the network and the electrical characteristics of the micro components determine the risk of cascading failures. And the cascading failures, as a process with dynamic development, not only the direct risk but also potential risk should be considered. In this paper, comprehensively considered the direct risk and potential risk of failures based on uncertain risk analysis theory and connection number theory, quantified uncertain correlation by the node degree and node clustering coefficient, then established a comprehensive risk indicator of failure. The proposed method has been proved by simulation on the actual power grid. Modeling a network according to the actual power grid, and verified the rationality of the proposed method.
Sainsbury, A W; Yu-Mei, R; Ågren, E; Vaughan-Higgins, R J; Mcgill, I S; Molenaar, F; Peniche, G; Foster, J
2017-10-01
There are risks from disease in undertaking wild animal reintroduction programmes. Methods of disease risk analysis have been advocated to assess and mitigate these risks, and post-release health and disease surveillance can be used to assess the effectiveness of the disease risk analysis, but results for a reintroduction programme have not to date been recorded. We carried out a disease risk analysis for the reintroduction of pool frogs (Pelophylax lessonae) to England, using information gained from the literature and from diagnostic testing of Swedish pool frogs and native amphibians. Ranavirus and Batrachochytrium dendrobatidis were considered high-risk disease threats for pool frogs at the destination site. Quarantine was used to manage risks from disease due to these two agents at the reintroduction site: the quarantine barrier surrounded the reintroduced pool frogs. Post-release health surveillance was carried out through regular health examinations of amphibians in the field at the reintroduction site and collection and examination of dead amphibians. No significant health or disease problems were detected, but the detection rate of dead amphibians was very low. Methods to detect a higher proportion of dead reintroduced animals and closely related species are required to better assess the effects of reintroduction on health and disease. © 2016 Blackwell Verlag GmbH.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Szilard, Ronaldo Henriques
A Risk Informed Safety Margin Characterization (RISMC) toolkit and methodology are proposed for investigating nuclear power plant core, fuels design and safety analysis, including postulated Loss-of-Coolant Accident (LOCA) analysis. This toolkit, under an integrated evaluation model framework, is name LOCA toolkit for the US (LOTUS). This demonstration includes coupled analysis of core design, fuel design, thermal hydraulics and systems analysis, using advanced risk analysis tools and methods to investigate a wide range of results.
Parkinson, Craig; Foley, Kieran; Whybra, Philip; Hills, Robert; Roberts, Ashley; Marshall, Chris; Staffurth, John; Spezi, Emiliano
2018-04-11
Prognosis in oesophageal cancer (OC) is poor. The 5-year overall survival (OS) rate is approximately 15%. Personalised medicine is hoped to increase the 5- and 10-year OS rates. Quantitative analysis of PET is gaining substantial interest in prognostic research but requires the accurate definition of the metabolic tumour volume. This study compares prognostic models developed in the same patient cohort using individual PET segmentation algorithms and assesses the impact on patient risk stratification. Consecutive patients (n = 427) with biopsy-proven OC were included in final analysis. All patients were staged with PET/CT between September 2010 and July 2016. Nine automatic PET segmentation methods were studied. All tumour contours were subjectively analysed for accuracy, and segmentation methods with < 90% accuracy were excluded. Standardised image features were calculated, and a series of prognostic models were developed using identical clinical data. The proportion of patients changing risk classification group were calculated. Out of nine PET segmentation methods studied, clustering means (KM2), general clustering means (GCM3), adaptive thresholding (AT) and watershed thresholding (WT) methods were included for analysis. Known clinical prognostic factors (age, treatment and staging) were significant in all of the developed prognostic models. AT and KM2 segmentation methods developed identical prognostic models. Patient risk stratification was dependent on the segmentation method used to develop the prognostic model with up to 73 patients (17.1%) changing risk stratification group. Prognostic models incorporating quantitative image features are dependent on the method used to delineate the primary tumour. This has a subsequent effect on risk stratification, with patients changing groups depending on the image segmentation method used.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salguero, Laura Marie; Huff, Johnathon; Matta, Anthony R.
Sandia National Laboratories is an organization with a wide range of research and development activities that include nuclear, explosives, and chemical hazards. In addition, Sandia has over 2000 labs and over 40 major test facilities, such as the Thermal Test Complex, the Lightning Test Facility, and the Rocket Sled Track. In order to support safe operations, Sandia has a diverse Environment, Safety, and Health (ES&H) organization that provides expertise to support engineers and scientists in performing work safely. With such a diverse organization to support, the ES&H program continuously seeks opportunities to improve the services provided for Sandia by usingmore » various methods as part of their risk management strategy. One of the methods being investigated is using enterprise architecture analysis to mitigate risk inducing characteristics such as normalization of deviance, organizational drift, and problems in information flow. This paper is a case study for how a Department of Defense Architecture Framework (DoDAF) model of the ES&H enterprise, including information technology applications, can be analyzed to understand the level of risk associated with the risk inducing characteristics discussed above. While the analysis is not complete, we provide proposed analysis methods that will be used for future research as the project progresses.« less
Hansson, Sven Ove; Aven, Terje
2014-07-01
This article discusses to what extent risk analysis is scientific in view of a set of commonly used definitions and criteria. We consider scientific knowledge to be characterized by its subject matter, its success in developing the best available knowledge in its fields of study, and the epistemic norms and values that guide scientific investigations. We proceed to assess the field of risk analysis according to these criteria. For this purpose, we use a model for risk analysis in which science is used as a base for decision making on risks, which covers the five elements evidence, knowledge base, broad risk evaluation, managerial review and judgment, and the decision; and that relates these elements to the domains experts and decisionmakers, and to the domains fact-based or value-based. We conclude that risk analysis is a scientific field of study, when understood as consisting primarily of (i) knowledge about risk-related phenomena, processes, events, etc., and (ii) concepts, theories, frameworks, approaches, principles, methods and models to understand, assess, characterize, communicate, and manage risk, in general and for specific applications (the instrumental part). © 2014 Society for Risk Analysis.
Ecological Risk Assessment with MCDM of Some Invasive Alien Plants in China
NASA Astrophysics Data System (ADS)
Xie, Guowen; Chen, Weiguang; Lin, Meizhen; Zheng, Yanling; Guo, Peiguo; Zheng, Yisheng
Alien plant invasion is an urgent global issue that threatens the sustainable development of the ecosystem health. The study of its ecological risk assessment (ERA) could help us to prevent and reduce the invasion risk more effectively. Based on the theory of ERA and methods of the analytic hierarchy process (AHP) of multi-criteria decision-making (MCDM), and through the analyses of the characteristics and processes of alien plant invasion, this paper discusses the methodologies of ERA of alien plant invasion. The assessment procedure consisted of risk source analysis, receptor analysis, exposure and hazard assessment, integral assessment, and countermeasure of risk management. The indicator system of risk source assessment as well as the indices and formulas applied to measure the ecological loss and risk were established, and the method for comprehensively assessing the ecological risk of alien plant invasion was worked out. The result of ecological risk analysis to 9 representative invasive alien plants in China shows that the ecological risk of Erigeron annuus, Ageratum conyzoides, Alternanthera philoxeroides and Mikania midrantha is high (grade1-2), that of Oxalis corymbosa and Wedelia chinensis comes next (grade3), while Mirabilis jalapa, Pilea microphylla and Calendula officinalis of the last (grade 4). Risk strategies are put forward on this basis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lah, J; Manger, R; Kim, G
Purpose: To examine the ability of traditional Failure mode and effects analysis (FMEA) and a light version of Healthcare FMEA (HFMEA), called Scenario analysis of FMEA (SAFER) by comparing their outputs in terms of the risks identified and their severity rankings. Methods: We applied two prospective methods of the quality management to surface image guided, linac-based radiosurgery (SIG-RS). For the traditional FMEA, decisions on how to improve an operation are based on risk priority number (RPN). RPN is a product of three indices: occurrence, severity and detectability. The SAFER approach; utilized two indices-frequency and severity-which were defined by a multidisciplinarymore » team. A criticality matrix was divided into 4 categories; very low, low, high and very high. For high risk events, an additional evaluation was performed. Based upon the criticality of the process, it was decided if additional safety measures were needed and what they comprise. Results: Two methods were independently compared to determine if the results and rated risks were matching or not. Our results showed an agreement of 67% between FMEA and SAFER approaches for the 15 riskiest SIG-specific failure modes. The main differences between the two approaches were the distribution of the values and the failure modes (No.52, 54, 154) that have high SAFER scores do not necessarily have high FMEA RPN scores. In our results, there were additional risks identified by both methods with little correspondence. In the SAFER, when the risk score is determined, the basis of the established decision tree or the failure mode should be more investigated. Conclusion: The FMEA method takes into account the probability that an error passes without being detected. SAFER is inductive because it requires the identification of the consequences from causes, and semi-quantitative since it allow the prioritization of risks and mitigation measures, and thus is perfectly applicable to clinical parts of radiotherapy.« less
49 CFR Appendix D to Part 172 - Rail Risk Analysis Factors
Code of Federal Regulations, 2012 CFR
2012-10-01
... nature of the rail system, each carrier must select and document the analysis method/model used and identify the routes to be analyzed. D. The safety and security risk analysis must consider current data and... curvature; 7. Presence or absence of signals and train control systems along the route (“dark” versus...
Use of labour induction and risk of cesarean delivery: a systematic review and meta-analysis
Mishanina, Ekaterina; Rogozinska, Ewelina; Thatthi, Tej; Uddin-Khan, Rehan; Khan, Khalid S.; Meads, Catherine
2014-01-01
Background: Induction of labour is common, and cesarean delivery is regarded as its major complication. We conducted a systematic review and meta-analysis to investigate whether the risk of cesarean delivery is higher or lower following labour induction compared with expectant management. Methods: We searched 6 electronic databases for relevant articles published through April 2012 to identify randomized controlled trials (RCTs) in which labour induction was compared with placebo or expectant management among women with a viable singleton pregnancy. We assessed risk of bias and obtained data on rates of cesarean delivery. We used regression analysis techniques to explore the effect of patient characteristics, induction methods and study quality on risk of cesarean delivery. Results: We identified 157 eligible RCTs (n = 31 085). Overall, the risk of cesarean delivery was 12% lower with labour induction than with expectant management (pooled relative risk [RR] 0.88, 95% confidence interval [CI] 0.84–0.93; I2 = 0%). The effect was significant in term and post-term gestations but not in preterm gestations. Meta-regression analysis showed that initial cervical score, indication for induction and method of induction did not alter the main result. There was a reduced risk of fetal death (RR 0.50, 95% CI 0.25–0.99; I2 = 0%) and admission to a neonatal intensive care unit (RR 0.86, 95% CI 0.79–0.94), and no impact on maternal death (RR 1.00, 95% CI 0.10–9.57; I2 = 0%) with labour induction. Interpretation: The risk of cesarean delivery was lower among women whose labour was induced than among those managed expectantly in term and post-term gestations. There were benefits for the fetus and no increased risk of maternal death. PMID:24778358
The Use of Object-Oriented Analysis Methods in Surety Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Craft, Richard L.; Funkhouser, Donald R.; Wyss, Gregory D.
1999-05-01
Object-oriented analysis methods have been used in the computer science arena for a number of years to model the behavior of computer-based systems. This report documents how such methods can be applied to surety analysis. By embodying the causality and behavior of a system in a common object-oriented analysis model, surety analysts can make the assumptions that underlie their models explicit and thus better communicate with system designers. Furthermore, given minor extensions to traditional object-oriented analysis methods, it is possible to automatically derive a wide variety of traditional risk and reliability analysis methods from a single common object model. Automaticmore » model extraction helps ensure consistency among analyses and enables the surety analyst to examine a system from a wider variety of viewpoints in a shorter period of time. Thus it provides a deeper understanding of a system's behaviors and surety requirements. This report documents the underlying philosophy behind the common object model representation, the methods by which such common object models can be constructed, and the rules required to interrogate the common object model for derivation of traditional risk and reliability analysis models. The methodology is demonstrated in an extensive example problem.« less
Byers, Helen; Wallis, Yvonne; van Veen, Elke M; Lalloo, Fiona; Reay, Kim; Smith, Philip; Wallace, Andrew J; Bowers, Naomi; Newman, William G; Evans, D Gareth
2016-11-01
The sensitivity of testing BRCA1 and BRCA2 remains unresolved as the frequency of deep intronic splicing variants has not been defined in high-risk familial breast/ovarian cancer families. This variant category is reported at significant frequency in other tumour predisposition genes, including NF1 and MSH2. We carried out comprehensive whole gene RNA analysis on 45 high-risk breast/ovary and male breast cancer families with no identified pathogenic variant on exonic sequencing and copy number analysis of BRCA1/2. In addition, we undertook variant screening of a 10-gene high/moderate risk breast/ovarian cancer panel by next-generation sequencing. DNA testing identified the causative variant in 50/56 (89%) breast/ovarian/male breast cancer families with Manchester scores of ≥50 with two variants being confirmed to affect splicing on RNA analysis. RNA sequencing of BRCA1/BRCA2 on 45 individuals from high-risk families identified no deep intronic variants and did not suggest loss of RNA expression as a cause of lost sensitivity. Panel testing in 42 samples identified a known RAD51D variant, a high-risk ATM variant in another breast ovary family and a truncating CHEK2 mutation. Current exonic sequencing and copy number analysis variant detection methods of BRCA1/2 have high sensitivity in high-risk breast/ovarian cancer families. Sequence analysis of RNA does not identify any variants undetected by current analysis of BRCA1/2. However, RNA analysis clarified the pathogenicity of variants of unknown significance detected by current methods. The low diagnostic uplift achieved through sequence analysis of the other known breast/ovarian cancer susceptibility genes indicates that further high-risk genes remain to be identified.
NASA Astrophysics Data System (ADS)
Jinguuji, Motoharu; Toprak, Selcuk
2017-12-01
The Hinode area of Itako City in Ibaraki Prefecture, Japan, suffered some of the most severe liquefaction damage of any areas in the Great Eastern Japan Earthquake in 2011. This liquefaction damage has been investigated by Itako City, as well as by universities and research institutes in Japan. The National Institute of Advanced Industrial Science and Technology (AIST) has carried out numerous investigations along the Tone River, and in particular, intensive surveys were done in the Hinode area. We have conducted a risk analysis based on the thickness and depth of the liquefaction layer measured using cone penetration testing (CPT) data and electric resistivity data obtained in the Hinode area. The distribution of the risk estimated from CPT at 143 points, and that obtained from analysis of the resistivity survey data, agreed with the distribution of actual damage. We also carried out conventional risk analyses method using the liquefaction resistance factor (FL) and liquefaction potential index (PL) methods with CPT data. The results show high PL values over the entire area, but their distribution did not agree well with actual damage in some parts of the study area. Because the analysis of the thickness and depth of the liquefaction layer, using geophysical prospecting methods, can cover a widespread area, this method will be very useful in investigating liquefaction risk, especially for gas and water pipelines.
Whalley, H C; Kestelman, J N; Rimmington, J E; Kelso, A; Abukmeil, S S; Best, J J; Johnstone, E C; Lawrie, S M
1999-07-30
The Edinburgh High Risk Project is a longitudinal study of brain structure (and function) in subjects at high risk of developing schizophrenia in the next 5-10 years for genetic reasons. In this article we describe the methods of volumetric analysis of structural magnetic resonance images used in the study. We also consider potential sources of error in these methods: the validity of our image analysis techniques; inter- and intra-rater reliability; possible positional variation; and thresholding criteria used in separating brain from cerebro-spinal fluid (CSF). Investigation with a phantom test object (of similar imaging characteristics to the brain) provided evidence for the validity of our image acquisition and analysis techniques. Both inter- and intra-rater reliability were found to be good in whole brain measures but less so for smaller regions. There were no statistically significant differences in positioning across the three study groups (patients with schizophrenia, high risk subjects and normal volunteers). A new technique for thresholding MRI scans longitudinally is described (the 'rescale' method) and compared with our established method (thresholding by eye). Few differences between the two techniques were seen at 3- and 6-month follow-up. These findings demonstrate the validity and reliability of the structural MRI analysis techniques used in the Edinburgh High Risk Project, and highlight methodological issues of general concern in cross-sectional and longitudinal studies of brain structure in healthy control subjects and neuropsychiatric populations.
Risk Factors of Falls in Community-Dwelling Older Adults: Logistic Regression Tree Analysis
ERIC Educational Resources Information Center
Yamashita, Takashi; Noe, Douglas A.; Bailer, A. John
2012-01-01
Purpose of the Study: A novel logistic regression tree-based method was applied to identify fall risk factors and possible interaction effects of those risk factors. Design and Methods: A nationally representative sample of American older adults aged 65 years and older (N = 9,592) in the Health and Retirement Study 2004 and 2006 modules was used.…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sezen, Halil; Aldemir, Tunc; Denning, R.
Probabilistic risk assessment of nuclear power plants initially focused on events initiated by internal faults at the plant, rather than external hazards including earthquakes and flooding. Although the importance of external hazards risk analysis is now well recognized, the methods for analyzing low probability external hazards rely heavily on subjective judgment of specialists, often resulting in substantial conservatism. This research developed a framework to integrate the risk of seismic and flooding events using realistic structural models and simulation of response of nuclear structures. The results of four application case studies are presented.
[Reliability theory based on quality risk network analysis for Chinese medicine injection].
Li, Zheng; Kang, Li-Yuan; Fan, Xiao-Hui
2014-08-01
A new risk analysis method based upon reliability theory was introduced in this paper for the quality risk management of Chinese medicine injection manufacturing plants. The risk events including both cause and effect ones were derived in the framework as nodes with a Bayesian network analysis approach. It thus transforms the risk analysis results from failure mode and effect analysis (FMEA) into a Bayesian network platform. With its structure and parameters determined, the network can be used to evaluate the system reliability quantitatively with probabilistic analytical appraoches. Using network analysis tools such as GeNie and AgenaRisk, we are able to find the nodes that are most critical to influence the system reliability. The importance of each node to the system can be quantitatively evaluated by calculating the effect of the node on the overall risk, and minimization plan can be determined accordingly to reduce their influences and improve the system reliability. Using the Shengmai injection manufacturing plant of SZYY Ltd as a user case, we analyzed the quality risk with both static FMEA analysis and dynamic Bayesian Network analysis. The potential risk factors for the quality of Shengmai injection manufacturing were identified with the network analysis platform. Quality assurance actions were further defined to reduce the risk and improve the product quality.
Sensitivity Analysis of Launch Vehicle Debris Risk Model
NASA Technical Reports Server (NTRS)
Gee, Ken; Lawrence, Scott L.
2010-01-01
As part of an analysis of the loss of crew risk associated with an ascent abort system for a manned launch vehicle, a model was developed to predict the impact risk of the debris resulting from an explosion of the launch vehicle on the crew module. The model consisted of a debris catalog describing the number, size and imparted velocity of each piece of debris, a method to compute the trajectories of the debris and a method to calculate the impact risk given the abort trajectory of the crew module. The model provided a point estimate of the strike probability as a function of the debris catalog, the time of abort and the delay time between the abort and destruction of the launch vehicle. A study was conducted to determine the sensitivity of the strike probability to the various model input parameters and to develop a response surface model for use in the sensitivity analysis of the overall ascent abort risk model. The results of the sensitivity analysis and the response surface model are presented in this paper.
Safety analysis, risk assessment, and risk acceptance criteria
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jamali, K.; Stack, D.W.; Sullivan, L.H.
1997-08-01
This paper discusses a number of topics that relate safety analysis as documented in the Department of Energy (DOE) safety analysis reports (SARs), probabilistic risk assessments (PRA) as characterized primarily in the context of the techniques that have assumed some level of formality in commercial nuclear power plant applications, and risk acceptance criteria as an outgrowth of PRA applications. DOE SARs of interest are those that are prepared for DOE facilities under DOE Order 5480.23 and the implementing guidance in DOE STD-3009-94. It must be noted that the primary area of application for DOE STD-3009 is existing DOE facilities andmore » that certain modifications of the STD-3009 approach are necessary in SARs for new facilities. Moreover, it is the hazard analysis (HA) and accident analysis (AA) portions of these SARs that are relevant to the present discussions. Although PRAs can be qualitative in nature, PRA as used in this paper refers more generally to all quantitative risk assessments and their underlying methods. HA as used in this paper refers more generally to all qualitative risk assessments and their underlying methods that have been in use in hazardous facilities other than nuclear power plants. This discussion includes both quantitative and qualitative risk assessment methods. PRA has been used, improved, developed, and refined since the Reactor Safety Study (WASH-1400) was published in 1975 by the Nuclear Regulatory Commission (NRC). Much debate has ensued since WASH-1400 on exactly what the role of PRA should be in plant design, reactor licensing, `ensuring` plant and process safety, and a large number of other decisions that must be made for potentially hazardous activities. Of particular interest in this area is whether the risks quantified using PRA should be compared with numerical risk acceptance criteria (RACs) to determine whether a facility is `safe.` Use of RACs requires quantitative estimates of consequence frequency and magnitude.« less
A theoretical treatment of technical risk in modern propulsion system design
NASA Astrophysics Data System (ADS)
Roth, Bryce Alexander
2000-09-01
A prevalent trend in modern aerospace systems is increasing complexity and cost, which in turn drives increased risk. Consequently, there is a clear and present need for the development of formalized methods to analyze the impact of risk on the design of aerospace vehicles. The objective of this work is to develop such a method that enables analysis of risk via a consistent, comprehensive treatment of aerothermodynamic and mass properties aspects of vehicle design. The key elements enabling the creation of this methodology are recent developments in the analytical estimation of work potential based on the second law of thermodynamics. This dissertation develops the theoretical foundation of a vehicle analysis method based on work potential and validates it using the Northrop F-5E with GE J85-GE-21 engines as a case study. Although the method is broadly applicable, emphasis is given to aircraft propulsion applications. Three work potential figures of merit are applied using this method: exergy, available energy, and thrust work potential. It is shown that each possesses unique properties making them useful for specific vehicle analysis tasks, though the latter two are actually special cases of exergy. All three are demonstrated on the analysis of the J85-GE-21 propulsion system, resulting in a comprehensive description of propulsion system thermodynamic loss. This "loss management" method is used to analyze aerodynamic drag loss of the F-5E and is then used in conjunction with the propulsive loss model to analyze the usage of fuel work potential throughout the F-5E design mission. The results clearly show how and where work potential is used during flight and yield considerable insight as to where the greatest opportunity for design improvement is. Next, usage of work potential is translated into fuel weight so that the aerothermodynamic performance of the F-5E can be expressed entirely in terms of vehicle gross weight. This technique is then applied as a means to quantify the impact of engine cycle technologies on the F-5E airframe. Finally, loss management methods are used in conjunction with probabilistic analysis methods to quantify the impact of risk on F-5E aerothermodynamic performance.
Protecting Privacy of Shared Epidemiologic Data without Compromising Analysis Potential
Cologne, John; Grant, Eric J.; Nakashima, Eiji; ...
2012-01-01
Objective . Ensuring privacy of research subjects when epidemiologic data are shared with outside collaborators involves masking (modifying) the data, but overmasking can compromise utility (analysis potential). Methods of statistical disclosure control for protecting privacy may be impractical for individual researchers involved in small-scale collaborations. Methods . We investigated a simple approach based on measures of disclosure risk and analytical utility that are straightforward for epidemiologic researchers to derive. The method is illustrated using data from the Japanese Atomic-bomb Survivor population. Results . Masking by modest rounding did not adequately enhance security but rounding to remove several digits of relativemore » accuracy effectively reduced the risk of identification without substantially reducing utility. Grouping or adding random noise led to noticeable bias. Conclusions . When sharing epidemiologic data, it is recommended that masking be performed using rounding. Specific treatment should be determined separately in individual situations after consideration of the disclosure risks and analysis needs.« less
Protecting Privacy of Shared Epidemiologic Data without Compromising Analysis Potential
Cologne, John; Grant, Eric J.; Nakashima, Eiji; Chen, Yun; Funamoto, Sachiyo; Katayama, Hiroaki
2012-01-01
Objective. Ensuring privacy of research subjects when epidemiologic data are shared with outside collaborators involves masking (modifying) the data, but overmasking can compromise utility (analysis potential). Methods of statistical disclosure control for protecting privacy may be impractical for individual researchers involved in small-scale collaborations. Methods. We investigated a simple approach based on measures of disclosure risk and analytical utility that are straightforward for epidemiologic researchers to derive. The method is illustrated using data from the Japanese Atomic-bomb Survivor population. Results. Masking by modest rounding did not adequately enhance security but rounding to remove several digits of relative accuracy effectively reduced the risk of identification without substantially reducing utility. Grouping or adding random noise led to noticeable bias. Conclusions. When sharing epidemiologic data, it is recommended that masking be performed using rounding. Specific treatment should be determined separately in individual situations after consideration of the disclosure risks and analysis needs. PMID:22505949
An Extreme-Value Approach to Anomaly Vulnerability Identification
NASA Technical Reports Server (NTRS)
Everett, Chris; Maggio, Gaspare; Groen, Frank
2010-01-01
The objective of this paper is to present a method for importance analysis in parametric probabilistic modeling where the result of interest is the identification of potential engineering vulnerabilities associated with postulated anomalies in system behavior. In the context of Accident Precursor Analysis (APA), under which this method has been developed, these vulnerabilities, designated as anomaly vulnerabilities, are conditions that produce high risk in the presence of anomalous system behavior. The method defines a parameter-specific Parameter Vulnerability Importance measure (PVI), which identifies anomaly risk-model parameter values that indicate the potential presence of anomaly vulnerabilities, and allows them to be prioritized for further investigation. This entails analyzing each uncertain risk-model parameter over its credible range of values to determine where it produces the maximum risk. A parameter that produces high system risk for a particular range of values suggests that the system is vulnerable to the modeled anomalous conditions, if indeed the true parameter value lies in that range. Thus, PVI analysis provides a means of identifying and prioritizing anomaly-related engineering issues that at the very least warrant improved understanding to reduce uncertainty, such that true vulnerabilities may be identified and proper corrective actions taken.
An improved method for risk evaluation in failure modes and effects analysis of CNC lathe
NASA Astrophysics Data System (ADS)
Rachieru, N.; Belu, N.; Anghel, D. C.
2015-11-01
Failure mode and effects analysis (FMEA) is one of the most popular reliability analysis tools for identifying, assessing and eliminating potential failure modes in a wide range of industries. In general, failure modes in FMEA are evaluated and ranked through the risk priority number (RPN), which is obtained by the multiplication of crisp values of the risk factors, such as the occurrence (O), severity (S), and detection (D) of each failure mode. However, the crisp RPN method has been criticized to have several deficiencies. In this paper, linguistic variables, expressed in Gaussian, trapezoidal or triangular fuzzy numbers, are used to assess the ratings and weights for the risk factors S, O and D. A new risk assessment system based on the fuzzy set theory and fuzzy rule base theory is to be applied to assess and rank risks associated to failure modes that could appear in the functioning of Turn 55 Lathe CNC. Two case studies have been shown to demonstrate the methodology thus developed. It is illustrated a parallel between the results obtained by the traditional method and fuzzy logic for determining the RPNs. The results show that the proposed approach can reduce duplicated RPN numbers and get a more accurate, reasonable risk assessment. As a result, the stability of product and process can be assured.
Bonan, Brigitte; Martelli, Nicolas; Berhoune, Malik; Maestroni, Marie-Laure; Havard, Laurent; Prognon, Patrice
2009-02-01
To apply the Hazard analysis and Critical Control Points method to the preparation of anti-cancer drugs. To identify critical control points in our cancer chemotherapy process and to propose control measures and corrective actions to manage these processes. The Hazard Analysis and Critical Control Points application began in January 2004 in our centralized chemotherapy compounding unit. From October 2004 to August 2005, monitoring of the process nonconformities was performed to assess the method. According to the Hazard Analysis and Critical Control Points method, a multidisciplinary team was formed to describe and assess the cancer chemotherapy process. This team listed all of the critical points and calculated their risk indexes according to their frequency of occurrence, their severity and their detectability. The team defined monitoring, control measures and corrective actions for each identified risk. Finally, over a 10-month period, pharmacists reported each non-conformity of the process in a follow-up document. Our team described 11 steps in the cancer chemotherapy process. The team identified 39 critical control points, including 11 of higher importance with a high-risk index. Over 10 months, 16,647 preparations were performed; 1225 nonconformities were reported during this same period. The Hazard Analysis and Critical Control Points method is relevant when it is used to target a specific process such as the preparation of anti-cancer drugs. This method helped us to focus on the production steps, which can have a critical influence on product quality, and led us to improve our process.
Sherman, Recinda L; Henry, Kevin A; Tannenbaum, Stacey L; Feaster, Daniel J; Kobetz, Erin; Lee, David J
2014-03-20
Epidemiologists are gradually incorporating spatial analysis into health-related research as geocoded cases of disease become widely available and health-focused geospatial computer applications are developed. One health-focused application of spatial analysis is cluster detection. Using cluster detection to identify geographic areas with high-risk populations and then screening those populations for disease can improve cancer control. SaTScan is a free cluster-detection software application used by epidemiologists around the world to describe spatial clusters of infectious and chronic disease, as well as disease vectors and risk factors. The objectives of this article are to describe how spatial analysis can be used in cancer control to detect geographic areas in need of colorectal cancer screening intervention, identify issues commonly encountered by SaTScan users, detail how to select the appropriate methods for using SaTScan, and explain how method selection can affect results. As an example, we used various methods to detect areas in Florida where the population is at high risk for late-stage diagnosis of colorectal cancer. We found that much of our analysis was underpowered and that no single method detected all clusters of statistical or public health significance. However, all methods detected 1 area as high risk; this area is potentially a priority area for a screening intervention. Cluster detection can be incorporated into routine public health operations, but the challenge is to identify areas in which the burden of disease can be alleviated through public health intervention. Reliance on SaTScan's default settings does not always produce pertinent results.
NASA Astrophysics Data System (ADS)
Ciurean, R. L.; Glade, T.
2012-04-01
Decision under uncertainty is a constant of everyday life and an important component of risk management and governance. Recently, experts have emphasized the importance of quantifying uncertainty in all phases of landslide risk analysis. Due to its multi-dimensional and dynamic nature, (physical) vulnerability is inherently complex and the "degree of loss" estimates imprecise and to some extent even subjective. Uncertainty analysis introduces quantitative modeling approaches that allow for a more explicitly objective output, improving the risk management process as well as enhancing communication between various stakeholders for better risk governance. This study presents a review of concepts for uncertainty analysis in vulnerability of elements at risk to landslides. Different semi-quantitative and quantitative methods are compared based on their feasibility in real-world situations, hazard dependency, process stage in vulnerability assessment (i.e. input data, model, output), and applicability within an integrated landslide hazard and risk framework. The resulted observations will help to identify current gaps and future needs in vulnerability assessment, including estimation of uncertainty propagation, transferability of the methods, development of visualization tools, but also address basic questions like what is uncertainty and how uncertainty can be quantified or treated in a reliable and reproducible way.
Toroody, Ahmad Bahoo; Abaei, Mohammad Mahdy; Gholamnia, Reza
2016-12-01
Risk assessment can be classified into two broad categories: traditional and modern. This paper is aimed at contrasting the functional resonance analysis method (FRAM) as a modern approach with the fault tree analysis (FTA) as a traditional method, regarding assessing the risks of a complex system. Applied methodology by which the risk assessment is carried out, is presented in each approach. Also, FRAM network is executed with regard to nonlinear interaction of human and organizational levels to assess the safety of technological systems. The methodology is implemented for lifting structures deep offshore. The main finding of this paper is that the combined application of FTA and FRAM during risk assessment, could provide complementary perspectives and may contribute to a more comprehensive understanding of an incident. Finally, it is shown that coupling a FRAM network with a suitable quantitative method will result in a plausible outcome for a predefined accident scenario.
Research on the Risk Early Warning Method of Material Supplier Performance in Power Industry
NASA Astrophysics Data System (ADS)
Chen, Peng; Zhang, Xi
2018-01-01
The early warning of supplier performance risk is still in the initial stage interiorly, and research on the early warning mechanism to identify, analyze and prevent the performance risk is few. In this paper, a new method aiming at marerial supplier performance risk in power industry is proposed, firstly, establishing a set of risk early warning indexes, Then use the ECM method to classify the indexes to form different risk grades. Then, improving Crock Ford risk quantization model by considering three indicators, including the stability of power system, economic losses and successful bid ratio to form the predictive risk grade, and ultimately using short board effect principle to form the ultimate risk grade to truly reflect the supplier performance risk. Finally, making empirical analysis on supplier performance and putting forward the counter measures and prevention strategies for different risks.
Risk-benefit analysis and public policy: a bibliography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clark, E.M.; Van Horn, A.J.
1976-11-01
Risk-benefit analysis has been implicitly practiced whenever decision-makers are confronted with decisions involving risks to life, health, or to the environment. Various methodologies have been developed to evaluate relevant criteria and to aid in assessing the impacts of alternative projects. Among these have been cost-benefit analysis, which has been widely used for project evaluation. However, in many cases it has been difficult to assign dollar costs to those criteria involving risks and benefits which are not now assigned explicit monetary values in our economic system. Hence, risk-benefit analysis has evolved to become more than merely an extension of cost-benefit analysis,more » and many methods have been applied to examine the trade-offs between risks and benefits. In addition, new scientific and statistical techniques have been developed for assessing current and future risks. The 950 references included in this bibliography are meant to suggest the breadth of those methodologies which have been applied to decisions involving risk.« less
Analysis of Risk Factors for Postoperative Morbidity in Perforated Peptic Ulcer
Kim, Jae-Myung; Jeong, Sang-Ho; Park, Soon-Tae; Choi, Sang-Kyung; Hong, Soon-Chan; Jung, Eun-Jung; Ju, Young-Tae; Jeong, Chi-Young; Ha, Woo-Song
2012-01-01
Purpose Emergency operations for perforated peptic ulcer are associated with a high incidence of postoperative complications. While several studies have investigated the impact of perioperative risk factors and underlying diseases on the postoperative morbidity after abdominal surgery, only a few have analyzed their role in perforated peptic ulcer disease. The purpose of this study was to determine any possible associations between postoperative morbidity and comorbid disease or perioperative risk factors in perforated peptic ulcer. Materials and Methods In total, 142 consecutive patients, who underwent surgery for perforated peptic ulcer, at a single institution, between January 2005 and October 2010 were included in this study. The clinical data concerning the patient characteristics, operative methods, and complications were collected retrospectively. Results The postoperative morbidity rate associated with perforated peptic ulcer operations was 36.6% (52/142). Univariate analysis revealed that a long operating time, the open surgical method, age (≥60), sex (female), high American Society of Anesthesiologists (ASA) score and presence of preoperative shock were significant perioperative risk factors for postoperative morbidity. Significant comorbid risk factors included hypertension, diabetes mellitus and pulmonary disease. Multivariate analysis revealed a long operating time, the open surgical method, high ASA score and the presence of preoperative shock were all independent risk factors for the postoperative morbidity in perforated peptic ulcer. Conclusions A high ASA score, preoperative shock, open surgery and long operating time of more than 150 minutes are high risk factors for morbidity. However, there is no association between postoperative morbidity and comorbid disease in patients with a perforated peptic ulcer. PMID:22500261
Probabilistic risk analysis and terrorism risk.
Ezell, Barry Charles; Bennett, Steven P; von Winterfeldt, Detlof; Sokolowski, John; Collins, Andrew J
2010-04-01
Since the terrorist attacks of September 11, 2001, and the subsequent establishment of the U.S. Department of Homeland Security (DHS), considerable efforts have been made to estimate the risks of terrorism and the cost effectiveness of security policies to reduce these risks. DHS, industry, and the academic risk analysis communities have all invested heavily in the development of tools and approaches that can assist decisionmakers in effectively allocating limited resources across the vast array of potential investments that could mitigate risks from terrorism and other threats to the homeland. Decisionmakers demand models, analyses, and decision support that are useful for this task and based on the state of the art. Since terrorism risk analysis is new, no single method is likely to meet this challenge. In this article we explore a number of existing and potential approaches for terrorism risk analysis, focusing particularly on recent discussions regarding the applicability of probabilistic and decision analytic approaches to bioterrorism risks and the Bioterrorism Risk Assessment methodology used by the DHS and criticized by the National Academies and others.
NASA Technical Reports Server (NTRS)
1991-01-01
The technical effort and computer code enhancements performed during the sixth year of the Probabilistic Structural Analysis Methods program are summarized. Various capabilities are described to probabilistically combine structural response and structural resistance to compute component reliability. A library of structural resistance models is implemented in the Numerical Evaluations of Stochastic Structures Under Stress (NESSUS) code that included fatigue, fracture, creep, multi-factor interaction, and other important effects. In addition, a user interface was developed for user-defined resistance models. An accurate and efficient reliability method was developed and was successfully implemented in the NESSUS code to compute component reliability based on user-selected response and resistance models. A risk module was developed to compute component risk with respect to cost, performance, or user-defined criteria. The new component risk assessment capabilities were validated and demonstrated using several examples. Various supporting methodologies were also developed in support of component risk assessment.
2016-01-01
Abstract Microarray gene expression data sets are jointly analyzed to increase statistical power. They could either be merged together or analyzed by meta-analysis. For a given ensemble of data sets, it cannot be foreseen which of these paradigms, merging or meta-analysis, works better. In this article, three joint analysis methods, Z -score normalization, ComBat and the inverse normal method (meta-analysis) were selected for survival prognosis and risk assessment of breast cancer patients. The methods were applied to eight microarray gene expression data sets, totaling 1324 patients with two clinical endpoints, overall survival and relapse-free survival. The performance derived from the joint analysis methods was evaluated using Cox regression for survival analysis and independent validation used as bias estimation. Overall, Z -score normalization had a better performance than ComBat and meta-analysis. Higher Area Under the Receiver Operating Characteristic curve and hazard ratio were also obtained when independent validation was used as bias estimation. With a lower time and memory complexity, Z -score normalization is a simple method for joint analysis of microarray gene expression data sets. The derived findings suggest further assessment of this method in future survival prediction and cancer classification applications. PMID:26504096
Ziegler, Ildikó; Borbély-Jakab, Judit; Sugó, Lilla; Kovács, Réka J
2017-01-01
In this case study, the principles of quality risk management were applied to review sampling points and monitoring frequencies in the hormonal tableting unit of a formulation development pilot plant. In the cleanroom area, premises of different functions are located. Therefore a general method was established for risk evaluation based on the Hazard Analysis and Critical Control Points (HACCP) method to evaluate these premises (i.e., production area itself and ancillary clean areas) from the point of view of microbial load and state in order to observe whether the existing monitoring program met the emerged advanced monitoring practice. LAY ABSTRACT: In pharmaceutical production, cleanrooms are needed for the manufacturing of final dosage forms of drugs-intended for human or veterinary use-in order to protect the patient's weakened body from further infections. Cleanrooms are premises with a controlled level of contamination that is specified by the number of particles per cubic meter at a specified particle size or number of microorganisms (i.e. microbial count) per surface area. To ensure a low microbial count over time, microorganisms are detected and counted by environmental monitoring methods regularly. It is reasonable to find the easily infected places by risk analysis to make sure the obtained results really represent the state of the whole room. This paper presents a risk analysis method for the optimization of environmental monitoring and verification of the suitability of the method. © PDA, Inc. 2017.
Analysis of dengue fever risk using geostatistics model in bone regency
NASA Astrophysics Data System (ADS)
Amran, Stang, Mallongi, Anwar
2017-03-01
This research aim is to analysis of dengue fever risk based on Geostatistics model in Bone Regency. Risk levels of dengue fever are denoted by parameter of Binomial distribution. Effect of temperature, rainfalls, elevation, and larvae abundance are investigated through Geostatistics model. Bayesian hierarchical method is used in estimation process. Using dengue fever data in eleven locations this research shows that temperature and rainfall have significant effect of dengue fever risk in Bone regency.
Fault tree analysis for system modeling in case of intentional EMI
NASA Astrophysics Data System (ADS)
Genender, E.; Mleczko, M.; Döring, O.; Garbe, H.; Potthast, S.
2011-08-01
The complexity of modern systems on the one hand and the rising threat of intentional electromagnetic interference (IEMI) on the other hand increase the necessity for systematical risk analysis. Most of the problems can not be treated deterministically since slight changes in the configuration (source, position, polarization, ...) can dramatically change the outcome of an event. For that purpose, methods known from probabilistic risk analysis can be applied. One of the most common approaches is the fault tree analysis (FTA). The FTA is used to determine the system failure probability and also the main contributors to its failure. In this paper the fault tree analysis is introduced and a possible application of that method is shown using a small computer network as an example. The constraints of this methods are explained and conclusions for further research are drawn.
WE-B-BRC-01: Current Methodologies in Risk Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rath, F.
Prospective quality management techniques, long used by engineering and industry, have become a growing aspect of efforts to improve quality management and safety in healthcare. These techniques are of particular interest to medical physics as scope and complexity of clinical practice continue to grow, thus making the prescriptive methods we have used harder to apply and potentially less effective for our interconnected and highly complex healthcare enterprise, especially in imaging and radiation oncology. An essential part of most prospective methods is the need to assess the various risks associated with problems, failures, errors, and design flaws in our systems. Wemore » therefore begin with an overview of risk assessment methodologies used in healthcare and industry and discuss their strengths and weaknesses. The rationale for use of process mapping, failure modes and effects analysis (FMEA) and fault tree analysis (FTA) by TG-100 will be described, as well as suggestions for the way forward. This is followed by discussion of radiation oncology specific risk assessment strategies and issues, including the TG-100 effort to evaluate IMRT and other ways to think about risk in the context of radiotherapy. Incident learning systems, local as well as the ASTRO/AAPM ROILS system, can also be useful in the risk assessment process. Finally, risk in the context of medical imaging will be discussed. Radiation (and other) safety considerations, as well as lack of quality and certainty all contribute to the potential risks associated with suboptimal imaging. The goal of this session is to summarize a wide variety of risk analysis methods and issues to give the medical physicist access to tools which can better define risks (and their importance) which we work to mitigate with both prescriptive and prospective risk-based quality management methods. Learning Objectives: Description of risk assessment methodologies used in healthcare and industry Discussion of radiation oncology-specific risk assessment strategies and issues Evaluation of risk in the context of medical imaging and image quality E. Samei: Research grants from Siemens and GE.« less
WE-B-BRC-03: Risk in the Context of Medical Imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Samei, E.
Prospective quality management techniques, long used by engineering and industry, have become a growing aspect of efforts to improve quality management and safety in healthcare. These techniques are of particular interest to medical physics as scope and complexity of clinical practice continue to grow, thus making the prescriptive methods we have used harder to apply and potentially less effective for our interconnected and highly complex healthcare enterprise, especially in imaging and radiation oncology. An essential part of most prospective methods is the need to assess the various risks associated with problems, failures, errors, and design flaws in our systems. Wemore » therefore begin with an overview of risk assessment methodologies used in healthcare and industry and discuss their strengths and weaknesses. The rationale for use of process mapping, failure modes and effects analysis (FMEA) and fault tree analysis (FTA) by TG-100 will be described, as well as suggestions for the way forward. This is followed by discussion of radiation oncology specific risk assessment strategies and issues, including the TG-100 effort to evaluate IMRT and other ways to think about risk in the context of radiotherapy. Incident learning systems, local as well as the ASTRO/AAPM ROILS system, can also be useful in the risk assessment process. Finally, risk in the context of medical imaging will be discussed. Radiation (and other) safety considerations, as well as lack of quality and certainty all contribute to the potential risks associated with suboptimal imaging. The goal of this session is to summarize a wide variety of risk analysis methods and issues to give the medical physicist access to tools which can better define risks (and their importance) which we work to mitigate with both prescriptive and prospective risk-based quality management methods. Learning Objectives: Description of risk assessment methodologies used in healthcare and industry Discussion of radiation oncology-specific risk assessment strategies and issues Evaluation of risk in the context of medical imaging and image quality E. Samei: Research grants from Siemens and GE.« less
WE-B-BRC-00: Concepts in Risk-Based Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
Prospective quality management techniques, long used by engineering and industry, have become a growing aspect of efforts to improve quality management and safety in healthcare. These techniques are of particular interest to medical physics as scope and complexity of clinical practice continue to grow, thus making the prescriptive methods we have used harder to apply and potentially less effective for our interconnected and highly complex healthcare enterprise, especially in imaging and radiation oncology. An essential part of most prospective methods is the need to assess the various risks associated with problems, failures, errors, and design flaws in our systems. Wemore » therefore begin with an overview of risk assessment methodologies used in healthcare and industry and discuss their strengths and weaknesses. The rationale for use of process mapping, failure modes and effects analysis (FMEA) and fault tree analysis (FTA) by TG-100 will be described, as well as suggestions for the way forward. This is followed by discussion of radiation oncology specific risk assessment strategies and issues, including the TG-100 effort to evaluate IMRT and other ways to think about risk in the context of radiotherapy. Incident learning systems, local as well as the ASTRO/AAPM ROILS system, can also be useful in the risk assessment process. Finally, risk in the context of medical imaging will be discussed. Radiation (and other) safety considerations, as well as lack of quality and certainty all contribute to the potential risks associated with suboptimal imaging. The goal of this session is to summarize a wide variety of risk analysis methods and issues to give the medical physicist access to tools which can better define risks (and their importance) which we work to mitigate with both prescriptive and prospective risk-based quality management methods. Learning Objectives: Description of risk assessment methodologies used in healthcare and industry Discussion of radiation oncology-specific risk assessment strategies and issues Evaluation of risk in the context of medical imaging and image quality E. Samei: Research grants from Siemens and GE.« less
A Model-Free Machine Learning Method for Risk Classification and Survival Probability Prediction.
Geng, Yuan; Lu, Wenbin; Zhang, Hao Helen
2014-01-01
Risk classification and survival probability prediction are two major goals in survival data analysis since they play an important role in patients' risk stratification, long-term diagnosis, and treatment selection. In this article, we propose a new model-free machine learning framework for risk classification and survival probability prediction based on weighted support vector machines. The new procedure does not require any specific parametric or semiparametric model assumption on data, and is therefore capable of capturing nonlinear covariate effects. We use numerous simulation examples to demonstrate finite sample performance of the proposed method under various settings. Applications to a glioma tumor data and a breast cancer gene expression survival data are shown to illustrate the new methodology in real data analysis.
Fuzzy risk analysis of a modern γ-ray industrial irradiator.
Castiglia, F; Giardina, M
2011-06-01
Fuzzy fault tree analyses were used to investigate accident scenarios that involve radiological exposure to operators working in industrial γ-ray irradiation facilities. The HEART method, a first generation human reliability analysis method, was used to evaluate the probability of adverse human error in these analyses. This technique was modified on the basis of fuzzy set theory to more directly take into account the uncertainties in the error-promoting factors on which the methodology is based. Moreover, with regard to some identified accident scenarios, fuzzy radiological exposure risk, expressed in terms of potential annual death, was evaluated. The calculated fuzzy risks for the examined plant were determined to be well below the reference risk suggested by International Commission on Radiological Protection.
1982-05-01
Raiffa (831, LaValle [891, and other books on decision analysis. 4.2 Risk Attitudes Much recent research has focused on the investigation of various risk...Issacs, G.L., Hamer, R., Chen, J., Chuang, D., Woodworth, G., Molenaar , I., Lewis C., and Libby, D., Manual for the Computer-Assisted Data Analysis (CADA
Applying Uncertainty Analysis to a Risk Assessment for the Pesticide Permethrin
We discuss the application of methods of uncertainty analysis from our previous poster to the problem of a risk assessment for exposure to the food-use pesticide permethrin resulting from residential pesticide crack and crevice application. Exposures are simulated by the SHEDS (S...
A risk-based approach to robotic mission requirements
NASA Technical Reports Server (NTRS)
Dias, William C.; Bourke, Roger D.
1992-01-01
A NASA Risk Team has developed a method for the application of risk management to the definition of robotic mission requirements for the Space Exploration Initiative. These requirements encompass environmental information, infrastructural emplacement in advance, and either technology testing or system/subsystems demonstration. Attention is presently given to a method for step-by-step consideration and analysis of the risk component inherent in mission architecture, followed by a calculation of the subjective risk level. Mitigation strategies are then applied with the same rules, and a comparison is made.
Mokhtari, Amirhossein; Christopher Frey, H; Zheng, Junyu
2006-11-01
Sensitivity analyses of exposure or risk models can help identify the most significant factors to aid in risk management or to prioritize additional research to reduce uncertainty in the estimates. However, sensitivity analysis is challenged by non-linearity, interactions between inputs, and multiple days or time scales. Selected sensitivity analysis methods are evaluated with respect to their applicability to human exposure models with such features using a testbed. The testbed is a simplified version of a US Environmental Protection Agency's Stochastic Human Exposure and Dose Simulation (SHEDS) model. The methods evaluated include the Pearson and Spearman correlation, sample and rank regression, analysis of variance, Fourier amplitude sensitivity test (FAST), and Sobol's method. The first five methods are known as "sampling-based" techniques, wheras the latter two methods are known as "variance-based" techniques. The main objective of the test cases was to identify the main and total contributions of individual inputs to the output variance. Sobol's method and FAST directly quantified these measures of sensitivity. Results show that sensitivity of an input typically changed when evaluated under different time scales (e.g., daily versus monthly). All methods provided similar insights regarding less important inputs; however, Sobol's method and FAST provided more robust insights with respect to sensitivity of important inputs compared to the sampling-based techniques. Thus, the sampling-based methods can be used in a screening step to identify unimportant inputs, followed by application of more computationally intensive refined methods to a smaller set of inputs. The implications of time variation in sensitivity results for risk management are briefly discussed.
Cooperberg, Matthew R; Ramakrishna, Naren R; Duff, Steven B; Hughes, Kathleen E; Sadownik, Sara; Smith, Joseph A; Tewari, Ashutosh K
2013-03-01
WHAT'S KNOWN ON THE SUBJECT? AND WHAT DOES THE STUDY ADD?: Multiple treatment alternatives exist for localised prostate cancer, with few high-quality studies directly comparing their comparative effectiveness and costs. The present study is the most comprehensive cost-effectiveness analysis to date for localised prostate cancer, conducted with a lifetime horizon and accounting for survival, health-related quality-of-life, and cost impact of secondary treatments and other downstream events, as well as primary treatment choices. The analysis found minor differences, generally slightly favouring surgical methods, in quality-adjusted life years across treatment options. However, radiation therapy (RT) was consistently more expensive than surgery, and some alternatives, e.g. intensity-modulated RT for low-risk disease, were dominated - that is, both more expensive and less effective than competing alternatives. To characterise the costs and outcomes associated with radical prostatectomy (open, laparoscopic, or robot-assisted) and radiation therapy (RT: dose-escalated three-dimensional conformal RT, intensity-modulated RT, brachytherapy, or combination), using a comprehensive, lifetime decision analytical model. A Markov model was constructed to follow hypothetical men with low-, intermediate-, and high-risk prostate cancer over their lifetimes after primary treatment; probabilities of outcomes were based on an exhaustive literature search yielding 232 unique publications. In each Markov cycle, patients could have remission, recurrence, salvage treatment, metastasis, death from prostate cancer, and death from other causes. Utilities for each health state were determined, and disutilities were applied for complications and toxicities of treatment. Costs were determined from the USA payer perspective, with incorporation of patient costs in a sensitivity analysis. Differences across treatments in quality-adjusted life years across methods were modest, ranging from 10.3 to 11.3 for low-risk patients, 9.6-10.5 for intermediate-risk patients and 7.8-9.3 for high-risk patients. There were no statistically significant differences among surgical methods, which tended to be more effective than RT methods, with the exception of combined external beam + brachytherapy for high-risk disease. RT methods were consistently more expensive than surgical methods; costs ranged from $19 901 (robot-assisted prostatectomy for low-risk disease) to $50 276 (combined RT for high-risk disease). These findings were robust to an extensive set of sensitivity analyses. Our analysis found small differences in outcomes and substantial differences in payer and patient costs across treatment alternatives. These findings may inform future policy discussions about strategies to improve efficiency of treatment selection for localised prostate cancer. © 2012 BJU International.
Stukel, Thérèse A.; Fisher, Elliott S; Wennberg, David E.; Alter, David A.; Gottlieb, Daniel J.; Vermeulen, Marian J.
2007-01-01
Context Comparisons of outcomes between patients treated and untreated in observational studies may be biased due to differences in patient prognosis between groups, often because of unobserved treatment selection biases. Objective To compare 4 analytic methods for removing the effects of selection bias in observational studies: multivariable model risk adjustment, propensity score risk adjustment, propensity-based matching, and instrumental variable analysis. Design, Setting, and Patients A national cohort of 122 124 patients who were elderly (aged 65–84 years), receiving Medicare, and hospitalized with acute myocardial infarction (AMI) in 1994–1995, and who were eligible for cardiac catheterization. Baseline chart reviews were taken from the Cooperative Cardiovascular Project and linked to Medicare health administrative data to provide a rich set of prognostic variables. Patients were followed up for 7 years through December 31, 2001, to assess the association between long-term survival and cardiac catheterization within 30 days of hospital admission. Main Outcome Measure Risk-adjusted relative mortality rate using each of the analytic methods. Results Patients who received cardiac catheterization (n=73 238) were younger and had lower AMI severity than those who did not. After adjustment for prognostic factors by using standard statistical risk-adjustment methods, cardiac catheterization was associated with a 50% relative decrease in mortality (for multivariable model risk adjustment: adjusted relative risk [RR], 0.51; 95% confidence interval [CI], 0.50–0.52; for propensity score risk adjustment: adjusted RR, 0.54; 95% CI, 0.53–0.55; and for propensity-based matching: adjusted RR, 0.54; 95% CI, 0.52–0.56). Using regional catheterization rate as an instrument, instrumental variable analysis showed a 16% relative decrease in mortality (adjusted RR, 0.84; 95% CI, 0.79–0.90). The survival benefits of routine invasive care from randomized clinical trials are between 8% and 21 %. Conclusions Estimates of the observational association of cardiac catheterization with long-term AMI mortality are highly sensitive to analytic method. All standard risk-adjustment methods have the same limitations regarding removal of unmeasured treatment selection biases. Compared with standard modeling, instrumental variable analysis may produce less biased estimates of treatment effects, but is more suited to answering policy questions than specific clinical questions. PMID:17227979
NASA Astrophysics Data System (ADS)
Dewi Ratih, Iis; Sutijo Supri Ulama, Brodjol; Prastuti, Mike
2018-03-01
Value at Risk (VaR) is one of the statistical methods used to measure market risk by estimating the worst losses in a given time period and level of confidence. The accuracy of this measuring tool is very important in determining the amount of capital that must be provided by the company to cope with possible losses. Because there is a greater losses to be faced with a certain degree of probability by the greater risk. Based on this, VaR calculation analysis is of particular concern to researchers and practitioners of the stock market to be developed, thus getting more accurate measurement estimates. In this research, risk analysis of stocks in four banking sub-sector, Bank Rakyat Indonesia, Bank Mandiri, Bank Central Asia and Bank Negara Indonesia will be done. Stock returns are expected to be influenced by exogenous variables, namely ICI and exchange rate. Therefore, in this research, stock risk estimation are done by using VaR ARMAX-GARCHX method. Calculating the VaR value with the ARMAX-GARCHX approach using window 500 gives more accurate results. Overall, Bank Central Asia is the only bank had the estimated maximum loss in the 5% quantile.
External Threat Risk Assessment Algorithm (ExTRAA)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Powell, Troy C.
Two risk assessment algorithms and philosophies have been augmented and combined to form a new algorit hm, the External Threat Risk Assessment Algorithm (ExTRAA), that allows for effective and statistically sound analysis of external threat sources in relation to individual attack methods . In addition to the attack method use probability and the attack method employment consequence, t he concept of defining threat sources is added to the risk assessment process. Sample data is tabulated and depicted in radar plots and bar graphs for algorithm demonstration purposes. The largest success of ExTRAA is its ability to visualize the kind ofmore » r isk posed in a given situation using the radar plot method.« less
Risk Evaluation of Business Continuity Management by Using Green Technology
NASA Astrophysics Data System (ADS)
Gang, Chen
IT disasters can be seen as the test of the ability in communities and firms to effectively protect their information and infrastructure, to reduce both human and property loss, and to rapidly recover. In this paper, we use a literature meta-analysis method to identify potential research directions in Green Business Continuity Management (GBCM). The concept and characteristics of GBCM are discussed. We analysis the connotation and the sources of green technology risk. An assessment index system is established from the perspectives of GBCM. A fuzzy comprehensive assessment method is introduced to assess the risks of green technology in Business Continuity Management.
Gavrilyuk, Oxana; Braaten, Tonje; Skeie, Guri; Weiderpass, Elisabete; Dumeaux, Vanessa; Lund, Eiliv
2014-03-25
Coffee and its compounds have been proposed to inhibit endometrial carcinogenesis. Studies in the Norwegian population can be especially interesting due to the high coffee consumption and increasing incidence of endometrial cancer in the country. A total of 97 926 postmenopausal Norwegian women from the population-based prospective Norwegian Women and Cancer (NOWAC) Study, were included in the present analysis. We evaluated the general association between total coffee consumption and endometrial cancer risk as well as the possible impact of brewing method. Multivariate Cox regression analysis was used to estimate risks, and heterogeneity tests were performed to compare brewing methods. During an average of 10.9 years of follow-up, 462 incident endometrial cancer cases were identified. After multivariate adjustment, significant risk reduction was found among participants who drank ≥8 cups/day of coffee with a hazard ratio of 0.52 (95% confidence interval, CI 0.34-0.79). However, we did not observe a significant dose-response relationship. No significant heterogeneity in risk was found when comparing filtered and boiled coffee brewing methods. A reduction in endometrial cancer risk was observed in subgroup analyses among participants who drank ≥8 cups/day and had a body mass index ≥25 kg/m2, and in current smokers. These data suggest that in this population with high coffee consumption, endometrial cancer risk decreases in women consuming ≥8 cups/day, independent of brewing method.
2014-01-01
Background Coffee and its compounds have been proposed to inhibit endometrial carcinogenesis. Studies in the Norwegian population can be especially interesting due to the high coffee consumption and increasing incidence of endometrial cancer in the country. Methods A total of 97 926 postmenopausal Norwegian women from the population-based prospective Norwegian Women and Cancer (NOWAC) Study, were included in the present analysis. We evaluated the general association between total coffee consumption and endometrial cancer risk as well as the possible impact of brewing method. Multivariate Cox regression analysis was used to estimate risks, and heterogeneity tests were performed to compare brewing methods. Results During an average of 10.9 years of follow-up, 462 incident endometrial cancer cases were identified. After multivariate adjustment, significant risk reduction was found among participants who drank ≥8 cups/day of coffee with a hazard ratio of 0.52 (95% confidence interval, CI 0.34-0.79). However, we did not observe a significant dose-response relationship. No significant heterogeneity in risk was found when comparing filtered and boiled coffee brewing methods. A reduction in endometrial cancer risk was observed in subgroup analyses among participants who drank ≥8 cups/day and had a body mass index ≥25 kg/m2, and in current smokers. Conclusions These data suggest that in this population with high coffee consumption, endometrial cancer risk decreases in women consuming ≥8 cups/day, independent of brewing method. PMID:24666820
Improvement of the cost-benefit analysis algorithm for high-rise construction projects
NASA Astrophysics Data System (ADS)
Gafurov, Andrey; Skotarenko, Oksana; Plotnikov, Vladimir
2018-03-01
The specific nature of high-rise investment projects entailing long-term construction, high risks, etc. implies a need to improve the standard algorithm of cost-benefit analysis. An improved algorithm is described in the article. For development of the improved algorithm of cost-benefit analysis for high-rise construction projects, the following methods were used: weighted average cost of capital, dynamic cost-benefit analysis of investment projects, risk mapping, scenario analysis, sensitivity analysis of critical ratios, etc. This comprehensive approach helped to adapt the original algorithm to feasibility objectives in high-rise construction. The authors put together the algorithm of cost-benefit analysis for high-rise construction projects on the basis of risk mapping and sensitivity analysis of critical ratios. The suggested project risk management algorithms greatly expand the standard algorithm of cost-benefit analysis in investment projects, namely: the "Project analysis scenario" flowchart, improving quality and reliability of forecasting reports in investment projects; the main stages of cash flow adjustment based on risk mapping for better cost-benefit project analysis provided the broad range of risks in high-rise construction; analysis of dynamic cost-benefit values considering project sensitivity to crucial variables, improving flexibility in implementation of high-rise projects.
Configurations of Common Childhood Psychosocial Risk Factors
ERIC Educational Resources Information Center
Copeland, William; Shanahan, Lilly; Costello, E. Jane; Angold, Adrian
2009-01-01
Background: Co-occurrence of psychosocial risk factors is commonplace, but little is known about psychiatrically-predictive configurations of psychosocial risk factors. Methods: Latent class analysis (LCA) was applied to 17 putative psychosocial risk factors in a representative population sample of 920 children ages 9 to 17. The resultant class…
School Health Promotion Policies and Adolescent Risk Behaviors in Israel: A Multilevel Analysis
ERIC Educational Resources Information Center
Tesler, Riki; Harel-Fisch, Yossi; Baron-Epel, Orna
2016-01-01
Background: Health promotion policies targeting risk-taking behaviors are being implemented across schools in Israel. This study identified the most effective components of these policies influencing cigarette smoking and alcohol consumption among adolescents. Methods: Logistic hierarchical linear model (HLM) analysis of data for 5279 students in…
Risk Decision Making Model for Reservoir Floodwater resources Utilization
NASA Astrophysics Data System (ADS)
Huang, X.
2017-12-01
Floodwater resources utilization(FRU) can alleviate the shortage of water resources, but there are risks. In order to safely and efficiently utilize the floodwater resources, it is necessary to study the risk of reservoir FRU. In this paper, the risk rate of exceeding the design flood water level and the risk rate of exceeding safety discharge are estimated. Based on the principle of the minimum risk and the maximum benefit of FRU, a multi-objective risk decision making model for FRU is constructed. Probability theory and mathematical statistics method is selected to calculate the risk rate; C-D production function method and emergy analysis method is selected to calculate the risk benefit; the risk loss is related to flood inundation area and unit area loss; the multi-objective decision making problem of the model is solved by the constraint method. Taking the Shilianghe reservoir in Jiangsu Province as an example, the optimal equilibrium solution of FRU of the Shilianghe reservoir is found by using the risk decision making model, and the validity and applicability of the model are verified.
A concept analysis of forensic risk.
Kettles, A M
2004-08-01
Forensic risk is a term used in relation to many forms of clinical practice, such as assessment, intervention and management. Rarely is the term defined in the literature and as a concept it is multifaceted. Concept analysis is a method for exploring and evaluating the meaning of words. It gives precise definitions, both theoretical and operational, for use in theory, clinical practice and research. A concept analysis provides a logical basis for defining terms through providing defining attributes, case examples (model, contrary, borderline, related), antecedents and consequences and the implications for nursing. Concept analysis helps us to refine and define a concept that derives from practice, research or theory. This paper will use the strategy of concept analysis to find a working definition for the concept of forensic risk. In conclusion, the historical background and literature are reviewed using concept analysis to bring the term into focus and to define it more clearly. Forensic risk is found to derive both from forensic practice and from risk theory. A proposed definition of forensic risk is given.
Impact of model-based risk analysis for liver surgery planning.
Hansen, C; Zidowitz, S; Preim, B; Stavrou, G; Oldhafer, K J; Hahn, H K
2014-05-01
A model-based risk analysis for oncologic liver surgery was described in previous work (Preim et al. in Proceedings of international symposium on computer assisted radiology and surgery (CARS), Elsevier, Amsterdam, pp. 353–358, 2002; Hansen et al. Int I Comput Assist Radiol Surg 4(5):469–474, 2009). In this paper, we present an evaluation of this method. To prove whether and how the risk analysis facilitates the process of liver surgery planning, an explorative user study with 10 liver experts was conducted. The purpose was to compare and analyze their decision-making. The results of the study show that model-based risk analysis enhances the awareness of surgical risk in the planning stage. Participants preferred smaller resection volumes and agreed more on the safety margins’ width in case the risk analysis was available. In addition, time to complete the planning task and confidence of participants were not increased when using the risk analysis. This work shows that the applied model-based risk analysis may influence important planning decisions in liver surgery. It lays a basis for further clinical evaluations and points out important fields for future research.
Determining the risk of cardiovascular disease using ion mobility of lipoproteins
Benner, W. Henry; Krauss, Ronald M.; Blanche, Patricia J.
2010-05-11
A medical diagnostic method and instrumentation system for analyzing noncovalently bonded agglomerated biological particles is described. The method and system comprises: a method of preparation for the biological particles; an electrospray generator; an alpha particle radiation source; a differential mobility analyzer; a particle counter; and data acquisition and analysis means. The medical device is useful for the assessment of human diseases, such as cardiac disease risk and hyperlipidemia, by rapid quantitative analysis of lipoprotein fraction densities. Initially, purification procedures are described to reduce an initial blood sample to an analytical input to the instrument. The measured sizes from the analytical sample are correlated with densities, resulting in a spectrum of lipoprotein densities. The lipoprotein density distribution can then be used to characterize cardiac and other lipid-related health risks.
Applying the partitioned multiobjective risk method (PMRM) to portfolio selection.
Reyes Santos, Joost; Haimes, Yacov Y
2004-06-01
The analysis of risk-return tradeoffs and their practical applications to portfolio analysis paved the way for Modern Portfolio Theory (MPT), which won Harry Markowitz a 1992 Nobel Prize in Economics. A typical approach in measuring a portfolio's expected return is based on the historical returns of the assets included in a portfolio. On the other hand, portfolio risk is usually measured using volatility, which is derived from the historical variance-covariance relationships among the portfolio assets. This article focuses on assessing portfolio risk, with emphasis on extreme risks. To date, volatility is a major measure of risk owing to its simplicity and validity for relatively small asset price fluctuations. Volatility is a justified measure for stable market performance, but it is weak in addressing portfolio risk under aberrant market fluctuations. Extreme market crashes such as that on October 19, 1987 ("Black Monday") and catastrophic events such as the terrorist attack of September 11, 2001 that led to a four-day suspension of trading on the New York Stock Exchange (NYSE) are a few examples where measuring risk via volatility can lead to inaccurate predictions. Thus, there is a need for a more robust metric of risk. By invoking the principles of the extreme-risk-analysis method through the partitioned multiobjective risk method (PMRM), this article contributes to the modeling of extreme risks in portfolio performance. A measure of an extreme portfolio risk, denoted by f(4), is defined as the conditional expectation for a lower-tail region of the distribution of the possible portfolio returns. This article presents a multiobjective problem formulation consisting of optimizing expected return and f(4), whose solution is determined using Evolver-a software that implements a genetic algorithm. Under business-as-usual market scenarios, the results of the proposed PMRM portfolio selection model are found to be compatible with those of the volatility-based model. However, under extremely unfavorable market conditions, results indicate that f(4) can be a more valid measure of risk than volatility.
Cheng, Feon W; Gao, Xiang; Bao, Le; Mitchell, Diane C; Wood, Craig; Sliwinski, Martin J; Smiciklas-Wright, Helen; Still, Christopher D; Rolston, David D K; Jensen, Gordon L
2017-07-01
To examine the risk factors of developing functional decline and make probabilistic predictions by using a tree-based method that allows higher order polynomials and interactions of the risk factors. The conditional inference tree analysis, a data mining approach, was used to construct a risk stratification algorithm for developing functional limitation based on BMI and other potential risk factors for disability in 1,951 older adults without functional limitations at baseline (baseline age 73.1 ± 4.2 y). We also analyzed the data with multivariate stepwise logistic regression and compared the two approaches (e.g., cross-validation). Over a mean of 9.2 ± 1.7 years of follow-up, 221 individuals developed functional limitation. Higher BMI, age, and comorbidity were consistently identified as significant risk factors for functional decline using both methods. Based on these factors, individuals were stratified into four risk groups via the conditional inference tree analysis. Compared to the low-risk group, all other groups had a significantly higher risk of developing functional limitation. The odds ratio comparing two extreme categories was 9.09 (95% confidence interval: 4.68, 17.6). Higher BMI, age, and comorbid disease were consistently identified as significant risk factors for functional decline among older individuals across all approaches and analyses. © 2017 The Obesity Society.
Analysis of risk factors for postoperative morbidity in perforated peptic ulcer.
Kim, Jae-Myung; Jeong, Sang-Ho; Lee, Young-Joon; Park, Soon-Tae; Choi, Sang-Kyung; Hong, Soon-Chan; Jung, Eun-Jung; Ju, Young-Tae; Jeong, Chi-Young; Ha, Woo-Song
2012-03-01
Emergency operations for perforated peptic ulcer are associated with a high incidence of postoperative complications. While several studies have investigated the impact of perioperative risk factors and underlying diseases on the postoperative morbidity after abdominal surgery, only a few have analyzed their role in perforated peptic ulcer disease. The purpose of this study was to determine any possible associations between postoperative morbidity and comorbid disease or perioperative risk factors in perforated peptic ulcer. In total, 142 consecutive patients, who underwent surgery for perforated peptic ulcer, at a single institution, between January 2005 and October 2010 were included in this study. The clinical data concerning the patient characteristics, operative methods, and complications were collected retrospectively. The postoperative morbidity rate associated with perforated peptic ulcer operations was 36.6% (52/142). Univariate analysis revealed that a long operating time, the open surgical method, age (≥60), sex (female), high American Society of Anesthesiologists (ASA) score and presence of preoperative shock were significant perioperative risk factors for postoperative morbidity. Significant comorbid risk factors included hypertension, diabetes mellitus and pulmonary disease. Multivariate analysis revealed a long operating time, the open surgical method, high ASA score and the presence of preoperative shock were all independent risk factors for the postoperative morbidity in perforated peptic ulcer. A high ASA score, preoperative shock, open surgery and long operating time of more than 150 minutes are high risk factors for morbidity. However, there is no association between postoperative morbidity and comorbid disease in patients with a perforated peptic ulcer.
Risk assessment of vector-borne diseases for public health governance.
Sedda, L; Morley, D W; Braks, M A H; De Simone, L; Benz, D; Rogers, D J
2014-12-01
In the context of public health, risk governance (or risk analysis) is a framework for the assessment and subsequent management and/or control of the danger posed by an identified disease threat. Generic frameworks in which to carry out risk assessment have been developed by various agencies. These include monitoring, data collection, statistical analysis and dissemination. Due to the inherent complexity of disease systems, however, the generic approach must be modified for individual, disease-specific risk assessment frameworks. The analysis was based on the review of the current risk assessments of vector-borne diseases adopted by the main Public Health organisations (OIE, WHO, ECDC, FAO, CDC etc…). Literature, legislation and statistical assessment of the risk analysis frameworks. This review outlines the need for the development of a general public health risk assessment method for vector-borne diseases, in order to guarantee that sufficient information is gathered to apply robust models of risk assessment. Stochastic (especially spatial) methods, often in Bayesian frameworks are now gaining prominence in standard risk assessment procedures because of their ability to assess accurately model uncertainties. Risk assessment needs to be addressed quantitatively wherever possible, and submitted with its quality assessment in order to enable successful public health measures to be adopted. In terms of current practice, often a series of different models and analyses are applied to the same problem, with results and outcomes that are difficult to compare because of the unknown model and data uncertainties. Therefore, the risk assessment areas in need of further research are identified in this article. Copyright © 2014 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.
Ion mobility analysis of lipoproteins
Benner, W Henry [Danville, CA; Krauss, Ronald M [Berkeley, CA; Blanche, Patricia J [Berkeley, CA
2007-08-21
A medical diagnostic method and instrumentation system for analyzing noncovalently bonded agglomerated biological particles is described. The method and system comprises: a method of preparation for the biological particles; an electrospray generator; an alpha particle radiation source; a differential mobility analyzer; a particle counter; and data acquisition and analysis means. The medical device is useful for the assessment of human diseases, such as cardiac disease risk and hyperlipidemia, by rapid quantitative analysis of lipoprotein fraction densities. Initially, purification procedures are described to reduce an initial blood sample to an analytical input to the instrument. The measured sizes from the analytical sample are correlated with densities, resulting in a spectrum of lipoprotein densities. The lipoprotein density distribution can then be used to characterize cardiac and other lipid-related health risks.
Developing and validating risk prediction models in an individual participant data meta-analysis
2014-01-01
Background Risk prediction models estimate the risk of developing future outcomes for individuals based on one or more underlying characteristics (predictors). We review how researchers develop and validate risk prediction models within an individual participant data (IPD) meta-analysis, in order to assess the feasibility and conduct of the approach. Methods A qualitative review of the aims, methodology, and reporting in 15 articles that developed a risk prediction model using IPD from multiple studies. Results The IPD approach offers many opportunities but methodological challenges exist, including: unavailability of requested IPD, missing patient data and predictors, and between-study heterogeneity in methods of measurement, outcome definitions and predictor effects. Most articles develop their model using IPD from all available studies and perform only an internal validation (on the same set of data). Ten of the 15 articles did not allow for any study differences in baseline risk (intercepts), potentially limiting their model’s applicability and performance in some populations. Only two articles used external validation (on different data), including a novel method which develops the model on all but one of the IPD studies, tests performance in the excluded study, and repeats by rotating the omitted study. Conclusions An IPD meta-analysis offers unique opportunities for risk prediction research. Researchers can make more of this by allowing separate model intercept terms for each study (population) to improve generalisability, and by using ‘internal-external cross-validation’ to simultaneously develop and validate their model. Methodological challenges can be reduced by prospectively planned collaborations that share IPD for risk prediction. PMID:24397587
A Method for Dynamic Risk Assessment and Management of Rockbursts in Drill and Blast Tunnels
NASA Astrophysics Data System (ADS)
Liu, Guo-Feng; Feng, Xia-Ting; Feng, Guang-Liang; Chen, Bing-Rui; Chen, Dong-Fang; Duan, Shu-Qian
2016-08-01
Focusing on the problems caused by rockburst hazards in deep tunnels, such as casualties, damage to construction equipment and facilities, construction schedule delays, and project cost increase, this research attempts to present a methodology for dynamic risk assessment and management of rockbursts in D&B tunnels. The basic idea of dynamic risk assessment and management of rockbursts is determined, and methods associated with each step in the rockburst risk assessment and management process are given, respectively. Among them, the main parts include a microseismic method for early warning the occurrence probability of rockburst risk, an estimation method that aims to assess potential consequences of rockburst risk, an evaluation method that utilizes a new quantitative index considering both occurrence probability and consequences for determining the level of rockburst risk, and the dynamic updating. Specifically, this research briefly describes the referenced microseismic method of warning rockburst, but focuses on the analysis of consequences and associated risk assessment and management of rockburst. Using the proposed method of risk assessment and management of rockburst, the occurrence probability, potential consequences, and the level of rockburst risk can be obtained in real-time during tunnel excavation, which contributes to the dynamic optimisation of risk mitigation measures and their application. The applicability of the proposed method has been verified by those cases from the Jinping II deep headrace and water drainage tunnels at depths of 1900-2525 m (with a length of 11.6 km in total for D&B tunnels).
The standard framework of Ecological Risk Assessment (ERA) uses organism-level assessment endpoints to qualitatively determine the risk to populations. While organism-level toxicity data provide the pathway by which a species may be affected by a chemical stressor, they neither i...
NASA Technical Reports Server (NTRS)
Smith, Greg
2003-01-01
Schedule risk assessments determine the likelihood of finishing on time. Each task in a schedule has a varying degree of probability of being finished on time. A schedule risk assessment quantifies these probabilities by assigning values to each task. This viewgraph presentation contains a flow chart for conducting a schedule risk assessment, and profiles applicable several methods of data analysis.
Stingray Failure Mode, Effects and Criticality Analysis: WEC Risk Registers
Ken Rhinefrank
2016-07-25
Analysis method to systematically identify all potential failure modes and their effects on the Stingray WEC system. This analysis is incorporated early in the development cycle such that the mitigation of the identified failure modes can be achieved cost effectively and efficiently. The FMECA can begin once there is enough detail to functions and failure modes of a given system, and its interfaces with other systems. The FMECA occurs coincidently with the design process and is an iterative process which allows for design changes to overcome deficiencies in the analysis.Risk Registers for major subsystems completed according to the methodology described in "Failure Mode Effects and Criticality Analysis Risk Reduction Program Plan.pdf" document below, in compliance with the DOE Risk Management Framework developed by NREL.
Risk as an attribute in discrete choice experiments: a systematic review of the literature.
Harrison, Mark; Rigby, Dan; Vass, Caroline; Flynn, Terry; Louviere, Jordan; Payne, Katherine
2014-01-01
Discrete choice experiments (DCEs) are used to elicit preferences of current and future patients and healthcare professionals about how they value different aspects of healthcare. Risk is an integral part of most healthcare decisions. Despite the use of risk attributes in DCEs consistently being highlighted as an area for further research, current methods of incorporating risk attributes in DCEs have not been reviewed explicitly. This study aimed to systematically identify published healthcare DCEs that incorporated a risk attribute, summarise and appraise methods used to present and analyse risk attributes, and recommend best practice regarding including, analysing and transparently reporting the methodology supporting risk attributes in future DCEs. The Web of Science, MEDLINE, EMBASE, PsycINFO and Econlit databases were searched on 18 April 2013 for DCEs that included a risk attribute published since 1995, and on 23 April 2013 to identify studies assessing risk communication in the general (non-DCE) health literature. Healthcare-related DCEs with a risk attribute mentioned or suggested in the title/abstract were obtained and retained in the final review if a risk attribute meeting our definition was included. Extracted data were tabulated and critically appraised to summarise the quality of reporting, and the format, presentation and interpretation of the risk attribute were summarised. This review identified 117 healthcare DCEs that incorporated at least one risk attribute. Whilst there was some evidence of good practice incorporated into the presentation of risk attributes, little evidence was found that developing methods and recommendations from other disciplines about effective methods and validation of risk communication were systematically applied to DCEs. In general, the reviewed DCE studies did not thoroughly report the methodology supporting the explanation of risk in training materials, the impact of framing risk, or exploring the validity of risk communication. The primary limitation of this review was that the methods underlying presentation, format and analysis of risk attributes could only be appraised to the extent that they were reported. Improvements in reporting and transparency of risk presentation from conception to the analysis of DCEs are needed. To define best practice, further research is needed to test how the process of communicating risk affects the way in which people value risk attributes in DCEs.
Huebner, Thomas; Goernig, Matthias; Schuepbach, Michael; Sanz, Ernst; Pilgram, Roland; Seeck, Andrea; Voss, Andreas
2010-01-01
Background: Electrocardiographic methods still provide the bulk of cardiovascular diagnostics. Cardiac ischemia is associated with typical alterations in cardiac biosignals that have to be measured, analyzed by mathematical algorithms and allegorized for further clinical diagnostics. The fast growing fields of biomedical engineering and applied sciences are intensely focused on generating new approaches to cardiac biosignal analysis for diagnosis and risk stratification in myocardial ischemia. Objectives: To present and review the state of the art in and new approaches to electrocardiologic methods for non-invasive detection and risk stratification in coronary artery disease (CAD) and myocardial ischemia; secondarily, to explore the future perspectives of these methods. Methods: In follow-up to the Expert Discussion at the 2008 Workshop on "Biosignal Analysis" of the German Society of Biomedical Engineering in Potsdam, Germany, we comprehensively searched the pertinent literature and databases and compiled the results into this review. Then, we categorized the state-of-the-art methods and selected new approaches based on their applications in detection and risk stratification of myocardial ischemia. Finally, we compared the pros and cons of the methods and explored their future potentials for cardiology. Results: Resting ECG, particularly suited for detecting ST-elevation myocardial infarctions, and exercise ECG, for the diagnosis of stable CAD, are state-of-the-art methods. New exercise-free methods for detecting stable CAD include cardiogoniometry (CGM); methods for detecting acute coronary syndrome without ST elevation are Body Surface Potential Mapping, functional imaging and CGM. Heart rate variability and blood pressure variability analyses, microvolt T-wave alternans and signal-averaged ECG mainly serve in detecting and stratifying the risk for lethal arrythmias in patients with myocardial ischemia or previous myocardial infarctions. Telemedicine and ambient-assisted living support the electrocardiological monitoring of at-risk patients. Conclusions: There are many promising methods for the exercise-free, non-invasive detection of CAD and myocardial ischemia in the stable and acute phases. In the coming years, these new methods will help enhance state-of-the-art procedures in routine diagnostics. The future can expect that equally novel methods for risk stratification and telemedicine will transition into clinical routine. PMID:21063467
Method Analysis of Microbial-Resistant Gypsum Products
Method Analysis of Microbial-Resistant Gypsum ProductsD.A. Betancourt1, T.R.Dean1, A. Evans2, and G.Byfield2 1. US Environmental Protection Agency, Office of Research and Development, National Risk Management Research Laboratory; RTP, NC 277112. RTI International, RTP, NCSeveral...
Quantitative Method for Analyzing the Allocation of Risks in Transportation Construction
DOT National Transportation Integrated Search
1979-04-01
The report presents a conceptual model of risk that was developed to analyze the impact on owner's cost of alternate allocations of risk among owner and contractor in mass transit construction. A model and analysis procedure are developed, based on d...
Gray, Benjamin J; Bracken, Richard M; Turner, Daniel; Morgan, Kerry; Thomas, Michael; Williams, Sally P; Williams, Meurig; Rice, Sam; Stephens, Jeffrey W
2015-01-01
Background Use of a validated risk-assessment tool to identify individuals at high risk of developing type 2 diabetes is currently recommended. It is under-reported, however, whether a different risk tool alters the predicted risk of an individual. Aim This study explored any differences between commonly used validated risk-assessment tools for type 2 diabetes. Design and setting Cross-sectional analysis of individuals who participated in a workplace-based risk assessment in Carmarthenshire, South Wales. Method Retrospective analysis of 676 individuals (389 females and 287 males) who participated in a workplace-based diabetes risk-assessment initiative. Ten-year risk of type 2 diabetes was predicted using the validated QDiabetes®, Leicester Risk Assessment (LRA), FINDRISC, and Cambridge Risk Score (CRS) algorithms. Results Differences between the risk-assessment tools were apparent following retrospective analysis of individuals. CRS categorised the highest proportion (13.6%) of individuals at ‘high risk’ followed by FINDRISC (6.6%), QDiabetes (6.1%), and, finally, the LRA was the most conservative risk tool (3.1%). Following further analysis by sex, over one-quarter of males were categorised at high risk using CRS (25.4%), whereas a greater percentage of females were categorised as high risk using FINDRISC (7.8%). Conclusion The adoption of a different valid risk-assessment tool can alter the predicted risk of an individual and caution should be used to identify those individuals who really are at high risk of type 2 diabetes. PMID:26541180
Risk analysis based on hazards interactions
NASA Astrophysics Data System (ADS)
Rossi, Lauro; Rudari, Roberto; Trasforini, Eva; De Angeli, Silvia; Becker, Joost
2017-04-01
Despite an increasing need for open, transparent, and credible multi-hazard risk assessment methods, models, and tools, the availability of comprehensive risk information needed to inform disaster risk reduction is limited, and the level of interaction across hazards is not systematically analysed. Risk assessment methodologies for different hazards often produce risk metrics that are not comparable. Hazard interactions (consecutive occurrence two or more different events) are generally neglected, resulting in strongly underestimated risk assessment in the most exposed areas. This study presents cases of interaction between different hazards, showing how subsidence can affect coastal and river flood risk (Jakarta and Bandung, Indonesia) or how flood risk is modified after a seismic event (Italy). The analysis of well documented real study cases, based on a combination between Earth Observation and in-situ data, would serve as basis the formalisation of a multi-hazard methodology, identifying gaps and research frontiers. Multi-hazard risk analysis is performed through the RASOR platform (Rapid Analysis and Spatialisation Of Risk). A scenario-driven query system allow users to simulate future scenarios based on existing and assumed conditions, to compare with historical scenarios, and to model multi-hazard risk both before and during an event (www.rasor.eu).
Bonnabry, P; Cingria, L; Sadeghipour, F; Ing, H; Fonzo-Christe, C; Pfister, R
2005-01-01
Background: Until recently, the preparation of paediatric parenteral nutrition formulations in our institution included re-transcription and manual compounding of the mixture. Although no significant clinical problems have occurred, re-engineering of this high risk activity was undertaken to improve its safety. Several changes have been implemented including new prescription software, direct recording on a server, automatic printing of the labels, and creation of a file used to pilot a BAXA MM 12 automatic compounder. The objectives of this study were to compare the risks associated with the old and new processes, to quantify the improved safety with the new process, and to identify the major residual risks. Methods: A failure modes, effects, and criticality analysis (FMECA) was performed by a multidisciplinary team. A cause-effect diagram was built, the failure modes were defined, and the criticality index (CI) was determined for each of them on the basis of the likelihood of occurrence, the severity of the potential effect, and the detection probability. The CIs for each failure mode were compared for the old and new processes and the risk reduction was quantified. Results: The sum of the CIs of all 18 identified failure modes was 3415 for the old process and 1397 for the new (reduction of 59%). The new process reduced the CIs of the different failure modes by a mean factor of 7. The CI was smaller with the new process for 15 failure modes, unchanged for two, and slightly increased for one. The greatest reduction (by a factor of 36) concerned re-transcription errors, followed by readability problems (by a factor of 30) and chemical cross contamination (by a factor of 10). The most critical steps in the new process were labelling mistakes (CI 315, maximum 810), failure to detect a dosage or product mistake (CI 288), failure to detect a typing error during the prescription (CI 175), and microbial contamination (CI 126). Conclusions: Modification of the process resulted in a significant risk reduction as shown by risk analysis. Residual failure opportunities were also quantified, allowing additional actions to be taken to reduce the risk of labelling mistakes. This study illustrates the usefulness of prospective risk analysis methods in healthcare processes. More systematic use of risk analysis is needed to guide continuous safety improvement of high risk activities. PMID:15805453
Kessels-Habraken, Marieke; De Jonge, Jan; Van der Schaaf, Tjerk; Rutte, Christel
2010-05-01
Hospitals can apply prospective and retrospective methods to reduce the large number of medical errors. Retrospective methods are used to identify errors after they occur and to facilitate learning. Prospective methods aim to determine, assess and minimise risks before incidents happen. This paper questions whether the order of implementation of those two methods influences the resultant impact on incident reporting behaviour. From November 2007 until June 2008, twelve wards of two Dutch general hospitals participated in a quasi-experimental reversed-treatment non-equivalent control group design. The six units of Hospital 1 first conducted a prospective analysis, after which a sophisticated incident reporting and analysis system was implemented. On the six units of Hospital 2 the two methods were implemented in reverse order. Data from the incident reporting and analysis system and from a questionnaire were used to assess between-hospital differences regarding the number of reported incidents, the spectrum of reported incident types, and the profession of reporters. The results show that carrying out a prospective analysis first can improve incident reporting behaviour in terms of a wider spectrum of reported incident types and a larger proportion of incidents reported by doctors. However, the proposed order does not necessarily yield a larger number of reported incidents. This study fills an important gap in safety management research regarding the order of the implementation of prospective and retrospective methods, and contributes to literature on incident reporting. This research also builds on the network theory of social contagion. The results might indicate that health care employees can disseminate their risk perceptions through communication with their direct colleagues. Copyright 2010 Elsevier Ltd. All rights reserved.
Yang, Yu; Lian, Xin-Ying; Jiang, Yong-Hai; Xi, Bei-Dou; He, Xiao-Song
2017-11-01
Agricultural regions are a significant source of groundwater pesticide pollution. To ensure that agricultural regions with a significantly high risk of groundwater pesticide contamination are properly managed, a risk-based ranking method related to groundwater pesticide contamination is needed. In the present paper, a risk-based prioritization method for the classification of groundwater pesticide pollution from agricultural regions was established. The method encompasses 3 phases, including indicator selection, characterization, and classification. In the risk ranking index system employed here, 17 indicators involving the physicochemical properties, environmental behavior characteristics, pesticide application methods, and inherent vulnerability of groundwater in the agricultural region were selected. The boundary of each indicator was determined using K-means cluster analysis based on a survey of a typical agricultural region and the physical and chemical properties of 300 typical pesticides. The total risk characterization was calculated by multiplying the risk value of each indicator, which could effectively avoid the subjectivity of index weight calculation and identify the main factors associated with the risk. The results indicated that the risk for groundwater pesticide contamination from agriculture in a region could be ranked into 4 classes from low to high risk. This method was applied to an agricultural region in Jiangsu Province, China, and it showed that this region had a relatively high risk for groundwater contamination from pesticides, and that the pesticide application method was the primary factor contributing to the relatively high risk. The risk ranking method was determined to be feasible, valid, and able to provide reference data related to the risk management of groundwater pesticide pollution from agricultural regions. Integr Environ Assess Manag 2017;13:1052-1059. © 2017 SETAC. © 2017 SETAC.
Przedlacki, J; Buczyńska-Chyl, J; Koźmiński, P; Niemczyk, E; Wojtaszek, E; Gieglis, E; Żebrowski, P; Podgórzak, A; Wściślak, J; Wieliczko, M; Matuszkiewicz-Rowińska, J
2018-05-01
We assessed the FRAX® method in 718 hemodialyzed patients in estimating increased risk of bone major and hip fractures. Over two prospective years, statistical analysis showed that FRAX® enables a better assessment of bone major fracture risk in these patients than any of its components and other risk factors considered in the analysis. Despite the generally increased risk of bone fractures among patients with end-stage renal disease, no prediction models for identifying individuals at particular risk have been developed to date. The goal of this prospective, multicenter observational study was to assess the usefulness of the FRAX® method in comparison to all its elements considered separately, selected factors associated with renal disease and the history of falls, in estimating increased risk of low-energy major bone and hip fractures in patients undergoing chronic hemodialysis. The study included a total of 1068 hemodialysis patients, who were followed for 2 years, and finally, 718 of them were analyzed. The risk analysis included the Polish version of the FRAX® calculator (without bone mineral density), dialysis vintage, mineral metabolism disorders (serum calcium, phosphate, and parathyroid hormone), and the number of falls during the last year before the study. Over 2 years, low-energy 30 major bone fractures were diagnosed and 13 of hip fractures among them. Area under the curve for FRAX® was 0.76 (95% CI 0.69-0.84) for major fractures and 0.70 (95% CI 0.563-0.832) for hip fractures. The AUC for major bone fractures was significantly higher than for all elements of the FRAX® calculator. In logistic regression analysis FRAX® was the strongest independent risk factor of assessment of the major bone fracture risk. FRAX® enables a better assessment of major bone fracture risk in ESRD patients undergoing hemodialysis than any of its components and other risk factors considered in the analysis.
NASA Astrophysics Data System (ADS)
Oh, Jung Hun; Kerns, Sarah; Ostrer, Harry; Powell, Simon N.; Rosenstein, Barry; Deasy, Joseph O.
2017-02-01
The biological cause of clinically observed variability of normal tissue damage following radiotherapy is poorly understood. We hypothesized that machine/statistical learning methods using single nucleotide polymorphism (SNP)-based genome-wide association studies (GWAS) would identify groups of patients of differing complication risk, and furthermore could be used to identify key biological sources of variability. We developed a novel learning algorithm, called pre-conditioned random forest regression (PRFR), to construct polygenic risk models using hundreds of SNPs, thereby capturing genomic features that confer small differential risk. Predictive models were trained and validated on a cohort of 368 prostate cancer patients for two post-radiotherapy clinical endpoints: late rectal bleeding and erectile dysfunction. The proposed method results in better predictive performance compared with existing computational methods. Gene ontology enrichment analysis and protein-protein interaction network analysis are used to identify key biological processes and proteins that were plausible based on other published studies. In conclusion, we confirm that novel machine learning methods can produce large predictive models (hundreds of SNPs), yielding clinically useful risk stratification models, as well as identifying important underlying biological processes in the radiation damage and tissue repair process. The methods are generally applicable to GWAS data and are not specific to radiotherapy endpoints.
Motamedzade, Majid; Ashuri, Mohammad Reza; Golmohammadi, Rostam; Mahjub, Hossein
2011-06-13
During the last decades, to assess the risk factors of work-related musculoskeletal disorders (WMSDs), enormous observational methods have been developed. Rapid Entire Body Assessment (REBA) and Quick Exposure Check (QEC) are two general methods in this field. This study aimed to compare ergonomic risk assessment outputs from QEC and REBA in terms of agreement in distribution of postural loading scores based on analysis of working postures. This cross-sectional study was conducted in an engine oil company in which 40 jobs were studied. All jobs were observed by a trained occupational health practitioner. Job information was collected to ensure the completion of ergonomic risk assessment tools, including QEC, and REBA. The result revealed that there was a significant correlation between final scores (r=0.731) and the action levels (r =0.893) of two applied methods. Comparison between the action levels and final scores of two methods showed that there was no significant difference among working departments. Most of studied postures acquired low and moderate risk level in QEC assessment (low risk=20%, moderate risk=50% and High risk=30%) and in REBA assessment (low risk=15%, moderate risk=60% and high risk=25%). There is a significant correlation between two methods. They have a strong correlation in identifying risky jobs, and determining the potential risk for incidence of WMSDs. Therefore, there is possibility for researchers to apply interchangeably both methods, for postural risk assessment in appropriate working environments.
Failure analysis in the identification of synergies between cleaning monitoring methods.
Whiteley, Greg S; Derry, Chris; Glasbey, Trevor
2015-02-01
The 4 monitoring methods used to manage the quality assurance of cleaning outcomes within health care settings are visual inspection, microbial recovery, fluorescent marker assessment, and rapid ATP bioluminometry. These methods each generate different types of information, presenting a challenge to the successful integration of monitoring results. A systematic approach to safety and quality control can be used to interrogate the known qualities of cleaning monitoring methods and provide a prospective management tool for infection control professionals. We investigated the use of failure mode and effects analysis (FMEA) for measuring failure risk arising through each cleaning monitoring method. FMEA uses existing data in a structured risk assessment tool that identifies weaknesses in products or processes. Our FMEA approach used the literature and a small experienced team to construct a series of analyses to investigate the cleaning monitoring methods in a way that minimized identified failure risks. FMEA applied to each of the cleaning monitoring methods revealed failure modes for each. The combined use of cleaning monitoring methods in sequence is preferable to their use in isolation. When these 4 cleaning monitoring methods are used in combination in a logical sequence, the failure modes noted for any 1 can be complemented by the strengths of the alternatives, thereby circumventing the risk of failure of any individual cleaning monitoring method. Copyright © 2015 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.
Li, Ning; Liu, Xueqin; Xie, Wei; Wu, Jidong; Zhang, Peng
2013-01-01
New features of natural disasters have been observed over the last several years. The factors that influence the disasters' formation mechanisms, regularity of occurrence and main characteristics have been revealed to be more complicated and diverse in nature than previously thought. As the uncertainty involved increases, the variables need to be examined further. This article discusses the importance and the shortage of multivariate analysis of natural disasters and presents a method to estimate the joint probability of the return periods and perform a risk analysis. Severe dust storms from 1990 to 2008 in Inner Mongolia were used as a case study to test this new methodology, as they are normal and recurring climatic phenomena on Earth. Based on the 79 investigated events and according to the dust storm definition with bivariate, the joint probability distribution of severe dust storms was established using the observed data of maximum wind speed and duration. The joint return periods of severe dust storms were calculated, and the relevant risk was analyzed according to the joint probability. The copula function is able to simulate severe dust storm disasters accurately. The joint return periods generated are closer to those observed in reality than the univariate return periods and thus have more value in severe dust storm disaster mitigation, strategy making, program design, and improvement of risk management. This research may prove useful in risk-based decision making. The exploration of multivariate analysis methods can also lay the foundation for further applications in natural disaster risk analysis. © 2012 Society for Risk Analysis.
Multiattribute risk analysis in nuclear emergency management.
Hämäläinen, R P; Lindstedt, M R; Sinkko, K
2000-08-01
Radiation protection authorities have seen a potential for applying multiattribute risk analysis in nuclear emergency management and planning to deal with conflicting objectives, different parties involved, and uncertainties. This type of approach is expected to help in the following areas: to ensure that all relevant attributes are considered in decision making; to enhance communication between the concerned parties, including the public; and to provide a method for explicitly including risk analysis in the process. A multiattribute utility theory analysis was used to select a strategy for protecting the population after a simulated nuclear accident. The value-focused approach and the use of a neutral facilitator were identified as being useful.
Lacny, Sarah; Wilson, Todd; Clement, Fiona; Roberts, Derek J; Faris, Peter; Ghali, William A; Marshall, Deborah A
2018-01-01
Kaplan-Meier survival analysis overestimates cumulative incidence in competing risks (CRs) settings. The extent of overestimation (or its clinical significance) has been questioned, and CRs methods are infrequently used. This meta-analysis compares the Kaplan-Meier method to the cumulative incidence function (CIF), a CRs method. We searched MEDLINE, EMBASE, BIOSIS Previews, Web of Science (1992-2016), and article bibliographies for studies estimating cumulative incidence using the Kaplan-Meier method and CIF. For studies with sufficient data, we calculated pooled risk ratios (RRs) comparing Kaplan-Meier and CIF estimates using DerSimonian and Laird random effects models. We performed stratified meta-analyses by clinical area, rate of CRs (CRs/events of interest), and follow-up time. Of 2,192 identified abstracts, we included 77 studies in the systematic review and meta-analyzed 55. The pooled RR demonstrated the Kaplan-Meier estimate was 1.41 [95% confidence interval (CI): 1.36, 1.47] times higher than the CIF. Overestimation was highest among studies with high rates of CRs [RR = 2.36 (95% CI: 1.79, 3.12)], studies related to hepatology [RR = 2.60 (95% CI: 2.12, 3.19)], and obstetrics and gynecology [RR = 1.84 (95% CI: 1.52, 2.23)]. The Kaplan-Meier method overestimated the cumulative incidence across 10 clinical areas. Using CRs methods will ensure accurate results inform clinical and policy decisions. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Grzyl, Beata; Kristowski, Adam; Miszewska-Urbańska, Emilia
2017-10-01
Each investment plan, including the one concerning a building, is exposed to the consequences of various types of threats taking place. Therefore, in the case of some large-scale, atypical and complicated building ventures, some actions included in the procedure of risk management should be taken (identifications, analysis, measurements, control and supervision of the risk). This will allow for the risk to be eliminated or limited. While preparing a building venture, an investor does not possess full information about the course of events on each stage of investment completion. The identification of the above-mentioned unknowns, subjecting them to quantification and specifying the method of dealing with them, allows an investor to increase the effectiveness of the intended plan. The enterprise discussed in this article and analyzed in the context of risk, concerns alteration, revitalization and conversion for office purposes of two buildings located in Gdańsk at 1 and 2 Lastadia Street. These buildings are situated on the area of historical urban layout of Gdańsk, in the northern-eastern part of Stare Przedmieście District (Old Suburb), about 800 meters south from Dlugi Targ Street and 200 meters west from The Old Motława River. The investor is “Gdańskie Melioracje Ltd.”, a limited liability company, which belongs to the Council of Gdańsk. In order to increase the effectiveness of the intended investment venture, while organizing the investment process, the investor commissioned preparation of an analysis and risk evaluation connected with the above-mentioned intention. Based on an on-site visit, the opinions of experts, who have been involved in the process of the preparation of the investment, studies of the available monographies about the technical condition of the buildings at 1 and 2 Lastadia Street and their own experiences, the authors identified 54 types of relevant risks, which have been systematized into 10 subject groups (among others- investor’s risk due to the designing process, location of the investment, third party or investor business activity, force majeure, political, legal, financial, technical). The scope of the study includes the identification, analysis and risk evaluation connected with planning and completion of alteration, revitalization and conversion of a historic building located at 2 Lastadia Street for the office purposes. The risk has been analyzed from the investor’s perspective. The authors used a method of preliminary analysis and risk evaluation PHA (Preliminary Hazard Analysis) and the expert method.
White Paper: A Defect Prioritization Method Based on the Risk Priority Number
2013-11-01
adapted The Failure Modes and Effects Analysis ( FMEA ) method employs a measurement technique called Risk Priority Number (RPN) to quantify the...Up to an hour 16-60 1.5 Brief Interrupt 0-15 1 Table 1 – Time Scaling Factors In the FMEA formulation, RPN is a product of the three categories
ERIC Educational Resources Information Center
Fleary, Sasha A.
2017-01-01
Background: Several studies have used latent class analyses to explore obesogenic behaviors and substance use in adolescents independently. We explored a variety of health risks jointly to identify distinct patterns of risk behaviors among adolescents. Methods: Latent class models were estimated using Youth Risk Behavior Surveillance System…
NASA Astrophysics Data System (ADS)
Kataoka, Norio; Kasama, Kiyonobu; Zen, Kouki; Chen, Guangqi
This paper presents a probabilistic method for assessi ng the liquefaction risk of cement-treated ground, which is an anti-liquefaction ground improved by cemen t-mixing. In this study, the liquefaction potential of cement-treated ground is analyzed statistically using Monte Carlo Simulation based on the nonlinear earthquake response analysis consid ering the spatial variability of so il properties. The seismic bearing capacity of partially liquefied ground is analyzed in order to estimat e damage costs induced by partial liquefaction. Finally, the annual li quefaction risk is calcu lated by multiplying the liquefaction potential with the damage costs. The results indicated that the proposed new method enables to evaluate the probability of liquefaction, to estimate the damage costs using the hazard curv e, fragility curve induced by liquefaction, and liq uefaction risk curve.
Classifying Nanomaterial Risks Using Multi-Criteria Decision Analysis
NASA Astrophysics Data System (ADS)
Linkov, I.; Steevens, J.; Chappell, M.; Tervonen, T.; Figueira, J. R.; Merad, M.
There is rapidly growing interest by regulatory agencies and stakeholders in the potential toxicity and other risks associated with nanomaterials throughout the different stages of the product life cycle (e.g., development, production, use and disposal). Risk assessment methods and tools developed and applied to chemical and biological material may not be readily adaptable for nanomaterials because of the current uncertainty in identifying the relevant physico-chemical and biological properties that adequately describe the materials. Such uncertainty is further driven by the substantial variations in the properties of the original material because of the variable manufacturing processes employed in nanomaterial production. To guide scientists and engineers in nanomaterial research and application as well as promote the safe use/handling of these materials, we propose a decision support system for classifying nanomaterials into different risk categories. The classification system is based on a set of performance metrics that measure both the toxicity and physico-chemical characteristics of the original materials, as well as the expected environmental impacts through the product life cycle. The stochastic multicriteria acceptability analysis (SMAA-TRI), a formal decision analysis method, was used as the foundation for this task. This method allowed us to cluster various nanomaterials in different risk categories based on our current knowledge of nanomaterial's physico-chemical characteristics, variation in produced material, and best professional judgement. SMAA-TRI uses Monte Carlo simulations to explore all feasible values for weights, criteria measurements, and other model parameters to assess the robustness of nanomaterial grouping for risk management purposes.1,2
Ambler, Gareth; Omar, Rumana Z; Royston, Patrick
2007-06-01
Risk models that aim to predict the future course and outcome of disease processes are increasingly used in health research, and it is important that they are accurate and reliable. Most of these risk models are fitted using routinely collected data in hospitals or general practices. Clinical outcomes such as short-term mortality will be near-complete, but many of the predictors may have missing values. A common approach to dealing with this is to perform a complete-case analysis. However, this may lead to overfitted models and biased estimates if entire patient subgroups are excluded. The aim of this paper is to investigate a number of methods for imputing missing data to evaluate their effect on risk model estimation and the reliability of the predictions. Multiple imputation methods, including hotdecking and multiple imputation by chained equations (MICE), were investigated along with several single imputation methods. A large national cardiac surgery database was used to create simulated yet realistic datasets. The results suggest that complete case analysis may produce unreliable risk predictions and should be avoided. Conditional mean imputation performed well in our scenario, but may not be appropriate if using variable selection methods. MICE was amongst the best performing multiple imputation methods with regards to the quality of the predictions. Additionally, it produced the least biased estimates, with good coverage, and hence is recommended for use in practice.
From Physical Process to Economic Cost - Integrated Approaches of Landslide Risk Assessment
NASA Astrophysics Data System (ADS)
Klose, M.; Damm, B.
2014-12-01
The nature of landslides is complex in many respects, with landslide hazard and impact being dependent on a variety of factors. This obviously requires an integrated assessment for fundamental understanding of landslide risk. Integrated risk assessment, according to the approach presented in this contribution, implies combining prediction of future landslide occurrence with analysis of landslide impact in the past. A critical step for assessing landslide risk in integrated perspective is to analyze what types of landslide damage affected people and property in which way and how people contributed and responded to these damage types. In integrated risk assessment, the focus is on systematic identification and monetization of landslide damage, and analytical tools that allow deriving economic costs from physical landslide processes are at the heart of this approach. The broad spectrum of landslide types and process mechanisms as well as nonlinearity between landslide magnitude, damage intensity, and direct costs are some main factors explaining recent challenges in risk assessment. The two prevailing approaches for assessing the impact of landslides in economic terms are cost survey (ex-post) and risk analysis (ex-ante). Both approaches are able to complement each other, but yet a combination of them has not been realized so far. It is common practice today to derive landslide risk without considering landslide process-based cause-effect relationships, since integrated concepts or new modeling tools expanding conventional methods are still widely missing. The approach introduced in this contribution is based on a systematic framework that combines cost survey and GIS-based tools for hazard or cost modeling with methods to assess interactions between land use practices and landslides in historical perspective. Fundamental understanding of landslide risk also requires knowledge about the economic and fiscal relevance of landslide losses, wherefore analysis of their impact on public budgets is a further component of this approach. In integrated risk assessment, combination of methods plays an important role, with the objective of collecting and integrating complex data sets on landslide risk.
Chronic Disease Risks in Young Adults with Autism Spectrum Disorder: Forewarned Is Forearmed
ERIC Educational Resources Information Center
Tyler, Carl V.; Schramm, Sarah C.; Karafa, Matthew; Tang, Anne S.; Jain, Anil K.
2011-01-01
An emerging, cost-effective method to examine prevalent and future health risks of persons with disabilities is electronic health record (EHR) analysis. As an example, a case-control EHR analysis of adults with autism spectrum disorder receiving primary care through the Cleveland Clinic from 2005 to 2008 identified 108 adults with autism spectrum…
Tang, Jingyuan; Xu, Lingyan; Xu, Haoxiang; Li, Ran; Han, Peng; Yang, Haiwei
2017-01-01
Previous studies have investigated the association between NAT2 polymorphism and the risk of prostate cancer (PCa). However, the findings from these studies remained inconsistent. Hence, we performed a meta-analysis to provide a more reliable conclusion about such associations. In the present meta-analysis, 13 independent case-control studies were included with a total of 14,469 PCa patients and 10,689 controls. All relevant studies published were searched in the databates PubMed, EMBASE, and Web of Science, till March 1st, 2017. We used the pooled odds ratios (ORs) with 95% confidence intervals (CIs) to evaluate the strength of the association between NAT2*4 allele and susceptibility to PCa. Subgroup analysis was carried out by ethnicity, source of controls and genotyping method. What's more, we also performed trial sequential analysis (TSA) to reduce the risk of type I error and evaluate whether the evidence of the results was firm. Firstly, our results indicated that NAT2*4 allele was not associated with PCa susceptibility (OR = 1.00, 95% CI= 0.95–1.05; P = 0.100). However, after excluding two studies for its heterogeneity and publication bias, no significant relationship was also detected between NAT2*4 allele and the increased risk of PCa, in fixed-effect model (OR = 0.99, 95% CI= 0.94–1.04; P = 0.451). Meanwhile, no significant increased risk of PCa was found in the subgroup analyses by ethnicity, source of controls and genotyping method. Moreover, TSA demonstrated that such association was confirmed in the present study. Therefore, this meta-analysis suggested that no significant association between NAT2 polymorphism and the risk of PCa was found. PMID:28915684
Liu, J; Li, Y P; Huang, G H; Zeng, X T; Nie, S
2016-01-01
In this study, an interval-stochastic-based risk analysis (RSRA) method is developed for supporting river water quality management in a rural system under uncertainty (i.e., uncertainties exist in a number of system components as well as their interrelationships). The RSRA method is effective in risk management and policy analysis, particularly when the inputs (such as allowable pollutant discharge and pollutant discharge rate) are expressed as probability distributions and interval values. Moreover, decision-makers' attitudes towards system risk can be reflected using a restricted resource measure by controlling the variability of the recourse cost. The RSRA method is then applied to a real case of water quality management in the Heshui River Basin (a rural area of China), where chemical oxygen demand (COD), total nitrogen (TN), total phosphorus (TP), and soil loss are selected as major indicators to identify the water pollution control strategies. Results reveal that uncertainties and risk attitudes have significant effects on both pollutant discharge and system benefit. A high risk measure level can lead to a reduced system benefit; however, this reduction also corresponds to raised system reliability. Results also disclose that (a) agriculture is the dominant contributor to soil loss, TN, and TP loads, and abatement actions should be mainly carried out for paddy and dry farms; (b) livestock husbandry is the main COD discharger, and abatement measures should be mainly conducted for poultry farm; (c) fishery accounts for a high percentage of TN, TP, and COD discharges but a has low percentage of overall net benefit, and it may be beneficial to cease fishery activities in the basin. The findings can facilitate the local authority in identifying desired pollution control strategies with the tradeoff between socioeconomic development and environmental sustainability.
Creating a spatially-explicit index: a method for assessing the global wildfire-water risk
NASA Astrophysics Data System (ADS)
Robinne, François-Nicolas; Parisien, Marc-André; Flannigan, Mike; Miller, Carol; Bladon, Kevin D.
2017-04-01
The wildfire-water risk (WWR) has been defined as the potential for wildfires to adversely affect water resources that are important for downstream ecosystems and human water needs for adequate water quantity and quality, therefore compromising the security of their water supply. While tools and methods are numerous for watershed-scale risk analysis, the development of a toolbox for the large-scale evaluation of the wildfire risk to water security has only started recently. In order to provide managers and policy-makers with an adequate tool, we implemented a method for the spatial analysis of the global WWR based on the Driving forces-Pressures-States-Impacts-Responses (DPSIR) framework. This framework relies on the cause-and-effect relationships existing between the five categories of the DPSIR chain. As this approach heavily relies on data, we gathered an extensive set of spatial indicators relevant to fire-induced hydrological hazards and water consumption patterns by human and natural communities. When appropriate, we applied a hydrological routing function to our indicators in order to simulate downstream accumulation of potentially harmful material. Each indicator was then assigned a DPSIR category. We collapsed the information in each category using a principal component analysis in order to extract the most relevant pixel-based information provided by each spatial indicator. Finally, we compiled our five categories using an additive indexation process to produce a spatially-explicit index of the WWR. A thorough sensitivity analysis has been performed in order to understand the relationship between the final risk values and the spatial pattern of each category used during the indexation. For comparison purposes, we aggregated index scores by global hydrological regions, or hydrobelts, to get a sense of regional DPSIR specificities. This rather simple method does not necessitate the use of complex physical models and provides a scalable and efficient tool for the analysis of global water security issues.
Gis-Based Multi-Criteria Decision Analysis for Forest Fire Risk Mapping
NASA Astrophysics Data System (ADS)
Akay, A. E.; Erdoğan, A.
2017-11-01
The forested areas along the coastal zone of the Mediterranean region in Turkey are classified as first-degree fire sensitive areas. Forest fires are major environmental disaster that affects the sustainability of forest ecosystems. Besides, forest fires result in important economic losses and even threaten human lives. Thus, it is critical to determine the forested areas with fire risks and thereby minimize the damages on forest resources by taking necessary precaution measures in these areas. The risk of forest fire can be assessed based on various factors such as forest vegetation structures (tree species, crown closure, tree stage), topographic features (slope and aspect), and climatic parameters (temperature, wind). In this study, GIS-based Multi-Criteria Decision Analysis (MCDA) method was used to generate forest fire risk map. The study was implemented in the forested areas within Yayla Forest Enterprise Chiefs at Dursunbey Forest Enterprise Directorate which is classified as first degree fire sensitive area. In the solution process, "extAhp 2.0" plug-in running Analytic Hierarchy Process (AHP) method in ArcGIS 10.4.1 was used to categorize study area under five fire risk classes: extreme risk, high risk, moderate risk, and low risk. The results indicated that 23.81 % of the area was of extreme risk, while 25.81 % was of high risk. The result indicated that the most effective criterion was tree species, followed by tree stages. The aspect had the least effective criterion on forest fire risk. It was revealed that GIS techniques integrated with MCDA methods are effective tools to quickly estimate forest fire risk at low cost. The integration of these factors into GIS can be very useful to determine forested areas with high fire risk and also to plan forestry management after fire.
[Study on the risk assessment method of regional groundwater pollution].
Yang, Yan; Yu, Yun-Jiang; Wang, Zong-Qing; Li, Ding-Long; Sun, Hong-Wei
2013-02-01
Based on the boundary elements of system risk assessment, the regional groundwater pollution risk assessment index system was preliminarily established, which included: regional groundwater specific vulnerability assessment, the regional pollution sources characteristics assessment and the health risk assessment of regional featured pollutants. The three sub-evaluation systems were coupled with the multi-index comprehensive method, the risk was characterized with the Spatial Analysis of ArcMap, and a new method to evaluate regional groundwater pollution risk that suitable for different parts of natural conditions, different types of pollution was established. Take Changzhou as an example, the risk of shallow groundwater pollution was studied with the new method, and found that the vulnerability index of groundwater in Changzhou is high and distributes unevenly; The distribution of pollution sources is concentrated and has a great impact on groundwater pollution risks; Influenced by the pollutants and pollution sources, the values of health risks are high in the urban area of Changzhou. The pollution risk of shallow groundwater is high and distributes unevenly, and distributes in the north of the line of Anjia-Xuejia-Zhenglu, the center of the city and the southeast, where the human activities are more intense and the pollution sources are intensive.
NASA Astrophysics Data System (ADS)
Hu, Xiaojing; Li, Qiang; Zhang, Hao; Guo, Ziming; Zhao, Kun; Li, Xinpeng
2018-06-01
Based on the Monte Carlo method, an improved risk assessment method for hybrid AC/DC power system with VSC station considering the operation status of generators, converter stations, AC lines and DC lines is proposed. According to the sequential AC/DC power flow algorithm, node voltage and line active power are solved, and then the operation risk indices of node voltage over-limit and line active power over-limit are calculated. Finally, an improved two-area IEEE RTS-96 system is taken as a case to analyze and assessment its operation risk. The results show that the proposed model and method can intuitively and directly reflect the weak nodes and weak lines of the system, which can provide some reference for the dispatching department.
Capodaglio, E M; Facioli, M; Bazzini, G
2001-01-01
Pathologies due to the repetitive activity of the upper limbs constitutes a growing part of the work-related musculo-skeletal disorders. At the moment, there are no universally accepted and validated methods for the description and assessment of the work-related risks. Yet, the criteria fundamentally characterizing the exposure are rather clear and even. This study reports a practical example of the application of some recent risk assessment methods proposed in the literature, combining objective and subjective measures obtained on the field, with the traditional activity analysis.
A time series modeling approach in risk appraisal of violent and sexual recidivism.
Bani-Yaghoub, Majid; Fedoroff, J Paul; Curry, Susan; Amundsen, David E
2010-10-01
For over half a century, various clinical and actuarial methods have been employed to assess the likelihood of violent recidivism. Yet there is a need for new methods that can improve the accuracy of recidivism predictions. This study proposes a new time series modeling approach that generates high levels of predictive accuracy over short and long periods of time. The proposed approach outperformed two widely used actuarial instruments (i.e., the Violence Risk Appraisal Guide and the Sex Offender Risk Appraisal Guide). Furthermore, analysis of temporal risk variations based on specific time series models can add valuable information into risk assessment and management of violent offenders.
Sun Protection Motivational Stages and Behavior: Skin Cancer Risk Profiles
ERIC Educational Resources Information Center
Pagoto, Sherry L.; McChargue, Dennis E.; Schneider, Kristin; Cook, Jessica Werth
2004-01-01
Objective: To create skin cancer risk profiles that could be used to predict sun protection among Midwest beachgoers. Method: Cluster analysis was used with study participants (N=239), who provided information about sun protection motivation and behavior, perceived risk, burn potential, and tan importance. Participants were clustered according to…
Røssvoll, Elin Halbach; Ueland, Øydis; Hagtvedt, Therese; Jacobsen, Eivind; Lavik, Randi; Langsrud, Solveig
2012-09-01
Traditionally, consumer food safety survey responses have been classified as either "right" or "wrong" and food handling practices that are associated with high risk of infection have been treated in the same way as practices with lower risks. In this study, a risk-based method for consumer food safety surveys has been developed, and HACCP (hazard analysis and critical control point) methodology was used for selecting relevant questions. We conducted a nationally representative Web-based survey (n = 2,008), and to fit the self-reported answers we adjusted a risk-based grading system originally developed for observational studies. The results of the survey were analyzed both with the traditional "right" and "wrong" classification and with the risk-based grading system. The results using the two methods were very different. Only 5 of the 10 most frequent food handling violations were among the 10 practices associated with the highest risk. These 10 practices dealt with different aspects of heat treatment (lacking or insufficient), whereas the majority of the most frequent violations involved storing food at room temperature for too long. Use of the risk-based grading system for survey responses gave a more realistic picture of risks associated with domestic food handling practices. The method highlighted important violations and minor errors, which are performed by most people and are not associated with significant risk. Surveys built on a HACCP-based approach with risk-based grading will contribute to a better understanding of domestic food handling practices and will be of great value for targeted information and educational activities.
Decerns: A framework for multi-criteria decision analysis
Yatsalo, Boris; Didenko, Vladimir; Gritsyuk, Sergey; ...
2015-02-27
A new framework, Decerns, for multicriteria decision analysis (MCDA) of a wide range of practical problems on risk management is introduced. Decerns framework contains a library of modules that are the basis for two scalable systems: DecernsMCDA for analysis of multicriteria problems, and DecernsSDSS for multicriteria analysis of spatial options. DecernsMCDA includes well known MCDA methods and original methods for uncertainty treatment based on probabilistic approaches and fuzzy numbers. As a result, these MCDA methods are described along with a case study on analysis of multicriteria location problem.
Study on Collision of Ship Side Structure by Simplified Plastic Analysis Method
NASA Astrophysics Data System (ADS)
Sun, C. J.; Zhou, J. H.; Wu, W.
2017-10-01
During its lifetime, a ship may encounter collision or grounding and sustain permanent damage after these types of accidents. Crashworthiness has been based on two kinds of main methods: simplified plastic analysis and numerical simulation. A simplified plastic analysis method is presented in this paper. Numerical methods using the non-linear finite-element software LS-DYNA are conducted to validate the method. The results show that, as for the accuracy of calculation results, the simplified plasticity analysis are in good agreement with the finite element simulation, which reveals that the simplified plasticity analysis method can quickly and accurately estimate the crashworthiness of the side structure during the collision process and can be used as a reliable risk assessment method.
Steele, Jennifer A; Richter, Carsten H; Echaubard, Pierre; Saenna, Parichat; Stout, Virginia; Sithithaworn, Paiboon; Wilcox, Bruce A
2018-05-17
Cholangiocarcinoma (CCA) is a fatal bile duct cancer associated with infection by the liver fluke, Opisthorchis viverrini, in the lower Mekong region. Numerous public health interventions have focused on reducing exposure to O. viverrini, but incidence of CCA in the region remains high. While this may indicate the inefficacy of public health interventions due to complex social and cultural factors, it may further indicate other risk factors or interactions with the parasite are important in pathogenesis of CCA. This systematic review aims to provide a comprehensive analysis of described risk factors for CCA in addition to O. viverrini to guide future integrative interventions. We searched five international and seven Thai research databases to identify studies relevant to risk factors for CCA in the lower Mekong region. Selected studies were assessed for risk of bias and quality in terms of study design, population, CCA diagnostic methods, and statistical methods. The final 18 included studies reported numerous risk factors which were grouped into behaviors, socioeconomics, diet, genetics, gender, immune response, other infections, and treatment for O. viverrini. Seventeen risk factors were reported by two or more studies and were assessed with random effects models during meta-analysis. This meta-analysis indicates that the combination of alcohol and smoking (OR = 11.1, 95% CI: 5.63-21.92, P < 0.0001) is most significantly associated with increased risk for CCA and is an even greater risk factor than O. viverrini exposure. This analysis also suggests that family history of cancer, consumption of raw cyprinoid fish, consumption of high nitrate foods, and praziquantel treatment are associated with significantly increased risk. These risk factors may have complex relationships with the host, parasite, or pathogenesis of CCA, and many of these risk factors were found to interact with each other in one or more studies. Our findings suggest that a complex variety of risk factors in addition to O. viverrini infection should be addressed in future public health interventions to reduce CCA in affected regions. In particular, smoking and alcohol use, dietary patterns, and socioeconomic factors should be considered when developing intervention programs to reduce CCA.
Li, Jia; Wang, Deming; Huang, Zonghou
2017-01-01
Coal dust explosions (CDE) are one of the main threats to the occupational safety of coal miners. Aiming to identify and assess the risk of CDE, this paper proposes a novel method of fuzzy fault tree analysis combined with the Visual Basic (VB) program. In this methodology, various potential causes of the CDE are identified and a CDE fault tree is constructed. To overcome drawbacks from the lack of exact probability data for the basic events, fuzzy set theory is employed and the probability data of each basic event is treated as intuitionistic trapezoidal fuzzy numbers. In addition, a new approach for calculating the weighting of each expert is also introduced in this paper to reduce the error during the expert elicitation process. Specifically, an in-depth quantitative analysis of the fuzzy fault tree, such as the importance measure of the basic events and the cut sets, and the CDE occurrence probability is given to assess the explosion risk and acquire more details of the CDE. The VB program is applied to simplify the analysis process. A case study and analysis is provided to illustrate the effectiveness of this proposed method, and some suggestions are given to take preventive measures in advance and avoid CDE accidents. PMID:28793348
Wang, Hetang; Li, Jia; Wang, Deming; Huang, Zonghou
2017-01-01
Coal dust explosions (CDE) are one of the main threats to the occupational safety of coal miners. Aiming to identify and assess the risk of CDE, this paper proposes a novel method of fuzzy fault tree analysis combined with the Visual Basic (VB) program. In this methodology, various potential causes of the CDE are identified and a CDE fault tree is constructed. To overcome drawbacks from the lack of exact probability data for the basic events, fuzzy set theory is employed and the probability data of each basic event is treated as intuitionistic trapezoidal fuzzy numbers. In addition, a new approach for calculating the weighting of each expert is also introduced in this paper to reduce the error during the expert elicitation process. Specifically, an in-depth quantitative analysis of the fuzzy fault tree, such as the importance measure of the basic events and the cut sets, and the CDE occurrence probability is given to assess the explosion risk and acquire more details of the CDE. The VB program is applied to simplify the analysis process. A case study and analysis is provided to illustrate the effectiveness of this proposed method, and some suggestions are given to take preventive measures in advance and avoid CDE accidents.
Comparison of concepts in easy-to-use methods for MSD risk assessment.
Roman-Liu, Danuta
2014-05-01
This article presents a comparative analysis of easy-to-use methods for assessing musculoskeletal load and the risk for developing musculoskeletal disorders. In all such methods, assessment of load consists in defining input data, the procedure and the system of assessment. This article shows what assessment steps the methods have in common; it also shows how those methods differ in each step. In addition, the methods are grouped according to their characteristic features. The conclusion is that the concepts of assessing risk in different methods can be used to develop solutions leading to a comprehensive method appropriate for all work tasks and all parts of the body. However, studies are necessary to verify the accepted premises and to introduce some standardization that would make consolidation possible. Copyright © 2013 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Protecting privacy of shared epidemiologic data without compromising analysis potential.
Cologne, John; Grant, Eric J; Nakashima, Eiji; Chen, Yun; Funamoto, Sachiyo; Katayama, Hiroaki
2012-01-01
Ensuring privacy of research subjects when epidemiologic data are shared with outside collaborators involves masking (modifying) the data, but overmasking can compromise utility (analysis potential). Methods of statistical disclosure control for protecting privacy may be impractical for individual researchers involved in small-scale collaborations. We investigated a simple approach based on measures of disclosure risk and analytical utility that are straightforward for epidemiologic researchers to derive. The method is illustrated using data from the Japanese Atomic-bomb Survivor population. Masking by modest rounding did not adequately enhance security but rounding to remove several digits of relative accuracy effectively reduced the risk of identification without substantially reducing utility. Grouping or adding random noise led to noticeable bias. When sharing epidemiologic data, it is recommended that masking be performed using rounding. Specific treatment should be determined separately in individual situations after consideration of the disclosure risks and analysis needs.
NASA Astrophysics Data System (ADS)
Ren, Lixia; He, Li; Lu, Hongwei; Chen, Yizhong
2016-08-01
A new Monte Carlo-based interval transformation analysis (MCITA) is used in this study for multi-criteria decision analysis (MCDA) of naphthalene-contaminated groundwater management strategies. The analysis can be conducted when input data such as total cost, contaminant concentration and health risk are represented as intervals. Compared to traditional MCDA methods, MCITA-MCDA has the advantages of (1) dealing with inexactness of input data represented as intervals, (2) mitigating computational time due to the introduction of Monte Carlo sampling method, (3) identifying the most desirable management strategies under data uncertainty. A real-world case study is employed to demonstrate the performance of this method. A set of inexact management alternatives are considered in each duration on the basis of four criteria. Results indicated that the most desirable management strategy lied in action 15 for the 5-year, action 8 for the 10-year, action 12 for the 15-year, and action 2 for the 20-year management.
Root Source Analysis/ValuStream[Trade Mark] - A Methodology for Identifying and Managing Risks
NASA Technical Reports Server (NTRS)
Brown, Richard Lee
2008-01-01
Root Source Analysis (RoSA) is a systems engineering methodology that has been developed at NASA over the past five years. It is designed to reduce costs, schedule, and technical risks by systematically examining critical assumptions and the state of the knowledge needed to bring to fruition the products that satisfy mission-driven requirements, as defined for each element of the Work (or Product) Breakdown Structure (WBS or PBS). This methodology is sometimes referred to as the ValuStream method, as inherent in the process is the linking and prioritizing of uncertainties arising from knowledge shortfalls directly to the customer's mission driven requirements. RoSA and ValuStream are synonymous terms. RoSA is not simply an alternate or improved method for identifying risks. It represents a paradigm shift. The emphasis is placed on identifying very specific knowledge shortfalls and assumptions that are the root sources of the risk (the why), rather than on assessing the WBS product(s) themselves (the what). In so doing RoSA looks forward to anticipate, identify, and prioritize knowledge shortfalls and assumptions that are likely to create significant uncertainties/ risks (as compared to Root Cause Analysis, which is most often used to look back to discover what was not known, or was assumed, that caused the failure). Experience indicates that RoSA, with its primary focus on assumptions and the state of the underlying knowledge needed to define, design, build, verify, and operate the products, can identify critical risks that historically have been missed by the usual approaches (i.e., design review process and classical risk identification methods). Further, the methodology answers four critical questions for decision makers and risk managers: 1. What s been included? 2. What's been left out? 3. How has it been validated? 4. Has the real source of the uncertainty/ risk been identified, i.e., is the perceived problem the real problem? Users of the RoSA methodology have characterized it as a true bottoms up risk assessment.
Asgari Dastjerdi, Hoori; Khorasani, Elahe; Yarmohammadian, Mohammad Hossein; Ahmadzade, Mahdiye Sadat
2017-01-01
Abstract: Background: Medical errors are one of the greatest problems in any healthcare systems. The best way to prevent such problems is errors identification and their roots. Failure Mode and Effects Analysis (FMEA) technique is a prospective risk analysis method. This study is a review of risk analysis using FMEA technique in different hospital wards and departments. Methods: This paper has systematically investigated the available databases. After selecting inclusion and exclusion criteria, the related studies were found. This selection was made in two steps. First, the abstracts and titles were investigated by the researchers and, after omitting papers which did not meet the inclusion criteria, 22 papers were finally selected and the text was thoroughly examined. At the end, the results were obtained. Results: The examined papers had focused mostly on the process and had been conducted in the pediatric wards and radiology departments, and most participants were nursing staffs. Many of these papers attempted to express almost all the steps of model implementation; and after implementing the strategies and interventions, the Risk Priority Number (RPN) was calculated to determine the degree of the technique’s effect. However, these papers have paid less attention to the identification of risk effects. Conclusions: The study revealed that a small number of studies had failed to show the FMEA technique effects. In general, however, most of the studies recommended this technique and had considered it a useful and efficient method in reducing the number of risks and improving service quality. PMID:28039688
ERIC Educational Resources Information Center
Kilcommons, Aoiffe M.; Withers, Paul; Moreno-Lopez, Agueda
2012-01-01
Background: Involving ID service users in risk decision making necessitates consideration of an individual's ability to assess the implications and associated risks and thus make an informed choice. This calls for research on service users' awareness and understanding of risk management (RM). Method: Thirteen people in a residential ID service who…
Claycamp, H Gregg; Kona, Ravikanth; Fahmy, Raafat; Hoag, Stephen W
2016-04-01
Qualitative risk assessment methods are often used as the first step to determining design space boundaries; however, quantitative assessments of risk with respect to the design space, i.e., calculating the probability of failure for a given severity, are needed to fully characterize design space boundaries. Quantitative risk assessment methods in design and operational spaces are a significant aid to evaluating proposed design space boundaries. The goal of this paper is to demonstrate a relatively simple strategy for design space definition using a simplified Bayesian Monte Carlo simulation. This paper builds on a previous paper that used failure mode and effects analysis (FMEA) qualitative risk assessment and Plackett-Burman design of experiments to identity the critical quality attributes. The results show that the sequential use of qualitative and quantitative risk assessments can focus the design of experiments on a reduced set of critical material and process parameters that determine a robust design space under conditions of limited laboratory experimentation. This approach provides a strategy by which the degree of risk associated with each known parameter can be calculated and allocates resources in a manner that manages risk to an acceptable level.
2016-02-01
We live in an age that increasingly calls for national or regional management of global risks. This article discusses the contributions that expert elicitation can bring to efforts to manage global risks and identifies challenges faced in conducting expert elicitation at this scale. In doing so it draws on lessons learned from conducting an expert elicitation as part of the World Health Organizations (WHO) initiative to estimate the global burden of foodborne disease; a study commissioned by the Foodborne Disease Epidemiology Reference Group (FERG). Expert elicitation is designed to fill gaps in data and research using structured, transparent methods. Such gaps are a significant challenge for global risk modeling. Experience with the WHO FERG expert elicitation shows that it is feasible to conduct an expert elicitation at a global scale, but that challenges do arise, including: defining an informative, yet feasible geographical structure for the elicitation; defining what constitutes expertise in a global setting; structuring international, multidisciplinary expert panels; and managing demands on experts' time in the elicitation. This article was written as part of a workshop, "Methods for Research Synthesis: A Cross-Disciplinary Approach" held at the Harvard Center for Risk Analysis on October 13, 2013. © 2016 Society for Risk Analysis.
[The role of a specialised risk analysis group in the Veterinary Services of a developing country].
Urbina-Amarís, M E
2003-08-01
Since the World Trade Organization (WTO) Agreement on the Application of Sanitary and Phytosanitary Measures was established, risk analysis in trade, and ultimately in Veterinary and Animal Health Services, has become strategically important. Irrespective of their concept (discipline, approach, method, process), all types of risk analysis in trade involve four periods or phases:--risk identification-- risk assessment--risk management--risk information or communication. All veterinarians involved in a risk analysis unit must have in-depth knowledge of statistics and the epidemiology of transmissible diseases, as well as a basic knowledge of veterinary science, economics, mathematics, data processing and social communication, to enable them to work with professionals in these disciplines. Many developing countries do not have enough well-qualified professionnals in these areas to support a risk analysis unit. This will need to be rectified by seeking strategic alliances with other public or private sectors that will provide the required support to run the unit properly. Due to the special nature of its risk analysis functions, its role in supporting decision-making, and the criteria of independence and transparency that are so crucial to its operations, the hierarchical position of the risk analysis unit should be close to the top management of the Veterinary Service. Due to the shortage of personnel in developing countries with the required training and scientific and technical qualifications, countries with organisations responsible for both animal and plant health protection would be advised to set up integrated plant and animal risk analysis units. In addition, these units could take charge of all activities relating to WTO agreements and regional agreements on animal and plant health management.
Unplanned pregnancy: does past experience influence the use of a contraceptive method?
Matteson, Kristen A; Peipert, Jeffrey F; Allsworth, Jenifer; Phipps, Maureen G; Redding, Colleen A
2006-01-01
To investigate whether women between the ages of 14 and 25 years with a past unplanned pregnancy were more likely to use a contraceptive method compared with women without a history of unplanned pregnancy. We analyzed baseline data of 424 nonpregnant women between the ages of 14 and 25 years enrolled in a randomized trial to prevent sexually transmitted diseases and unplanned pregnancy (Project PROTECT). Women at high risk for sexually transmitted diseases or unplanned pregnancy were included. Participants completed a demographic, substance use, and reproductive health questionnaire. We compared women with and without a history of unplanned pregnancy using bivariate analysis and log binomial regression. The prevalence of past unplanned pregnancy in this sample was 43%. Women reporting an unplanned pregnancy were older, and had less education, and were more likely to be nonwhite race or ethnicity. History of an unplanned pregnancy was not associated with usage of a contraceptive method (relative risk 1.01, 95% confidence interval 0.87-1.16) in bivariate analysis or when potential confounders were accounted for in the analysis (adjusted relative risk 1.10, 95% confidence interval 0.95-1.28). Several factors were associated with both unplanned pregnancy and overall contraceptive method use in this population. However, a past unplanned pregnancy was not associated with overall contraceptive method usage. Future studies are necessary to investigate the complex relationship between unplanned pregnancy and contraceptive method use. II-2.
FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES
2017-06-01
FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES BY AMANDA DONNELLY A THESIS...work develops a comparative model for strategic design methodologies, focusing on the primary elements of vision, time, process, communication and...collaboration, and risk assessment. My analysis dissects and compares three potential design methodologies including, net assessment, scenarios and
NASA Astrophysics Data System (ADS)
Takahashi, Masakazu; Nanba, Reiji; Fukue, Yoshinori
This paper proposes operational Risk Management (RM) method using Failure Mode and Effects Analysis (FMEA) for drug manufacturing computerlized system (DMCS). The quality of drug must not be influenced by failures and operational mistakes of DMCS. To avoid such situation, DMCS has to be conducted enough risk assessment and taken precautions. We propose operational RM method using FMEA for DMCS. To propose the method, we gathered and compared the FMEA results of DMCS, and develop a list that contains failure modes, failures and countermeasures. To apply this list, we can conduct RM in design phase, find failures, and conduct countermeasures efficiently. Additionally, we can find some failures that have not been found yet.
Zhang, Yan; Zhong, Ming
2013-01-01
Groundwater contamination is a serious threat to water supply. Risk assessment of groundwater contamination is an effective way to protect the safety of groundwater resource. Groundwater is a complex and fuzzy system with many uncertainties, which is impacted by different geological and hydrological factors. In order to deal with the uncertainty in the risk assessment of groundwater contamination, we propose an approach with analysis hierarchy process and fuzzy comprehensive evaluation integrated together. Firstly, the risk factors of groundwater contamination are identified by the sources-pathway-receptor-consequence method, and a corresponding index system of risk assessment based on DRASTIC model is established. Due to the complexity in the process of transitions between the possible pollution risks and the uncertainties of factors, the method of analysis hierarchy process is applied to determine the weights of each factor, and the fuzzy sets theory is adopted to calculate the membership degrees of each factor. Finally, a case study is presented to illustrate and test this methodology. It is concluded that the proposed approach integrates the advantages of both analysis hierarchy process and fuzzy comprehensive evaluation, which provides a more flexible and reliable way to deal with the linguistic uncertainty and mechanism uncertainty in groundwater contamination without losing important information. PMID:24453883
Need for, and financial feasibility of, satellite-aided land mobile communications
NASA Technical Reports Server (NTRS)
Castruccio, P. A.; Marantz, C. S.; Freibaum, J.
1982-01-01
Questions regarding the role of a mobile-satellite system in augmenting the terrestrial communications system are considered, and a market assessment study is discussed. Aspects of an investment analysis are examined, taking into account a three phase financial study of four postulated land Mobile Satellite Service (LMSS) systems, project profitability evaluation methods, risk analysis methods, financial projections, potential investor acceptance standards, and a risk analysis. It is concluded that a satellite augmented terrestrial mobile service appears to be economically and technically superior to a service depending exclusively on terrestrial systems. The interest in the Mobile Satellite Service is found to be worldwide, and the ground equipment market is potentially large.
Bonofiglio, Federico; Beyersmann, Jan; Schumacher, Martin; Koller, Michael; Schwarzer, Guido
2016-09-01
Meta-analysis of a survival endpoint is typically based on the pooling of hazard ratios (HRs). If competing risks occur, the HRs may lose translation into changes of survival probability. The cumulative incidence functions (CIFs), the expected proportion of cause-specific events over time, re-connect the cause-specific hazards (CSHs) to the probability of each event type. We use CIF ratios to measure treatment effect on each event type. To retrieve information on aggregated, typically poorly reported, competing risks data, we assume constant CSHs. Next, we develop methods to pool CIF ratios across studies. The procedure computes pooled HRs alongside and checks the influence of follow-up time on the analysis. We apply the method to a medical example, showing that follow-up duration is relevant both for pooled cause-specific HRs and CIF ratios. Moreover, if all-cause hazard and follow-up time are large enough, CIF ratios may reveal additional information about the effect of treatment on the cumulative probability of each event type. Finally, to improve the usefulness of such analysis, better reporting of competing risks data is needed. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
Life-table methods for detecting age-risk factor interactions in long-term follow-up studies.
Logue, E E; Wing, S
1986-01-01
Methodological investigation has suggested that age-risk factor interactions should be more evident in age of experience life tables than in follow-up time tables due to the mixing of ages of experience over follow-up time in groups defined by age at initial examination. To illustrate the two approaches, age modification of the effect of total cholesterol on ischemic heart disease mortality in two long-term follow-up studies was investigated. Follow-up time life table analysis of 116 deaths over 20 years in one study was more consistent with a uniform relative risk due to cholesterol, while age of experience life table analysis was more consistent with a monotonic negative age interaction. In a second follow-up study (160 deaths over 24 years), there was no evidence of a monotonic negative age-cholesterol interaction by either method. It was concluded that age-specific life table analysis should be used when age-risk factor interactions are considered, but that both approaches yield almost identical results in absence of age interaction. The identification of the more appropriate life-table analysis should be ultimately guided by the nature of the age or time phenomena of scientific interest.
Ma, Li; Sun, Jing; Yang, Zhaoguang; Wang, Lin
2015-12-01
Heavy metal contamination attracted a wide spread attention due to their strong toxicity and persistence. The Ganxi River, located in Chenzhou City, Southern China, has been severely polluted by lead/zinc ore mining activities. This work investigated the heavy metal pollution in agricultural soils around the Ganxi River. The total concentrations of heavy metals were determined by inductively coupled plasma-mass spectrometry. The potential risk associated with the heavy metals in soil was assessed by Nemerow comprehensive index and potential ecological risk index. In both methods, the study area was rated as very high risk. Multivariate statistical methods including Pearson's correlation analysis, hierarchical cluster analysis, and principal component analysis were employed to evaluate the relationships between heavy metals, as well as the correlation between heavy metals and pH, to identify the metal sources. Three distinct clusters have been observed by hierarchical cluster analysis. In principal component analysis, a total of two components were extracted to explain over 90% of the total variance, both of which were associated with anthropogenic sources.
NASA Astrophysics Data System (ADS)
Moreira, Francisco; Silva, Nuno
2016-08-01
Safety systems require accident avoidance. This is covered by application standards, processes, techniques and tools that support the identification, analysis, elimination or reduction to an acceptable level of system risks and hazards. Ideally, a safety system should be free of hazards. However, both industry and academia have been struggling to ensure appropriate risk and hazard analysis, especially in what concerns completeness of the hazards, formalization, and timely analysis in order to influence the specifications and the implementation. Such analysis is also important when considering a change to an existing system. The Common Safety Method for Risk Evaluation and Assessment (CSM- RA) is a mandatory procedure whenever any significant change is proposed to the railway system in a European Member State. This paper provides insights on the fundamentals of CSM-RA based and complemented with Hazard Analysis. When and how to apply them, and the relation and similarities of these processes with industry standards and the system life cycles is highlighted. Finally, the paper shows how CSM-RA can be the basis of a change management process, guiding the identification and management of the hazards helping ensuring the similar safety level as the initial system. This paper will show how the CSM-RA principles can be used in other domains particularly for space system evolution.
Issues in benchmarking human reliability analysis methods : a literature review.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lois, Erasmia; Forester, John Alan; Tran, Tuan Q.
There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessment (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study is currently underway that compares HRA methods with each other and against operator performance in simulator studies. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted,more » reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies are presented in order to aid in the design of future HRA benchmarking endeavors.« less
Issues in Benchmarking Human Reliability Analysis Methods: A Literature Review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ronald L. Boring; Stacey M. L. Hendrickson; John A. Forester
There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessments (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study comparing and evaluating HRA methods in assessing operator performance in simulator experiments is currently underway. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing pastmore » benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies are presented in order to aid in the design of future HRA benchmarking endeavors.« less
Segmented Poincaré plot analysis for risk stratification in patients with dilated cardiomyopathy.
Voss, A; Fischer, C; Schroeder, R; Figulla, H R; Goernig, M
2010-01-01
The prognostic value of heart rate variability in patients with dilated cardiomyopathy (DCM) is limited and does not contribute to risk stratification although the dynamics of ventricular repolarization differs considerably between DCM patients and healthy subjects. Neither linear nor nonlinear methods of heart rate variability analysis could discriminate between patients at high and low risk for sudden cardiac death. The aim of this study was to analyze the suitability of the new developed segmented Poincaré plot analysis (SPPA) to enhance risk stratification in DCM. In contrast to the usual applied Poincaré plot analysis the SPPA retains nonlinear features from investigated beat-to-beat interval time series. Main features of SPPA are the rotation of cloud of points and their succeeded variability depended segmentation. Significant row and column probabilities were calculated from the segments and led to discrimination (up to p<0.005) between low and high risk in DCM patients. For the first time an index from Poincaré plot analysis of heart rate variability was able to contribute to risk stratification in patients suffering from DCM.
Kovatchev, Boris P; Clarke, William L; Breton, Marc; Brayman, Kenneth; McCall, Anthony
2005-12-01
Continuous glucose monitors (CGMs) collect detailed blood glucose (BG) time series, which carry significant information about the dynamics of BG fluctuations. In contrast, the methods for analysis of CGM data remain those developed for infrequent BG self-monitoring. As a result, important information about the temporal structure of the data is lost during the translation of raw sensor readings into clinically interpretable statistics and images. The following mathematical methods are introduced into the field of CGM data interpretation: (1) analysis of BG rate of change; (2) risk analysis using previously reported Low/High BG Indices and Poincare (lag) plot of risk associated with temporal BG variability; and (3) spatial aggregation of the process of BG fluctuations and its Markov chain visualization. The clinical application of these methods is illustrated by analysis of data of a patient with Type 1 diabetes mellitus who underwent islet transplantation and with data from clinical trials. Normative data [12,025 reference (YSI device, Yellow Springs Instruments, Yellow Springs, OH) BG determinations] in patients with Type 1 diabetes mellitus who underwent insulin and glucose challenges suggest that the 90%, 95%, and 99% confidence intervals of BG rate of change that could be maximally sustained over 15-30 min are [-2,2], [-3,3], and [-4,4] mg/dL/min, respectively. BG dynamics and risk parameters clearly differentiated the stages of transplantation and the effects of medication. Aspects of treatment were clearly visualized by graphs of BG rate of change and Low/High BG Indices, by a Poincare plot of risk for rapid BG fluctuations, and by a plot of the aggregated Markov process. Advanced analysis and visualization of CGM data allow for evaluation of dynamical characteristics of diabetes and reveal clinical information that is inaccessible via standard statistics, which do not take into account the temporal structure of the data. The use of such methods improves the assessment of patients' glycemic control.
Le Vu, Stéphane; Ratmann, Oliver; Delpech, Valerie; Brown, Alison E; Gill, O Noel; Tostevin, Anna; Fraser, Christophe; Volz, Erik M
2018-06-01
Phylogenetic clustering of HIV sequences from a random sample of patients can reveal epidemiological transmission patterns, but interpretation is hampered by limited theoretical support and statistical properties of clustering analysis remain poorly understood. Alternatively, source attribution methods allow fitting of HIV transmission models and thereby quantify aspects of disease transmission. A simulation study was conducted to assess error rates of clustering methods for detecting transmission risk factors. We modeled HIV epidemics among men having sex with men and generated phylogenies comparable to those that can be obtained from HIV surveillance data in the UK. Clustering and source attribution approaches were applied to evaluate their ability to identify patient attributes as transmission risk factors. We find that commonly used methods show a misleading association between cluster size or odds of clustering and covariates that are correlated with time since infection, regardless of their influence on transmission. Clustering methods usually have higher error rates and lower sensitivity than source attribution method for identifying transmission risk factors. But neither methods provide robust estimates of transmission risk ratios. Source attribution method can alleviate drawbacks from phylogenetic clustering but formal population genetic modeling may be required to estimate quantitative transmission risk factors. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Agapova, Maria; Devine, Emily Beth; Bresnahan, Brian W; Higashi, Mitchell K; Garrison, Louis P
2014-09-01
Health agencies making regulatory marketing-authorization decisions use qualitative and quantitative approaches to assess expected benefits and expected risks associated with medical interventions. There is, however, no universal standard approach that regulatory agencies consistently use to conduct benefit-risk assessment (BRA) for pharmaceuticals or medical devices, including for imaging technologies. Economics, health services research, and health outcomes research use quantitative approaches to elicit preferences of stakeholders, identify priorities, and model health conditions and health intervention effects. Challenges to BRA in medical devices are outlined, highlighting additional barriers in radiology. Three quantitative methods--multi-criteria decision analysis, health outcomes modeling and stated-choice survey--are assessed using criteria that are important in balancing benefits and risks of medical devices and imaging technologies. To be useful in regulatory BRA, quantitative methods need to: aggregate multiple benefits and risks, incorporate qualitative considerations, account for uncertainty, and make clear whose preferences/priorities are being used. Each quantitative method performs differently across these criteria and little is known about how BRA estimates and conclusions vary by approach. While no specific quantitative method is likely to be the strongest in all of the important areas, quantitative methods may have a place in BRA of medical devices and radiology. Quantitative BRA approaches have been more widely applied in medicines, with fewer BRAs in devices. Despite substantial differences in characteristics of pharmaceuticals and devices, BRA methods may be as applicable to medical devices and imaging technologies as they are to pharmaceuticals. Further research to guide the development and selection of quantitative BRA methods for medical devices and imaging technologies is needed. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.
Tijhuis, M J; Pohjola, M V; Gunnlaugsdóttir, H; Kalogeras, N; Leino, O; Luteijn, J M; Magnússon, S H; Odekerken-Schröder, G; Poto, M; Tuomisto, J T; Ueland, O; White, B C; Holm, F; Verhagen, H
2012-01-01
An integrated benefit-risk analysis aims to give guidance in decision situations where benefits do not clearly prevail over risks, and explicit weighing of benefits and risks is thus indicated. The BEPRARIBEAN project aims to advance benefit-risk analysis in the area of food and nutrition by learning from other fields. This paper constitutes the final stage of the project, in which commonalities and differences in benefit-risk analysis are identified between the Food and Nutrition field and other fields, namely Medicines, Food Microbiology, Environmental Health, Economics and Marketing-Finance, and Consumer Perception. From this, ways forward are characterized for benefit-risk analysis in Food and Nutrition. Integrated benefit-risk analysis in Food and Nutrition may advance in the following ways: Increased engagement and communication between assessors, managers, and stakeholders; more pragmatic problem-oriented framing of assessment; accepting some risk; pre- and post-market analysis; explicit communication of the assessment purpose, input and output; more human (dose-response) data and more efficient use of human data; segmenting populations based on physiology; explicit consideration of value judgments in assessment; integration of multiple benefits and risks from multiple domains; explicit recognition of the impact of consumer beliefs, opinions, views, perceptions, and attitudes on behaviour; and segmenting populations based on behaviour; the opportunities proposed here do not provide ultimate solutions; rather, they define a collection of issues to be taken account of in developing methods, tools, practices and policies, as well as refining the regulatory context, for benefit-risk analysis in Food and Nutrition and other fields. Thus, these opportunities will now need to be explored further and incorporated into benefit-risk practice and policy. If accepted, incorporation of these opportunities will also involve a paradigm shift in Food and Nutrition benefit-risk analysis towards conceiving the analysis as a process of creating shared knowledge among all stakeholders. Copyright © 2011 Elsevier Ltd. All rights reserved.
A utility/cost analysis of breast cancer risk prediction algorithms
NASA Astrophysics Data System (ADS)
Abbey, Craig K.; Wu, Yirong; Burnside, Elizabeth S.; Wunderlich, Adam; Samuelson, Frank W.; Boone, John M.
2016-03-01
Breast cancer risk prediction algorithms are used to identify subpopulations that are at increased risk for developing breast cancer. They can be based on many different sources of data such as demographics, relatives with cancer, gene expression, and various phenotypic features such as breast density. Women who are identified as high risk may undergo a more extensive (and expensive) screening process that includes MRI or ultrasound imaging in addition to the standard full-field digital mammography (FFDM) exam. Given that there are many ways that risk prediction may be accomplished, it is of interest to evaluate them in terms of expected cost, which includes the costs of diagnostic outcomes. In this work we perform an expected-cost analysis of risk prediction algorithms that is based on a published model that includes the costs associated with diagnostic outcomes (true-positive, false-positive, etc.). We assume the existence of a standard screening method and an enhanced screening method with higher scan cost, higher sensitivity, and lower specificity. We then assess expected cost of using a risk prediction algorithm to determine who gets the enhanced screening method under the strong assumption that risk and diagnostic performance are independent. We find that if risk prediction leads to a high enough positive predictive value, it will be cost-effective regardless of the size of the subpopulation. Furthermore, in terms of the hit-rate and false-alarm rate of the of the risk prediction algorithm, iso-cost contours are lines with slope determined by properties of the available diagnostic systems for screening.
Comparison of risk assessment procedures used in OCRA and ULRA methods
Roman-Liu, Danuta; Groborz, Anna; Tokarski, Tomasz
2013-01-01
The aim of this study was to analyse the convergence of two methods by comparing exposure and the assessed risk of developing musculoskeletal disorders at 18 repetitive task workstations. The already established occupational repetitive actions (OCRA) and the recently developed upper limb risk assessment (ULRA) produce correlated results (R = 0.84, p = 0.0001). A discussion of the factors that influence the values of the OCRA index and ULRA's repetitive task indicator shows that both similarities and differences in the results produced by the two methods can arise from the concepts that underlie them. The assessment procedure and mathematical calculations that the basic parameters are subjected to are crucial to the results of risk assessment. The way the basic parameters are defined influences the assessment of exposure and risk assessment to a lesser degree. The analysis also proved that not always do great differences in load indicator values result in differences in risk zones. Practitioner Summary: We focused on comparing methods that, even though based on different concepts, serve the same purpose. The results proved that different methods with different assumptions can produce similar assessment of upper limb load; sharp criteria in risk assessment are not the best solution. PMID:24041375
Ye, Fang; Chen, Zhi-Hua; Chen, Jie; Liu, Fang; Zhang, Yong; Fan, Qin-Ying; Wang, Lin
2016-01-01
Background: In the past decades, studies on infant anemia have mainly focused on rural areas of China. With the increasing heterogeneity of population in recent years, available information on infant anemia is inconclusive in large cities of China, especially with comparison between native residents and floating population. This population-based cross-sectional study was implemented to determine the anemic status of infants as well as the risk factors in a representative downtown area of Beijing. Methods: As useful methods to build a predictive model, Chi-squared automatic interaction detection (CHAID) decision tree analysis and logistic regression analysis were introduced to explore risk factors of infant anemia. A total of 1091 infants aged 6–12 months together with their parents/caregivers living at Heping Avenue Subdistrict of Beijing were surveyed from January 1, 2013 to December 31, 2014. Results: The prevalence of anemia was 12.60% with a range of 3.47%–40.00% in different subgroup characteristics. The CHAID decision tree model has demonstrated multilevel interaction among risk factors through stepwise pathways to detect anemia. Besides the three predictors identified by logistic regression model including maternal anemia during pregnancy, exclusive breastfeeding in the first 6 months, and floating population, CHAID decision tree analysis also identified the fourth risk factor, the maternal educational level, with higher overall classification accuracy and larger area below the receiver operating characteristic curve. Conclusions: The infant anemic status in metropolis is complex and should be carefully considered by the basic health care practitioners. CHAID decision tree analysis has demonstrated a better performance in hierarchical analysis of population with great heterogeneity. Risk factors identified by this study might be meaningful in the early detection and prompt treatment of infant anemia in large cities. PMID:27174328
The risk of lung cancer among cooking adults: a meta-analysis of 23 observational studies.
Jia, Peng-Li; Zhang, Chao; Yu, Jia-Jie; Xu, Chang; Tang, Li; Sun, Xin
2018-02-01
Cooking has been regarded as a potential risk factor for lung cancer. We aim to investigate the evidence of cooking oil fume and risk of lung cancer. Medline and Embase were searched for eligible studies. We conducted a meta-analysis to summarize the evidences of case-control or cohort studies, with subgroup analysis for the potential discrepancy. Sensitivity analysis was employed to test the robustness. We included 23 observational studies, involving 9411 lung cancer cases. Our meta-analysis found that, for cooking female, the pooled OR of cooking oil fume exposure was 1.98 (95% CI 1.54, 2.54, I 2 = 79%, n = 15) among non-smoking population and 2.00 (95% CI 1.46, 2.74, I 2 = 75%, n = 10) among partly smoking population. For cooking males, the pooled OR of lung cancer was 1.15 (95% CI 0.71, 1.87; I 2 = 80%, n = 4). When sub grouped by ventilation condition, the pooled OR for poor ventilation was 1.20 (95% CI 1.10, 1.31, I 2 = 2%) compared to good ventilation. For different cooking methods, our results suggested that stir frying (OR = 1.89, 95% CI 1.23, 2.90; I 2 = 66%) was associated with increased risk of lung cancer while not for deep frying (OR = 1.41, 95% CI 0.87, 2.29; I 2 = 5%). Sensitivity analysis suggested our results were stable. Cooking oil fume is likely to be a risk factor for lung cancer for female, regardless of smoking status. Poor ventilation may increase the risk of lung cancer. Cooking methods may have different effect on lung cancer that deep frying may be healthier than stir frying.
2013-01-01
Background Fine-scale and longitudinal geospatial analysis of health risks in challenging urban areas is often limited by the lack of other spatial layers even if case data are available. Underlying population counts, residential context, and associated causative factors such as standing water or trash locations are often missing unless collected through logistically difficult, and often expensive, surveys. The lack of spatial context also hinders the interpretation of results and designing intervention strategies structured around analytical insights. This paper offers a ubiquitous spatial data collection approach using a spatial video that can be used to improve analysis and involve participatory collaborations. A case study will be used to illustrate this approach with three health risks mapped at the street scale for a coastal community in Haiti. Methods Spatial video was used to collect street and building scale information, including standing water, trash accumulation, presence of dogs, cohort specific population characteristics, and other cultural phenomena. These data were digitized into Google Earth and then coded and analyzed in a GIS using kernel density and spatial filtering approaches. The concentrations of these risks around area schools which are sometimes sources of diarrheal disease infection because of the high concentration of children and variable sanitary practices will show the utility of the method. In addition schools offer potential locations for cholera education interventions. Results Previously unavailable fine scale health risk data vary in concentration across the town, with some schools being proximate to greater concentrations of the mapped risks. The spatial video is also used to validate coded data and location specific risks within these “hotspots”. Conclusions Spatial video is a tool that can be used in any environment to improve local area health analysis and intervention. The process is rapid and can be repeated in study sites through time to track spatio-temporal dynamics of the communities. Its simplicity should also be used to encourage local participatory collaborations. PMID:23587358
Proposal for a recovery prediction method for patients affected by acute mediastinitis
2012-01-01
Background An attempt to find a prediction method of death risk in patients affected by acute mediastinitis. There is not such a tool described in available literature for that serious disease. Methods The study comprised 44 consecutive cases of acute mediastinitis. General anamnesis and biochemical data were included. Factor analysis was used to extract the risk characteristic for the patients. The most valuable results were obtained for 8 parameters which were selected for further statistical analysis (all collected during few hours after admission). Three factors reached Eigenvalue >1. Clinical explanations of these combined statistical factors are: Factor1 - proteinic status (serum total protein, albumin, and hemoglobin level), Factor2 - inflammatory status (white blood cells, CRP, procalcitonin), and Factor3 - general risk (age, number of coexisting diseases). Threshold values of prediction factors were estimated by means of statistical analysis (factor analysis, Statgraphics Centurion XVI). Results The final prediction result for the patients is constructed as simultaneous evaluation of all factor scores. High probability of death should be predicted if factor 1 value decreases with simultaneous increase of factors 2 and 3. The diagnostic power of the proposed method was revealed to be high [sensitivity =90%, specificity =64%], for Factor1 [SNC = 87%, SPC = 79%]; for Factor2 [SNC = 87%, SPC = 50%] and for Factor3 [SNC = 73%, SPC = 71%]. Conclusion The proposed prediction method seems a useful emergency signal during acute mediastinitis control in affected patients. PMID:22574625
Haer, Toon; Botzen, W J Wouter; de Moel, Hans; Aerts, Jeroen C J H
2017-10-01
Recent studies showed that climate change and socioeconomic trends are expected to increase flood risks in many regions. However, in these studies, human behavior is commonly assumed to be constant, which neglects interaction and feedback loops between human and environmental systems. This neglect of human adaptation leads to a misrepresentation of flood risk. This article presents an agent-based model that incorporates human decision making in flood risk analysis. In particular, household investments in loss-reducing measures are examined under three economic decision models: (1) expected utility theory, which is the traditional economic model of rational agents; (2) prospect theory, which takes account of bounded rationality; and (3) a prospect theory model, which accounts for changing risk perceptions and social interactions through a process of Bayesian updating. We show that neglecting human behavior in flood risk assessment studies can result in a considerable misestimation of future flood risk, which is in our case study an overestimation of a factor two. Furthermore, we show how behavior models can support flood risk analysis under different behavioral assumptions, illustrating the need to include the dynamic adaptive human behavior of, for instance, households, insurers, and governments. The method presented here provides a solid basis for exploring human behavior and the resulting flood risk with respect to low-probability/high-impact risks. © 2016 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.
ERIC Educational Resources Information Center
Brammeier, Monique; Chow, Joan M.; Samuel, Michael C.; Organista, Kurt C.; Miller, Jamie; Bolan, Gail
2008-01-01
Context: The prevalence of sexually transmitted diseases and associated risk behaviors among California farmworkers is not well described. Purpose: To estimate the prevalence of sexually transmitted diseases (STDs) and associated risk behaviors among California farmworkers. Methods: Cross-sectional analysis of population-based survey data from 6…
Analysis of Risk Management in Adapted Physical Education Textbooks
ERIC Educational Resources Information Center
Murphy, Kelle L.; Donovan, Jacqueline B.; Berg, Dominck A.
2016-01-01
Physical education teacher education (PETE) programs vary on how the topics of safe teaching and risk management are addressed. Common practices to cover such issues include requiring textbooks, lesson planning, peer teaching, videotaping, reflecting, and reading case law analyses. We used a mixed methods design to examine how risk management is…
Ely, Scott; Forsberg, Peter; Ouansafi, Ihsane; Rossi, Adriana; Modin, Alvin; Pearse, Roger; Pekle, Karen; Perry, Arthur; Coleman, Morton; Jayabalan, David; Di Liberto, Maurizio; Chen-Kiang, Selina; Niesvizky, Ruben; Mark, Tomer M
2017-12-01
Therapeutic options for multiple myeloma (MM) are growing, yet clinical outcomes remain heterogeneous. Cytogenetic analysis and disease staging are mainstays of risk stratification, but data suggest a complex interplay between numerous abnormalities. Myeloma cell proliferation is a metric shown to predict outcomes, but available methods are not feasible in clinical practice. Multiplex immunohistochemistry (mIHC), using multiple immunostains simultaneously, is universally available for clinical use. We tested mIHC as a method to calculate a plasma cell proliferation index (PCPI). By mIHC, marrow trephine core biopsy samples were costained for CD138, a plasma cell-specific marker, and Ki-67. Myeloma cells (CD138 + ) were counted as proliferating if coexpressing Ki-67. Retrospective analysis was performed on 151 newly diagnosed, treatment-naive patients divided into 2 groups on the basis of myeloma cell proliferation: low (PCPI ≤ 5%, n = 87), and high (PCPI > 5%, n = 64). Median overall survival (OS) was not reached versus 78.9 months (P = .0434) for the low versus high PCPI groups. Multivariate analysis showed that only high-risk cytogenetics (hazard ratio [HR] = 2.02; P = .023), International Staging System (ISS) stage > I (HR = 2.30; P = .014), and PCPI > 5% (HR = 1.70; P = .041) had independent effects on OS. Twenty-three (36%) of the 64 patients with low-risk disease (ISS stage 1, without high-risk cytogenetics) were uniquely reidentified as high risk by PCPI. PCPI is a practical method that predicts OS in newly diagnosed myeloma and facilitates broader use of MM cell proliferation for risk stratification. Copyright © 2017 Elsevier Inc. All rights reserved.
Malchaire, J B
2004-08-01
The first section of the document describes a risk-prevention strategy, called SOBANE, in four levels: screening, observation, analysis and expertise. The aim is to make risk prevention faster, more cost effective, and more effective in coordinating the contributions of the workers themselves, their management, the internal and external occupational health (OH) practitioners and the experts. These four levels are: screening, where the risk factors are detected by the workers and their management, and obvious solutions are implemented; observation, where the remaining problems are studied in more detail, one by one, and the reasons and the solutions are discussed in detail; analysis, where, when necessary, an OH practitioner is called upon to carry out appropriate measurements to develop specific solutions; expertise, where, in very sophisticated and rare cases, the assistance of an expert is called upon to solve a particular problem. The method for the participatory screening of the risks (in French: Dépistage Participatif des Risques), Déparis, is proposed for the first level screening of the SOBANE strategy. The work situation is systematically reviewed and all the aspects conditioning the easiness, the effectiveness and the satisfaction at work are discussed, in search of practical prevention measures. The points to be studied more in detail at level 2, observation, are identified. The method is carried out during a meeting of key workers and technical staff. The method proves to be simple, sparing in time and means and playing a significant role in the development of a dynamic plan of risk management and of a culture of dialogue in the company.
Hydrologic risk analysis in the Yangtze River basin through coupling Gaussian mixtures into copulas
NASA Astrophysics Data System (ADS)
Fan, Y. R.; Huang, W. W.; Huang, G. H.; Li, Y. P.; Huang, K.; Li, Z.
2016-02-01
In this study, a bivariate hydrologic risk framework is proposed through coupling Gaussian mixtures into copulas, leading to a coupled GMM-copula method. In the coupled GMM-Copula method, the marginal distributions of flood peak, volume and duration are quantified through Gaussian mixture models and the joint probability distributions of flood peak-volume, peak-duration and volume-duration are established through copulas. The bivariate hydrologic risk is then derived based on the joint return period of flood variable pairs. The proposed method is applied to the risk analysis for the Yichang station on the main stream of the Yangtze River, China. The results indicate that (i) the bivariate risk for flood peak-volume would keep constant for the flood volume less than 1.0 × 105 m3/s day, but present a significant decreasing trend for the flood volume larger than 1.7 × 105 m3/s day; and (ii) the bivariate risk for flood peak-duration would not change significantly for the flood duration less than 8 days, and then decrease significantly as duration value become larger. The probability density functions (pdfs) of the flood volume and duration conditional on flood peak can also be generated through the fitted copulas. The results indicate that the conditional pdfs of flood volume and duration follow bimodal distributions, with the occurrence frequency of the first vertex decreasing and the latter one increasing as the increase of flood peak. The obtained conclusions from the bivariate hydrologic analysis can provide decision support for flood control and mitigation.
Mattei, Francesca; Liverani, Silvia; Guida, Florence; Matrat, Mireille; Cenée, Sylvie; Azizi, Lamiae; Menvielle, Gwenn; Sanchez, Marie; Pilorget, Corinne; Lapôtre-Ledoux, Bénédicte; Luce, Danièle; Richardson, Sylvia; Stücker, Isabelle
2016-01-01
Background The association between lung cancer and occupational exposure to organic solvents is discussed. Since different solvents are often used simultaneously, it is difficult to assess the role of individual substances. Objectives The present study is focused on an in-depth investigation of the potential association between lung cancer risk and occupational exposure to a large group of organic solvents, taking into account the well-known risk factors for lung cancer, tobacco smoking and occupational exposure to asbestos. Methods We analysed data from the Investigation of occupational and environmental causes of respiratory cancers (ICARE) study, a large French population-based case–control study, set up between 2001 and 2007. A total of 2276 male cases and 2780 male controls were interviewed, and long-life occupational history was collected. In order to overcome the analytical difficulties created by multiple correlated exposures, we carried out a novel type of analysis based on Bayesian profile regression. Results After analysis with conventional logistic regression methods, none of the 11 solvents examined were associated with lung cancer risk. Through a profile regression approach, we did not observe any significant association between solvent exposure and lung cancer. However, we identified clusters at high risk that are related to occupations known to be at risk of developing lung cancer, such as painters. Conclusions Organic solvents do not appear to be substantial contributors to the occupational risk of lung cancer for the occupations known to be at risk. PMID:26911986
Product Quality Improvement Using FMEA for Electric Parking Brake (EPB)
NASA Astrophysics Data System (ADS)
Dumitrescu, C. D.; Gruber, G. C.; Tişcă, I. A.
2016-08-01
One of the most frequently used methods to improve product quality is complex FMEA. (Failure Modes and Effects Analyses). In the literature various FMEA is known, depending on the mode and depending on the targets; we mention here some of these names: Failure Modes and Effects Analysis Process, or analysis Failure Mode and Effects Reported (FMECA). Whatever option is supported by the work team, the goal of the method is the same: optimize product design activities in research, design processes, implementation of manufacturing processes, optimization of mining product to beneficiaries. According to a market survey conducted on parts suppliers to vehicle manufacturers FMEA method is used in 75%. One purpose of the application is that after the research and product development is considered resolved, any errors which may be detected; another purpose of applying the method is initiating appropriate measures to avoid mistakes. Achieving these two goals leads to a high level distribution in applying, to avoid errors already in the design phase of the product, thereby avoiding the emergence and development of additional costs in later stages of product manufacturing. During application of FMEA method using standardized forms; with their help will establish the initial assemblies of product structure, in which all components will be viewed without error. The work is an application of the method FMEA quality components to optimize the structure of the electrical parking brake (Electric Parching Brake - E.P.B). This is a component attached to the roller system which ensures automotive replacement of conventional mechanical parking brake while ensuring its comfort, functionality, durability and saves space in the passenger compartment. The paper describes the levels at which they appealed in applying FMEA, working arrangements in the 4 distinct levels of analysis, and how to determine the number of risk (Risk Priority Number); the analysis of risk factors and established authors who have imposed measures to reduce / eliminate risk completely exploiting this complex product.
NASA Astrophysics Data System (ADS)
Yu, Zhang; Xiaohui, Song; Jianfang, Li; Fei, Gao
2017-05-01
Cable overheating will lead to the cable insulation level reducing, speed up the cable insulation aging, even easy to cause short circuit faults. Cable overheating risk identification and warning is nessesary for distribution network operators. Cable overheating risk warning method based on impedance parameter estimation is proposed in the paper to improve the safty and reliability operation of distribution network. Firstly, cable impedance estimation model is established by using least square method based on the data from distribiton SCADA system to improve the impedance parameter estimation accuracy. Secondly, calculate the threshold value of cable impedance based on the historical data and the forecast value of cable impedance based on the forecasting data in future from distribiton SCADA system. Thirdly, establish risks warning rules library of cable overheating, calculate the cable impedance forecast value and analysis the change rate of impedance, and then warn the overheating risk of cable line based on the overheating risk warning rules library according to the variation relationship between impedance and line temperature rise. Overheating risk warning method is simulated in the paper. The simulation results shows that the method can identify the imedance and forecast the temperature rise of cable line in distribution network accurately. The result of overheating risk warning can provide decision basis for operation maintenance and repair.
Time Dependence of Collision Probabilities During Satellite Conjunctions
NASA Technical Reports Server (NTRS)
Hall, Doyle T.; Hejduk, Matthew D.; Johnson, Lauren C.
2017-01-01
The NASA Conjunction Assessment Risk Analysis (CARA) team has recently implemented updated software to calculate the probability of collision (P (sub c)) for Earth-orbiting satellites. The algorithm can employ complex dynamical models for orbital motion, and account for the effects of non-linear trajectories as well as both position and velocity uncertainties. This “3D P (sub c)” method entails computing a 3-dimensional numerical integral for each estimated probability. Our analysis indicates that the 3D method provides several new insights over the traditional “2D P (sub c)” method, even when approximating the orbital motion using the relatively simple Keplerian two-body dynamical model. First, the formulation provides the means to estimate variations in the time derivative of the collision probability, or the probability rate, R (sub c). For close-proximity satellites, such as those orbiting in formations or clusters, R (sub c) variations can show multiple peaks that repeat or blend with one another, providing insight into the ongoing temporal distribution of risk. For single, isolated conjunctions, R (sub c) analysis provides the means to identify and bound the times of peak collision risk. Additionally, analysis of multiple actual archived conjunctions demonstrates that the commonly used “2D P (sub c)” approximation can occasionally provide inaccurate estimates. These include cases in which the 2D method yields negligibly small probabilities (e.g., P (sub c)) is greater than 10 (sup -10)), but the 3D estimates are sufficiently large to prompt increased monitoring or collision mitigation (e.g., P (sub c) is greater than or equal to 10 (sup -5)). Finally, the archive analysis indicates that a relatively efficient calculation can be used to identify which conjunctions will have negligibly small probabilities. This small-P (sub c) screening test can significantly speed the overall risk analysis computation for large numbers of conjunctions.
Advanced space-based InSAR risk analysis of planned and existing transportation infrastructure.
DOT National Transportation Integrated Search
2017-03-21
The purpose of this document is to summarize activities by Stanford University and : MDA Geospatial Services Inc. (MDA) to estimate surface deformation and associated : risk to transportation infrastructure using SAR Interferometric methods for the :...
NASA Astrophysics Data System (ADS)
Berlin, Julian; Bogaard, Thom; Van Westen, Cees; Bakker, Wim; Mostert, Eric; Dopheide, Emile
2014-05-01
Cost benefit analysis (CBA) is a well know method used widely for the assessment of investments either in the private and public sector. In the context of risk mitigation and the evaluation of risk reduction alternatives for natural hazards its use is very important to evaluate the effectiveness of such efforts in terms of avoided monetary losses. However the current method has some disadvantages related to the spatial distribution of the costs and benefits, the geographical distribution of the avoided damage and losses, the variation in areas that are benefited in terms of invested money and avoided monetary risk. Decision-makers are often interested in how the costs and benefits are distributed among different administrative units of a large area or region, so they will be able to compare and analyse the cost and benefits per administrative unit as a result of the implementation of the risk reduction projects. In this work we first examined the Cost benefit procedure for natural hazards, how the costs are assessed for several structural and non-structural risk reduction alternatives, we also examined the current problems of the method such as the inclusion of cultural and social considerations that are complex to monetize , the problem of discounting future values using a defined interest rate and the spatial distribution of cost and benefits. We also examined the additional benefits and the indirect costs associated with the implementation of the risk reduction alternatives such as the cost of having a ugly landscape (also called negative benefits). In the last part we examined the current tools and software used in natural hazards assessment with support to conduct CBA and we propose design considerations for the implementation of the CBA module for the CHANGES-SDSS Platform an initiative of the ongoing 7th Framework Programme "CHANGES of the European commission. Keywords: Risk management, Economics of risk mitigation, EU Flood Directive, resilience, prevention, cost benefit analysis, spatial distribution of costs and benefits
Fu, Zhuxuan; Liska, DeAnn; Talan, David; Chung, Mei
2017-12-01
Background: Cranberry ( Vaccinium spp.) has been advocated for treatment of urinary tract infection (UTI); however, its efficacy is controversial. Women have a 50% risk of UTI over their lifetime, and ∼20-30% experience a subsequent UTI recurrence. Objective: We conducted this meta-analysis to assess the effect of cranberry on the risk of UTI recurrence in otherwise healthy women. Methods: Literature published before January 2011 was obtained from 2 published systematic reviews, and we conducted updated searches in EMBASE and MEDLINE (through July 2017). We included randomized controlled trials that were conducted in generally healthy nonpregnant women aged ≥18 y with a history of UTI, compared cranberry intervention to a placebo or control, and reported the outcome as the number of participants experiencing a UTI. Two researchers conducted abstract and full-text screenings, data extractions, and risk of bias assessments independently, and discrepancies were resolved by group consensus. Meta-analyses were performed by using Stata SE software (version 13). We employed a fixed-effect model using the Mantel-Haenszel method to estimate the summary risk if the heterogeneity was low to moderate ( I 2 < 50%). Otherwise, we applied a random-effects model using the DerSimonian-Laird method. Results: We identified 7 randomized controlled trials conducted in healthy women at risk of UTI ( n = 1498 participants). Results of the meta-analysis showed that cranberry reduced the risk of UTI by 26% (pooled risk ratio: 0.74; 95% CI: 0.55, 0.98; I 2 = 54%). Risk of bias indicated that 2 studies had high loss to follow-up or selective outcome reporting. Overall, the studies were relatively small, with only 2 having >300 participants. Conclusion: These results suggest that cranberry may be effective in preventing UTI recurrence in generally healthy women; however, larger high-quality studies are needed to confirm these findings. This trial was registered at crd.york.ac.uk/prospero as CRD42015024439. © 2017 American Society for Nutrition.
NASA Astrophysics Data System (ADS)
Fink, G.; Koch, M.
2010-12-01
An important aspect in water resources and hydrological engineering is the assessment of hydrological risk, due to the occurrence of extreme events, e.g. droughts or floods. When dealing with the latter - as is the focus here - the classical methods of flood frequency analysis (FFA) are usually being used for the proper dimensioning of a hydraulic structure, for the purpose of bringing down the flood risk to an acceptable level. FFA is based on extreme value statistics theory. Despite the progress of methods in this scientific branch, the development, decision, and fitting of an appropriate distribution function stills remains a challenge, particularly, when certain underlying assumptions of the theory are not met in real applications. This is, for example, the case when the stationarity-condition for a random flood time series is not satisfied anymore, as could be the situation when long-term hydrological impacts of future climate change are to be considered. The objective here is to verify the applicability of classical (stationary) FFA to predicted flood time series in the Fulda catchment in central Germany, as they may occur in the wake of climate change during the 21st century. These discharge time series at the outlet of the Fulda basin have been simulated with a distributed hydrological model (SWAT) that is forced by predicted climate variables of a regional climate model for Germany (REMO). From the simulated future daily time series, annual maximum (extremes) values are computed and analyzed for the purpose of risk evaluation. Although the 21st century estimated extreme flood series of the Fulda river turn out to be only mildly non-stationary, alleviating the need for further action and concern at the first sight, the more detailed analysis of the risk, as quantified, for example, by the return period, shows non-negligent differences in the calculated risk levels. This could be verified by employing a new method, the so-called flood series maximum analysis (FSMA) method, which consists in the stochastic simulation of numerous trajectories of a stochastic process with a given GEV-distribution over a certain length of time (> larger than a desired return period). Then the maximum value for each trajectory is computed, all of which are then used to determine the empirical distribution of this maximum series. Through graphical inversion of this distribution function the size of the design flood for a given risk (quantile) and given life duration can be inferred. The results of numerous simulations show that for stationary flood series, the new FSMA method results, expectedly, in nearly identical risk values as the classical FFA approach. However, once the flood time series becomes slightly non-stationary - for reasons as discussed - and regardless of whether the trend is increasing or decreasing, large differences in the computed risk values for a given design flood occur. Or in other word, for the same risk, the new FSMA method would lead to different values in the design flood for a hydraulic structure than the classical FFA method. This, in turn, could lead to some cost savings in the realization of a hydraulic project.
Information and problem report usage in system saftey engineering division
NASA Technical Reports Server (NTRS)
Morrissey, Stephen J.
1990-01-01
Five basic problems or question areas are examined. They are as follows: (1) Evaluate adequacy of current problem/performance data base; (2) Evaluate methods of performing trend analysis; (3) Methods and sources of data for probabilistic risk assessment; and (4) How is risk assessment documentation upgraded and/or updated. The fifth problem was to provide recommendations for each of the above four areas.
A Regional Decision Support Scheme for Pest Risk Analysis in Southeast Asia.
Soliman, T; MacLeod, A; Mumford, J D; Nghiem, T P L; Tan, H T W; Papworth, S K; Corlett, R T; Carrasco, L R
2016-05-01
A key justification to support plant health regulations is the ability of quarantine services to conduct pest risk analyses (PRA). Despite the supranational nature of biological invasions and the close proximity and connectivity of Southeast Asian countries, PRAs are conducted at the national level. Furthermore, some countries have limited experience in the development of PRAs, which may result in inadequate phytosanitary responses that put their plant resources at risk to pests vectored via international trade. We review existing decision support schemes for PRAs and, following international standards for phytosanitary measures, propose new methods that adapt existing practices to suit the unique characteristics of Southeast Asia. Using a formal written expert elicitation survey, a panel of regional scientific experts was asked to identify and rate unique traits of Southeast Asia with respect to PRA. Subsequently, an expert elicitation workshop with plant protection officials was conducted to verify the potential applicability of the developed methods. Rich biodiversity, shortage of trained personnel, social vulnerability, tropical climate, agriculture-dependent economies, high rates of land-use change, and difficulties in implementing risk management options were identified as challenging Southeast Asian traits. The developed methods emphasize local Southeast Asian conditions and could help support authorities responsible for carrying out PRAs within the region. These methods could also facilitate the creation of other PRA schemes in low- and middle-income tropical countries. © 2016 Society for Risk Analysis.
Indulski, J A; Rolecki, R
1994-01-01
In view of the present and proposed amendments to the Labor Code as well as bearing in mind anticipated harmonization of regulations in this area with those of EEC, the authors emphasize the need for well developed methodology for assessing chemical safety in an occupational environment with special reference to health effects in people exposed to chemicals. Methods for assessing health risk induced by work under conditions of exposure to chemicals were divided into: methods for assessing technological/processing risk, and methods for assessing health risk related to the toxic effect of chemicals. The need for developing means of risk communication in order to secure proper risk perception among people exposed to chemicals and risk managers responsible for prevention against chemical hazards was also stressed. It is suggested to establish a centre for chemical substances in order to settle down all issues pertaining to human exposure to chemicals. The centre would be responsible, under the provisions of the Chemical Substances Act, for the qualitative and quantitative analysis of the present situation and for the development of guidelines on assessment of health risk among persons exposed to chemicals.
Van der Bij, Sjoukje; Vermeulen, Roel C H; Portengen, Lützen; Moons, Karel G M; Koffijberg, Hendrik
2016-05-01
Exposure to asbestos fibres increases the risk of mesothelioma and lung cancer. Although the vast majority of mesothelioma cases are caused by asbestos exposure, the number of asbestos-related lung cancers is less clear. This number cannot be determined directly as lung cancer causes are not clinically distinguishable but may be estimated using varying modelling methods. We applied three different modelling methods to the Dutch population supplemented with uncertainty ranges (UR) due to uncertainty in model input values. The first method estimated asbestos-related lung cancer cases directly from observed and predicted mesothelioma cases in an age-period-cohort analysis. The second method used evidence on the fraction of lung cancer cases attributable (population attributable risk (PAR)) to asbestos exposure. The third method incorporated risk estimates and population exposure estimates to perform a life table analysis. The three methods varied substantially in incorporated evidence. Moreover, the estimated number of asbestos-related lung cancer cases in the Netherlands between 2011 and 2030 depended crucially on the actual method applied, as the mesothelioma method predicts 17 500 expected cases (UR 7000-57 000), the PAR method predicts 12 150 cases (UR 6700-19 000), and the life table analysis predicts 6800 cases (UR 6800-33 850). The three different methods described resulted in absolute estimates varying by a factor of ∼2.5. These results show that accurate estimation of the impact of asbestos exposure on the lung cancer burden remains a challenge. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
ERIC Educational Resources Information Center
Whitehouse, Andrew J. O.
2010-01-01
Purpose: Specific language impairment (SLI) is known to aggregate in families. Debate exists on whether the male sex presents an additional risk for SLI. This meta-analysis examined whether there is a sex ratio difference in the risk for impairment among family members of an SLI proband and whether this is mediated by assessment method (direct…
Micek, Agnieszka; Marranzano, Marina; Ray, Sumantra
2017-01-01
Background: A meta-analysis was conducted to summarize the evidence from prospective cohort and case-control studies regarding the association between coffee intake and biliary tract cancer (BTC) and liver cancer risk. Methods: Eligible studies were identified by searches of PubMed and EMBASE databases from the earliest available online indexing year to March 2017. The dose–response relationship was assessed by a restricted cubic spline model and multivariate random-effect meta-regression. A stratified and subgroup analysis by smoking status and hepatitis was performed to identify potential confounding factors. Results: We identified five studies on BTC risk and 13 on liver cancer risk eligible for meta-analysis. A linear dose–response meta-analysis did not show a significant association between coffee consumption and BTC risk. However, there was evidence of inverse correlation between coffee consumption and liver cancer risk. The association was consistent throughout the various potential confounding factors explored including smoking status, hepatitis, etc. Increasing coffee consumption by one cup per day was associated with a 15% reduction in liver cancer risk (RR 0.85; 95% CI 0.82 to 0.88). Conclusions: The findings suggest that increased coffee consumption is associated with decreased risk of liver cancer, but not BTC. PMID:28846640
Ayyub, Bilal M
2014-02-01
The United Nations Office for Disaster Risk Reduction reported that the 2011 natural disasters, including the earthquake and tsunami that struck Japan, resulted in $366 billion in direct damages and 29,782 fatalities worldwide. Storms and floods accounted for up to 70% of the 302 natural disasters worldwide in 2011, with earthquakes producing the greatest number of fatalities. Average annual losses in the United States amount to about $55 billion. Enhancing community and system resilience could lead to massive savings through risk reduction and expeditious recovery. The rational management of such reduction and recovery is facilitated by an appropriate definition of resilience and associated metrics. In this article, a resilience definition is provided that meets a set of requirements with clear relationships to the metrics of the relevant abstract notions of reliability and risk. Those metrics also meet logically consistent requirements drawn from measure theory, and provide a sound basis for the development of effective decision-making tools for multihazard environments. Improving the resiliency of a system to meet target levels requires the examination of system enhancement alternatives in economic terms, within a decision-making framework. Relevant decision analysis methods would typically require the examination of resilience based on its valuation by society at large. The article provides methods for valuation and benefit-cost analysis based on concepts from risk analysis and management. © 2013 Society for Risk Analysis.
Risk analysis by FMEA as an element of analytical validation.
van Leeuwen, J F; Nauta, M J; de Kaste, D; Odekerken-Rombouts, Y M C F; Oldenhof, M T; Vredenbregt, M J; Barends, D M
2009-12-05
We subjected a Near-Infrared (NIR) analytical procedure used for screening drugs on authenticity to a Failure Mode and Effects Analysis (FMEA), including technical risks as well as risks related to human failure. An FMEA team broke down the NIR analytical method into process steps and identified possible failure modes for each step. Each failure mode was ranked on estimated frequency of occurrence (O), probability that the failure would remain undetected later in the process (D) and severity (S), each on a scale of 1-10. Human errors turned out to be the most common cause of failure modes. Failure risks were calculated by Risk Priority Numbers (RPNs)=O x D x S. Failure modes with the highest RPN scores were subjected to corrective actions and the FMEA was repeated, showing reductions in RPN scores and resulting in improvement indices up to 5.0. We recommend risk analysis as an addition to the usual analytical validation, as the FMEA enabled us to detect previously unidentified risks.
Akata, Kentaro; Yatera, Kazuhiro; Yamasaki, Kei; Kawanami, Toshinori; Naito, Keisuke; Noguchi, Shingo; Fukuda, Kazumasa; Ishimoto, Hiroshi; Taniguchi, Hatsumi; Mukae, Hiroshi
2016-05-11
Aspiration pneumonia has been a growing interest in an aging population. Anaerobes are important pathogens, however, the etiology of aspiration pneumonia is not fully understood. In addition, the relationship between the patient clinical characteristics and the causative pathogens in pneumonia patients with aspiration risk factors are unclear. To evaluate the relationship between the patient clinical characteristics with risk factors for aspiration and bacterial flora in bronchoalveolar lavage fluid (BALF) in pneumonia patients, the bacterial floral analysis of 16S ribosomal RNA gene was applied in addition to cultivation methods in BALF samples. From April 2010 to February 2014, BALF samples were obtained from the affected lesions of pneumonia via bronchoscopy, and were evaluated by the bacterial floral analysis of 16S rRNA gene in addition to cultivation methods in patients with community-acquired pneumonia (CAP) and healthcare-associated pneumonia (HCAP). Factors associated with aspiration risks in these patients were analyzed. A total of 177 (CAP 83, HCAP 94) patients were enrolled. According to the results of the bacterial floral analysis, detection rate of oral streptococci as the most detected bacterial phylotypes in BALF was significantly higher in patients with aspiration risks (31.0 %) than in patients without aspiration risks (14.7 %) (P = 0.009). In addition, the percentages of oral streptococci in each BALF sample were significantly higher in patients with aspiration risks (26.6 ± 32.0 %) than in patients without aspiration risks (13.8 ± 25.3 %) (P = 0.002). A multiple linear regression analysis showed that an Eastern Cooperative Oncology Group (ECOG) performance status (PS) of ≥3, the presence of comorbidities, and a history of pneumonia within a previous year were significantly associated with a detection of oral streptococci in BALF. The bacterial floral analysis of 16S rRNA gene revealed that oral streptococci were mostly detected as the most detected bacterial phylotypes in BALF samples in CAP and HCAP patients with aspiration risks, especially in those with a poor ECOG-PS or a history of pneumonia.
Analysis of Braking Behavior of Train Drivers to Detect Unusual Driving
NASA Astrophysics Data System (ADS)
Marumo, Yoshitaka; Tsunashima, Hitoshi; Kojima, Takashi; Hasegawa, Yasushi
The safety devices for train systems are activated in emergency situations when a risk becomes obvious, and the emergency brake is applied. If such systems are faulty, the drivers' operating errors may cause immediate accidents. So it is necessary to evaluate potential risks by detecting improper driving behavior before overt risks appear. This study analyzes the driving behavior of train drivers using a train-driving simulator. We focus on braking behavior when approaching a station. Two methods for detecting unusual braking operation are examined by giving drivers mental calculation problems as a mental workload. The first is a method monitoring the driver's brake handle operation, and the second is a method measuring vehicle deceleration. These methods make it possible to detect unusual driving.
[The methods of assessment of health risk from exposure to radon and radon daughters].
Demin, V F; Zhukovskiy, M V; Kiselev, S M
2014-01-01
The critical analysis of existing models of the relationship dose-effect (RDE) for radon exposure on human health has been performed. Conclusion about the necessity and possibility of improving these models has been made. A new improved version ofthe RDE has been developed. A technique for assessing the human health risk of exposure to radon, including the method for estimating of exposure doses of radon, an improved model of RDE, proper methodology risk assessment has been described. Methodology is proposed for the use in the territory of Russia.
TNF -308 G/A Polymorphism and Risk of Acne Vulgaris: A Meta-Analysis
Yang, Jian-Kang; Wu, Wen-Juan; Qi, Jue; He, Li; Zhang, Ya-Ping
2014-01-01
Background The -308 G/A polymorphism in the tumor necrosis factor (TNF) gene has been implicated in the risk of acne vulgaris, but the results are inconclusive. The present meta-analysis aimed to investigate the overall association between the -308 G/A polymorphism and acne vulgaris risk. Methods We searched in Pubmed, Embase, Web of Science and CNKI for studies evaluating the association between the -308 G/A gene polymorphism and acne vulgaris risk. Data were extracted and statistical analysis was performed using STATA 12.0 software. Results A total of five publications involving 1553 subjects (728 acne vulgaris cases and 825 controls) were included in this meta-analysis. Combined analysis revealed a significant association between this polymorphism and acne vulgaris risk under recessive model (OR = 2.73, 95% CI: 1.37–5.44, p = 0.004 for AA vs. AG + GG). Subgroup analysis by ethnicity showed that the acne vulgaris risk associated with the -308 G/A gene polymorphism was significantly elevated among Caucasians under recessive model (OR = 2.34, 95% CI: 1.13–4.86, p = 0.023). Conclusion This meta-analysis suggests that the -308 G/A polymorphism in the TNF gene contributes to acne vulgaris risk, especially in Caucasian populations. Further studies among different ethnicity populations are needed to validate these findings. PMID:24498378
Aerosol preparation of intact lipoproteins
Benner, W Henry [Danville, CA; Krauss, Ronald M [Berkeley, CA; Blanche, Patricia J [Berkeley, CA
2012-01-17
A medical diagnostic method and instrumentation system for analyzing noncovalently bonded agglomerated biological particles is described. The method and system comprises: a method of preparation for the biological particles; an electrospray generator; an alpha particle radiation source; a differential mobility analyzer; a particle counter; and data acquisition and analysis means. The medical device is useful for the assessment of human diseases, such as cardiac disease risk and hyperlipidemia, by rapid quantitative analysis of lipoprotein fraction densities. Initially, purification procedures are described to reduce an initial blood sample to an analytical input to the instrument. The measured sizes from the analytical sample are correlated with densities, resulting in a spectrum of lipoprotein densities. The lipoprotein density distribution can then be used to characterize cardiac and other lipid-related health risks.
Risk analysis theory applied to fishing operations: A new approach on the decision-making problem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cunha, J.C.S.
1994-12-31
In the past the decisions concerning whether to continue or interrupt a fishing operation were based primarily on the operator`s previous experience. This procedure often led to wrong decisions and unnecessary loss of money and time. This paper describes a decision-making method based on risk analysis theory and previous operation results from a field under study. The method leads to more accurate decisions on a daily basis allowing the operator to verify each day of the operation if the decision being carried out is the one with the highest probability to conduct to the best economical result. An example ofmore » the method application is provided at the end of the paper.« less
NASA Astrophysics Data System (ADS)
Sari, Diana Puspita; Pujotomo, Darminto; Wardani, Nadira Kusuma
2017-11-01
The Determination of risk is an uncertain event. Risks can have negative or positive impacts on project objectives. A project was defined as a series of activities and tasks that have a purpose, specifications, and limits of cost. Banyumanik Hospital Development Project is one of the construction projects in Semarang which have experienced some problems. The first problem is project delays on building stake. The second problem is delay of material supply. Finally, the problem that occurs is less management attention to health safety as evidenced by the unavailability of PPE for the workers. These problems will pose a risk to be a very important risk management performed by contractors at the Banyumanik Hospital Development Project to reduce the impact that would be caused by the risk borne by the provider of construction services. This research aim to risk identification, risk assessment and risk mitigation. Project risk management begins with the identification of risks based on the project life cycle. The risk assessment carried out by AS I NZS 4360: 2004 to the impacts of cost, time and quality. The results obtained from the method of AS I NZS 4360: 2004 is the risk that requires the handling of mitigation. Mitigated risk is the risk that had significant and high level. There are four risks that require risk mitigation with Bow-Tie diagrams which is work accidents, contract delays, material delays and design changes. Bow-Tie diagram method is a method for identifying causal and preventive action and recovery of a risk. Results obtained from Bow-Tie diagram method is a preventive action and recovery. This action is used as input to the ALARP method. ALARP method is used to determine the priority of the strategy proposed in the category broadly acceptable, tolerable, and unacceptable.
Multivariate generalized multifactor dimensionality reduction to detect gene-gene interactions
2013-01-01
Background Recently, one of the greatest challenges in genome-wide association studies is to detect gene-gene and/or gene-environment interactions for common complex human diseases. Ritchie et al. (2001) proposed multifactor dimensionality reduction (MDR) method for interaction analysis. MDR is a combinatorial approach to reduce multi-locus genotypes into high-risk and low-risk groups. Although MDR has been widely used for case-control studies with binary phenotypes, several extensions have been proposed. One of these methods, a generalized MDR (GMDR) proposed by Lou et al. (2007), allows adjusting for covariates and applying to both dichotomous and continuous phenotypes. GMDR uses the residual score of a generalized linear model of phenotypes to assign either high-risk or low-risk group, while MDR uses the ratio of cases to controls. Methods In this study, we propose multivariate GMDR, an extension of GMDR for multivariate phenotypes. Jointly analysing correlated multivariate phenotypes may have more power to detect susceptible genes and gene-gene interactions. We construct generalized estimating equations (GEE) with multivariate phenotypes to extend generalized linear models. Using the score vectors from GEE we discriminate high-risk from low-risk groups. We applied the multivariate GMDR method to the blood pressure data of the 7,546 subjects from the Korean Association Resource study: systolic blood pressure (SBP) and diastolic blood pressure (DBP). We compare the results of multivariate GMDR for SBP and DBP to the results from separate univariate GMDR for SBP and DBP, respectively. We also applied the multivariate GMDR method to the repeatedly measured hypertension status from 5,466 subjects and compared its result with those of univariate GMDR at each time point. Results Results from the univariate GMDR and multivariate GMDR in two-locus model with both blood pressures and hypertension phenotypes indicate best combinations of SNPs whose interaction has significant association with risk for high blood pressures or hypertension. Although the test balanced accuracy (BA) of multivariate analysis was not always greater than that of univariate analysis, the multivariate BAs were more stable with smaller standard deviations. Conclusions In this study, we have developed multivariate GMDR method using GEE approach. It is useful to use multivariate GMDR with correlated multiple phenotypes of interests. PMID:24565370
Earthquake Hazard Analysis Methods: A Review
NASA Astrophysics Data System (ADS)
Sari, A. M.; Fakhrurrozi, A.
2018-02-01
One of natural disasters that have significantly impacted on risks and damage is an earthquake. World countries such as China, Japan, and Indonesia are countries located on the active movement of continental plates with more frequent earthquake occurrence compared to other countries. Several methods of earthquake hazard analysis have been done, for example by analyzing seismic zone and earthquake hazard micro-zonation, by using Neo-Deterministic Seismic Hazard Analysis (N-DSHA) method, and by using Remote Sensing. In its application, it is necessary to review the effectiveness of each technique in advance. Considering the efficiency of time and the accuracy of data, remote sensing is used as a reference to the assess earthquake hazard accurately and quickly as it only takes a limited time required in the right decision-making shortly after the disaster. Exposed areas and possibly vulnerable areas due to earthquake hazards can be easily analyzed using remote sensing. Technological developments in remote sensing such as GeoEye-1 provide added value and excellence in the use of remote sensing as one of the methods in the assessment of earthquake risk and damage. Furthermore, the use of this technique is expected to be considered in designing policies for disaster management in particular and can reduce the risk of natural disasters such as earthquakes in Indonesia.
Nevo, Daniel; Zucker, David M.; Tamimi, Rulla M.; Wang, Molin
2017-01-01
A common paradigm in dealing with heterogeneity across tumors in cancer analysis is to cluster the tumors into subtypes using marker data on the tumor, and then to analyze each of the clusters separately. A more specific target is to investigate the association between risk factors and specific subtypes and to use the results for personalized preventive treatment. This task is usually carried out in two steps–clustering and risk factor assessment. However, two sources of measurement error arise in these problems. The first is the measurement error in the biomarker values. The second is the misclassification error when assigning observations to clusters. We consider the case with a specified set of relevant markers and propose a unified single-likelihood approach for normally distributed biomarkers. As an alternative, we consider a two-step procedure with the tumor type misclassification error taken into account in the second-step risk factor analysis. We describe our method for binary data and also for survival analysis data using a modified version of the Cox model. We present asymptotic theory for the proposed estimators. Simulation results indicate that our methods significantly lower the bias with a small price being paid in terms of variance. We present an analysis of breast cancer data from the Nurses’ Health Study to demonstrate the utility of our method. PMID:27558651
Investment appraisal using quantitative risk analysis.
Johansson, Henrik
2002-07-01
Investment appraisal concerned with investments in fire safety systems is discussed. Particular attention is directed at evaluating, in terms of the Bayesian decision theory, the risk reduction that investment in a fire safety system involves. It is shown how the monetary value of the change from a building design without any specific fire protection system to one including such a system can be estimated by use of quantitative risk analysis, the results of which are expressed in terms of a Risk-adjusted net present value. This represents the intrinsic monetary value of investing in the fire safety system. The method suggested is exemplified by a case study performed in an Avesta Sheffield factory.
Chen, Liangyong
2015-01-01
The perceived risk of nonremunerated blood donation (NRBD) is one of the most important factors which hinder the Chinese public from donating blood. To understand deeply and measure scientifically the public's perceived risk of NRBD, in this paper the qualitative and quantitative methods were used to explore the construct of perceived risk of NRBD in Chinese context. Firstly, the preliminary construct of perceived risk of NRBD was developed based on the grounded theory. Then, a measurement scale of perceived risk of NRBD was designed. Finally, the exploratory factor analysis (EFA) and confirmatory factor analysis (CFA) were adopted for testing and verifying the construct. The results show that the construct of perceived risk of NRBD has three core dimensions, namely, trust risk, psychological risk, and health risk, which provides a clear construct and concise scale to better capture the Chinese public's perceived risk of NRBD. Blood collection agencies can strategically make polices about perceived risk reduction to maximize the public's NRBD behavior. PMID:26526570
Chen, Liangyong; Ma, Zujun
2015-01-01
The perceived risk of nonremunerated blood donation (NRBD) is one of the most important factors which hinder the Chinese public from donating blood. To understand deeply and measure scientifically the public's perceived risk of NRBD, in this paper the qualitative and quantitative methods were used to explore the construct of perceived risk of NRBD in Chinese context. Firstly, the preliminary construct of perceived risk of NRBD was developed based on the grounded theory. Then, a measurement scale of perceived risk of NRBD was designed. Finally, the exploratory factor analysis (EFA) and confirmatory factor analysis (CFA) were adopted for testing and verifying the construct. The results show that the construct of perceived risk of NRBD has three core dimensions, namely, trust risk, psychological risk, and health risk, which provides a clear construct and concise scale to better capture the Chinese public's perceived risk of NRBD. Blood collection agencies can strategically make polices about perceived risk reduction to maximize the public's NRBD behavior.
Congote, L. F.; Hamilton, E. F.; Chow, J. C.; Perry, T. B.
1982-01-01
Three techniques for analysing hemoglobin synthesis in blood samples obtained by fetoscopy were evaluated. Of the fetuses studied, 12 were not at risk of genetic disorders, 10 were at risk of beta-thalassemia, 2 were at risk of sickle cell anemia and 1 was at risk of both diseases. The conventional method of prenatal diagnosis of hemoglobinopathies, involving the separation of globin chains labelled with a radioactive isotope on carboxymethyl cellulose (CMC) columns, was compared with a method involving globin-chain separation by high-pressure liquid chromatography (HPLC) and with direct analysis of labelled hemoglobin tetramers obtained from cell lysates by chromatography on ion-exchange columns. The last method is technically the simplest and can be used for diagnosing beta-thalassemia and sickle cell anemia. However, it gives spuriously high levels of adult hemoglobin in samples containing nonlabelled adult hemoglobin. HPLC is the fastest method for prenatal diagnosis of beta-thalassemia and may prove as reliable as the CMC method. Of the 13 fetuses at risk for hemoglobinopathies, 1 was predicted to be affected, and the diagnosis was confirmed in the abortus. Of 12 predicted to be unaffected, 1 was aborted spontaneously and was unavailable for confirmatory studies, as were 3 of the infants; however, the diagnosis was confirmed in seven cases and is awaiting confirmation when the infant in 6 months old in one case. Couples at risk of bearing a child with a hemoglobinopathy should be referred for genetic counselling before pregnancy or, at the latest, by the 12th week of gestation so that prenatal diagnosis can be attempted by amniocentesis, safer procedure, with restriction endonuclease analysis of the amniotic fluid cells. PMID:7139502
A simulation model of IT risk on program trading
NASA Astrophysics Data System (ADS)
Xia, Bingying; Jiang, Wenbao; Luo, Guangxuan
2015-12-01
The biggest difficulty for Program trading IT risk measures lies in the loss of data, in view of this situation, the current scholars approach is collecting court, network and other public media such as all kinds of accident of IT both at home and abroad for data collection, and the loss of IT risk quantitative analysis based on this database. However, the IT risk loss database established by this method can only fuzzy reflect the real situation and not for real to make fundamental explanation. In this paper, based on the study of the concept and steps of the MC simulation, we use computer simulation method, by using the MC simulation method in the "Program trading simulation system" developed by team to simulate the real programming trading and get the IT risk loss of data through its IT failure experiment, at the end of the article, on the effectiveness of the experimental data is verified. In this way, better overcome the deficiency of the traditional research method and solves the problem of lack of IT risk data in quantitative research. More empirically provides researchers with a set of simulation method are used to study the ideas and the process template.
Nse, Odunaiya; Quinette, Louw; Okechukwu, Ogah
2015-09-01
Well developed and validated lifestyle cardiovascular disease (CVD) risk factors questionnaires is the key to obtaining accurate information to enable planning of CVD prevention program which is a necessity in developing countries. We conducted this review to assess methods and processes used for development and content validation of lifestyle CVD risk factors questionnaires and possibly develop an evidence based guideline for development and content validation of lifestyle CVD risk factors questionnaires. Relevant databases at the Stellenbosch University library were searched for studies conducted between 2008 and 2012, in English language and among humans. Using the following databases; pubmed, cinahl, psyc info and proquest. Search terms used were CVD risk factors, questionnaires, smoking, alcohol, physical activity and diet. Methods identified for development of lifestyle CVD risk factors were; review of literature either systematic or traditional, involvement of expert and /or target population using focus group discussion/interview, clinical experience of authors and deductive reasoning of authors. For validation, methods used were; the involvement of expert panel, the use of target population and factor analysis. Combination of methods produces questionnaires with good content validity and other psychometric properties which we consider good.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tortorelli, J.P.
1995-08-01
A workshop was held at the Idaho National Engineering Laboratory, August 16--18, 1994 on the topic of risk assessment on medical devices that use radioactive isotopes. Its purpose was to review past efforts to develop a risk assessment methodology to evaluate these devices, and to develop a program plan and a scoping document for future methodology development. This report contains a summary of that workshop. Participants included experts in the fields of radiation oncology, medical physics, risk assessment, human-error analysis, and human factors. Staff from the US Nuclear Regulatory Commission (NRC) associated with the regulation of medical uses of radioactivemore » materials and with research into risk-assessment methods participated in the workshop. The workshop participants concurred in NRC`s intended use of risk assessment as an important technology in the development of regulations for the medical use of radioactive material and encouraged the NRC to proceed rapidly with a pilot study. Specific recommendations are included in the executive summary and the body of this report. An appendix contains the 8 papers presented at the conference: NRC proposed policy statement on the use of probabilistic risk assessment methods in nuclear regulatory activities; NRC proposed agency-wide implementation plan for probabilistic risk assessment; Risk evaluation of high dose rate remote afterloading brachytherapy at a large research/teaching institution; The pros and cons of using human reliability analysis techniques to analyze misadministration events; Review of medical misadministration event summaries and comparison of human error modeling; Preliminary examples of the development of error influences and effects diagrams to analyze medical misadministration events; Brachytherapy risk assessment program plan; and Principles of brachytherapy quality assurance.« less
Environmental justice assessment for transportation : risk analysis
DOT National Transportation Integrated Search
1999-04-01
This paper presents methods of comparing populations and their racial/ethnic compositions using tabulations, histograms, and Chi Squared tests for statistical significance of differences found. Two examples of these methods are presented: comparison ...
Accelerated Monte Carlo Simulation for Safety Analysis of the Advanced Airspace Concept
NASA Technical Reports Server (NTRS)
Thipphavong, David
2010-01-01
Safe separation of aircraft is a primary objective of any air traffic control system. An accelerated Monte Carlo approach was developed to assess the level of safety provided by a proposed next-generation air traffic control system. It combines features of fault tree and standard Monte Carlo methods. It runs more than one order of magnitude faster than the standard Monte Carlo method while providing risk estimates that only differ by about 10%. It also preserves component-level model fidelity that is difficult to maintain using the standard fault tree method. This balance of speed and fidelity allows sensitivity analysis to be completed in days instead of weeks or months with the standard Monte Carlo method. Results indicate that risk estimates are sensitive to transponder, pilot visual avoidance, and conflict detection failure probabilities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tennenberg, S.D.; Jacobs, M.P.; Solomkin, J.S.
1987-04-01
Two methods for predicting adult respiratory distress syndrome (ARDS) were evaluated prospectively in a group of 81 multitrauma and sepsis patients considered at clinical high risk. A popular ARDS risk-scoring method, employing discriminant analysis equations (weighted risk criteria and oxygenation characteristics), yielded a predictive accuracy of 59% and a false-negative rate of 22%. Pulmonary alveolar-capillary permeability (PACP) was determined with a radioaerosol lung-scan technique in 23 of these 81 patients, representing a statistically similar subgroup. Lung scanning achieved a predictive accuracy of 71% (after excluding patients with unilateral pulmonary contusion) and gave no false-negatives. We propose a combination of clinicalmore » risk identification and functional determination of PACP to assess a patient's risk of developing ARDS.« less
Li, Xianbo; Zuo, Rui; Teng, Yanguo; Wang, Jinsheng; Wang, Bin
2015-01-01
Increasing pressure on water supply worldwide, especially in arid areas, has resulted in groundwater overexploitation and contamination, and subsequent deterioration of the groundwater quality and threats to public health. Environmental risk assessment of regional groundwater is an important tool for groundwater protection. This study presents a new approach for assessing the environmental risk assessment of regional groundwater. It was carried out with a relative risk model (RRM) coupled with a series of indices, such as a groundwater vulnerability index, which includes receptor analysis, risk source analysis, risk exposure and hazard analysis, risk characterization, and management of groundwater. The risk map is a product of the probability of environmental contamination and impact. The reliability of the RRM was verified using Monte Carlo analysis. This approach was applied to the lower Liaohe River Plain (LLRP), northeastern China, which covers 23604 km2. A spatial analysis tool within GIS which was used to interpolate and manipulate the data to develop environmental risk maps of regional groundwater, divided the level of risk from high to low into five ranks (V, IV, III, II, I). The results indicate that areas of relative risk rank (RRR) V cover 2324 km2, covering 9.8% of the area; RRR IV covers 3986 km2, accounting for 16.9% of the area. It is a new and appropriate method for regional groundwater resource management and land use planning, and is a rapid and effective tool for improving strategic decision making to protect groundwater and reduce environmental risk. PMID:26020518
Mine safety assessment using gray relational analysis and bow tie model
2018-01-01
Mine safety assessment is a precondition for ensuring orderly and safety in production. The main purpose of this study was to prevent mine accidents more effectively by proposing a composite risk analysis model. First, the weights of the assessment indicators were determined by the revised integrated weight method, in which the objective weights were determined by a variation coefficient method and the subjective weights determined by the Delphi method. A new formula was then adopted to calculate the integrated weights based on the subjective and objective weights. Second, after the assessment indicator weights were determined, gray relational analysis was used to evaluate the safety of mine enterprises. Mine enterprise safety was ranked according to the gray relational degree, and weak links of mine safety practices identified based on gray relational analysis. Third, to validate the revised integrated weight method adopted in the process of gray relational analysis, the fuzzy evaluation method was used to the safety assessment of mine enterprises. Fourth, for first time, bow tie model was adopted to identify the causes and consequences of weak links and allow corresponding safety measures to be taken to guarantee the mine’s safe production. A case study of mine safety assessment was presented to demonstrate the effectiveness and rationality of the proposed composite risk analysis model, which can be applied to other related industries for safety evaluation. PMID:29561875
Huang, Weiqing; Fan, Hongbo; Qiu, Yongfu; Cheng, Zhiyu; Xu, Pingru; Qian, Yu
2016-05-01
Recently, China has frequently experienced large-scale, severe and persistent haze pollution due to surging urbanization and industrialization and a rapid growth in the number of motor vehicles and energy consumption. The vehicle emission due to the consumption of a large number of fossil fuels is no doubt a critical factor of the haze pollution. This work is focused on the causation mechanism of haze pollution related to the vehicle emission for Guangzhou city by employing the Fault Tree Analysis (FTA) method for the first time. With the establishment of the fault tree system of "Haze weather-Vehicle exhausts explosive emission", all of the important risk factors are discussed and identified by using this deductive FTA method. The qualitative and quantitative assessments of the fault tree system are carried out based on the structure, probability and critical importance degree analysis of the risk factors. The study may provide a new simple and effective tool/strategy for the causation mechanism analysis and risk management of haze pollution in China. Copyright © 2016 Elsevier Ltd. All rights reserved.
Prieto, M.L.; Cuéllar-Barboza, A.B.; Bobo, W.V.; Roger, V.L.; Bellivier, F.; Leboyer, M.; West, C.P.; Frye, M.A.
2016-01-01
Objective To review the evidence on and estimate the risk of myocardial infarction and stroke in bipolar disorder. Method A systematic search using MEDLINE, EMBASE, PsycINFO, Web of Science, Scopus, Cochrane Database of Systematic Reviews, and bibliographies (1946 – May, 2013) was conducted. Case-control and cohort studies of bipolar disorder patients age 15 or older with myocardial infarction or stroke as outcomes were included. Two independent reviewers extracted data and assessed quality. Estimates of effect were summarized using random-effects meta-analysis. Results Five cohort studies including 13 115 911 participants (27 092 bipolar) were included. Due to the use of registers, different statistical methods, and inconsistent adjustment for confounders, there was significant methodological heterogeneity among studies. The exploratory meta-analysis yielded no evidence for a significant increase in the risk of myocardial infarction: [relative risk (RR): 1.09, 95% CI 0.96–1.24, P = 0.20; I2 = 6%]. While there was evidence of significant study heterogeneity, the risk of stroke in bipolar disorder was significantly increased (RR 1.74, 95% CI 1.29–2.35; P = 0.0003; I2 = 83%). Conclusion There may be a differential risk of myocardial infarction and stroke in patients with bipolar disorder. Confidence in these pooled estimates was limited by the small number of studies, significant heterogeneity and dissimilar methodological features. PMID:24850482
Terán-Hernández, Mónica; Díaz-Barriga, Fernando; Cubillas-Tejeda, Ana Cristina
2016-02-01
Objective To carry out a diagnosis of children's environmental health and an analysis of risk perception in indigenous communities of the Huasteca Sur region of San Luis Potosí, Mexico, in order to design an intervention strategy in line with their needs. Methods The study used mixed methods research, carried out in two phases. It was conducted in three indigenous communities of Tancanhuitz municipality from January 2005 to June 2006. In the adult population, risk perception was analyzed through focus groups, in-depth interviews, and questionnaires. In the child population, analysis of children's drawings was used to study perception. An assessment of health risks was carried out through biological monitoring and environmental monitoring of water and soil. Results The three communities face critical problems that reveal their vulnerability. When the results were triangulated and integrated, it was found that the principal problems relate to exposure to pathogenic microorganisms in water and soil, exposure to indoor wood smoke, exposure to smoke from the burning of refuse, use of insecticides, exposure to lead from the use of glazed ceramics, and alcoholism. Conclusions To ensure that the intervention strategy is adapted to the target population, it is essential to incorporate risk perception analysis and to promote the participation of community members. The proposed intervention strategy to address the detected problems is based on the principles of risk communication, community participation, and interinstitutional linkage.
Insomnia and the risk of depression: a meta-analysis of prospective cohort studies.
Li, Liqing; Wu, Chunmei; Gan, Yong; Qu, Xianguo; Lu, Zuxun
2016-11-05
Observational studies suggest that insomnia might be associated with an increased risk of depression with inconsistent results. This study aimed at conducting a meta-analysis of prospective cohort studies to evaluate the association between insomnia and the risk of depression. Relevant cohort studies were comprehensively searched from the PubMed, Embase, Web of Science, and China National Knowledge Infrastructure databases (up to October 2014) and from the reference lists of retrieved articles. A random-effects model was used to calculate the pooled risk estimates and 95 % confidence intervals (CIs). The I 2 statistic was used to assess the heterogeneity and potential sources of heterogeneity were assessed with meta-regression. The potential publication bias was explored by using funnel plots, Egger's test, and Duval and Tweedie trim-and-fill methods. Thirty-four cohort studies involving 172,077 participants were included in this meta-analysis with an average follow-up period of 60.4 months (ranging from 3.5 to 408). Statistical analysis suggested a positive relationship between insomnia and depression, the pooled RR was 2.27 (95 % CI: 1.89-2.71), and a high heterogeneity was observed (I 2 = 92.6 %, P < 0.001). Visual inspection of the funnel plot revealed some asymmetry. The Egger's test identified evidence of substantial publication bias (P <0.05), but correction for this bias using trim-and-fill method did not alter the combined risk estimates. This meta-analysis indicates that insomnia is significantly associated with an increased risk of depression, which has implications for the prevention of depression in non-depressed individuals with insomnia symptoms.
Hou, Yi-Chao; Hu, Qiang; Huang, Jiao; Fang, Jing-Yuan; Xiong, Hua
2017-01-01
Background Existing data evaluating the impact of metformin on the colorectal adenoma (CRA) risk in patients suffering from type 2 diabetes (T2D) are limited and controversial. We therefore summarized the studies currently available and assessed the relationship between metformin treatment and risk of CRA in T2D patients. Methods We systematically searched databases for eligible studies that explored the impact of metformin treatment on the occurrence of CRA in T2D patients from inception to June 2016. The summary odds ratio (OR) estimates with their 95% confidence interval (CI) were derived using random-effect, generic inverse variance methods. Sensitivity analysis and subgroup analysis were performed. Results Seven studies involving 7178 participants met the inclusion criteria. The pooling showed that metformin therapy has a 27% decrease in the CRA risk (OR, 0.73; 95% CI, 0.58 - 0.90). In subgroup analysis, we detected that metformin exhibits significant chemoprevention effects in Asia region (OR, 0.68; 95% CI, 0.48 - 0.96). Similar results were identified in both studies with adjusted ORs and high-quality studies (OR, 0.66; 95% CI, 0.50 - 0.86 and OR, 0.70; 95% CI, 0.58 - 0.84, respectively). Of note, an inverse relationship was noted that metformin therapy may result in a significant decrease in the advanced adenoma risk (OR, 0.52; 95% CI, 0.38 - 0.72). Low heterogeneity was observed, however, the results remained robust in multiplesensitivity analyses. Conclusions This meta-analysis indicates that metformin therapy is correlated with a significant decrease in the risk of CRA and advanced adenoma in T2D patients. Further confirmatory studies are warranted. PMID:27903961
Tsunamis: Global Exposure and Local Risk Analysis
NASA Astrophysics Data System (ADS)
Harbitz, C. B.; Løvholt, F.; Glimsdal, S.; Horspool, N.; Griffin, J.; Davies, G.; Frauenfelder, R.
2014-12-01
The 2004 Indian Ocean tsunami led to a better understanding of the likelihood of tsunami occurrence and potential tsunami inundation, and the Hyogo Framework for Action (HFA) was one direct result of this event. The United Nations International Strategy for Disaster Risk Reduction (UN-ISDR) adopted HFA in January 2005 in order to reduce disaster risk. As an instrument to compare the risk due to different natural hazards, an integrated worldwide study was implemented and published in several Global Assessment Reports (GAR) by UN-ISDR. The results of the global earthquake induced tsunami hazard and exposure analysis for a return period of 500 years are presented. Both deterministic and probabilistic methods (PTHA) are used. The resulting hazard levels for both methods are compared quantitatively for selected areas. The comparison demonstrates that the analysis is rather rough, which is expected for a study aiming at average trends on a country level across the globe. It is shown that populous Asian countries account for the largest absolute number of people living in tsunami prone areas, more than 50% of the total exposed people live in Japan. Smaller nations like Macao and the Maldives are among the most exposed by population count. Exposed nuclear power plants are limited to Japan, China, India, Taiwan, and USA. On the contrary, a local tsunami vulnerability and risk analysis applies information on population, building types, infrastructure, inundation, flow depth for a certain tsunami scenario with a corresponding return period combined with empirical data on tsunami damages and mortality. Results and validation of a GIS tsunami vulnerability and risk assessment model are presented. The GIS model is adapted for optimal use of data available for each study. Finally, the importance of including landslide sources in the tsunami analysis is also discussed.
Nouri.Gharahasanlou, Ali; Mokhtarei, Ashkan; Khodayarei, Aliasqar; Ataei, Mohammad
2014-01-01
Evaluating and analyzing the risk in the mining industry is a new approach for improving the machinery performance. Reliability, safety, and maintenance management based on the risk analysis can enhance the overall availability and utilization of the mining technological systems. This study investigates the failure occurrence probability of the crushing and mixing bed hall department at Azarabadegan Khoy cement plant by using fault tree analysis (FTA) method. The results of the analysis in 200 h operating interval show that the probability of failure occurrence for crushing, conveyor systems, crushing and mixing bed hall department is 73, 64, and 95 percent respectively and the conveyor belt subsystem found as the most probable system for failure. Finally, maintenance as a method of control and prevent the occurrence of failure is proposed. PMID:26779433
Nouri Gharahasanlou, Ali; Mokhtarei, Ashkan; Khodayarei, Aliasqar; Ataei, Mohammad
2014-04-01
Evaluating and analyzing the risk in the mining industry is a new approach for improving the machinery performance. Reliability, safety, and maintenance management based on the risk analysis can enhance the overall availability and utilization of the mining technological systems. This study investigates the failure occurrence probability of the crushing and mixing bed hall department at Azarabadegan Khoy cement plant by using fault tree analysis (FTA) method. The results of the analysis in 200 h operating interval show that the probability of failure occurrence for crushing, conveyor systems, crushing and mixing bed hall department is 73, 64, and 95 percent respectively and the conveyor belt subsystem found as the most probable system for failure. Finally, maintenance as a method of control and prevent the occurrence of failure is proposed.
Annalaura, Carducci; Giulia, Davini; Stefano, Ceccanti
2013-01-01
Risk analysis is widely used in the pharmaceutical industry to manage production processes, validation activities, training, and other activities. Several methods of risk analysis are available (for example, failure mode and effects analysis, fault tree analysis), and one or more should be chosen and adapted to the specific field where they will be applied. Among the methods available, hazard analysis and critical control points (HACCP) is a methodology that has been applied since the 1960s, and whose areas of application have expanded over time from food to the pharmaceutical industry. It can be easily and successfully applied to several processes because its main feature is the identification, assessment, and control of hazards. It can be also integrated with other tools, such as fishbone diagram and flowcharting. The aim of this article is to show how HACCP can be used to manage an analytical process, propose how to conduct the necessary steps, and provide data templates necessary to document and useful to follow current good manufacturing practices. In the quality control process, risk analysis is a useful tool for enhancing the uniformity of technical choices and their documented rationale. Accordingly, it allows for more effective and economical laboratory management, is capable of increasing the reliability of analytical results, and enables auditors and authorities to better understand choices that have been made. The aim of this article is to show how hazard analysis and critical control points can be used to manage bacterial endotoxins testing and other analytical processes in a formal, clear, and detailed manner.
Age-differentiated Risk Factors of Suicidal Ideation among Young and Middle-aged Korean Adults
Jo, Ahra; Jeon, Minho; Oh, Heeyoung
2017-01-01
Objectives This study aimed to determine the prevalence of suicidal ideation among young and middle-aged adults, and explore the risk factors that affect suicidal ideation. Methods A descriptive study design was used for secondary data analysis. A total sample of 5,214 was drawn from two waves (2012–2013) of the 7th Korea Health Panel (KHP) survey. The KHP data were collected by a well-trained interviewer using the face-to-face method during home visits as well as self-report method. Descriptive statistics of frequency, percentage, chi-square test, and logistic regression analysis were performed using SPSS 22.0. Results The prevalence of suicidal ideation in young and middle-aged adults was 4.4% and 5.6%, respectively. For young adults, suicidal ideation risk was higher among those with low income or heavy drinking habits. In middle-aged adults, low income, poor perceived health status, negative perception of peer-compared health status, and negative social perspective were the major risk factors. Conclusion There is considerable risk of suicidal ideation in adulthood. Opportunities for increased income, avoidance of heavy drinking, and the construction of positive subjective health status and social perspective should be considered in suicide prevention interventions for Korean young and middle-aged adults. PMID:28781943
Sexual Pleasure and Sexual Risk among Women who Use Methamphetamine: A Mixed Methods Study
Lorvick, Jennifer; Bourgois, Philippe; Wenger, Lynn D.; Arreola, Sonya G.; Lutnick, Alexandra; Wechsberg, Wendee M.; Kral, Alex H.
2012-01-01
Background The intersection of drug use, sexual pleasure and sexual risk behavior is rarely explored when it comes to poor women who use drugs. This paper explores the relationship between sexual behavior and methamphetamine use in a community-based sample of women, exploring not only risk, but also desire, pleasure and the challenges of overcoming trauma. Methods Quantitative data were collected using standard epidemiological methods (N=322) for community-based studies. In addition, using purposive sampling, qualitative data were collected among a subset of participants (n=34). Data were integrated for mixed methods analysis. Results While many participants reported sexual risk behavior (unprotected vaginal or anal intercourse) in the quantitative survey, sexual risk was not the central narrative pertaining to sexual behavior and methamphetamine use in qualitative findings. Rather, desire, pleasure and disinhibition arose as central themes. Women described feelings of power and agency related to sexual behavior while high on methamphetamine. Findings were mixed on whether methamphetamine use increased sexual risk behavior. Conclusion The use of mixed methods afforded important insights into the sexual behavior and priorities of methamphetamine-using women. Efforts to reduce sexual risk should recognize and valorize the positive aspects of methamphetamine use for some women, building on positive feelings of power and agency as an approach to harm minimization. PMID:22954501
NASA Technical Reports Server (NTRS)
Hohnloser, S. H.; Klingenheben, T.; Li, Y. G.; Zabel, M.; Peetermans, J.; Cohen, R. J.
1998-01-01
INTRODUCTION: The current standard for arrhythmic risk stratification is electrophysiologic (EP) testing, which, due to its invasive nature, is limited to patients already known to be at high risk. A number of noninvasive tests, such as determination of left ventricular ejection fraction (LVEF) or heart rate variability, have been evaluated as additional risk stratifiers. Microvolt T wave alternans (TWA) is a promising new risk marker. Prospective evaluation of noninvasive risk markers in low- or moderate-risk populations requires studies involving very large numbers of patients, and in such studies, documentation of the occurrence of ventricular tachyarrhythmias is difficult. In the present study, we identified a high-risk population, recipients of an implantable cardioverter defibrillator (ICD), and prospectively compared microvolt TWA with invasive EP testing and other risk markers with respect to their ability to predict recurrence of ventricular tachyarrhythmias as documented by ICD electrograms. METHODS AND RESULTS: Ninety-five patients with a history of ventricular tachyarrhythmias undergoing implantation of an ICD underwent EP testing, assessment of TWA, as well as determination of LVEF, baroreflex sensitivity, signal-averaged ECG, analysis of 24-hour Holter monitoring, and QT dispersion from the 12-lead surface ECG. The endpoint of the study was first appropriate ICD therapy for electrogram-documented ventricular fibrillation or tachycardia during follow-up. Kaplan-Meier survival analysis revealed that TWA (P < 0.006) and LVEF (P < 0.04) were the only significant univariate risk stratifiers. EP testing was not statistically significant (P < 0.2). Multivariate Cox regression analysis revealed that TWA was the only statistically significant independent risk factor. CONCLUSIONS: Measurement of microvolt TWA compared favorably with both invasive EP testing and other currently used noninvasive risk assessment methods in predicting recurrence of ventricular tachyarrhythmias in ICD recipients. This study suggests that TWA might also be a powerful tool for risk stratification in low- or moderate-risk patients, and needs to be prospectively evaluated in such populations.
NASA Astrophysics Data System (ADS)
Han, Dan; Mu, Jing
2017-12-01
Based on the characteristics of e-commerce of fresh agricultural products in China, and using the correlation analysis method, the relational model between product knowledge, perceived benefit, perceived risk and purchase intention is constructed. The Logistic model is used to carry in the empirical analysis. The influence factors and the mechanism of online purchase intention are explored. The results show that consumers’ product knowledge, perceived benefit and perceived risk can affect their purchase intention. Consumers’ product knowledge has a positive effect on perceived benefit and perceived benefit has a positive effect on purchase intention. Consumers’ product knowledge has a negative effect on perceived risk, and perceived profit has a negative effect on perceived risk, and perceived risk has a negative effect on purchase intention. Through the empirical analysis, some feasible suggestions for the government and electricity supplier enterprises can be provided.
Biederman, Joseph; Petty, Carter R.; Hammerness, Paul; Woodworth, K. Yvonne; Faraone, Stephen V.
2013-01-01
Objective The main aim of this study was to use familial risk analysis to examine the association between attention-deficit/hyperactivity disorder (ADHD) and nicotine dependence. Methods Subjects were children with (n = 257) and without (n = 229) ADHD of both sexes ascertained form pediatric and psychiatric referral sources and their first-degree relatives (N = 1627). Results Nicotine dependence in probands increased the risk for nicotine dependence in relatives irrespective of ADHD status. There was no evidence of cosegregation or assortative mating between these disorders. Patterns of familial risk analysis suggest that the association between ADHD and nicotine dependence is most consistent with the hypothesis of independent transmission of these disorders. Conclusions These findings may have important implications for the identification of a subgroup of children with ADHD at high risk for nicotine dependence based on parental history of nicotine dependence. PMID:23461889
Defining Human Failure Events for Petroleum Risk Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ronald L. Boring; Knut Øien
2014-06-01
In this paper, an identification and description of barriers and human failure events (HFEs) for human reliability analysis (HRA) is performed. The barriers, called target systems, are identified from risk significant accident scenarios represented as defined situations of hazard and accident (DSHAs). This report serves as the foundation for further work to develop petroleum HFEs compatible with the SPAR-H method and intended for reuse in future HRAs.
Method and system for dynamic probabilistic risk assessment
NASA Technical Reports Server (NTRS)
Dugan, Joanne Bechta (Inventor); Xu, Hong (Inventor)
2013-01-01
The DEFT methodology, system and computer readable medium extends the applicability of the PRA (Probabilistic Risk Assessment) methodology to computer-based systems, by allowing DFT (Dynamic Fault Tree) nodes as pivot nodes in the Event Tree (ET) model. DEFT includes a mathematical model and solution algorithm, supports all common PRA analysis functions and cutsets. Additional capabilities enabled by the DFT include modularization, phased mission analysis, sequence dependencies, and imperfect coverage.
2010-06-01
a storytelling narrative of how the risk was mitigated and what worked or did not work. A knowledge-based risk is also a means of transferring...devel- oping a cadre of trained facilitators to assist teams in using Web-based deci- sion-support technology to support team brain - storming...and use “contextualization” (a.k.a. storytelling ) as an alternative method to analysis? Storytelling Instead of Analysis There have been some
Caous, Cristofer André; Machado, Birajara; Hors, Cora; Zeh, Andrea Kaufmann; Dias, Cleber Gustavo; Amaro Junior, Edson
2012-01-01
To propose a measure (index) of expected risks to evaluate and follow up the performance analysis of research projects involving financial and adequate structure parameters for its development. A ranking of acceptable results regarding research projects with complex variables was used as an index to gauge a project performance. In order to implement this method the ulcer index as the basic model to accommodate the following variables was applied: costs, high impact publication, fund raising, and patent registry. The proposed structured analysis, named here as RoSI (Return on Scientific Investment) comprises a pipeline of analysis to characterize the risk based on a modeling tool that comprises multiple variables interacting in semi-quantitatively environments. This method was tested with data from three different projects in our Institution (projects A, B and C). Different curves reflected the ulcer indexes identifying the project that may have a minor risk (project C) related to development and expected results according to initial or full investment. The results showed that this model contributes significantly to the analysis of risk and planning as well as to the definition of necessary investments that consider contingency actions with benefits to the different stakeholders: the investor or donor, the project manager and the researchers.
Analysis of interactions among barriers in project risk management
NASA Astrophysics Data System (ADS)
Dandage, Rahul V.; Mantha, Shankar S.; Rane, Santosh B.; Bhoola, Vanita
2018-03-01
In the context of the scope, time, cost, and quality constraints, failure is not uncommon in project management. While small projects have 70% chances of success, large projects virtually have no chance of meeting the quadruple constraints. While there is no dearth of research on project risk management, the manifestation of barriers to project risk management is a less dwelt topic. The success of project management is oftentimes based on the understanding of barriers to effective risk management, application of appropriate risk management methodology, proactive leadership to avoid barriers, workers' attitude, adequate resources, organizational culture, and involvement of top management. This paper represents various risk categories and barriers to risk management in domestic and international projects through literature survey and feedback from project professionals. After analysing the various modelling methods used in project risk management literature, interpretive structural modelling (ISM) and MICMAC analysis have been used to analyse interactions among the barriers and prioritize them. The analysis indicates that lack of top management support, lack of formal training, and lack of addressing cultural differences are the high priority barriers, among many others.
Jordá Aragón, Carlos; Peñalver Cuesta, Juan Carlos; Mancheño Franch, Nuria; de Aguiar Quevedo, Karol; Vera Sempere, Francisco; Padilla Alarcón, José
2015-09-07
Survival studies of non-small cell lung cancer (NSCLC) are usually based on the Kaplan-Meier method. However, other factors not covered by this method may modify the observation of the event of interest. There are models of cumulative incidence (CI), that take into account these competing risks, enabling more accurate survival estimates and evaluation of the risk of death from other causes. We aimed to evaluate these models in resected early-stage NSCLC patients. This study included 263 patients with resected NSCLC whose diameter was ≤ 3 cm without node involvement (N0). Demographic, clinical, morphopathological and surgical variables, TNM classification and long-term evolution were analysed. To analyse CI, death by another cause was considered to be competitive event. For the univariate analysis, Gray's method was used, while Fine and Gray's method was employed for the multivariate analysis. Mortality by NSCLC was 19.4% at 5 years and 14.3% by another cause. Both curves crossed at 6.3 years, and probability of death by another cause became greater from this point. In multivariate analysis, cancer mortality was conditioned by visceral pleural invasion (VPI) (P=.001) and vascular invasion (P=.020), with age>50 years (P=.034), smoking (P=.009) and the Charlson index ≥ 2 (P=.000) being by no cancer. By the method of CI, VPI and vascular invasion conditioned cancer death in NSCLC >3 cm, while non-tumor causes of long-term death were determined. Copyright © 2014 Elsevier España, S.L.U. All rights reserved.
Probability Elicitation Under Severe Time Pressure: A Rank-Based Method.
Jaspersen, Johannes G; Montibeller, Gilberto
2015-07-01
Probability elicitation protocols are used to assess and incorporate subjective probabilities in risk and decision analysis. While most of these protocols use methods that have focused on the precision of the elicited probabilities, the speed of the elicitation process has often been neglected. However, speed is also important, particularly when experts need to examine a large number of events on a recurrent basis. Furthermore, most existing elicitation methods are numerical in nature, but there are various reasons why an expert would refuse to give such precise ratio-scale estimates, even if highly numerate. This may occur, for instance, when there is lack of sufficient hard evidence, when assessing very uncertain events (such as emergent threats), or when dealing with politicized topics (such as terrorism or disease outbreaks). In this article, we adopt an ordinal ranking approach from multicriteria decision analysis to provide a fast and nonnumerical probability elicitation process. Probabilities are subsequently approximated from the ranking by an algorithm based on the principle of maximum entropy, a rule compatible with the ordinal information provided by the expert. The method can elicit probabilities for a wide range of different event types, including new ways of eliciting probabilities for stochastically independent events and low-probability events. We use a Monte Carlo simulation to test the accuracy of the approximated probabilities and try the method in practice, applying it to a real-world risk analysis recently conducted for DEFRA (the U.K. Department for the Environment, Farming and Rural Affairs): the prioritization of animal health threats. © 2015 Society for Risk Analysis.
Decision modeling for fire incident analysis
Donald G. MacGregor; Armando González-Cabán
2009-01-01
This paper reports on methods for representing and modeling fire incidents based on concepts and models from the decision and risk sciences. A set of modeling techniques are used to characterize key fire management decision processes and provide a basis for incident analysis. The results of these methods can be used to provide insights into the structure of fire...
NASA Astrophysics Data System (ADS)
Guillen, George; Rainey, Gail; Morin, Michelle
2004-04-01
Currently, the Minerals Management Service uses the Oil Spill Risk Analysis model (OSRAM) to predict the movement of potential oil spills greater than 1000 bbl originating from offshore oil and gas facilities. OSRAM generates oil spill trajectories using meteorological and hydrological data input from either actual physical measurements or estimates generated from other hydrological models. OSRAM and many other models produce output matrices of average, maximum and minimum contact probabilities to specific landfall or target segments (columns) from oil spills at specific points (rows). Analysts and managers are often interested in identifying geographic areas or groups of facilities that pose similar risks to specific targets or groups of targets if a spill occurred. Unfortunately, due to the potentially large matrix generated by many spill models, this question is difficult to answer without the use of data reduction and visualization methods. In our study we utilized a multivariate statistical method called cluster analysis to group areas of similar risk based on potential distribution of landfall target trajectory probabilities. We also utilized ArcView™ GIS to display spill launch point groupings. The combination of GIS and multivariate statistical techniques in the post-processing of trajectory model output is a powerful tool for identifying and delineating areas of similar risk from multiple spill sources. We strongly encourage modelers, statistical and GIS software programmers to closely collaborate to produce a more seamless integration of these technologies and approaches to analyzing data. They are complimentary methods that strengthen the overall assessment of spill risks.
Analysis of labour risks in the Spanish industrial aerospace sector.
Laguardia, Juan; Rubio, Emilio; Garcia, Ana; Garcia-Foncillas, Rafael
2016-01-01
Labour risk prevention is an activity integrated within Safety and Hygiene at Work in Spain. In 2003, the Electronic Declaration for Accidents at Work, Delt@ (DELTA) was introduced. The industrial aerospace sector is subject to various risks. Our objective is to analyse the Spanish Industrial Aerospace Sector (SIAS) using the ACSOM methodology to assess its labour risks and to prioritise preventive actions. The SIAS and the Services Subsector (SS) were created and the relevant accident rate data were obtained. The ACSOM method was applied through double contrast (deviation and translocation) of the SIAS or SS risk polygon with the considered pattern, accidents from all sectors (ACSOM G) or the SIAS. A list of risks was obtained, ordered by action phases. In the SIAS vs. ACSOM G analysis, radiation risks were the worst, followed by overstrains. Accidents caused by living beings were also significant in the SS vs. SIAE, which will be able to be used to improve Risk Prevention. Radiation is the most significant risk in the SIAS and the SS. Preventive actions will be primary and secondary. ACSOM has shown itself to be a valid tool for the analysis of labour risks.
A Framework for Flood Risk Analysis and Benefit Assessment of Flood Control Measures in Urban Areas
Li, Chaochao; Cheng, Xiaotao; Li, Na; Du, Xiaohe; Yu, Qian; Kan, Guangyuan
2016-01-01
Flood risk analysis is more complex in urban areas than that in rural areas because of their closely packed buildings, different kinds of land uses, and large number of flood control works and drainage systems. The purpose of this paper is to propose a practical framework for flood risk analysis and benefit assessment of flood control measures in urban areas. Based on the concept of disaster risk triangle (hazard, vulnerability and exposure), a comprehensive analysis method and a general procedure were proposed for urban flood risk analysis. Urban Flood Simulation Model (UFSM) and Urban Flood Damage Assessment Model (UFDAM) were integrated to estimate the flood risk in the Pudong flood protection area (Shanghai, China). S-shaped functions were adopted to represent flood return period and damage (R-D) curves. The study results show that flood control works could significantly reduce the flood risk within the 66-year flood return period and the flood risk was reduced by 15.59%. However, the flood risk was only reduced by 7.06% when the flood return period exceeded 66-years. Hence, it is difficult to meet the increasing demands for flood control solely relying on structural measures. The R-D function is suitable to describe the changes of flood control capacity. This frame work can assess the flood risk reduction due to flood control measures, and provide crucial information for strategy development and planning adaptation. PMID:27527202
Deep Learning for Classification of Colorectal Polyps on Whole-slide Images
Korbar, Bruno; Olofson, Andrea M.; Miraflor, Allen P.; Nicka, Catherine M.; Suriawinata, Matthew A.; Torresani, Lorenzo; Suriawinata, Arief A.; Hassanpour, Saeed
2017-01-01
Context: Histopathological characterization of colorectal polyps is critical for determining the risk of colorectal cancer and future rates of surveillance for patients. However, this characterization is a challenging task and suffers from significant inter- and intra-observer variability. Aims: We built an automatic image analysis method that can accurately classify different types of colorectal polyps on whole-slide images to help pathologists with this characterization and diagnosis. Setting and Design: Our method is based on deep-learning techniques, which rely on numerous levels of abstraction for data representation and have shown state-of-the-art results for various image analysis tasks. Subjects and Methods: Our method covers five common types of polyps (i.e., hyperplastic, sessile serrated, traditional serrated, tubular, and tubulovillous/villous) that are included in the US Multisociety Task Force guidelines for colorectal cancer risk assessment and surveillance. We developed multiple deep-learning approaches by leveraging a dataset of 2074 crop images, which were annotated by multiple domain expert pathologists as reference standards. Statistical Analysis: We evaluated our method on an independent test set of 239 whole-slide images and measured standard machine-learning evaluation metrics of accuracy, precision, recall, and F1 score and their 95% confidence intervals. Results: Our evaluation shows that our method with residual network architecture achieves the best performance for classification of colorectal polyps on whole-slide images (overall accuracy: 93.0%, 95% confidence interval: 89.0%–95.9%). Conclusions: Our method can reduce the cognitive burden on pathologists and improve their efficacy in histopathological characterization of colorectal polyps and in subsequent risk assessment and follow-up recommendations. PMID:28828201
Verrier, Richard L.; Klingenheben, Thomas; Malik, Marek; El-Sherif, Nabil; Exner, Derek V.; Hohnloser, Stefan H.; Ikeda, Takanori; Martínez, Juan Pablo; Narayan, Sanjiv M.; Nieminen, Tuomo; Rosenbaum, David S.
2014-01-01
This consensus guideline was prepared on behalf of the International Society for Holter and Noninvasive Electrocardiology and is cosponsored by the Japanese Circulation Society, the Computers in Cardiology Working Group on e-Cardiology of the European Society of Cardiology, and the European Cardiac Arrhythmia Society. It discusses the electrocardiographic phenomenon of T-wave alternans (TWA) (i.e., a beat-to-beat alternation in the morphology and amplitude of the ST- segment or T-wave). This statement focuses on its physiological basis and measurement technologies and its clinical utility in stratifying risk for life-threatening ventricular arrhythmias. Signal processing techniques including the frequency-domain Spectral Method and the time-domain Modified Moving Average method have demonstrated the utility of TWA in arrhythmia risk stratification in prospective studies in >12,000 patients. The majority of exercise-based studies using both methods have reported high relative risks for cardiovascular mortality and for sudden cardiac death in patients with preserved as well as depressed left ventricular ejection fraction. Studies with ambulatory electrocardiogram-based TWA analysis with Modified Moving Average method have yielded significant predictive capacity. However, negative studies with the Spectral Method have also appeared, including 2 interventional studies in patients with implantable defibrillators. Meta-analyses have been performed to gain insights into this issue. Frontiers of TWA research include use in arrhythmia risk stratification of individuals with preserved ejection fraction, improvements in predictivity with quantitative analysis, and utility in guiding medical as well as device-based therapy. Overall, although TWA appears to be a useful marker of risk for arrhythmic and cardiovascular death, there is as yet no definitive evidence that it can guide therapy. PMID:21920259
Graves, Tabitha A.; Royle, J. Andrew; Kendall, Katherine C.; Beier, Paul; Stetz, Jeffrey B.; Macleod, Amy C.
2012-01-01
Using multiple detection methods can increase the number, kind, and distribution of individuals sampled, which may increase accuracy and precision and reduce cost of population abundance estimates. However, when variables influencing abundance are of interest, if individuals detected via different methods are influenced by the landscape differently, separate analysis of multiple detection methods may be more appropriate. We evaluated the effects of combining two detection methods on the identification of variables important to local abundance using detections of grizzly bears with hair traps (systematic) and bear rubs (opportunistic). We used hierarchical abundance models (N-mixture models) with separate model components for each detection method. If both methods sample the same population, the use of either data set alone should (1) lead to the selection of the same variables as important and (2) provide similar estimates of relative local abundance. We hypothesized that the inclusion of 2 detection methods versus either method alone should (3) yield more support for variables identified in single method analyses (i.e. fewer variables and models with greater weight), and (4) improve precision of covariate estimates for variables selected in both separate and combined analyses because sample size is larger. As expected, joint analysis of both methods increased precision as well as certainty in variable and model selection. However, the single-method analyses identified different variables and the resulting predicted abundances had different spatial distributions. We recommend comparing single-method and jointly modeled results to identify the presence of individual heterogeneity between detection methods in N-mixture models, along with consideration of detection probabilities, correlations among variables, and tolerance to risk of failing to identify variables important to a subset of the population. The benefits of increased precision should be weighed against those risks. The analysis framework presented here will be useful for other species exhibiting heterogeneity by detection method.
Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581
NASA Astrophysics Data System (ADS)
Prayogo, Galang Sandy; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin
2016-04-01
Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presented a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.
Using the weighted area under the net benefit curve for decision curve analysis.
Talluri, Rajesh; Shete, Sanjay
2016-07-18
Risk prediction models have been proposed for various diseases and are being improved as new predictors are identified. A major challenge is to determine whether the newly discovered predictors improve risk prediction. Decision curve analysis has been proposed as an alternative to the area under the curve and net reclassification index to evaluate the performance of prediction models in clinical scenarios. The decision curve computed using the net benefit can evaluate the predictive performance of risk models at a given or range of threshold probabilities. However, when the decision curves for 2 competing models cross in the range of interest, it is difficult to identify the best model as there is no readily available summary measure for evaluating the predictive performance. The key deterrent for using simple measures such as the area under the net benefit curve is the assumption that the threshold probabilities are uniformly distributed among patients. We propose a novel measure for performing decision curve analysis. The approach estimates the distribution of threshold probabilities without the need of additional data. Using the estimated distribution of threshold probabilities, the weighted area under the net benefit curve serves as the summary measure to compare risk prediction models in a range of interest. We compared 3 different approaches, the standard method, the area under the net benefit curve, and the weighted area under the net benefit curve. Type 1 error and power comparisons demonstrate that the weighted area under the net benefit curve has higher power compared to the other methods. Several simulation studies are presented to demonstrate the improvement in model comparison using the weighted area under the net benefit curve compared to the standard method. The proposed measure improves decision curve analysis by using the weighted area under the curve and thereby improves the power of the decision curve analysis to compare risk prediction models in a clinical scenario.
Convergent evidence from systematic analysis of GWAS revealed genetic basis of esophageal cancer.
Gao, Xue-Xin; Gao, Lei; Wang, Jiu-Qiang; Qu, Su-Su; Qu, Yue; Sun, Hong-Lei; Liu, Si-Dang; Shang, Ying-Li
2016-07-12
Recent genome-wide association studies (GWAS) have identified single nucleotide polymorphisms (SNPs) associated with risk of esophageal cancer (EC). However, investigation of genetic basis from the perspective of systematic biology and integrative genomics remains scarce.In this study, we explored genetic basis of EC based on GWAS data and implemented a series of bioinformatics methods including functional annotation, expression quantitative trait loci (eQTL) analysis, pathway enrichment analysis and pathway grouped network analysis.Two hundred and thirteen risk SNPs were identified, in which 44 SNPs were found to have significantly differential gene expression in esophageal tissues by eQTL analysis. By pathway enrichment analysis, 170 risk genes mapped by risk SNPs were enriched into 38 significant GO terms and 17 significant KEGG pathways, which were significantly grouped into 9 sub-networks by pathway grouped network analysis. The 9 groups of interconnected pathways were mainly involved with muscle cell proliferation, cellular response to interleukin-6, cell adhesion molecules, and ethanol oxidation, which might participate in the development of EC.Our findings provide genetic evidence and new insight for exploring the molecular mechanisms of EC.
Dynamic safety assessment of natural gas stations using Bayesian network.
Zarei, Esmaeil; Azadeh, Ali; Khakzad, Nima; Aliabadi, Mostafa Mirzaei; Mohammadfam, Iraj
2017-01-05
Pipelines are one of the most popular and effective ways of transporting hazardous materials, especially natural gas. However, the rapid development of gas pipelines and stations in urban areas has introduced a serious threat to public safety and assets. Although different methods have been developed for risk analysis of gas transportation systems, a comprehensive methodology for risk analysis is still lacking, especially in natural gas stations. The present work is aimed at developing a dynamic and comprehensive quantitative risk analysis (DCQRA) approach for accident scenario and risk modeling of natural gas stations. In this approach, a FMEA is used for hazard analysis while a Bow-tie diagram and Bayesian network are employed to model the worst-case accident scenario and to assess the risks. The results have indicated that the failure of the regulator system was the worst-case accident scenario with the human error as the most contributing factor. Thus, in risk management plan of natural gas stations, priority should be given to the most probable root events and main contribution factors, which have identified in the present study, in order to reduce the occurrence probability of the accident scenarios and thus alleviate the risks. Copyright © 2016 Elsevier B.V. All rights reserved.
Unmanned aircraft system sense and avoid integrity and continuity
NASA Astrophysics Data System (ADS)
Jamoom, Michael B.
This thesis describes new methods to guarantee safety of sense and avoid (SAA) functions for Unmanned Aircraft Systems (UAS) by evaluating integrity and continuity risks. Previous SAA efforts focused on relative safety metrics, such as risk ratios, comparing the risk of using an SAA system versus not using it. The methods in this thesis evaluate integrity and continuity risks as absolute measures of safety, as is the established practice in commercial aircraft terminal area navigation applications. The main contribution of this thesis is a derivation of a new method, based on a standard intruder relative constant velocity assumption, that uses hazard state estimates and estimate error covariances to establish (1) the integrity risk of the SAA system not detecting imminent loss of '"well clear," which is the time and distance required to maintain safe separation from intruder aircraft, and (2) the probability of false alert, the continuity risk. Another contribution is applying these integrity and continuity risk evaluation methods to set quantifiable and certifiable safety requirements on sensors. A sensitivity analysis uses this methodology to evaluate the impact of sensor errors on integrity and continuity risks. The penultimate contribution is an integrity and continuity risk evaluation where the estimation model is refined to address realistic intruder relative linear accelerations, which goes beyond the current constant velocity standard. The final contribution is an integrity and continuity risk evaluation addressing multiple intruders. This evaluation is a new innovation-based method to determine the risk of mis-associating intruder measurements. A mis-association occurs when the SAA system incorrectly associates a measurement to the wrong intruder, causing large errors in the estimated intruder trajectories. The new methods described in this thesis can help ensure safe encounters between aircraft and enable SAA sensor certification for UAS integration into the National Airspace System.
Heart Rate Variability Dynamics for the Prognosis of Cardiovascular Risk
Ramirez-Villegas, Juan F.; Lam-Espinosa, Eric; Ramirez-Moreno, David F.; Calvo-Echeverry, Paulo C.; Agredo-Rodriguez, Wilfredo
2011-01-01
Statistical, spectral, multi-resolution and non-linear methods were applied to heart rate variability (HRV) series linked with classification schemes for the prognosis of cardiovascular risk. A total of 90 HRV records were analyzed: 45 from healthy subjects and 45 from cardiovascular risk patients. A total of 52 features from all the analysis methods were evaluated using standard two-sample Kolmogorov-Smirnov test (KS-test). The results of the statistical procedure provided input to multi-layer perceptron (MLP) neural networks, radial basis function (RBF) neural networks and support vector machines (SVM) for data classification. These schemes showed high performances with both training and test sets and many combinations of features (with a maximum accuracy of 96.67%). Additionally, there was a strong consideration for breathing frequency as a relevant feature in the HRV analysis. PMID:21386966
NASA Astrophysics Data System (ADS)
Davis, Adam Christopher
This research develops a new framework for evaluating the occupational risks of exposure to hazardous substances in any setting where As Low As Reasonably Achievable (ALARA) practices are mandated or used. The evaluation is performed by developing a hypothesis-test-based procedure for evaluating the homogeneity of various epidemiological cohorts, and thus the appropriateness of the application of aggregate data-pooling techniques to those cohorts. A statistical methodology is then developed as an alternative to aggregate pooling for situations in which individual cohorts show heterogeneity between them and are thus unsuitable for pooled analysis. These methods are then applied to estimate the all-cancer mortality risks incurred by workers at four Department-of-Energy nuclear weapons laboratories. Both linear, no-threshold and dose-bin averaged risks are calculated and it is further shown that aggregate analysis tends to overestimate the risks with respect to those calculated by the methods developed in this work. The risk estimates developed in Chapter 2 are, in Chapter 3, applied to assess the risks to workers engaged in americium recovery operations at Los Alamos National Laboratory. The work described in Chapter 3 develops a full radiological protection assessment for the new americium recovery project, including development of exposure cases, creation and modification of MCNP5 models, development of a time-and-motion study, and the final synthesis of all data. This work also develops a new risk-based method of determining whether administrative controls, such as staffing increases, are ALARA-optimized. The EPA's estimate of the value of statistical life is applied to these risk estimates to determine a monetary value for risk. The rate of change of this "risk value" (marginal risk) is then compared with the rate of change of workers' compensations as additional workers are added to the project to reduce the dose (and therefore, presumably, risk) to each individual.
NASA Astrophysics Data System (ADS)
Haining, Wang; Lei, Wang; Qian, Zhang; Zongqiang, Zheng; Hongyu, Zhou; Chuncheng, Gao
2018-03-01
For the uncertain problems in the comprehensive evaluation of supervision risk in electricity transaction, this paper uses the unidentified rational numbers to evaluation the supervision risk, to obtain the possible result and corresponding credibility of evaluation and realize the quantification of risk indexes. The model can draw the risk degree of various indexes, which makes it easier for the electricity transaction supervisors to identify the transaction risk and determine the risk level, assisting the decision-making and realizing the effective supervision of the risk. The results of the case analysis verify the effectiveness of the model.
Jolly, Kim; Archibald, Cynthia; Liehr, Patricia
2013-10-01
School nurses are well positioned to address risk-taking behaviors for adolescents in their care. The purpose of this mixed-method exploratory study was to explore risk taking in Afro-Caribbean adolescents in South Florida, comparing first- to second-generation adolescents. Quantitative and qualitative data were collected from an immigrant group using the adolescent risk-taking instrument to evaluate risk-taking attitudes, behaviors, and self-described riskiest activities. One-hundred and six adolescents participated; 44% were first generation Afro-Caribbean. Data analysis included analysis of variance, frequencies, and content analysis. There were no differences in risk-taking attitudes; smaller percentages of first generation Afro-Caribbean adolescents reported sexual activity, substance use, and violence. Over one third of the sample, regardless of generational status, reported alcohol use, but did not note alcohol or other health-compromising behaviors as "riskiest" activities. It is important to better understand Afro-Caribbean adolescents' perspectives about risky behaviors, and school-based venues offer the best promise for reaching these adolescents.
Risk Management using Dependency Stucture Matrix
NASA Astrophysics Data System (ADS)
Petković, Ivan
2011-09-01
An efficient method based on dependency structure matrix (DSM) analysis is given for ranking risks in a complex system or process whose entities are mutually dependent. This rank is determined according to the element's values of the unique positive eigenvector which corresponds to the matrix spectral radius modeling the considered engineering system. For demonstration, the risk problem of NASA's robotic spacecraft is analyzed.
Ramírez de Arellano, A; Coca, A; de la Figuera, M; Rubio-Terrés, C; Rubio-Rodríguez, D; Gracia, A; Boldeanu, A; Puig-Gilberte, J; Salas, E
2013-10-01
A clinical–genetic function (Cardio inCode®) was generated using genetic variants associated with coronary heart disease (CHD), but not with classical CHD risk factors, to achieve a more precise estimation of the CHD risk of individuals by incorporating genetics into risk equations [Framingham and REGICOR (Registre Gironí del Cor)]. The objective of this study was to conduct an economic analysis of the CHD risk assessment with Cardio inCode®, which incorporates the patient’s genetic risk into the functions of REGICOR and Framingham, compared with the standard method (using only the functions). A Markov model was developed with seven states of health (low CHD risk, moderate CHD risk, high CHD risk, CHD event, recurrent CHD, chronic CHD, and death). The reclassification of CHD risk derived from genetic information and transition probabilities between states was obtained from a validation study conducted in cohorts of REGICOR (Spain) and Framingham (USA). It was assumed that patients classified as at moderate risk by the standard method were the best candidates to test the risk reclassification with Cardio inCode®. The utilities and costs (€; year 2011 values) of Markov states were obtained from the literature and Spanish sources. The analysis was performed from the perspective of the Spanish National Health System, for a life expectancy of 82 years in Spain. An annual discount rate of 3.5 % for costs and benefits was applied. For a Cardio inCode® price of €400, the cost per QALY gained compared with the standard method [incremental cost-effectiveness ratio (ICER)] would be €12,969 and €21,385 in REGICOR and Framingham cohorts, respectively. The threshold price of Cardio inCode® to reach the ICER threshold generally accepted in Spain (€30,000/QALY) would range between €668 and €836. The greatest benefit occurred in the subgroup of patients with moderate–high risk, with a high-risk reclassification of 22.8 % and 12 % of patients and an ICER of €1,652/QALY and €5,884/QALY in the REGICOR and Framingham cohorts, respectively. Sensitivity analyses confirmed the stability of the study results. Cardio inCode® is a cost-effective risk score option in CHD risk assessment compared with the standard method.
NASA Astrophysics Data System (ADS)
Iskandar, I.
2018-03-01
The exponential distribution is the most widely used reliability analysis. This distribution is very suitable for representing the lengths of life of many cases and is available in a simple statistical form. The characteristic of this distribution is a constant hazard rate. The exponential distribution is the lower rank of the Weibull distributions. In this paper our effort is to introduce the basic notions that constitute an exponential competing risks model in reliability analysis using Bayesian analysis approach and presenting their analytic methods. The cases are limited to the models with independent causes of failure. A non-informative prior distribution is used in our analysis. This model describes the likelihood function and follows with the description of the posterior function and the estimations of the point, interval, hazard function, and reliability. The net probability of failure if only one specific risk is present, crude probability of failure due to a specific risk in the presence of other causes, and partial crude probabilities are also included.
NASA Astrophysics Data System (ADS)
Al-Akad, S.; Akensous, Y.; Hakdaoui, M.
2017-11-01
This research article is summarize the applications of remote sensing and GIS to study the urban floods risk in Al Mukalla. Satellite acquisition of a flood event on October 2015 in Al Mukalla (Yemen) by using flood risk mapping techniques illustrate the potential risk present in this city. Satellite images (The Landsat and DEM images data were atmospherically corrected, radiometric corrected, and geometric and topographic distortions rectified.) are used for flood risk mapping to afford a hazard (vulnerability) map. This map is provided by applying image-processing techniques and using geographic information system (GIS) environment also the application of NDVI, NDWI index, and a method to estimate the flood-hazard areas. Four factors were considered in order to estimate the spatial distribution of the hazardous areas: flow accumulation, slope, land use, geology and elevation. The multi-criteria analysis, allowing to deal with vulnerability to flooding, as well as mapping areas at the risk of flooding of the city Al Mukalla. The main object of this research is to provide a simple and rapid method to reduce and manage the risks caused by flood in Yemen by take as example the city of Al Mukalla.
Risk Costs for New Dams: Economic Analysis and Effects of Monitoring
NASA Astrophysics Data System (ADS)
Paté-Cornell, M. Elisabeth; Tagaras, George
1986-01-01
This paper presents new developments and illustrations of the introduction of risk and costs in cost-benefit analysis for new dams. The emphasis is on a method of evaluation of the risk costs based on the structure of the local economy. Costs to agricultural property as well as residential, commercial, industrial, and public property are studied in detail. Of particular interest is the case of sequential dam failure and the evaluation of the risk costs attributable to a new dam upstream from an existing one. Three real cases are presented as illustrations of the method: the Auburn Dam, the Dickey-Lincoln School Project, and the Teton Dam, which failed in 1976. This last case provides a calibration tool for the estimation of loss ratios. For these three projects, the risk-modified benefit-cost ratios are computed to assess the effect of the risk on the economic performance of the project. The role of a warning system provided by systematic monitoring of the dam is analyzed: by reducing the risk costs, the warning system attenuates their effect on the benefit-cost ratio. The precursors, however, can be missed or misinterpreted: monitoring does not guarantee that the risks to human life can be reduced to zero. This study shows, in particular, that it is critical to consider the risk costs in the decision to build a new dam when the flood area is large and densely populated.
Robey, Thomas E; Edwards, Kelly; Murphy, Mary K
2014-02-01
This qualitative study aimed to characterize the barriers to informed discussions between patients and emergency physicians (EPs) about radiation risk from computed tomography (CT) and to identify future interventions to improve patient understanding of CT radiation risk. This study used a focus group approach to collect concepts about radiation risk exposure from a national sample of EPs and a local sample of emergency department (ED) patients. A directed content analysis used an a priori medical ethics framework to explore themes from the focus groups while a subsequent normative ethics analysis compared these results with existing perceptions about discussing CT radiation risk. Focus groups (three each for a total of 19 EPs and 27 patients) identified concepts consistent with core medical ethics principles: patients emphasized autonomy and nonmaleficence more than physicians, while physicians emphasized beneficence. Subjects' knowledge of radiation dose and risk were equivalent to previously published reports. When asked about whether they should talk about radiation with patients, 74% of EPs reported that radiation exposure should be discussed, but the study EPs self-reported doing so with only an average of 24% of patients. Patients reported wanting to hear about radiation from their physicians the next time they need CT scans and thought that a written handout would work better than any other method. When presented with options for how to discuss risk with patients, EPs reported needing easy access to risk information and preferred discussion over other communications approaches, but had mixed support of distributing patient handouts. The normative view that radiation from diagnostic CT should be discussed in the ED is shared by patients and physicians, but is challenged by the lack of a structured method to communicate CT radiation risk to ED patients. Our analysis identifies promising interest among physicians and patients to use information guides and electronic order prompts as potential informational tools to overcome this barrier. © 2014 by the Society for Academic Emergency Medicine.
Syazwan, AI; Rafee, B Mohd; Juahir, Hafizan; Azman, AZF; Nizar, AM; Izwyn, Z; Syahidatussyakirah, K; Muhaimin, AA; Yunos, MA Syafiq; Anita, AR; Hanafiah, J Muhamad; Shaharuddin, MS; Ibthisham, A Mohd; Hasmadi, I Mohd; Azhar, MN Mohamad; Azizan, HS; Zulfadhli, I; Othman, J; Rozalini, M; Kamarul, FT
2012-01-01
Purpose To analyze and characterize a multidisciplinary, integrated indoor air quality checklist for evaluating the health risk of building occupants in a nonindustrial workplace setting. Design A cross-sectional study based on a participatory occupational health program conducted by the National Institute of Occupational Safety and Health (Malaysia) and Universiti Putra Malaysia. Method A modified version of the indoor environmental checklist published by the Department of Occupational Health and Safety, based on the literature and discussion with occupational health and safety professionals, was used in the evaluation process. Summated scores were given according to the cluster analysis and principal component analysis in the characterization of risk. Environmetric techniques was used to classify the risk of variables in the checklist. Identification of the possible source of item pollutants was also evaluated from a semiquantitative approach. Result Hierarchical agglomerative cluster analysis resulted in the grouping of factorial components into three clusters (high complaint, moderate-high complaint, moderate complaint), which were further analyzed by discriminant analysis. From this, 15 major variables that influence indoor air quality were determined. Principal component analysis of each cluster revealed that the main factors influencing the high complaint group were fungal-related problems, chemical indoor dispersion, detergent, renovation, thermal comfort, and location of fresh air intake. The moderate-high complaint group showed significant high loading on ventilation, air filters, and smoking-related activities. The moderate complaint group showed high loading on dampness, odor, and thermal comfort. Conclusion This semiquantitative assessment, which graded risk from low to high based on the intensity of the problem, shows promising and reliable results. It should be used as an important tool in the preliminary assessment of indoor air quality and as a categorizing method for further IAQ investigations and complaints procedures. PMID:23055779
76 FR 28102 - Notice of Issuance of Regulatory Guide
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-13
..., Probabilistic Risk Assessment Branch, Division of Risk Analysis, Office of Nuclear Regulatory Research, U.S... approaches and methods (whether quantitative or qualitative, deterministic or probabilistic), data, and... uses in evaluating specific problems or postulated accidents, and data that the staff needs in its...
Private participation in infrastructure: A risk analysis of long-term contracts in power sector
NASA Astrophysics Data System (ADS)
Ceran, Nisangul
The objective of this dissertation is to assess whether the private participation in energy sector through long term contracting, such as Build-Operate-Transfer (BOT) type investments, is an efficient way of promoting efficiency in the economy. To this end; the theoretical literature on the issue is discussed, the experience of several developing countries are examined, and a BOT project, which is undertaken by the Enron company in Turkey, has been studied in depth as a case study. Different risk analysis techniques, including sensitivity and probabilistic risk analysis with the Monte Carlo Simulation (MCS) method have been applied to assess the financial feasibility and risks of the case study project, and to shed light on the level of rent-seeking in the BOT agreements. Although data on rent seeking and corruption is difficult to obtain, the analysis of case study investment using the sensitivity and MCS method provided some information that can be used in assessing the level of rent-seeking in BOT projects. The risk analysis enabled to test the sustainability of the long-term BOT contracts through the analysis of projects financial feasibility with and without the government guarantees in the project. The approach of testing the sustainability of the project under different scenarios is helpful to understand the potential costs and contingent liabilities for the government and project's impact on a country's overall economy. The results of the risk analysis made by the MCS method for the BOT project used as the case study strongly suggest that, the BOT projects does not serve to the interest of the society and transfers substantial amount of public money to the private companies, implying severe governance problems. It is found that not only government but also private sector may be reluctant about full privatization of infrastructure due to several factors such as involvement of large sunk costs, very long time period for returns to be received, political and macroeconomic uncertainties and insufficient institutional and regulatory environment. It is concluded that the BOT type infrastructure projects are not an efficient way of promoting private sector participation in infrastructure. They tend to serve the interest of rent-seekers rather than the interest of the society. Since concession contracts and Treasury guarantees shift the commercial risk to government, the private sector has no incentive to be efficient. The concession agreements distort the market conditions by preventing free completion in the market.
ERIC Educational Resources Information Center
Gibson-Young, Linda; Martinasek, Mary P.; Clutter, Michiko; Forrest, Jamie
2014-01-01
Background: Adolescents with asthma are at risk for psychological and behavioral problems. The aim of this study was to determine whether high school students with asthma are at increased risk for bullying in school and cyberspace, and to explore the role of depressive symptoms in moderating this association. Methods: A secondary data analysis was…
Takeuchi, Yoshinori; Shinozaki, Tomohiro; Matsuyama, Yutaka
2018-01-08
Despite the frequent use of self-controlled methods in pharmacoepidemiological studies, the factors that may bias the estimates from these methods have not been adequately compared in real-world settings. Here, we comparatively examined the impact of a time-varying confounder and its interactions with time-invariant confounders, time trends in exposures and events, restrictions, and misspecification of risk period durations on the estimators from three self-controlled methods. This study analyzed self-controlled case series (SCCS), case-crossover (CCO) design, and sequence symmetry analysis (SSA) using simulated and actual electronic medical records datasets. We evaluated the performance of the three self-controlled methods in simulated cohorts for the following scenarios: 1) time-invariant confounding with interactions between the confounders, 2) time-invariant and time-varying confounding without interactions, 3) time-invariant and time-varying confounding with interactions among the confounders, 4) time trends in exposures and events, 5) restricted follow-up time based on event occurrence, and 6) patient restriction based on event history. The sensitivity of the estimators to misspecified risk period durations was also evaluated. As a case study, we applied these methods to evaluate the risk of macrolides on liver injury using electronic medical records. In the simulation analysis, time-varying confounding produced bias in the SCCS and CCO design estimates, which aggravated in the presence of interactions between the time-invariant and time-varying confounders. The SCCS estimates were biased by time trends in both exposures and events. Erroneously short risk periods introduced bias to the CCO design estimate, whereas erroneously long risk periods introduced bias to the estimates of all three methods. Restricting the follow-up time led to severe bias in the SSA estimates. The SCCS estimates were sensitive to patient restriction. The case study showed that although macrolide use was significantly associated with increased liver injury occurrence in all methods, the value of the estimates varied. The estimations of the three self-controlled methods depended on various underlying assumptions, and the violation of these assumptions may cause non-negligible bias in the resulting estimates. Pharmacoepidemiologists should select the appropriate self-controlled method based on how well the relevant key assumptions are satisfied with respect to the available data.
Recent developments of the NESSUS probabilistic structural analysis computer program
NASA Technical Reports Server (NTRS)
Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.
1992-01-01
The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.
Spatial analysis on human brucellosis incidence in mainland China: 2004–2010
Zhang, Junhui; Yin, Fei; Zhang, Tao; Yang, Chao; Zhang, Xingyu; Feng, Zijian; Li, Xiaosong
2014-01-01
Objectives China has experienced a sharply increasing rate of human brucellosis in recent years. Effective spatial monitoring of human brucellosis incidence is very important for successful implementation of control and prevention programmes. The purpose of this paper is to apply exploratory spatial data analysis (ESDA) methods and the empirical Bayes (EB) smoothing technique to monitor county-level incidence rates for human brucellosis in mainland China from 2004 to 2010 by examining spatial patterns. Methods ESDA methods were used to characterise spatial patterns of EB smoothed incidence rates for human brucellosis based on county-level data obtained from the China Information System for Disease Control and Prevention (CISDCP) in mainland China from 2004 to 2010. Results EB smoothed incidence rates for human brucellosis were spatially dependent during 2004–2010. The local Moran test identified significantly high-risk clusters of human brucellosis (all p values <0.01), which persisted during the 7-year study period. High-risk counties were centred in the Inner Mongolia Autonomous Region and other Northern provinces (ie, Hebei, Shanxi, Jilin and Heilongjiang provinces) around the border with the Inner Mongolia Autonomous Region where animal husbandry was highly developed. The number of high-risk counties increased from 25 in 2004 to 54 in 2010. Conclusions ESDA methods and the EB smoothing technique can assist public health officials in identifying high-risk areas. Allocating more resources to high-risk areas is an effective way to reduce human brucellosis incidence. PMID:24713215
NASA Astrophysics Data System (ADS)
Escuder-Bueno, I.; Castillo-Rodríguez, J. T.; Zechner, S.; Jöbstl, C.; Perales-Momparler, S.; Petaccia, G.
2012-09-01
Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009-2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative). First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.
Yue, Wencong; Cai, Yanpeng; Xu, Linyu; Yang, Zhifeng; Yin, Xin'An; Su, Meirong
2017-07-11
To improve the capabilities of conventional methodologies in facilitating industrial water allocation under uncertain conditions, an integrated approach was developed through the combination of operational research, uncertainty analysis, and violation risk analysis methods. The developed approach can (a) address complexities of industrial water resources management (IWRM) systems, (b) facilitate reflections of multiple uncertainties and risks of the system and incorporate them into a general optimization framework, and (c) manage robust actions for industrial productions in consideration of water supply capacity and wastewater discharging control. The developed method was then demonstrated in a water-stressed city (i.e., the City of Dalian), northeastern China. Three scenarios were proposed according to the city's industrial plans. The results indicated that in the planning year of 2020 (a) the production of civilian-used steel ships and machine-made paper & paperboard would reduce significantly, (b) violation risk of chemical oxygen demand (COD) discharge under scenario 1 would be the most prominent, compared with those under scenarios 2 and 3, (c) the maximal total economic benefit under scenario 2 would be higher than the benefit under scenario 3, and (d) the production of rolling contact bearing, rail vehicles, and commercial vehicles would be promoted.
Linear regression analysis: part 14 of a series on evaluation of scientific publications.
Schneider, Astrid; Hommel, Gerhard; Blettner, Maria
2010-11-01
Regression analysis is an important statistical method for the analysis of medical data. It enables the identification and characterization of relationships among multiple factors. It also enables the identification of prognostically relevant risk factors and the calculation of risk scores for individual prognostication. This article is based on selected textbooks of statistics, a selective review of the literature, and our own experience. After a brief introduction of the uni- and multivariable regression models, illustrative examples are given to explain what the important considerations are before a regression analysis is performed, and how the results should be interpreted. The reader should then be able to judge whether the method has been used correctly and interpret the results appropriately. The performance and interpretation of linear regression analysis are subject to a variety of pitfalls, which are discussed here in detail. The reader is made aware of common errors of interpretation through practical examples. Both the opportunities for applying linear regression analysis and its limitations are presented.
Improving the use of crop models for risk assessment and climate change adaptation.
Challinor, Andrew J; Müller, Christoph; Asseng, Senthold; Deva, Chetan; Nicklin, Kathryn Jane; Wallach, Daniel; Vanuytrecht, Eline; Whitfield, Stephen; Ramirez-Villegas, Julian; Koehler, Ann-Kristin
2018-01-01
Crop models are used for an increasingly broad range of applications, with a commensurate proliferation of methods. Careful framing of research questions and development of targeted and appropriate methods are therefore increasingly important. In conjunction with the other authors in this special issue, we have developed a set of criteria for use of crop models in assessments of impacts, adaptation and risk. Our analysis drew on the other papers in this special issue, and on our experience in the UK Climate Change Risk Assessment 2017 and the MACSUR, AgMIP and ISIMIP projects. The criteria were used to assess how improvements could be made to the framing of climate change risks, and to outline the good practice and new developments that are needed to improve risk assessment. Key areas of good practice include: i. the development, running and documentation of crop models, with attention given to issues of spatial scale and complexity; ii. the methods used to form crop-climate ensembles, which can be based on model skill and/or spread; iii. the methods used to assess adaptation, which need broadening to account for technological development and to reflect the full range options available. The analysis highlights the limitations of focussing only on projections of future impacts and adaptation options using pre-determined time slices. Whilst this long-standing approach may remain an essential component of risk assessments, we identify three further key components: 1.Working with stakeholders to identify the timing of risks. What are the key vulnerabilities of food systems and what does crop-climate modelling tell us about when those systems are at risk?2.Use of multiple methods that critically assess the use of climate model output and avoid any presumption that analyses should begin and end with gridded output.3.Increasing transparency and inter-comparability in risk assessments. Whilst studies frequently produce ranges that quantify uncertainty, the assumptions underlying these ranges are not always clear. We suggest that the contingency of results upon assumptions is made explicit via a common uncertainty reporting format; and/or that studies are assessed against a set of criteria, such as those presented in this paper.
Kawakubo, Kazumichi; Kawakami, Hiroshi; Toyokawa, Yoshihide; Otani, Koichi; Kuwatani, Masaki; Abe, Yoko; Kawahata, Shuhei; Kubo, Kimitoshi; Kubota, Yoshimasa; Sakamoto, Naoya
2015-01-01
Endoscopic double self-expandable metallic stent (SEMS) placement by the partial stent-in-stent (PSIS) method has been reported to be useful for the management of unresectable hilar malignant biliary obstruction. However, it is technically challenging, and the optimal SEMS for the procedure remains unknown. The aim of this study was to identify the risk factors for technical failure of endoscopic double SEMS placement for unresectable malignant hilar biliary obstruction (MHBO). Between December 2009 and May 2013, 50 consecutive patients with MHBO underwent endoscopic double SEMS placement by the PSIS method. We retrospectively evaluated the rate of successful double SEMS placement and identified the risk factors for technical failure. The technical success rate for double SEMS placement was 82.0% (95% confidence interval [CI]: 69.2-90.2). On univariate analysis, the rate of technical failure was high in patients with metastatic disease and unilateral placement. Multivariate analysis revealed that metastatic disease was a significant risk factor for technical failure (odds ratio: 9.63, 95% CI: 1.11-105.5). The subgroup analysis after double guidewire insertion showed that the rate of technical success was higher in the laser-cut type SEMS with a large mesh and thick delivery system than in the braided type SEMS with a small mesh and thick delivery system. Metastatic disease was a significant risk factor for technical failure of double SEMS placement for unresectable MHBO. The laser-cut type SEMS with a large mesh and thin delivery system might be preferable for the PSIS procedure. © 2014 Japanese Society of Hepato-Biliary-Pancreatic Surgery.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheong, S-K; Kim, J
Purpose: The aim of the study is the application of a Failure Modes and Effects Analysis (FMEA) to access the risks for patients undergoing a Low Dose Rate (LDR) Prostate Brachytherapy Treatment. Methods: FMEA was applied to identify all the sub processes involved in the stages of identifying patient, source handling, treatment preparation, treatment delivery, and post treatment. These processes characterize the radiation treatment associated with LDR Prostate Brachytherapy. The potential failure modes together with their causes and effects were identified and ranked in order of their importance. Three indexes were assigned for each failure mode: the occurrence rating (O),more » the severity rating (S), and the detection rating (D). A ten-point scale was used to score each category, ten being the number indicating most severe, most frequent, and least detectable failure mode, respectively. The risk probability number (RPN) was calculated as a product of the three attributes: RPN = O X S x D. The analysis was carried out by a working group (WG) at UPMC. Results: The total of 56 failure modes were identified including 32 modes before the treatment, 13 modes during the treatment, and 11 modes after the treatment. In addition to the protocols already adopted in the clinical practice, the prioritized risk management will be implanted to the high risk procedures on the basis of RPN score. Conclusion: The effectiveness of the FMEA method was established. The FMEA methodology provides a structured and detailed assessment method for the risk analysis of the LDR Prostate Brachytherapy Procedure and can be applied to other radiation treatment modes.« less
Yang, Ming; Yang, Yuan-Zhen; Wang, Ya-Qi; Wu, Zhen-Feng; Wang, Xue-Cheng; Luo, Jing
2017-03-01
Product quality relies on not only testing methods,but also the design and development, production control and product manufacturing all aspects of logistics management. Quality comes from the process control level.Therefore, it is very important to accurately identify the factors that may induce quality risk in the production process and quality control measures correspondingly.This article systematically analyzes the source of the quality risk of all aspects of the production process in traditional Chinese medicine preparation. Discussing ways and methods of quality risk identification of traditional Chinese medicine preparation and providing references for perfecting the whole process quality management of traditional Chinese medicine preparation. Copyright© by the Chinese Pharmaceutical Association.
NASA Astrophysics Data System (ADS)
Smallwood, Jeremy; Swenson, David E.
2011-06-01
Evaluation of electrostatic performance of footwear and flooring in combination is necessary in applications such as electrostatic discharge (ESD) control in electronics manufacture, evaluation of equipment for avoidance of factory process electrostatic ignition risks and avoidance of electrostatic shocks to personnel in working environments. Typical standards use a walking test in which the voltage produced on a subject is evaluated by identification and measurement of the magnitude of the 5 highest "peaks" and "valleys" of the recorded voltage waveform. This method does not lend itself to effective analysis of the risk that the voltage will exceed a hazard threshold. This paper shows the advantages of voltage probability analysis and recommends that the method is adopted for use in future standards.
Problems With Risk Reclassification Methods for Evaluating Prediction Models
Pepe, Margaret S.
2011-01-01
For comparing the performance of a baseline risk prediction model with one that includes an additional predictor, a risk reclassification analysis strategy has been proposed. The first step is to cross-classify risks calculated according to the 2 models for all study subjects. Summary measures including the percentage of reclassification and the percentage of correct reclassification are calculated, along with 2 reclassification calibration statistics. The author shows that interpretations of the proposed summary measures and P values are problematic. The author's recommendation is to display the reclassification table, because it shows interesting information, but to use alternative methods for summarizing and comparing model performance. The Net Reclassification Index has been suggested as one alternative method. The author argues for reporting components of the Net Reclassification Index because they are more clinically relevant than is the single numerical summary measure. PMID:21555714
NASA Astrophysics Data System (ADS)
Drieniková, Katarína; Hrdinová, Gabriela; Naňo, Tomáš; Sakál, Peter
2010-01-01
The paper deals with the analysis of the theory of corporate social responsibility, risk management and the exact method of analytic hierarchic process that is used in the decision-making processes. The Chapters 2 and 3 focus on presentation of the experience with the application of the method in formulating the stakeholders' strategic goals within the Corporate Social Responsibility (CSR) and simultaneously its utilization in minimizing the environmental risks. The major benefit of this paper is the application of Analytical Hierarchy Process (AHP).
Malicki, Julian; Bly, Ritva; Bulot, Mireille; Godet, Jean-Luc; Jahnen, Andreas; Krengli, Marco; Maingon, Philippe; Prieto Martin, Carlos; Przybylska, Kamila; Skrobała, Agnieszka; Valero, Marc; Jarvinen, Hannu
2017-04-01
To describe the current status of implementation of European directives for risk management in radiotherapy and to assess variability in risk management in the following areas: 1) in-country regulatory framework; 2) proactive risk assessment; (3) reactive analysis of events; and (4) reporting and learning systems. The original data were collected as part of the ACCIRAD project through two online surveys. Risk assessment criteria are closely associated with quality assurance programs. Only 9/32 responding countries (28%) with national regulations reported clear "requirements" for proactive risk assessment and/or reactive risk analysis, with wide variability in assessment methods. Reporting of adverse error events is mandatory in most (70%) but not all surveyed countries. Most European countries have taken steps to implement European directives designed to reduce the probability and magnitude of accidents in radiotherapy. Variability between countries is substantial in terms of legal frameworks, tools used to conduct proactive risk assessment and reactive analysis of events, and in the reporting and learning systems utilized. These findings underscore the need for greater harmonisation in common terminology, classification and reporting practices across Europe to improve patient safety and to enable more reliable inter-country comparisons. Copyright © 2017 Elsevier B.V. All rights reserved.
Decision Support Methods and Tools
NASA Technical Reports Server (NTRS)
Green, Lawrence L.; Alexandrov, Natalia M.; Brown, Sherilyn A.; Cerro, Jeffrey A.; Gumbert, Clyde r.; Sorokach, Michael R.; Burg, Cecile M.
2006-01-01
This paper is one of a set of papers, developed simultaneously and presented within a single conference session, that are intended to highlight systems analysis and design capabilities within the Systems Analysis and Concepts Directorate (SACD) of the National Aeronautics and Space Administration (NASA) Langley Research Center (LaRC). This paper focuses on the specific capabilities of uncertainty/risk analysis, quantification, propagation, decomposition, and management, robust/reliability design methods, and extensions of these capabilities into decision analysis methods within SACD. These disciplines are discussed together herein under the name of Decision Support Methods and Tools. Several examples are discussed which highlight the application of these methods within current or recent aerospace research at the NASA LaRC. Where applicable, commercially available, or government developed software tools are also discussed
2015-06-18
Engineering Effectiveness Survey. CMU/SEI-2012-SR-009. Carnegie Mellon University. November 2012. Field, Andy. Discovering Statistics Using SPSS , 3rd...enough into the survey to begin answering questions on risk practices. All of the data statistical analysis will be performed using SPSS . Prior to...probabilistically using distributions for likelihood and impact. Statistical methods like Monte Carlo can more comprehensively evaluate the cost and
Probabilistic framework for product design optimization and risk management
NASA Astrophysics Data System (ADS)
Keski-Rahkonen, J. K.
2018-05-01
Probabilistic methods have gradually gained ground within engineering practices but currently it is still the industry standard to use deterministic safety margin approaches to dimensioning components and qualitative methods to manage product risks. These methods are suitable for baseline design work but quantitative risk management and product reliability optimization require more advanced predictive approaches. Ample research has been published on how to predict failure probabilities for mechanical components and furthermore to optimize reliability through life cycle cost analysis. This paper reviews the literature for existing methods and tries to harness their best features and simplify the process to be applicable in practical engineering work. Recommended process applies Monte Carlo method on top of load-resistance models to estimate failure probabilities. Furthermore, it adds on existing literature by introducing a practical framework to use probabilistic models in quantitative risk management and product life cycle costs optimization. The main focus is on mechanical failure modes due to the well-developed methods used to predict these types of failures. However, the same framework can be applied on any type of failure mode as long as predictive models can be developed.
Schwandt, Hilary M; Skinner, Joanna; Hebert, Luciana E; Saad, Abdulmumin
2015-12-01
Research shows that side effects are often the most common reason for contraceptive non-use in Nigeria; however, research to date has not explored the underlying factors that influence risk and benefit perceptions associated with specific contraceptive methods in Nigeria. A qualitative study design using focus group discussions was used to explore social attitudes and beliefs about family planning methods in Ibadan and Kaduna, Nigeria. A total of 26 focus group discussions were held in 2010 with men and women of reproductive age, disaggregated by city, sex, age, marital status, neighborhood socioeconomic status, and--for women only--family planning experience. A discussion guide was used that included specific questions about the perceived risks and benefits associated with the use of six different family planning methods. A thematic content analytic approach guided the analysis. Participants identified a spectrum of risks encompassing perceived threats to health (both real and fictitious) and social concerns, as well as benefits associated with each method. By exploring Nigerian perspectives on the risks and benefits associated with specific family planning methods, programs aiming to increase contraceptive use in Nigeria can be better equipped to highlight recognized benefits, address specific concerns, and work to dispel misperceptions associated with each family planning method.
Cut set-based risk and reliability analysis for arbitrarily interconnected networks
Wyss, Gregory D.
2000-01-01
Method for computing all-terminal reliability for arbitrarily interconnected networks such as the United States public switched telephone network. The method includes an efficient search algorithm to generate minimal cut sets for nonhierarchical networks directly from the network connectivity diagram. Efficiency of the search algorithm stems in part from its basis on only link failures. The method also includes a novel quantification scheme that likewise reduces computational effort associated with assessing network reliability based on traditional risk importance measures. Vast reductions in computational effort are realized since combinatorial expansion and subsequent Boolean reduction steps are eliminated through analysis of network segmentations using a technique of assuming node failures to occur on only one side of a break in the network, and repeating the technique for all minimal cut sets generated with the search algorithm. The method functions equally well for planar and non-planar networks.
NASA Astrophysics Data System (ADS)
Gordon, K.; Houser, T.; Kopp, R. E., III; Hsiang, S. M.; Larsen, K.; Jina, A.; Delgado, M.; Muir-Wood, R.; Rasmussen, D.; Rising, J.; Mastrandrea, M.; Wilson, P. S.
2014-12-01
The United States faces a range of economic risks from global climate change - from increased flooding and storm damage, to climate-driven changes in crop yields and labor productivity, to heat-related strains on energy and public health systems. The Risky Business Project commissioned a groundbreaking new analysis of these and other climate risks by region of the country and sector of the economy. The American Climate Prospectus (ACP) links state-of-the-art climate models with econometric research of human responses to climate variability and cutting edge private sector risk assessment tools, the ACP offers decision-makers a data driven assessment of the specific risks they face. We describe the challenge, methods, findings, and policy implications of the national risk analysis, with particular focus on methodological innovations and novel insights.
A Simplified Approach to Risk Assessment Based on System Dynamics: An Industrial Case Study.
Garbolino, Emmanuel; Chery, Jean-Pierre; Guarnieri, Franck
2016-01-01
Seveso plants are complex sociotechnical systems, which makes it appropriate to support any risk assessment with a model of the system. However, more often than not, this step is only partially addressed, simplified, or avoided in safety reports. At the same time, investigations have shown that the complexity of industrial systems is frequently a factor in accidents, due to interactions between their technical, human, and organizational dimensions. In order to handle both this complexity and changes in the system over time, this article proposes an original and simplified qualitative risk evaluation method based on the system dynamics theory developed by Forrester in the early 1960s. The methodology supports the development of a dynamic risk assessment framework dedicated to industrial activities. It consists of 10 complementary steps grouped into two main activities: system dynamics modeling of the sociotechnical system and risk analysis. This system dynamics risk analysis is applied to a case study of a chemical plant and provides a way to assess the technological and organizational components of safety. © 2016 Society for Risk Analysis.
Chen, Keping; Blong, Russell; Jacobson, Carol
2003-04-01
This paper develops a GIS-based integrated approach to risk assessment in natural hazards, with reference to bushfires. The challenges for undertaking this approach have three components: data integration, risk assessment tasks, and risk decision-making. First, data integration in GIS is a fundamental step for subsequent risk assessment tasks and risk decision-making. A series of spatial data integration issues within GIS such as geographical scales and data models are addressed. Particularly, the integration of both physical environmental data and socioeconomic data is examined with an example linking remotely sensed data and areal census data in GIS. Second, specific risk assessment tasks, such as hazard behavior simulation and vulnerability assessment, should be undertaken in order to understand complex hazard risks and provide support for risk decision-making. For risk assessment tasks involving heterogeneous data sources, the selection of spatial analysis units is important. Third, risk decision-making concerns spatial preferences and/or patterns, and a multicriteria evaluation (MCE)-GIS typology for risk decision-making is presented that incorporates three perspectives: spatial data types, data models, and methods development. Both conventional MCE methods and artificial intelligence-based methods with GIS are identified to facilitate spatial risk decision-making in a rational and interpretable way. Finally, the paper concludes that the integrated approach can be used to assist risk management of natural hazards, in theory and in practice.
Nevo, Daniel; Nishihara, Reiko; Ogino, Shuji; Wang, Molin
2017-08-04
In the analysis of time-to-event data with multiple causes using a competing risks Cox model, often the cause of failure is unknown for some of the cases. The probability of a missing cause is typically assumed to be independent of the cause given the time of the event and covariates measured before the event occurred. In practice, however, the underlying missing-at-random assumption does not necessarily hold. Motivated by colorectal cancer molecular pathological epidemiology analysis, we develop a method to conduct valid analysis when additional auxiliary variables are available for cases only. We consider a weaker missing-at-random assumption, with missing pattern depending on the observed quantities, which include the auxiliary covariates. We use an informative likelihood approach that will yield consistent estimates even when the underlying model for missing cause of failure is misspecified. The superiority of our method over naive methods in finite samples is demonstrated by simulation study results. We illustrate the use of our method in an analysis of colorectal cancer data from the Nurses' Health Study cohort, where, apparently, the traditional missing-at-random assumption fails to hold.
Cui, Xueliang; Chen, Hui; Rui, Yunfeng; Niu, Yang; Li, He
2018-01-01
Objectives Two-stage open reduction and internal fixation (ORIF) and limited internal fixation combined with external fixation (LIFEF) are two widely used methods to treat Pilon injury. However, which method is superior to the other remains controversial. This meta-analysis was performed to quantitatively compare two-stage ORIF and LIFEF and clarify which method is better with respect to postoperative complications in the treatment of tibial Pilon fractures. Methods We conducted a meta-analysis to quantitatively compare the postoperative complications between two-stage ORIF and LIFEF. Eight studies involving 360 fractures in 359 patients were included in the meta-analysis. Results The two-stage ORIF group had a significantly lower risk of superficial infection, nonunion, and bone healing problems than the LIFEF group. However, no significant differences in deep infection, delayed union, malunion, arthritis symptoms, or chronic osteomyelitis were found between the two groups. Conclusion Two-stage ORIF was associated with a lower risk of postoperative complications with respect to superficial infection, nonunion, and bone healing problems than LIFEF for tibial Pilon fractures. Level of evidence 2.
Coffee consumption and risk of fractures: a meta-analysis
Liu, Huifang; Yao, Ke; Zhang, Wenjie; Zhou, Jun; Wu, Taixiang
2012-01-01
Introduction Recent studies have indicated higher risk of fractures among coffee drinkers. To quantitatively assess the association between coffee consumption and the risk of fractures, we conducted this meta-analysis. Material and methods We searched MEDLINE and EMBASE for prospective studies reporting the risk of fractures with coffee consumption. Quality of included studies was assessed with the Newcastle Ottawa scale. We conducted a meta-analysis and a cumulative meta-analysis of relative risk (RR) for an increment of one cup of coffee per day, and explored the potential dose-response relationship. Sensitivity analysis was performed where statistical heterogeneity existed. Results We included 10 prospective studies covering 214,059 participants and 9,597 cases. There was overall 3.5% higher fracture risk for an increment of one cup of coffee per day (RR = 1.035, 95% CI: 1.019-1.052). Pooled RRs were 1.049 (95% CI: 1.022-1.077) for women and 0.910 (95% CI: 0.873-0.949) for men. Among women, RR was 1.055 (95% CI: 0.999-1.114) for younger participants, and 1.047 (95% CI: 1.016-1.080) for older ones. Cumulative meta-analysis indicated that risk estimates reached a stabilization level (RR = 1.035, 95% CI: 1.019-1.052), and it revealed a positive dose-response relationship between coffee consumption and risk of fractures either for men and women combined or women specifically. Conclusions This meta-analysis suggests an overall harm of coffee intake in increasing the risk of fractures, especially for women. But current data are insufficient to reach a convincing conclusion and further research needs to be conducted. PMID:23185185
Nevo, Daniel; Zucker, David M; Tamimi, Rulla M; Wang, Molin
2016-12-30
A common paradigm in dealing with heterogeneity across tumors in cancer analysis is to cluster the tumors into subtypes using marker data on the tumor, and then to analyze each of the clusters separately. A more specific target is to investigate the association between risk factors and specific subtypes and to use the results for personalized preventive treatment. This task is usually carried out in two steps-clustering and risk factor assessment. However, two sources of measurement error arise in these problems. The first is the measurement error in the biomarker values. The second is the misclassification error when assigning observations to clusters. We consider the case with a specified set of relevant markers and propose a unified single-likelihood approach for normally distributed biomarkers. As an alternative, we consider a two-step procedure with the tumor type misclassification error taken into account in the second-step risk factor analysis. We describe our method for binary data and also for survival analysis data using a modified version of the Cox model. We present asymptotic theory for the proposed estimators. Simulation results indicate that our methods significantly lower the bias with a small price being paid in terms of variance. We present an analysis of breast cancer data from the Nurses' Health Study to demonstrate the utility of our method. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Occupational risk assessment in the construction industry in Iran.
Seifi Azad Mard, Hamid Reza; Estiri, Ali; Hadadi, Parinaz; Seifi Azad Mard, Mahshid
2017-12-01
Occupational accidents in the construction industry are more common compared with other fields and these accidents are more severe compared with the global average in developing countries, especially in Iran. Studies which lead to the source of these accidents and suggest solutions for them are therefore valuable. In this study a combination of the failure mode and effects analysis method and fuzzy theory is used as a semi-qualitative-quantitative method for analyzing risks and failure modes. The main causes of occupational accidents in this field were identified and analyzed based on three factors; severity, detection and occurrence. Based on whether the risks are high or low priority, modifying actions were suggested to reduce the occupational risks. Finally, the results showed that high priority risks had a 40% decrease due to these actions.
Bonnabry, P; Cingria, L; Sadeghipour, F; Ing, H; Fonzo-Christe, C; Pfister, R E
2005-04-01
Until recently, the preparation of paediatric parenteral nutrition formulations in our institution included re-transcription and manual compounding of the mixture. Although no significant clinical problems have occurred, re-engineering of this high risk activity was undertaken to improve its safety. Several changes have been implemented including new prescription software, direct recording on a server, automatic printing of the labels, and creation of a file used to pilot a BAXA MM 12 automatic compounder. The objectives of this study were to compare the risks associated with the old and new processes, to quantify the improved safety with the new process, and to identify the major residual risks. A failure modes, effects, and criticality analysis (FMECA) was performed by a multidisciplinary team. A cause-effect diagram was built, the failure modes were defined, and the criticality index (CI) was determined for each of them on the basis of the likelihood of occurrence, the severity of the potential effect, and the detection probability. The CIs for each failure mode were compared for the old and new processes and the risk reduction was quantified. The sum of the CIs of all 18 identified failure modes was 3415 for the old process and 1397 for the new (reduction of 59%). The new process reduced the CIs of the different failure modes by a mean factor of 7. The CI was smaller with the new process for 15 failure modes, unchanged for two, and slightly increased for one. The greatest reduction (by a factor of 36) concerned re-transcription errors, followed by readability problems (by a factor of 30) and chemical cross contamination (by a factor of 10). The most critical steps in the new process were labelling mistakes (CI 315, maximum 810), failure to detect a dosage or product mistake (CI 288), failure to detect a typing error during the prescription (CI 175), and microbial contamination (CI 126). Modification of the process resulted in a significant risk reduction as shown by risk analysis. Residual failure opportunities were also quantified, allowing additional actions to be taken to reduce the risk of labelling mistakes. This study illustrates the usefulness of prospective risk analysis methods in healthcare processes. More systematic use of risk analysis is needed to guide continuous safety improvement of high risk activities.
The Importance of Human Reliability Analysis in Human Space Flight: Understanding the Risks
NASA Technical Reports Server (NTRS)
Hamlin, Teri L.
2010-01-01
HRA is a method used to describe, qualitatively and quantitatively, the occurrence of human failures in the operation of complex systems that affect availability and reliability. Modeling human actions with their corresponding failure in a PRA (Probabilistic Risk Assessment) provides a more complete picture of the risk and risk contributions. A high quality HRA can provide valuable information on potential areas for improvement, including training, procedural, equipment design and need for automation.
Multidimensional Risk Analysis: MRISK
NASA Technical Reports Server (NTRS)
McCollum, Raymond; Brown, Douglas; O'Shea, Sarah Beth; Reith, William; Rabulan, Jennifer; Melrose, Graeme
2015-01-01
Multidimensional Risk (MRISK) calculates the combined multidimensional score using Mahalanobis distance. MRISK accounts for covariance between consequence dimensions, which de-conflicts the interdependencies of consequence dimensions, providing a clearer depiction of risks. Additionally, in the event the dimensions are not correlated, Mahalanobis distance reduces to Euclidean distance normalized by the variance and, therefore, represents the most flexible and optimal method to combine dimensions. MRISK is currently being used in NASA's Environmentally Responsible Aviation (ERA) project o assess risk and prioritize scarce resources.
A review and critique of some models used in competing risk analysis.
Gail, M
1975-03-01
We have introduced a notation which allows one to define competing risk models easily and to examine underlying assumptions. We have treated the actuarial model for competing risk in detail, comparing it with other models and giving useful variance formulae both for the case when times of death are available and for the case when they are not. The generality of these methods is illustrated by an example treating two dependent competing risks.
A risk analysis for production processes with disposable bioreactors.
Merseburger, Tobias; Pahl, Ina; Müller, Daniel; Tanner, Markus
2014-01-01
: Quality management systems are, as a rule, tightly defined systems that conserve existing processes and therefore guarantee compliance with quality standards. But maintaining quality also includes introducing new enhanced production methods and making use of the latest findings of bioscience. The advances in biotechnology and single-use manufacturing methods for producing new drugs especially impose new challenges on quality management, as quality standards have not yet been set. New methods to ensure patient safety have to be established, as it is insufficient to rely only on current rules. A concept of qualification, validation, and manufacturing procedures based on risk management needs to be established and realized in pharmaceutical production. The chapter starts with an introduction to the regulatory background of the manufacture of medicinal products. It then continues with key methods of risk management. Hazards associated with the production of medicinal products with single-use equipment are described with a focus on bioreactors, storage containers, and connecting devices. The hazards are subsequently evaluated and criteria for risk evaluation are presented. This chapter concludes with aspects of industrial application of quality risk management.
NASA Astrophysics Data System (ADS)
Kostyuchenko, Yuriy V.; Sztoyka, Yulia; Kopachevsky, Ivan; Artemenko, Igor; Yuschenko, Maxim
2017-10-01
Multi-model approach for remote sensing data processing and interpretation is described. The problem of satellite data utilization in multi-modeling approach for socio-ecological risks assessment is formally defined. Observation, measurement and modeling data utilization method in the framework of multi-model approach is described. Methodology and models of risk assessment in framework of decision support approach are defined and described. Method of water quality assessment using satellite observation data is described. Method is based on analysis of spectral reflectance of aquifers. Spectral signatures of freshwater bodies and offshores are analyzed. Correlations between spectral reflectance, pollutions and selected water quality parameters are analyzed and quantified. Data of MODIS, MISR, AIRS and Landsat sensors received in 2002-2014 have been utilized verified by in-field spectrometry and lab measurements. Fuzzy logic based approach for decision support in field of water quality degradation risk is discussed. Decision on water quality category is making based on fuzzy algorithm using limited set of uncertain parameters. Data from satellite observations, field measurements and modeling is utilizing in the framework of the approach proposed. It is shown that this algorithm allows estimate water quality degradation rate and pollution risks. Problems of construction of spatial and temporal distribution of calculated parameters, as well as a problem of data regularization are discussed. Using proposed approach, maps of surface water pollution risk from point and diffuse sources are calculated and discussed.
Cyber security risk assessment for SCADA and DCS networks.
Ralston, P A S; Graham, J H; Hieb, J L
2007-10-01
The growing dependence of critical infrastructures and industrial automation on interconnected physical and cyber-based control systems has resulted in a growing and previously unforeseen cyber security threat to supervisory control and data acquisition (SCADA) and distributed control systems (DCSs). It is critical that engineers and managers understand these issues and know how to locate the information they need. This paper provides a broad overview of cyber security and risk assessment for SCADA and DCS, introduces the main industry organizations and government groups working in this area, and gives a comprehensive review of the literature to date. Major concepts related to the risk assessment methods are introduced with references cited for more detail. Included are risk assessment methods such as HHM, IIM, and RFRM which have been applied successfully to SCADA systems with many interdependencies and have highlighted the need for quantifiable metrics. Presented in broad terms is probability risk analysis (PRA) which includes methods such as FTA, ETA, and FEMA. The paper concludes with a general discussion of two recent methods (one based on compromise graphs and one on augmented vulnerability trees) that quantitatively determine the probability of an attack, the impact of the attack, and the reduction in risk associated with a particular countermeasure.
Milá, Lorely; Valdés, Rodolfo; Tamayo, Andrés; Padilla, Sigifredo; Ferro, Williams
2012-03-01
CB.Hep-1 monoclonal antibody (mAb) is used for a recombinant Hepatitis B vaccine manufacturing, which is included in a worldwide vaccination program against Hepatitis B disease. The use of this mAb as immunoligand has been addressed into one of the most efficient steps of active pharmaceutical ingredient purification process. Regarding this, Quality Risk Management (QRM) provides an excellent framework for the risk management use in pharmaceutical manufacturing and quality decision-making applications. Consequently, this study sought applying a prospective risk analysis methodology Failure Mode Effects Analysis (FMEA) as QRM tool for analyzing different CB.Hep-1 mAb manufacturing technologies. As main conclusions FMEA was successfully used to assess risks associated with potential problems in CB.Hep-1 mAb manufacturing processes. The severity and occurrence of risks analysis evidenced that the percentage of very high severe risks ranged 31.0-38.7% of all risks and the huge majority of risks have a very low occurrence level (61.9-83.3%) in all assessed technologies. Finally, additive Risk Priority Number, was descending ordered as follow: transgenic plants (2636), ascites (2577), transgenic animals (2046) and hollow fiber bioreactors (1654), which also corroborated that in vitro technology, should be the technology of choice for CB.Hep-1 mAb manufacturing in terms of risks and mAb molecule quality. Copyright © 2011 The International Alliance for Biological Standardization. Published by Elsevier Ltd. All rights reserved.
Woo, Hae Dong; Kim, Jeongseon
2012-01-01
Good biomarkers for early detection of cancer lead to better prognosis. However, harvesting tumor tissue is invasive and cannot be routinely performed. Global DNA methylation of peripheral blood leukocyte DNA was evaluated as a biomarker for cancer risk. We performed a meta-analysis to estimate overall cancer risk according to global DNA hypomethylation levels among studies with various cancer types and analytical methods used to measure DNA methylation. Studies were systemically searched via PubMed with no language limitation up to July 2011. Summary estimates were calculated using a fixed effects model. The subgroup analyses by experimental methods to determine DNA methylation level were performed due to heterogeneity within the selected studies (p<0.001, I(2): 80%). Heterogeneity was not found in the subgroup of %5-mC (p = 0.393, I(2): 0%) and LINE-1 used same target sequence (p = 0.097, I(2): 49%), whereas considerable variance remained in LINE-1 (p<0.001, I(2): 80%) and bladder cancer studies (p = 0.016, I(2): 76%). These results suggest that experimental methods used to quantify global DNA methylation levels are important factors in the association study between hypomethylation levels and cancer risk. Overall, cancer risks of the group with the lowest DNA methylation levels were significantly higher compared to the group with the highest methylation levels [OR (95% CI): 1.48 (1.28-1.70)]. Global DNA hypomethylation in peripheral blood leukocytes may be a suitable biomarker for cancer risk. However, the association between global DNA methylation and cancer risk may be different based on experimental methods, and region of DNA targeted for measuring global hypomethylation levels as well as the cancer type. Therefore, it is important to select a precise and accurate surrogate marker for global DNA methylation levels in the association studies between global DNA methylation levels in peripheral leukocyte and cancer risk.
Use of Model-Based Design Methods for Enhancing Resiliency Analysis of Unmanned Aerial Vehicles
NASA Astrophysics Data System (ADS)
Knox, Lenora A.
The most common traditional non-functional requirement analysis is reliability. With systems becoming more complex, networked, and adaptive to environmental uncertainties, system resiliency has recently become the non-functional requirement analysis of choice. Analysis of system resiliency has challenges; which include, defining resilience for domain areas, identifying resilience metrics, determining resilience modeling strategies, and understanding how to best integrate the concepts of risk and reliability into resiliency. Formal methods that integrate all of these concepts do not currently exist in specific domain areas. Leveraging RAMSoS, a model-based reliability analysis methodology for Systems of Systems (SoS), we propose an extension that accounts for resiliency analysis through evaluation of mission performance, risk, and cost using multi-criteria decision-making (MCDM) modeling and design trade study variability modeling evaluation techniques. This proposed methodology, coined RAMSoS-RESIL, is applied to a case study in the multi-agent unmanned aerial vehicle (UAV) domain to investigate the potential benefits of a mission architecture where functionality to complete a mission is disseminated across multiple UAVs (distributed) opposed to being contained in a single UAV (monolithic). The case study based research demonstrates proof of concept for the proposed model-based technique and provides sufficient preliminary evidence to conclude which architectural design (distributed vs. monolithic) is most resilient based on insight into mission resilience performance, risk, and cost in addition to the traditional analysis of reliability.
Shi, Nianmin; Lu, Qiang; Zhang, Jiao; Li, Li; Zhang, Junnan; Zhang, Fanglei; Dong, Yanhong; Zhang, Xinyue; Zhang, Zheng; Gao, Wenhui
2017-06-03
This study aims to prevent persistentinfection, reduce the incidence of cervical cancer, and improve women's health by understanding the theoretical basis of the risk factors for continuous infection of asymptomatic women with high-risk human papilloma virus (HPV) strains via information collected, which includes the persistent infection rate and the most prevalent HPV strain types of high risk to asymptomatic women in the high-risk area of cervical cancer in Linfen, Shanxi Province. Based on the method of cluster sampling, locations were chosen from the industrial county and agricultural county of Linfen, Shanxi Province, namely the Xiangfen and Quwo counties. Use of the convenience sampling (CS) method enables the identification of women who have sex but without symptoms of abnormal cervix for analyzing risk factors of HPV-DNA detection and performing a retrospective questionnaire survey in these 2 counties. Firstly, cervical exfoliated cell samples were collected for thin-layer liquid-based cytology test (TCT), and simultaneously testing high-risk type HPV DNA, then samples with positive testing results were retested to identify the infected HPV types. The 6-month period of testing was done to derive the 6-month persistent infection rate. The retrospective survey included concepts addressed in the questionnaire: basic situation of the research objects, menstrual history, marital status, pregnancy history, sexual habits and other aspects. The questionnaire was divided into a case group and a comparison group, which are based on the high-risk HPV-DNA testing result to ascertain whether or not there is persistent infection. Statistical analysis employed Epidate3.1 software for date entry, SPSS17.0 for date statistical analysis. Select statistic charts, Chi-Square Analysis, single-factor analysis and multivariate Logistic regression analysis to analyze the protective factors and risk factors of high-risk HPV infection. Risk factors are predicted by using the classification tree. 3000 women participated in the study. The high-risk type HPV infection rate was 8.7%, the persistent infection rate was 7.5%. The persistent infection rates for the 2 age groups (ages 18-26 and 27-30) were 6.9% and 8.7%. The persistent infection rates of Xiangfen county were 7.4% and 7.4% respectively, and those of Quwo county were 7.8% and 11.6% respectively; there was no significant difference between each pair of groups. Single risk-factor analysis showed that first-time sex at age under 20, high school/technical secondary school education or above, multiple sexual partners, having more than 2 sexual partners in the past 6 months, oral sex, and colitis are the risk factors of high-risk type HPV infection. Multivariate analysis showed that the number of sexual partners, smoking and oral sex had an effect on HPV infection. The risk of HPV infection from smoking was 5.0-fold higher, and the risk of HPV infection from oral sex was 6.1-fold higher. Having more than 2 sexual partners increase the risk of HPV infection. By the predicated model analysis, the probability of HPV conveyed by oral sex was 14.8%; if the sexual companion number was zero or more than 2 without oral sex, the probability of HPV infection was 12.1%; if there was one sexual partner who smokes without oral sex, the probability of infection was 18.6%; if there was one sexual partner who does not smoke and without oral sex, the probability of infection was 3.6%. The persistent infection rate of asymptomatic women for high-risk type HPV is lower than those women in all ages. High-risk type HPV infection risk factors include the number of sexual partners, oral sex and smoking. Thus, young women may be able to reduce the risk of infection with high-risk type HPV by reducing the number of sexual partners, forming a correct sexual life habit, and avoiding smoking.
Quantitative influence of risk factors on blood glucose level.
Chen, Songjing; Luo, Senlin; Pan, Limin; Zhang, Tiemei; Han, Longfei; Zhao, Haixiu
2014-01-01
The aim of this study is to quantitatively analyze the influence of risk factors on the blood glucose level, and to provide theory basis for understanding the characteristics of blood glucose change and confirming the intervention index for type 2 diabetes. The quantitative method is proposed to analyze the influence of risk factors on blood glucose using back propagation (BP) neural network. Ten risk factors are screened first. Then the cohort is divided into nine groups by gender and age. According to the minimum error principle, nine BP models are trained respectively. The quantitative values of the influence of different risk factors on the blood glucose change can be obtained by sensitivity calculation. The experiment results indicate that weight is the leading cause of blood glucose change (0.2449). The second factors are cholesterol, age and triglyceride. The total ratio of these four factors reaches to 77% of the nine screened risk factors. And the sensitivity sequences can provide judgment method for individual intervention. This method can be applied to risk factors quantitative analysis of other diseases and potentially used for clinical practitioners to identify high risk populations for type 2 diabetes as well as other disease.
The risk of kidney stones following bariatric surgery: a systematic review and meta-analysis.
Thongprayoon, Charat; Cheungpasitporn, Wisit; Vijayvargiya, Priya; Anthanont, Pimjai; Erickson, Stephen B
2016-01-01
With rising prevalence of morbid obesity, the number of bariatric surgeries performed each year has been increasing worldwide. The objective of this meta-analysis was to assess the risk of kidney stones following bariatric surgery. A literature search was performed using MEDLINE, EMBASE, and Cochrane Database of Systematic Reviews from inception through July 2015. Only studies reporting relative risks, odd ratios or hazard ratios (HRs) to compare risk of kidney stones in patients who underwent bariatric surgery versus no surgery were included. Pooled risk ratios (RR) and 95% confidence interval (CI) were calculated using a random-effect, generic inverse variance method. Four studies (One randomized controlled trial and three cohort studies) with 11,348 patients were included in analysis to assess the risk of kidney stones following bariatric surgery. The pooled RR of kidney stones in patients undergoing bariatric surgery was 1.22 (95% CI, 0.63-2.35). The type of bariatric surgery subgroup analysis demonstrated an increased risk of kidney stones in patients following Roux-en-Y gastric bypass (RYGB) with the pooled RR of 1.73 (95% CI, 1.30-2.30) and a decreased risk of kidney stones in patients following restrictive procedures including laparoscopic banding or sleeve gastrectomy with the pooled RR of 0.37 (95% CI, 0.16-0.85). Our meta-analysis demonstrates an association between RYGB and increased risk of kidney stones. Restrictive bariatric surgery, on the other hand, may decrease kidney stone risk. Future study with long-term follow-up data is needed to confirm this potential benefit of restrictive bariatric surgery.
Topography- and nightlight-based national flood risk assessment in Canada
NASA Astrophysics Data System (ADS)
Elshorbagy, Amin; Bharath, Raja; Lakhanpal, Anchit; Ceola, Serena; Montanari, Alberto; Lindenschmidt, Karl-Erich
2017-04-01
In Canada, flood analysis and water resource management, in general, are tasks conducted at the provincial level; therefore, unified national-scale approaches to water-related problems are uncommon. In this study, a national-scale flood risk assessment approach is proposed and developed. The study focuses on using global and national datasets available with various resolutions to create flood risk maps. First, a flood hazard map of Canada is developed using topography-based parameters derived from digital elevation models, namely, elevation above nearest drainage (EAND) and distance from nearest drainage (DFND). This flood hazard mapping method is tested on a smaller area around the city of Calgary, Alberta, against a flood inundation map produced by the city using hydraulic modelling. Second, a flood exposure map of Canada is developed using a land-use map and the satellite-based nightlight luminosity data as two exposure parameters. Third, an economic flood risk map is produced, and subsequently overlaid with population density information to produce a socioeconomic flood risk map for Canada. All three maps of hazard, exposure, and risk are classified into five classes, ranging from very low to severe. A simple way to include flood protection measures in hazard estimation is also demonstrated using the example of the city of Winnipeg, Manitoba. This could be done for the entire country if information on flood protection across Canada were available. The evaluation of the flood hazard map shows that the topography-based method adopted in this study is both practical and reliable for large-scale analysis. Sensitivity analysis regarding the resolution of the digital elevation model is needed to identify the resolution that is fine enough for reliable hazard mapping, but coarse enough for computational tractability. The nightlight data are found to be useful for exposure and risk mapping in Canada; however, uncertainty analysis should be conducted to investigate the effect of the overglow phenomenon on flood risk mapping.
Krass, I; Mitchell, B; Clarke, P; Brillant, M; Dienaar, R; Hughes, J; Lau, P; Peterson, G; Stewart, K; Taylor, S; Wilkinson, J; Armour, C
2007-03-01
To compare the efficacy and cost-effectiveness of two methods of screening for undiagnosed type 2 diabetes in Australian community pharmacy. A random sample of 30 pharmacies were allocated into two groups: (i) tick test only (TTO); or (ii) sequential screening (SS) method. Both methods used the same initial risk assessment for type 2 diabetes. Subjects with one or more risk factors in the TTO group were offered a referral to their general practitioner (GP). Under the SS method, patients with risk factors were offered a capillary blood glucose test and those identified as being at risk referred to a GP. The effectiveness and cost-effectiveness of these approaches was assessed. A total of 1286 people were screened over a period of 3 months. The rate of diagnosis of diabetes was significantly higher for SS compared with the TTO method (1.7% versus 0.2%; p=0.008). The SS method resulted in fewer referrals to the GP and a higher uptake of referrals than the TTO method and so was the more cost-effective screening method. SS is the superior method from a cost and efficacy perspective. It should be considered as the preferred option for screening by community based pharmacists in Australia.
Engineering risk reduction in satellite programs
NASA Technical Reports Server (NTRS)
Dean, E. S., Jr.
1979-01-01
Methods developed in planning and executing system safety engineering programs for Lockheed satellite integration contracts are presented. These procedures establish the applicable safety design criteria, document design compliance and assess the residual risks where non-compliant design is proposed, and provide for hazard analysis of system level test, handling and launch preparations. Operations hazard analysis identifies product protection and product liability hazards prior to the preparation of operational procedures and provides safety requirements for inclusion in them. The method developed for documenting all residual hazards for the attention of program management assures an acceptable minimum level of risk prior to program deployment. The results are significant for persons responsible for managing or engineering the deployment and production of complex high cost equipment under current product liability law and cost/time constraints, have a responsibility to minimize the possibility of an accident, and should have documentation to provide a defense in a product liability suit.
Hydrologic Drought Decision Support System (HyDroDSS)
Granato, Gregory E.
2014-01-01
The hydrologic drought decision support system (HyDroDSS) was developed by the U.S. Geological Survey (USGS) in cooperation with the Rhode Island Water Resources Board (RIWRB) for use in the analysis of hydrologic variables that may indicate the risk for streamflows to be below user-defined flow targets at a designated site of interest, which is defined herein as data-collection site on a stream that may be adversely affected by pumping. Hydrologic drought is defined for this study as a period of lower than normal streamflows caused by precipitation deficits and (or) water withdrawals. The HyDroDSS is designed to provide water managers with risk-based information for balancing water-supply needs and aquatic-habitat protection goals to mitigate potential effects of hydrologic drought. This report describes the theory and methods for retrospective streamflow-depletion analysis, rank correlation analysis, and drought-projection analysis. All three methods are designed to inform decisions made by drought steering committees and decisionmakers on the basis of quantitative risk assessment. All three methods use estimates of unaltered streamflow, which is the measured or modeled flow without major withdrawals or discharges, to approximate a natural low-flow regime. Retrospective streamflow-depletion analysis can be used by water-resource managers to evaluate relations between withdrawal plans and the potential effects of withdrawal plans on streams at one or more sites of interest in an area. Retrospective streamflow-depletion analysis indicates the historical risk of being below user-defined flow targets if different pumping plans were implemented for the period of record. Retrospective streamflow-depletion analysis also indicates the risk for creating hydrologic drought conditions caused by use of a pumping plan. Retrospective streamflow-depletion analysis is done by calculating the net streamflow depletions from withdrawals and discharges and applying these depletions to a simulated record of unaltered streamflow. Rank correlation analysis in the HyDroDSS indicates the persistence of hydrologic measurements from month to month for the prediction of developing hydrologic drought conditions and quantitatively indicates which hydrologic variables may be used to indicate the onset of hydrologic drought conditions. Rank correlation analysis also indicates the potential use of each variable for estimating the monthly minimum unaltered flow at a site of interest for use in the drought-projection analysis. Rank correlation analysis in the HyDroDSS is done by calculating Spearman’s rho for paired samples and the 95-percent confidence limits of this rho value. Rank correlation analysis can be done by using precipitation, groundwater levels, measured streamflows, and estimated unaltered streamflows. Serial correlation analysis, which indicates relations between current and future values, can be done for a single site. Cross correlation analysis, which indicates relations among current values at one site and current and future values at a second site, also can be done. Drought-projection analysis in the HyDroDSS indicates the risk for being in a hydrologic drought condition during the current month and the five following months with and without pumping. Drought-projection analysis also indicates the potential effectiveness of water-conservation methods for mitigating the effect of withdrawals in the coming months on the basis of the amount of depletion caused by different pumping plans and on the risk of unaltered flows being below streamflow targets. Drought-projection analysis in the HyDroDSS is done with Monte Carlo methods by using the position analysis method. In this method the initial value of estimated unaltered streamflows is calculated by correlation to a measured hydrologic variable (monthly precipitation, groundwater levels, or streamflows from an index station identified with the rank correlation analysis). Then a pseudorandom number generator is used to create 251 six-month-long flow traces by using a bootstrap method. Serial correlation of the estimated unaltered monthly minimum streamflows determined from the rank correlation analysis is preserved within each flow trace. The sample of unaltered streamflows indicates the risk of being below flow targets in the coming months under simulated natural conditions (without historic withdrawals). The streamflow-depletion algorithms are then used to estimate risks of flow being below targets if selected pumping plans are used. This report also describes the implementation of the HyDroDSS. The HyDroDSS was developed as a Microsoft Access® database application to facilitate storage, handling, and use of hydrologic datasets with a simple graphical user interface. The program is implemented in the database by using the Visual Basic for Applications® (VBA) programming language. Program source code for the analytical techniques is provided in the HyDroDSS and in electronic text files accompanying this report. Program source code for the graphical user interface and for data-handling code, which is specific to Microsoft Access® and the HyDroDSS, is provided in the database. An installation package with a run-time version of the software is available with this report for potential users who do not have a compatible copy of Microsoft Access®. Administrative rights are needed to install this version of the HyDroDSS. A case study, to demonstrate the use of HyDroDSS and interpretation of results for a site of interest, is detailed for the USGS streamgage on the Hunt River (station 01117000) near East Greenwich in central Rhode Island. The Hunt River streamgage was used because it has a long record of streamflow and is in a well-studied basin with a substantial amount of hydrologic and water-use data including groundwater pumping for municipal water supply.
Insomnia and risk of dementia in older adults: Systematic review and meta-analysis.
de Almondes, Katie Moraes; Costa, Mônica Vieira; Malloy-Diniz, Leandro Fernandes; Diniz, Breno Satler
2016-06-01
There are cross-sectional evidences of an association between sleep disorders and cognitive impairment on older adults. However, there are no consensus by means of longitudinal studies data on the increased risk of developing dementia related to insomnia. We conduct a systematic review and meta-analysis to evaluate the risk of incident all-cause dementia in individuals with insomnia in population-based prospective cohort studies. Five studies of 5.242 retrieved references were included in the meta-analysis. We used the generic inverse variance method with a random effects model to calculate the pooled risk of dementia in older adults with insomnia. We assessed heterogeneity in the meta-analysis by means of the Q-test and I2 index. Study quality was assessed with the Newcastle-Ottawa Scale The results showed that Insomnia was associated with a significant risk of all-cause dementia (RR = 1.53 CI95% (1.07-2.18), z = 2.36, p = 0.02). There was evidence for significant heterogeneity in the analysis (q-value = 2.4, p < 0.001 I2 = 82%). Insomnia is associated with an increased risk for dementia. This results provide evidences that future studies should investigate dementia prevention among elderly individuals through screening and proper management of insomnia. Copyright © 2016 Elsevier Ltd. All rights reserved.
A semi-quantitative approach to GMO risk-benefit analysis.
Morris, E Jane
2011-10-01
In many countries there are increasing calls for the benefits of genetically modified organisms (GMOs) to be considered as well as the risks, and for a risk-benefit analysis to form an integral part of GMO regulatory frameworks. This trend represents a shift away from the strict emphasis on risks, which is encapsulated in the Precautionary Principle that forms the basis for the Cartagena Protocol on Biosafety, and which is reflected in the national legislation of many countries. The introduction of risk-benefit analysis of GMOs would be facilitated if clear methodologies were available to support the analysis. Up to now, methodologies for risk-benefit analysis that would be applicable to the introduction of GMOs have not been well defined. This paper describes a relatively simple semi-quantitative methodology that could be easily applied as a decision support tool, giving particular consideration to the needs of regulators in developing countries where there are limited resources and experience. The application of the methodology is demonstrated using the release of an insect resistant maize variety in South Africa as a case study. The applicability of the method in the South African regulatory system is also discussed, as an example of what might be involved in introducing changes into an existing regulatory process.
Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prayogo, Galang Sandy, E-mail: gasandylang@live.com; Haryadi, Gunawan Dwi; Ismail, Rifky
Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presentedmore » a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.« less
Regulatory Science in Professional Education.
Akiyama, Hiroshi
2017-01-01
In the field of pharmaceutical sciences, the subject of regulatory science (RS) includes pharmaceuticals, food, and living environments. For pharmaceuticals, considering the balance between efficacy and safety is a point required for public acceptance, and in that balance, more importance is given to efficacy in curing disease. For food, however, safety is the most important consideration for public acceptance because food should be essentially free of risk. To ensure food safety, first, any hazard that is an agent in food or condition of food with the potential to cause adverse health effects should be identified and characterized. Then the risk that it will affect public health is scientifically analyzed. This process is called risk assessment. Second, risk management should be conducted to reduce a risk that has the potential to affect public health found in a risk assessment. Furthermore, risk communication, which is the interactive exchange of information and opinions concerning risk and risk management among risk assessors, risk managers, consumers, and other interested parties, should be conducted. Food safety is ensured based on risk analysis consisting of the three components of risk assessment, risk management, and risk communication. RS in the field of food safety supports risk analysis, such as scientific research and development of test methods to evaluate food quality, efficacy, and safety. RS is also applied in the field of living environments because the safety of environmental chemical substances is ensured based on risk analysis, similar to that conducted for food.
Decomposition-Based Failure Mode Identification Method for Risk-Free Design of Large Systems
NASA Technical Reports Server (NTRS)
Tumer, Irem Y.; Stone, Robert B.; Roberts, Rory A.; Clancy, Daniel (Technical Monitor)
2002-01-01
When designing products, it is crucial to assure failure and risk-free operation in the intended operating environment. Failures are typically studied and eliminated as much as possible during the early stages of design. The few failures that go undetected result in unacceptable damage and losses in high-risk applications where public safety is of concern. Published NASA and NTSB accident reports point to a variety of components identified as sources of failures in the reported cases. In previous work, data from these reports were processed and placed in matrix form for all the system components and failure modes encountered, and then manipulated using matrix methods to determine similarities between the different components and failure modes. In this paper, these matrices are represented in the form of a linear combination of failures modes, mathematically formed using Principal Components Analysis (PCA) decomposition. The PCA decomposition results in a low-dimensionality representation of all failure modes and components of interest, represented in a transformed coordinate system. Such a representation opens the way for efficient pattern analysis and prediction of failure modes with highest potential risks on the final product, rather than making decisions based on the large space of component and failure mode data. The mathematics of the proposed method are explained first using a simple example problem. The method is then applied to component failure data gathered from helicopter, accident reports to demonstrate its potential.
Bourdel, Nicolas; Chauvet, Pauline; Tognazza, Enrica; Pereira, Bruno; Botchorishvili, Revaz; Canis, Michel
2016-01-01
Our objective was to identify the most accurate method of endometrial sampling for the diagnosis of complex atypical hyperplasia (CAH), and the related risk of underestimation of endometrial cancer. We conducted a systematic literature search in PubMed and EMBASE (January 1999-September 2013) to identify all registered articles on this subject. Studies were selected with a 2-step method. First, titles and abstracts were analyzed by 2 reviewers, and 69 relevant articles were selected for full reading. Then, the full articles were evaluated to determine whether full inclusion criteria were met. We selected 27 studies, taking into consideration the comparison between histology of endometrial hyperplasia obtained by diagnostic tests of interest (uterine curettage, hysteroscopically guided biopsy, or hysteroscopic endometrial resection) and subsequent results of hysterectomy. Analysis of the studies reviewed focused on 1106 patients with a preoperative diagnosis of atypical endometrial hyperplasia. The mean risk of finding endometrial cancer at hysterectomy after atypical endometrial hyperplasia diagnosed by uterine curettage was 32.7% (95% confidence interval [CI], 26.2-39.9), with a risk of 45.3% (95% CI, 32.8-58.5) after hysteroscopically guided biopsy and 5.8% (95% CI, 0.8-31.7) after hysteroscopic resection. In total, the risk of underestimation of endometrial cancer reaches a very high rate in patients with CAH using the classic method of evaluation (i.e., uterine curettage or hysteroscopically guided biopsy). This rate of underdiagnosed endometrial cancer leads to the risk of inappropriate surgical procedures (31.7% of tubal conservation in the data available and no abdominal exploration in 24.6% of the cases). Hysteroscopic resection seems to reduce the risk of underdiagnosed endometrial cancer. Copyright © 2016 AAGL. Published by Elsevier Inc. All rights reserved.
Lifshits, A M
1979-01-01
General characteristics of the multivariate statistical analysis (MSA) is given. Methodical premises and criteria for the selection of an adequate MSA method applicable to pathoanatomic investigations of the epidemiology of multicausal diseases are presented. The experience of using MSA with computors and standard computing programs in studies of coronary arteries aterosclerosis on the materials of 2060 autopsies is described. The combined use of 4 MSA methods: sequential, correlational, regressional, and discriminant permitted to quantitate the contribution of each of the 8 examined risk factors in the development of aterosclerosis. The most important factors were found to be the age, arterial hypertension, and heredity. Occupational hypodynamia and increased fatness were more important in men, whereas diabetes melitus--in women. The registration of this combination of risk factors by MSA methods provides for more reliable prognosis of the likelihood of coronary heart disease with a fatal outcome than prognosis of the degree of coronary aterosclerosis.
Zoller, Thomas; Fèvre, Eric M; Welburn, Susan C; Odiit, Martin; Coleman, Paul G
2008-01-01
Background Sleeping sickness (HAT) caused by T.b. rhodesiense is a major veterinary and human public health problem in Uganda. Previous studies have investigated spatial risk factors for T.b. rhodesiense at large geographic scales, but none have properly investigated such risk factors at small scales, i.e. within affected villages. In the present work, we use a case-control methodology to analyse both behavioural and spatial risk factors for HAT in an endemic area. Methods The present study investigates behavioural and occupational risk factors for infection with HAT within villages using a questionnaire-based case-control study conducted in 17 villages endemic for HAT in SE Uganda, and spatial risk factors in 4 high risk villages. For the spatial analysis, the location of homesteads with one or more cases of HAT up to three years prior to the beginning of the study was compared to all non-case homesteads. Analysing spatial associations with respect to irregularly shaped geographical objects required the development of a new approach to geographical analysis in combination with a logistic regression model. Results The study was able to identify, among other behavioural risk factors, having a family member with a history of HAT (p = 0.001) as well as proximity of a homestead to a nearby wetland area (p < 0.001) as strong risk factors for infection. The novel method of analysing complex spatial interactions used in the study can be applied to a range of other diseases. Conclusion Spatial risk factors for HAT are maintained across geographical scales; this consistency is useful in the design of decision support tools for intervention and prevention of the disease. Familial aggregation of cases was confirmed for T. b. rhodesiense HAT in the study and probably results from shared behavioural and spatial risk factors amongmembers of a household. PMID:18590541
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bacvarov, D.C.
1981-01-01
A new method for probabilistic risk assessment of transmission line insulation flashovers caused by lightning strokes is presented. The utilized approach of applying the finite element method for probabilistic risk assessment is demonstrated to be very powerful. The reasons for this are two. First, the finite element method is inherently suitable for analysis of three dimensional spaces where the parameters, such as three variate probability densities of the lightning currents, are non-uniformly distributed. Second, the finite element method permits non-uniform discretization of the three dimensional probability spaces thus yielding high accuracy in critical regions, such as the area of themore » low probability events, while at the same time maintaining coarse discretization in the non-critical areas to keep the number of grid points and the size of the problem to a manageable low level. The finite element probabilistic risk assessment method presented here is based on a new multidimensional search algorithm. It utilizes an efficient iterative technique for finite element interpolation of the transmission line insulation flashover criteria computed with an electro-magnetic transients program. Compared to other available methods the new finite element probabilistic risk assessment method is significantly more accurate and approximately two orders of magnitude computationally more efficient. The method is especially suited for accurate assessment of rare, very low probability events.« less
An Independent Evaluation of the FMEA/CIL Hazard Analysis Alternative Study
NASA Technical Reports Server (NTRS)
Ray, Paul S.
1996-01-01
The present instruments of safety and reliability risk control for a majority of the National Aeronautics and Space Administration (NASA) programs/projects consist of Failure Mode and Effects Analysis (FMEA), Hazard Analysis (HA), Critical Items List (CIL), and Hazard Report (HR). This extensive analytical approach was introduced in the early 1970's and was implemented for the Space Shuttle Program by NHB 5300.4 (1D-2. Since the Challenger accident in 1986, the process has been expanded considerably and resulted in introduction of similar and/or duplicated activities in the safety/reliability risk analysis. A study initiated in 1995, to search for an alternative to the current FMEA/CIL Hazard Analysis methodology generated a proposed method on April 30, 1996. The objective of this Summer Faculty Study was to participate in and conduct an independent evaluation of the proposed alternative to simplify the present safety and reliability risk control procedure.
AbdelRahman, Samir E; Zhang, Mingyuan; Bray, Bruce E; Kawamoto, Kensaku
2014-05-27
The aim of this study was to propose an analytical approach to develop high-performing predictive models for congestive heart failure (CHF) readmission using an operational dataset with incomplete records and changing data over time. Our analytical approach involves three steps: pre-processing, systematic model development, and risk factor analysis. For pre-processing, variables that were absent in >50% of records were removed. Moreover, the dataset was divided into a validation dataset and derivation datasets which were separated into three temporal subsets based on changes to the data over time. For systematic model development, using the different temporal datasets and the remaining explanatory variables, the models were developed by combining the use of various (i) statistical analyses to explore the relationships between the validation and the derivation datasets; (ii) adjustment methods for handling missing values; (iii) classifiers; (iv) feature selection methods; and (iv) discretization methods. We then selected the best derivation dataset and the models with the highest predictive performance. For risk factor analysis, factors in the highest-performing predictive models were analyzed and ranked using (i) statistical analyses of the best derivation dataset, (ii) feature rankers, and (iii) a newly developed algorithm to categorize risk factors as being strong, regular, or weak. The analysis dataset consisted of 2,787 CHF hospitalizations at University of Utah Health Care from January 2003 to June 2013. In this study, we used the complete-case analysis and mean-based imputation adjustment methods; the wrapper subset feature selection method; and four ranking strategies based on information gain, gain ratio, symmetrical uncertainty, and wrapper subset feature evaluators. The best-performing models resulted from the use of a complete-case analysis derivation dataset combined with the Class-Attribute Contingency Coefficient discretization method and a voting classifier which averaged the results of multi-nominal logistic regression and voting feature intervals classifiers. Of 42 final model risk factors, discharge disposition, discretized age, and indicators of anemia were the most significant. This model achieved a c-statistic of 86.8%. The proposed three-step analytical approach enhanced predictive model performance for CHF readmissions. It could potentially be leveraged to improve predictive model performance in other areas of clinical medicine.
Dalle Carbonare, S; Folli, F; Patrini, E; Giudici, P; Bellazzi, R
2013-01-01
The increasing demand of health care services and the complexity of health care delivery require Health Care Organizations (HCOs) to approach clinical risk management through proper methods and tools. An important aspect of risk management is to exploit the analysis of medical injuries compensation claims in order to reduce adverse events and, at the same time, to optimize the costs of health insurance policies. This work provides a probabilistic method to estimate the risk level of a HCO by computing quantitative risk indexes from medical injury compensation claims. Our method is based on the estimate of a loss probability distribution from compensation claims data through parametric and non-parametric modeling and Monte Carlo simulations. The loss distribution can be estimated both on the whole dataset and, thanks to the application of a Bayesian hierarchical model, on stratified data. The approach allows to quantitatively assessing the risk structure of the HCO by analyzing the loss distribution and deriving its expected value and percentiles. We applied the proposed method to 206 cases of injuries with compensation requests collected from 1999 to the first semester of 2007 by the HCO of Lodi, in the Northern part of Italy. We computed the risk indexes taking into account the different clinical departments and the different hospitals involved. The approach proved to be useful to understand the HCO risk structure in terms of frequency, severity, expected and unexpected loss related to adverse events.
An Updated Meta-Analysis of Risk of Multiple Sclerosis following Infectious Mononucleosis
Handel, Adam E.; Williamson, Alexander J.; Disanto, Giulio; Handunnetthi, Lahiru; Giovannoni, Gavin; Ramagopalan, Sreeram V.
2010-01-01
Background Multiple sclerosis (MS) appears to develop in genetically susceptible individuals as a result of environmental exposures. Epstein-Barr virus (EBV) infection is an almost universal finding among individuals with MS. Symptomatic EBV infection as manifested by infectious mononucleosis (IM) has been shown in a previous meta-analysis to be associated with the risk of MS, however a number of much larger studies have since been published. Methods/Principal Findings We performed a Medline search to identify articles published since the original meta-analysis investigating MS risk following IM. A total of 18 articles were included in this study, including 19390 MS patients and 16007 controls. We calculated the relative risk of MS following IM using a generic inverse variance with random effects model. This showed that the risk of MS was strongly associated with IM (relative risk (RR) 2.17; 95% confidence interval 1.97–2.39; p<10−54). Discussion Our results establish firmly that a history of infectious mononucleosis significantly increases the risk of multiple sclerosis. Future work should focus on the mechanism of this association and interaction with other risk factors. PMID:20824132
Environmental mediation: A method for protecting environmental sciences and scientists
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vigerstad, T.J.; Berdt Romilly, G. de; MacKeigan, P.
1995-12-31
The primary role for scientific analysis of environmental and human risks has been to support decisions that have arisen out of a regulatory decision-making model called ``Command and Control`` or ``Decide and Defend``. A project or a policy is proposed and permission for its implementation is sought. Permission-gaining sometimes requires a number of technical documents: Environmental Impact Statements, Public Health Risk Evaluations, policy analysis documents. Usually, little of this analysis is used to make any real decisions. This is a fact that has lead to enormous frustration and an atmosphere of distrust of government, industry and consulting scientists. There havemore » been a number of responses by governmental and industrial managers, some scientists, and even the legal system, to mitigate the frustration and distrust. One response has been to develop methods of packaging information using language which is considered more ``understandable`` to the public: Ecosystem Health, Social Risk Assessment, Economic Risk Management, Enviro-hazard Communication, Risk Focus Analysis, etc. A second is to develop more sophisticated persuasion techniques-a potential misuse of Risk Communication. A third is proposing to change the practice of science itself: e.g., ``post-normal science`` and ``popular epidemiology``. A fourth has been to challenge the definition of ``expert`` in legal proceedings. All of these approaches do not appear to address the underlying issue: lack of trust and credibility. To address this issue requires an understanding of the nature of environmental disputes and the development of an atmosphere of trust and credibility. The authors propose Environmental Mediation as a response to the dilemma faced by professional environmental scientists, engineers, and managers that protects the professionals and their disciplines.« less
Quantifying the Metrics That Characterize Safety Culture of Three Engineered Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tucker, Julie; Ernesti, Mary; Tokuhiro, Akira
2002-07-01
With potential energy shortages and increasing electricity demand, the nuclear energy option is being reconsidered in the United States. Public opinion will have a considerable voice in policy decisions that will 'road-map' the future of nuclear energy in this country. This report is an extension of the last author's work on the 'safety culture' associated with three engineered systems (automobiles, commercial airplanes, and nuclear power plants) in Japan and the United States. Safety culture, in brief is defined as a specifically developed culture based on societal and individual interpretations of the balance of real, perceived, and imagined risks versus themore » benefits drawn from utilizing a given engineered systems. The method of analysis is a modified scale analysis, with two fundamental Eigen-metrics, time- (t) and number-scales (N) that describe both engineered systems and human factors. The scale analysis approach is appropriate because human perception of risk, perception of benefit and level of (technological) acceptance are inherently subjective, therefore 'fuzzy' and rarely quantifiable in exact magnitude. Perception of risk, expressed in terms of the psychometric factors 'dread risk' and 'unknown risk', contains both time- and number-scale elements. Various engineering system accidents with fatalities, reported by mass media are characterized by t and N, and are presented in this work using the scale analysis method. We contend that level of acceptance infers a perception of benefit at least two orders larger magnitude than perception of risk. The 'amplification' influence of mass media is also deduced as being 100- to 1000-fold the actual number of fatalities/serious injuries in a nuclear-related accident. (authors)« less
Problems of Mathematical Finance by Stochastic Control Methods
NASA Astrophysics Data System (ADS)
Stettner, Łukasz
The purpose of this paper is to present main ideas of mathematics of finance using the stochastic control methods. There is an interplay between stochastic control and mathematics of finance. On the one hand stochastic control is a powerful tool to study financial problems. On the other hand financial applications have stimulated development in several research subareas of stochastic control in the last two decades. We start with pricing of financial derivatives and modeling of asset prices, studying the conditions for the absence of arbitrage. Then we consider pricing of defaultable contingent claims. Investments in bonds lead us to the term structure modeling problems. Special attention is devoted to historical static portfolio analysis called Markowitz theory. We also briefly sketch dynamic portfolio problems using viscosity solutions to Hamilton-Jacobi-Bellman equation, martingale-convex analysis method or stochastic maximum principle together with backward stochastic differential equation. Finally, long time portfolio analysis for both risk neutral and risk sensitive functionals is introduced.
[Economic effects of integrated RIS-PACS solution in the university environment].
Kröger, M; Nissen-Meyer, S; Wetekam, V; Reiser, M
1999-04-01
The goal of the current article is to demonstrate how qualitative and monetary effects resulting from an integrated RIS/PACS installation can be evaluated. First of all, the system concept of a RIS/PACS solution for a university hospital is defined and described. Based on this example, a generic method for the evaluation of qualitative and monetary effects as well as associated risks is depicted and demonstrated. To this end, qualitative analyses, investment calculations and risk analysis are employed. The sample analysis of a RIS/PACS solution specially designed for a university hospital demonstrates positive qualitative and monetary effects of the system. Under ideal conditions the payoff time of the investments is reached after 4 years of an assumed 8 years effective life of the system. Furthermore, under conservative assumptions, the risk analysis shows a probability of 0% for realising a negative net present value at the end of the payoff time period. It should be pointed out that the positive result of this sample analysis will not necessarily apply to other clinics or hospitals. However, the same methods may be used for the individual evaluation of the qualitative and monetary effects of a RIS/PACS installation in any clinic.
Xie, Kun; Ozbay, Kaan; Kurkcu, Abdullah; Yang, Hong
2017-08-01
This study aims to explore the potential of using big data in advancing the pedestrian risk analysis including the investigation of contributing factors and the hotspot identification. Massive amounts of data of Manhattan from a variety of sources were collected, integrated, and processed, including taxi trips, subway turnstile counts, traffic volumes, road network, land use, sociodemographic, and social media data. The whole study area was uniformly split into grid cells as the basic geographical units of analysis. The cell-structured framework makes it easy to incorporate rich and diversified data into risk analysis. The cost of each crash, weighted by injury severity, was assigned to the cells based on the relative distance to the crash site using a kernel density function. A tobit model was developed to relate grid-cell-specific contributing factors to crash costs that are left-censored at zero. The potential for safety improvement (PSI) that could be obtained by using the actual crash cost minus the cost of "similar" sites estimated by the tobit model was used as a measure to identify and rank pedestrian crash hotspots. The proposed hotspot identification method takes into account two important factors that are generally ignored, i.e., injury severity and effects of exposure indicators. Big data, on the one hand, enable more precise estimation of the effects of risk factors by providing richer data for modeling, and on the other hand, enable large-scale hotspot identification with higher resolution than conventional methods based on census tracts or traffic analysis zones. © 2017 Society for Risk Analysis.
Construction risk assessment of deep foundation pit in metro station based on G-COWA method
NASA Astrophysics Data System (ADS)
You, Weibao; Wang, Jianbo; Zhang, Wei; Liu, Fangmeng; Yang, Diying
2018-05-01
In order to get an accurate understanding of the construction safety of deep foundation pit in metro station and reduce the probability and loss of risk occurrence, a risk assessment method based on G-COWA is proposed. Firstly, relying on the specific engineering examples and the construction characteristics of deep foundation pit, an evaluation index system based on the five factors of “human, management, technology, material and environment” is established. Secondly, the C-OWA operator is introduced to realize the evaluation index empowerment and weaken the negative influence of expert subjective preference. The gray cluster analysis and fuzzy comprehensive evaluation method are combined to construct the construction risk assessment model of deep foundation pit, which can effectively solve the uncertainties. Finally, the model is applied to the actual project of deep foundation pit of Qingdao Metro North Station, determine its construction risk rating is “medium”, evaluate the model is feasible and reasonable. And then corresponding control measures are put forward and useful reference are provided.
Market-implied spread for earthquake CAT bonds: financial implications of engineering decisions.
Damnjanovic, Ivan; Aslan, Zafer; Mander, John
2010-12-01
In the event of natural and man-made disasters, owners of large-scale infrastructure facilities (assets) need contingency plans to effectively restore the operations within the acceptable timescales. Traditionally, the insurance sector provides the coverage against potential losses. However, there are many problems associated with this traditional approach to risk transfer including counterparty risk and litigation. Recently, a number of innovative risk mitigation methods, termed alternative risk transfer (ART) methods, have been introduced to address these problems. One of the most important ART methods is catastrophe (CAT) bonds. The objective of this article is to develop an integrative model that links engineering design parameters with financial indicators including spread and bond rating. The developed framework is based on a four-step structural loss model and transformed survival model to determine expected excess returns. We illustrate the framework for a seismically designed bridge using two unique CAT bond contracts. The results show a nonlinear relationship between engineering design parameters and market-implied spread. © 2010 Society for Risk Analysis.
Senarathna, S.M.D.K. Ganga; Ranganathan, Shalini S.; Buckley, Nick; Soysa, S.S.S.B.D. Preethi; Fernandopulle, B. M. Rohini
2012-01-01
Objectives: Acute paracetamol poisoning is an emerging problem in Sri Lanka. Management guidelines recommend ingested dose and serum paracetamol concentrations to assess the risk. Our aim was to determine the usefulness of the patient's history of an ingested dose of >150 mg/kg and paracetamol concentration obtained by a simple colorimetric method to assess risk in patients with acute paracetamol poisoning. Materials and Methods: Serum paracetamol concentrations were determined in 100 patients with a history of paracetamol overdose using High Performance Liquid Chromatography (HPLC); (reference method). The results were compared to those obtained with a colorimetric method. The utility of risk assessment by reported dose ingested and colorimetric analysis were compared. Results: The area under the receiver operating characteristic curve for the history of ingested dose was 0.578 and there was no dose cut-off providing useful risk categorization. Both analytical methods had less than 5% intra- and inter-batch variation and were accurate on spiked samples. The time from blood collection to result was six times faster and ten times cheaper for colorimetry (30 minutes, US$2) than for HPLC (180 minutes, US$20). The correlation coefficient between the paracetamol levels by the two methods was 0.85. The agreement on clinical risk categorization on the standard nomogram was also good (Kappa = 0.62, sensitivity 81%, specificity 89%). Conclusions: History of dose ingested alone greatly over-estimated the number of patients who need antidotes and it was a poor predictor of risk. Paracetamol concentrations by colorimetry are rapid and inexpensive. The use of these would greatly improve the assessment of risk and greatly reduce unnecessary expenditure on antidotes. PMID:23087506
Probability of Failure Analysis Standards and Guidelines for Expendable Launch Vehicles
NASA Astrophysics Data System (ADS)
Wilde, Paul D.; Morse, Elisabeth L.; Rosati, Paul; Cather, Corey
2013-09-01
Recognizing the central importance of probability of failure estimates to ensuring public safety for launches, the Federal Aviation Administration (FAA), Office of Commercial Space Transportation (AST), the National Aeronautics and Space Administration (NASA), and U.S. Air Force (USAF), through the Common Standards Working Group (CSWG), developed a guide for conducting valid probability of failure (POF) analyses for expendable launch vehicles (ELV), with an emphasis on POF analysis for new ELVs. A probability of failure analysis for an ELV produces estimates of the likelihood of occurrence of potentially hazardous events, which are critical inputs to launch risk analysis of debris, toxic, or explosive hazards. This guide is intended to document a framework for POF analyses commonly accepted in the US, and should be useful to anyone who performs or evaluates launch risk analyses for new ELVs. The CSWG guidelines provide performance standards and definitions of key terms, and are being revised to address allocation to flight times and vehicle response modes. The POF performance standard allows a launch operator to employ alternative, potentially innovative methodologies so long as the results satisfy the performance standard. Current POF analysis practice at US ranges includes multiple methodologies described in the guidelines as accepted methods, but not necessarily the only methods available to demonstrate compliance with the performance standard. The guidelines include illustrative examples for each POF analysis method, which are intended to illustrate an acceptable level of fidelity for ELV POF analyses used to ensure public safety. The focus is on providing guiding principles rather than "recipe lists." Independent reviews of these guidelines were performed to assess their logic, completeness, accuracy, self- consistency, consistency with risk analysis practices, use of available information, and ease of applicability. The independent reviews confirmed the general validity of the performance standard approach and suggested potential updates to improve the accuracy each of the example methods, especially to address reliability growth.
Humphries Choptiany, John Michael; Pelot, Ronald
2014-09-01
Multicriteria decision analysis (MCDA) has been applied to various energy problems to incorporate a variety of qualitative and quantitative criteria, usually spanning environmental, social, engineering, and economic fields. MCDA and associated methods such as life-cycle assessments and cost-benefit analysis can also include risk analysis to address uncertainties in criteria estimates. One technology now being assessed to help mitigate climate change is carbon capture and storage (CCS). CCS is a new process that captures CO2 emissions from fossil-fueled power plants and injects them into geological reservoirs for storage. It presents a unique challenge to decisionmakers (DMs) due to its technical complexity, range of environmental, social, and economic impacts, variety of stakeholders, and long time spans. The authors have developed a risk assessment model using a MCDA approach for CCS decisions such as selecting between CO2 storage locations and choosing among different mitigation actions for reducing risks. The model includes uncertainty measures for several factors, utility curve representations of all variables, Monte Carlo simulation, and sensitivity analysis. This article uses a CCS scenario example to demonstrate the development and application of the model based on data derived from published articles and publicly available sources. The model allows high-level DMs to better understand project risks and the tradeoffs inherent in modern, complex energy decisions. © 2014 Society for Risk Analysis.
Risk Assessment for Stonecutting Enterprises
NASA Astrophysics Data System (ADS)
Aleksandrova, A. J.; Timofeeva, S. S.
2017-04-01
Working conditions at enterprises and artisanal workshops for the processing of jewelry and ornamental stones were considered. The main stages of the technological process for processing of stone raw materials were shown; dangerous processes in the extraction of stone and its processing were identified. The characteristic of harmful and dangerous production factors affecting stonecutters is given. It was revealed that the most dangerous are the increased level of noise and vibration, as well as chemical reagents. The results of a special assessment of the working conditions of stone-cutting plant workers are studied. Professions with high professional risk were identified; an analysis of occupational risks and occupational injuries was carried out. Risk assessment was produced by several methods; professions with high and medium risk indicators were identified by results of the evaluation. The application of risk assessment methods was given the possibility to justify rational measures reducing risks to the lowest possible level. The received quantitative indicators of risk of workers of the stone-cutting enterprises are the result of this work.
Physical activity level and fall risk among community-dwelling older adults.
Low, Sok Teng; Balaraman, Thirumalaya
2017-07-01
[Purpose] To find the physical activity level and fall risk among the community-dwelling Malaysian older adults and determine the correlation between them. [Subjects and Methods] A cross-sectional study was conducted in which, the physical activity level was evaluated using the Rapid Assessment of Physical Activity questionnaire and fall risk with Fall Risk Assessment Tool. Subjects recruited were 132 community-dwelling Malaysian older adults using the convenience sampling method. [Results] The majority of the participants were under the category of under-active regular light-activities and most of them reported low fall risk. The statistical analysis using Fisher's exact test did not show a significant correlation between physical activity level and fall risk. [Conclusion] The majority of community-dwelling Malaysian older adults are performing some form of physical activity and in low fall risk category. But this study did not find any significant correlation between physical activity level and fall risk among community-dwelling older adults in Malaysia.
[Does clinical risk management require a structured conflict management?].
Neumann, Stefan
2015-01-01
A key element of clinical risk management is the analysis of errors causing near misses or patient damage. After analyzing the causes and circumstances, measures for process improvement have to be taken. Process management, human resource development and other established methods are used. If an interpersonal conflict is a contributory factor to the error, there is usually no structured conflict management available which includes selection criteria for various methods of conflict processing. The European University Viadrina in Frankfurt (Oder) has created a process model for introducing a structured conflict management system which is suitable for hospitals and could fill the gap in the methodological spectrum of clinical risk management. There is initial evidence that a structured conflict management reduces staff fluctuation and hidden conflict costs. This article should be understood as an impulse for discussion on to what extent the range of methods of clinical risk management should be complemented by conflict management.
New Methods for the Analysis of Heartbeat Behavior in Risk Stratification
Glass, Leon; Lerma, Claudia; Shrier, Alvin
2011-01-01
Developing better methods for risk stratification for tachyarrhythmic sudden cardiac remains a major challenge for physicians and scientists. Since the transition from sinus rhythm to ventricular tachycardia/fibrillation happens by different mechanisms in different people, it is unrealistic to think that a single measure will be adequate to provide a good index for risk stratification. We analyze the dynamical properties of ventricular premature complexes over 24 h in an effort to understand the underlying mechanisms of ventricular arrhythmias and to better understand the arrhythmias that occur in individual patients. Two dimensional density plots, called heartprints, correlate characteristic features of the dynamics of premature ventricular complexes and the sinus rate. Heartprints show distinctive characteristics in individual patients. Based on a better understanding of the natures of transitions from sinus rhythm to sudden cardiac and the mechanisms of arrhythmia prior to cardiac arrest, it should be possible to develop better methods for risk stratification. PMID:22144963
Zhang, Jie; Chen, Yuewen; Shao, Yong; Wu, Qi; Guan, Ming; Zhang, Wei; Wan, Jun; Yu, Bo
2012-01-01
Background. TNFα-induced protein 3 (TNFAIP3) interacting with protein 1 (TNIP1) acts as a negative regulator of NF-κB and plays an important role in maintaining the homeostasis of immune system. A recent genome-wide association study (GWAS) showed that the polymorphism of TNIP1 was associated with the disease risk of SLE in Caucasian. In this study, we investigated whether the association of TNIP1 with SLE was replicated in Chinese population. Methods. The association of TNIP1 SNP rs7708392 (G/C) was determined by high resolution melting (HRM) analysis with unlabeled probe in 285 SLE patients and 336 healthy controls. Results. A new SNP rs79937737 located on 5 bp upstream of rs7708392 was discovered during the HRM analysis. No association of rs7708392 or rs79937737 with the disease risk of SLE was found. Furthermore, rs7708392 and rs79937737 were in weak linkage disequilibrium (LD). Hypotypes analysis of the two SNPs also showed no association with SLE in Chinese population. Conclusions. High resolution melting analysis with unlabeled probes proves to be a powerful and efficient genotyping method for identifying and screening SNPs. No association of rs7708392 or rs79937737 with the disease risk of SLE was observed in Chinese population. PMID:22852072
Wu, Lei; Sun, Dali
2017-03-22
Previous systematic reviews and meta-analyses have evaluated the association of dairy consumption and the risk of cardiovascular disease (CVD). However, the findings were inconsistent. No quantitative analysis has specifically assessed the effect of yogurt intake on the incident risk of CVD. We searched the PubMed and the Embase databases from inception to 10 January 2017. A generic inverse-variance method was used to pool the fully-adjusted relative risks (RRs) and the corresponding 95% confidence intervals (CIs) with a random-effects model. A generalized least squares trend estimation model was used to calculate the specific slopes in the dose-response analysis. The present systematic review and meta-analysis identified nine prospective cohort articles involving a total of 291,236 participants. Compared with the lowest category, highest category of yogurt consumption was not significantly related with the incident risk of CVD, and the RR (95% CI) was 1.01 (0.95, 1.08) with an evidence of significant heterogeneity (I² = 52%). However, intake of ≥200 g/day yogurt was significantly associated with a lower risk of CVD in the subgroup analysis. There was a trend that a higher level of yogurt consumption was associated with a lower incident risk of CVD in the dose-response analysis. A daily dose of ≥200 g yogurt intake might be associated with a lower incident risk of CVD. Further cohort studies and randomized controlled trials are still demanded to establish and confirm the observed association in populations with different characteristics.
Wu, Lei; Sun, Dali
2017-01-01
Previous systematic reviews and meta-analyses have evaluated the association of dairy consumption and the risk of cardiovascular disease (CVD). However, the findings were inconsistent. No quantitative analysis has specifically assessed the effect of yogurt intake on the incident risk of CVD. We searched the PubMed and the Embase databases from inception to 10 January 2017. A generic inverse-variance method was used to pool the fully-adjusted relative risks (RRs) and the corresponding 95% confidence intervals (CIs) with a random-effects model. A generalized least squares trend estimation model was used to calculate the specific slopes in the dose-response analysis. The present systematic review and meta-analysis identified nine prospective cohort articles involving a total of 291,236 participants. Compared with the lowest category, highest category of yogurt consumption was not significantly related with the incident risk of CVD, and the RR (95% CI) was 1.01 (0.95, 1.08) with an evidence of significant heterogeneity (I2 = 52%). However, intake of ≥200 g/day yogurt was significantly associated with a lower risk of CVD in the subgroup analysis. There was a trend that a higher level of yogurt consumption was associated with a lower incident risk of CVD in the dose-response analysis. A daily dose of ≥200 g yogurt intake might be associated with a lower incident risk of CVD. Further cohort studies and randomized controlled trials are still demanded to establish and confirm the observed association in populations with different characteristics. PMID:28327514
The effects of napping on the risk of hypertension: a systematic review and meta-analysis.
Cheungpasitporn, Wisit; Thongprayoon, Charat; Srivali, Narat; Vijayvargiya, Priya; Andersen, Carl A; Kittanamongkolchai, Wonngarm; Sathick, Insara J Jaffer; Caples, Sean M; Erickson, Stephen B
2016-11-01
The risk of hypertension in adults who regularly take a nap is controversial. The objective of this meta-analysis was to assess the associations between napping and hypertension. A literature search was performed using MEDLINE, EMbase and The Cochrane Database of Systematic Reviews from inception through October, 2015. Studies that reported relative risks, odd ratios or hazard ratios comparing the risk of hypertension in individuals who regularly take nap were included. Pooled risk ratios (RR) and 95% confidence interval (CI) were calculated using a random-effect, generic inverse variance method. Nine observational studies with 112,267 individuals were included in the analysis to assess the risk of hypertension in nappers. The pooled RR of hypertension in nappers was 1.13 with 95% CI (0.98 to 1.30). When meta-analysis was limited only to studies assessing the risk of hypertension in daytime nappers, the pooled RR of hypertension was 1.19 with 95% CI (1.06 to 1.35). The data on association between nighttime napping in individuals who work night shift and hypertension were limited, only one observational study reported reduced risk of hypertension in nighttime nappers with odds ratio of 0.79 with 95% CI (0.63 to 1.00). Our meta-analysis demonstrates a significant association between daytime napping and hypertension. Future study is needed to assess the potential benefits of HTN screening for daytime nappers. © 2016 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.
A Western Dietary Pattern Increases Prostate Cancer Risk: A Systematic Review and Meta-Analysis.
Fabiani, Roberto; Minelli, Liliana; Bertarelli, Gaia; Bacci, Silvia
2016-10-12
Dietary patterns were recently applied to examine the relationship between eating habits and prostate cancer (PC) risk. While the associations between PC risk with the glycemic index and Mediterranean score have been reviewed, no meta-analysis is currently available on dietary patterns defined by "a posteriori" methods. A literature search was carried out (PubMed, Web of Science) to identify studies reporting the relationship between dietary patterns and PC risk. Relevant dietary patterns were selected and the risks estimated were calculated by a random-effect model. Multivariable-adjusted odds ratios (ORs), for a first-percentile increase in dietary pattern score, were combined by a dose-response meta-analysis. Twelve observational studies were included in the meta-analysis which identified a "Healthy pattern" and a "Western pattern". The Healthy pattern was not related to PC risk (OR = 0.96; 95% confidence interval (CI): 0.88-1.04) while the Western pattern significantly increased it (OR = 1.34; 95% CI: 1.08-1.65). In addition, the "Carbohydrate pattern", which was analyzed in four articles, was positively associated with a higher PC risk (OR = 1.64; 95% CI: 1.35-2.00). A significant linear trend between the Western ( p = 0.011) pattern, the Carbohydrate ( p = 0.005) pattern, and the increment of PC risk was observed. The small number of studies included in the meta-analysis suggests that further investigation is necessary to support these findings.
A Western Dietary Pattern Increases Prostate Cancer Risk: A Systematic Review and Meta-Analysis
Fabiani, Roberto; Minelli, Liliana; Bertarelli, Gaia; Bacci, Silvia
2016-01-01
Dietary patterns were recently applied to examine the relationship between eating habits and prostate cancer (PC) risk. While the associations between PC risk with the glycemic index and Mediterranean score have been reviewed, no meta-analysis is currently available on dietary patterns defined by “a posteriori” methods. A literature search was carried out (PubMed, Web of Science) to identify studies reporting the relationship between dietary patterns and PC risk. Relevant dietary patterns were selected and the risks estimated were calculated by a random-effect model. Multivariable-adjusted odds ratios (ORs), for a first-percentile increase in dietary pattern score, were combined by a dose-response meta-analysis. Twelve observational studies were included in the meta-analysis which identified a “Healthy pattern” and a “Western pattern”. The Healthy pattern was not related to PC risk (OR = 0.96; 95% confidence interval (CI): 0.88–1.04) while the Western pattern significantly increased it (OR = 1.34; 95% CI: 1.08–1.65). In addition, the “Carbohydrate pattern”, which was analyzed in four articles, was positively associated with a higher PC risk (OR = 1.64; 95% CI: 1.35–2.00). A significant linear trend between the Western (p = 0.011) pattern, the Carbohydrate (p = 0.005) pattern, and the increment of PC risk was observed. The small number of studies included in the meta-analysis suggests that further investigation is necessary to support these findings. PMID:27754328
The East London glaucoma prediction score: web-based validation of glaucoma risk screening tool
Stephen, Cook; Benjamin, Longo-Mbenza
2013-01-01
AIM It is difficult for Optometrists and General Practitioners to know which patients are at risk. The East London glaucoma prediction score (ELGPS) is a web based risk calculator that has been developed to determine Glaucoma risk at the time of screening. Multiple risk factors that are available in a low tech environment are assessed to provide a risk assessment. This is extremely useful in settings where access to specialist care is difficult. Use of the calculator is educational. It is a free web based service. Data capture is user specific. METHOD The scoring system is a web based questionnaire that captures and subsequently calculates the relative risk for the presence of Glaucoma at the time of screening. Three categories of patient are described: Unlikely to have Glaucoma; Glaucoma Suspect and Glaucoma. A case review methodology of patients with known diagnosis is employed to validate the calculator risk assessment. RESULTS Data from the patient records of 400 patients with an established diagnosis has been captured and used to validate the screening tool. The website reports that the calculated diagnosis correlates with the actual diagnosis 82% of the time. Biostatistics analysis showed: Sensitivity = 88%; Positive predictive value = 97%; Specificity = 75%. CONCLUSION Analysis of the first 400 patients validates the web based screening tool as being a good method of screening for the at risk population. The validation is ongoing. The web based format will allow a more widespread recruitment for different geographic, population and personnel variables. PMID:23550097
Markov chains and semi-Markov models in time-to-event analysis.
Abner, Erin L; Charnigo, Richard J; Kryscio, Richard J
2013-10-25
A variety of statistical methods are available to investigators for analysis of time-to-event data, often referred to as survival analysis. Kaplan-Meier estimation and Cox proportional hazards regression are commonly employed tools but are not appropriate for all studies, particularly in the presence of competing risks and when multiple or recurrent outcomes are of interest. Markov chain models can accommodate censored data, competing risks (informative censoring), multiple outcomes, recurrent outcomes, frailty, and non-constant survival probabilities. Markov chain models, though often overlooked by investigators in time-to-event analysis, have long been used in clinical studies and have widespread application in other fields.
Markov chains and semi-Markov models in time-to-event analysis
Abner, Erin L.; Charnigo, Richard J.; Kryscio, Richard J.
2014-01-01
A variety of statistical methods are available to investigators for analysis of time-to-event data, often referred to as survival analysis. Kaplan-Meier estimation and Cox proportional hazards regression are commonly employed tools but are not appropriate for all studies, particularly in the presence of competing risks and when multiple or recurrent outcomes are of interest. Markov chain models can accommodate censored data, competing risks (informative censoring), multiple outcomes, recurrent outcomes, frailty, and non-constant survival probabilities. Markov chain models, though often overlooked by investigators in time-to-event analysis, have long been used in clinical studies and have widespread application in other fields. PMID:24818062
NASA Technical Reports Server (NTRS)
Carreno, Victor
2006-01-01
This document describes a method to demonstrate that a UAS, operating in the NAS, can avoid collisions with an equivalent level of safety compared to a manned aircraft. The method is based on the calculation of a collision probability for a UAS , the calculation of a collision probability for a base line manned aircraft, and the calculation of a risk ratio given by: Risk Ratio = P(collision_UAS)/P(collision_manned). A UAS will achieve an equivalent level of safety for collision risk if the Risk Ratio is less than or equal to one. Calculation of the probability of collision for UAS and manned aircraft is accomplished through event/fault trees.
Health Risk Behavior in Foster Youth
Gramkowski, Bridget; Kools, Susan; Paul, Steven; Boyer, Cherrie; Monasterio, Erica; Robbins, Nancy
2010-01-01
Problem Adolescent health problems are predominantly caused by risk behavior. Foster adolescents have disproportionately poor health; therefore identification of risk behavior is critical. Method A secondary analysis of data from a larger study investigated the health risk behavior of 56 foster youth using the CHIP-AE. Findings Foster youth had some increased risk behavior. Younger adolescents and those in kinship care had less risky behavior. Youth had more risk behavior when: in group homes, parental death, histories of physical or emotional abuse, or history of attempted suicide. Conclusions These results point to areas of strength and vulnerability in foster youth. PMID:19490278
MacBride-Stewart, Sean; Marwick, Charis; Houston, Neil; Watt, Iain; Patton, Andrea; Guthrie, Bruce
2017-01-01
Background It is uncertain whether improvements in primary care high-risk prescribing seen in research trials can be realised in the real-world setting. Aim To evaluate the impact of a 1-year system-wide phase IV prescribing safety improvement initiative, which included education, feedback, support to identify patients to review, and small financial incentives. Design and setting An interrupted time series analysis of targeted high-risk prescribing in all 56 general practices in NHS Forth Valley, Scotland, was performed. In 2013–2014, this focused on high-risk non-steroidal anti-inflammatory drugs (NSAIDs) in older people and NSAIDs with oral anticoagulants; in 2014–2015, it focused on antipsychotics in older people. Method The primary analysis used segmented regression analysis to estimate impact at the end of the intervention, and 12 months later. The secondary analysis used difference-in-difference methods to compare Forth Valley changes with those in NHS Greater Glasgow and Clyde (GGC). Results In the primary analysis, downward trends for all three NSAID measures that were existent before the intervention statistically significantly steepened following implementation of the intervention. At the end of the intervention period, 1221 fewer patients than expected were prescribed a high-risk NSAID. In contrast, antipsychotic prescribing in older people increased slowly over time, with no intervention-associated change. In the secondary analysis, reductions at the end of the intervention period in all three NSAID measures were statistically significantly greater in NHS Forth Valley than in NHS GGC, but only significantly greater for two of these measures 12 months after the intervention finished. Conclusion There were substantial and sustained reductions in the high-risk prescribing of NSAIDs, although with some waning of effect 12 months after the intervention ceased. The same intervention had no effect on antipsychotic prescribing in older people. PMID:28347986
Yamamoto, Dorothy J.; Woo, Choong-Wan; Wager, Tor D.; Regner, Michael F.; Tanabe, Jody
2015-01-01
Background Alterations in frontal and striatal function are hypothesized to underlie risky decision-making in drug users, but how these regions interact to affect behavior is incompletely understood. We used mediation analysis to investigate how prefrontal cortex and ventral striatum together influence risk avoidance in abstinent drug users. Method Thirty-seven abstinent substance-dependent individuals (SDI) and 43 controls underwent fMRI while performing a decision-making task involving risk and reward. Analyses of a priori regions-of-interest tested whether activity in dorsolateral prefrontal cortex (DLPFC) and ventral striatum (VST) explained group differences in risk avoidance. Whole-brain analysis was conducted to identify brain regions influencing the negative VST-risk avoidance relationship. Results Right DLPFC (RDLPFC) positively mediated the group-risk avoidance relationship (p < 0.05); RDLPFC activity was higher in SDI and predicted higher risk avoidance across groups, controlling for SDI vs. controls. Conversely, VST activity negatively influenced risk avoidance (p < 0.05); it was higher in SDI, and predicted lower risk avoidance. Whole-brain analysis revealed that, across group, RDLPFC and left temporal-parietal junction positively (p ≤ 0.001) while right thalamus and left middle frontal gyrus negatively (p < 0.005) mediated the VST activity-risk avoidance relationship. Conclusion RDLPFC activity mediated less risky decision-making while VST mediated more risky decision-making across drug users and controls. These results suggest a dual pathway underlying decision-making, which, if imbalanced, may adversely influence choices involving risk. Modeling contributions of multiple brain systems to behavior through mediation analysis could lead to a better understanding of mechanisms of behavior and suggest neuromodulatory treatments for addiction. PMID:25736619
Haloacetic acids in drinking water and risk for stillbirth
King, W; Dodds, L; Allen, A; Armson, B; Fell, D; Nimrod, C
2005-01-01
Aims: To investigate the effects of haloacetic acid (HAA) compounds in drinking water on stillbirth risk. Methods: A population based case-control study was conducted in Nova Scotia and Eastern Ontario, Canada. Estimates of daily exposure to total and specific HAAs were based on household water samples and questionnaire information on water consumption at home and work. Results: The analysis included 112 stillbirth cases and 398 live birth controls. In analysis without adjustment for total THM exposure, a relative risk greater than 2 was observed for an intermediate exposure category for total HAA and dichloroacetic acid measures. After adjustment for total THM exposure, the risk estimates for intermediate exposure categories were diminished, the relative risk associated with the highest category was in the direction of a protective effect, and all confidence intervals included the null value. Conclusions: No association was observed between HAA exposures and stillbirth risk after controlling for THM exposures. PMID:15657195
Ergonomic assessment for the task of repairing computers in a manufacturing company: A case study.
Maldonado-Macías, Aidé; Realyvásquez, Arturo; Hernández, Juan Luis; García-Alcaraz, Jorge
2015-01-01
Manufacturing industry workers who repair computers may be exposed to ergonomic risk factors. This project analyzes the tasks involved in the computer repair process to (1) find the risk level for musculoskeletal disorders (MSDs) and (2) propose ergonomic interventions to address any ergonomic issues. Work procedures and main body postures were video recorded and analyzed using task analysis, the Rapid Entire Body Assessment (REBA) postural method, and biomechanical analysis. High risk for MSDs was found on every subtask using REBA. Although biomechanical analysis found an acceptable mass center displacement during tasks, a hazardous level of compression on the lower back during computer's transportation was detected. This assessment found ergonomic risks mainly in the trunk, arm/forearm, and legs; the neck and hand/wrist were also compromised. Opportunities for ergonomic analyses and interventions in the design and execution of computer repair tasks are discussed.
[Sources and potential risk of heavy metals in roadside soils of Xi' an City].
Chen, Jing-hui; Lu, Xin-wei; Zhai, Meng
2011-07-01
Based on the X-Ray fluorescence spectroscopic measurement of heavy metals concentration in roadside soil samples from Xi' an City, and by the methods of principal component analysis, cluster analysis, and correlation analysis, this paper approached the possible sources of heavy metals in the roadside soils of the City. In the meantime, potential ecological risk index was used to assess the ecological risk of the heavy metals. In the roadside soils, the mean concentrations of Co, Cr, Cu, Mn, Ni, Pb, and Zn were higher than those of the Shaanxi soil background values. The As, Mn and Ni in roadside soils mainly came from natural source and transportation source, the Cu, Pb, and Zn mainly came from transportation source, and the Co and Cr mainly came from industry source. These heavy metals in the roadside soils belonged to medium pollution, and had medium potential ecological risk.
Case Study on Project Risk Management Planning Based on Soft System Methodology
NASA Astrophysics Data System (ADS)
Lifang, Xie; Jun, Li
This paper analyzed the soft system characters of construction projects and the applicability on using Soft System Methodology (SSM) for risk analysis after a brief review of SSM. Taking a hydropower project as an example, it constructed the general frame of project risk management planning (PRMP) and established the Risk Management Planning (RMP) system from the perspective of the interests of co-ordination. This paper provided the ideas and methods for construction RMP under the win-win situation through the practice of SSM.
Campbell, M J
1983-01-01
I describe methods of analysing possible aetiological factors in a follow-up survey, all of which are possible to carry out using the statistical package GENSTAT. A high haemoglobin level carried a significantly increased risk of ischaemic heart disease, and a low one an increased risk of cancer. Smoking was also an important factor. The increased risk was reasonably constant over time. Sugar intake and Quetelet's index did not significantly affect the relative risk.
Association among Dietary Flavonoids, Flavonoid Subclasses and Ovarian Cancer Risk: A Meta-Analysis
You, Ruxu; Yang, Yu; Liao, Jing; Chen, Dongsheng; Yu, Lixiu
2016-01-01
Background Previous studies have indicated that intake of dietary flavonoids or flavonoid subclasses is associated with the ovarian cancer risk, but presented controversial results. Therefore, we conducted a meta-analysis to derive a more precise estimation of these associations. Methods We performed a search in PubMed, Google Scholar and ISI Web of Science from their inception to April 25, 2015 to select studies on the association among dietary flavonoids, flavonoid subclasses and ovarian cancer risk. The information was extracted by two independent authors. We assessed the heterogeneity, sensitivity, publication bias and quality of the articles. A random-effects model was used to calculate the pooled risk estimates. Results Five cohort studies and seven case-control studies were included in the final meta-analysis. We observed that intake of dietary flavonoids can decrease ovarian cancer risk, which was demonstrated by pooled RR (RR = 0.82, 95% CI = 0.68–0.98). In a subgroup analysis by flavonoid subtypes, the ovarian cancer risk was also decreased for isoflavones (RR = 0.67, 95% CI = 0.50–0.92) and flavonols (RR = 0.68, 95% CI = 0.58–0.80). While there was no compelling evidence that consumption of flavones (RR = 0.86, 95% CI = 0.71–1.03) could decrease ovarian cancer risk, which revealed part sources of heterogeneity. The sensitivity analysis indicated stable results, and no publication bias was observed based on the results of Funnel plot analysis and Egger’s test (p = 0.26). Conclusions This meta-analysis suggested that consumption of dietary flavonoids and subtypes (isoflavones, flavonols) has a protective effect against ovarian cancer with a reduced risk of ovarian cancer except for flavones consumption. Nevertheless, further investigations on a larger population covering more flavonoid subclasses are warranted. PMID:26960146
2014-01-01
Background Physical activity has been inversely associated with risk of several cancers. We performed a systematic review and meta-analysis to evaluate the association between physical activity and risk of esophageal cancer (esophageal adenocarcinoma [EAC] and/or esophageal squamous cell carcinoma [ESCC]). Methods We conducted a comprehensive search of bibliographic databases and conference proceedings from inception through February 2013 for observational studies that examined associations between recreational and/or occupational physical activity and esophageal cancer risk. Summary adjusted odds ratio (OR) estimates with 95% confidence intervals (CI) were estimated using the random-effects model. Results The analysis included 9 studies (4 cohort, 5 case–control) reporting 1,871 cases of esophageal cancer among 1,381,844 patients. Meta-analysis demonstrated that the risk of esophageal cancer was 29% lower among the most physically active compared to the least physically active subjects (OR, 0.71; 95% CI, 0.57-0.89), with moderate heterogeneity (I2 = 47%). On histology-specific analysis, physical activity was associated with a 32% decreased risk of EAC (4 studies, 503 cases of EAC; OR, 0.68; 95% CI, 0.55-0.85) with minimal heterogeneity (I2 = 0%). There were only 3 studies reporting the association between physical activity and risk of ESCC with conflicting results, and the meta-analysis demonstrated a null association (OR, 1.10; 95% CI, 0.21-5.64). The results were consistent across study design, geographic location and study quality, with a non-significant trend towards a dose–response relationship. Conclusions Meta-analysis of published observational studies indicates that physical activity may be associated with reduced risk of esophageal adenocarcinoma. Lifestyle interventions focusing on increasing physical activity may decrease the global burden of EAC. PMID:24886123
Low-thrust mission risk analysis, with application to a 1980 rendezvous with the comet Encke
NASA Technical Reports Server (NTRS)
Yen, C. L.; Smith, D. B.
1973-01-01
A computerized failure process simulation procedure is used to evaluate the risk in a solar electric space mission. The procedure uses currently available thrust-subsystem reliability data and performs approximate simulations of the thrust sybsystem burn operation, the system failure processes, and the retargeting operations. The method is applied to assess the risks in carrying out a 1980 rendezvous mission to the comet Encke. Analysis of the results and evaluation of the effects of various risk factors on the mission show that system component failure rates are the limiting factors in attaining a high mission relability. It is also shown that a well-designed trajectory and system operation mode can be used effectively to partially compensate for unreliable thruster performance.
Train integrity detection risk analysis based on PRISM
NASA Astrophysics Data System (ADS)
Wen, Yuan
2018-04-01
GNSS based Train Integrity Monitoring System (TIMS) is an effective and low-cost detection scheme for train integrity detection. However, as an external auxiliary system of CTCS, GNSS may be influenced by external environments, such as uncertainty of wireless communication channels, which may lead to the failure of communication and positioning. In order to guarantee the reliability and safety of train operation, a risk analysis method of train integrity detection based on PRISM is proposed in this article. First, we analyze the risk factors (in GNSS communication process and the on-board communication process) and model them. Then, we evaluate the performance of the model in PRISM based on the field data. Finally, we discuss how these risk factors influence the train integrity detection process.
Interleukin-10 gene polymorphisms and hepatocellular carcinoma susceptibility: A meta-analysis
Wei, Yong-Gang; Liu, Fei; Li, Bo; Chen, Xi; Ma, Yu; Yan, Lv-Nan; Wen, Tian-Fu; Xu, Ming-Qing; Wang, Wen-Tao; Yang, Jia-Yin
2011-01-01
AIM: To assess the association between Interleukin-10 (IL-10) gene IL-10-1082 (G/A), IL-10-592(C/A), IL-10-819 (T/C) polymorphisms and hepatocellular carcinoma (HCC) susceptibility. METHODS: Two investigators independently searched the Medline, Embase, China National Knowledge Infrastructure, and Chinese Biomedicine Database. Summary odds ratios (ORs) and 95% confidence intervals (95% CIs) for IL-10 polymorphisms and HCC were calculated in a fixed-effects model (the Mantel-Haenszel method) and a random-effects model (the DerSimonian and Laird method) when appropriate. RESULTS: This meta-analysis included seven eligible studies, which included 1012 HCC cases and 2308 controls. Overall, IL-10-1082 G/A polymorphism was not associated with the risk of HCC (AA vs AG + GG, OR = 1.11, 95% CI = 0.90-1.37). When stratifying for ethnicity, the results were similar (Asian, OR = 1.12, 95% CI = 0.87-1.44; non-Asian, OR = 1.10, 95% CI = 0.75-1.60). In the overall analysis, the IL-10 polymorphism at position -592 (C/A) was identified as a genetic risk factor for HCC among Asians; patients carrying the IL-10-592*C allele had an increased risk of HCC (OR = 1.29, 95% CI = 1.12-1.49). No association was observed between the IL-10-819 T/C polymorphism and HCC susceptibility (TT vs TC + CC, OR = 1.02, 95% CI = 0.79-1.32). CONCLUSION: This meta-analysis suggests that IL-10-592 A/C polymorphism may be associated with HCC among Asians. IL-10-1082 G/A and IL-10-819 T/C polymorphisms were not detected to be related to the risk for HCC. PMID:22025883
EPRI/NRC-RES fire human reliability analysis guidelines.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lewis, Stuart R.; Cooper, Susan E.; Najafi, Bijan
2010-03-01
During the 1990s, the Electric Power Research Institute (EPRI) developed methods for fire risk analysis to support its utility members in the preparation of responses to Generic Letter 88-20, Supplement 4, 'Individual Plant Examination - External Events' (IPEEE). This effort produced a Fire Risk Assessment methodology for operations at power that was used by the majority of U.S. nuclear power plants (NPPs) in support of the IPEEE program and several NPPs overseas. Although these methods were acceptable for accomplishing the objectives of the IPEEE, EPRI and the U.S. Nuclear Regulatory Commission (NRC) recognized that they required upgrades to support currentmore » requirements for risk-informed, performance-based (RI/PB) applications. In 2001, EPRI and the USNRC's Office of Nuclear Regulatory Research (RES) embarked on a cooperative project to improve the state-of-the-art in fire risk assessment to support a new risk-informed environment in fire protection. This project produced a consensus document, NUREG/CR-6850 (EPRI 1011989), entitled 'Fire PRA Methodology for Nuclear Power Facilities' which addressed fire risk for at power operations. NUREG/CR-6850 developed high level guidance on the process for identification and inclusion of human failure events (HFEs) into the fire PRA (FPRA), and a methodology for assigning quantitative screening values to these HFEs. It outlined the initial considerations of performance shaping factors (PSFs) and related fire effects that may need to be addressed in developing best-estimate human error probabilities (HEPs). However, NUREG/CR-6850 did not describe a methodology to develop best-estimate HEPs given the PSFs and the fire-related effects. In 2007, EPRI and RES embarked on another cooperative project to develop explicit guidance for estimating HEPs for human failure events under fire generated conditions, building upon existing human reliability analysis (HRA) methods. This document provides a methodology and guidance for conducting a fire HRA. This process includes identification and definition of post-fire human failure events, qualitative analysis, quantification, recovery, dependency, and uncertainty. This document provides three approaches to quantification: screening, scoping, and detailed HRA. Screening is based on the guidance in NUREG/CR-6850, with some additional guidance for scenarios with long time windows. Scoping is a new approach to quantification developed specifically to support the iterative nature of fire PRA quantification. Scoping is intended to provide less conservative HEPs than screening, but requires fewer resources than a detailed HRA analysis. For detailed HRA quantification, guidance has been developed on how to apply existing methods to assess post-fire fire HEPs.« less
Liu, Renzhi; Liu, Jing; Zhang, Zhijiao; Borthwick, Alistair; Zhang, Ke
2015-12-02
Over the past half century, a surprising number of major pollution incidents occurred due to tailings dam failures. Most previous studies of such incidents comprised forensic analyses of environmental impacts after a tailings dam failure, with few considering the combined pollution risk before incidents occur at a watershed-scale. We therefore propose Watershed-scale Tailings-pond Pollution Risk Analysis (WTPRA), designed for multiple mine tailings ponds, stemming from previous watershed-scale accidental pollution risk assessments. Transferred and combined risk is embedded using risk rankings of multiple routes of the "source-pathway-target" in the WTPRA. The previous approach is modified using multi-criteria analysis, dam failure models, and instantaneous water quality models, which are modified for application to multiple tailings ponds. The study area covers the basin of Gutanting Reservoir (the largest backup drinking water source for Beijing) in Zhangjiakou City, where many mine tailings ponds are located. The resultant map shows that risk is higher downstream of Gutanting Reservoir and in its two tributary basins (i.e., Qingshui River and Longyang River). Conversely, risk is lower in the midstream and upstream reaches. The analysis also indicates that the most hazardous mine tailings ponds are located in Chongli and Xuanhua, and that Guanting Reservoir is the most vulnerable receptor. Sensitivity and uncertainty analyses are performed to validate the robustness of the WTPRA method.
Kwon, Younghoon; Koene, Ryan J.; Kwon, Osung; Kealhofer, Jessica V.; Adabag, Selcuk; Duval, Sue
2017-01-01
Background Patients with heart failure and reduced ejection fraction are at increased risk of malignant ventricular arrhythmias. Implantable cardioverter-defibrillator (ICD) is recommended to prevent sudden cardiac death in some of these patients. Sleep-disordered breathing (SDB) is highly prevalent in this population and may impact arrhythmogenicity. We performed a systematic review and meta-analysis of prospective studies that assessed the impact of SDB on ICD therapy. Methods and Results Relevant prospective studies were identified in the Ovid MEDLINE, EMBASE, and Google Scholar databases. Weighted risk ratios of the association between SDB and appropriate ICD therapies were estimated using random effects meta-analysis. Nine prospective cohort studies (n=1274) were included in this analysis. SDB was present in 52% of the participants. SDB was associated with a 55% higher risk of appropriate ICD therapies (45% versus 28%; risk ratio, 1.55; 95% confidence interval, 1.32–1.83). In a subgroup analysis based on the subtypes of SDB, the risk was higher in both central (risk ratio, 1.50; 95% confidence interval, 1.11–2.02) and obstructive (risk ratio, 1.43; 95% confidence interval, 1.01–2.03) sleep apnea. Conclusions SDB is associated with an increased risk of appropriate ICD therapy in patients with heart failure and reduced ejection fraction. PMID:28213507
Liu, Renzhi; Liu, Jing; Zhang, Zhijiao; Borthwick, Alistair; Zhang, Ke
2015-01-01
Over the past half century, a surprising number of major pollution incidents occurred due to tailings dam failures. Most previous studies of such incidents comprised forensic analyses of environmental impacts after a tailings dam failure, with few considering the combined pollution risk before incidents occur at a watershed-scale. We therefore propose Watershed-scale Tailings-pond Pollution Risk Analysis (WTPRA), designed for multiple mine tailings ponds, stemming from previous watershed-scale accidental pollution risk assessments. Transferred and combined risk is embedded using risk rankings of multiple routes of the “source-pathway-target” in the WTPRA. The previous approach is modified using multi-criteria analysis, dam failure models, and instantaneous water quality models, which are modified for application to multiple tailings ponds. The study area covers the basin of Gutanting Reservoir (the largest backup drinking water source for Beijing) in Zhangjiakou City, where many mine tailings ponds are located. The resultant map shows that risk is higher downstream of Gutanting Reservoir and in its two tributary basins (i.e., Qingshui River and Longyang River). Conversely, risk is lower in the midstream and upstream reaches. The analysis also indicates that the most hazardous mine tailings ponds are located in Chongli and Xuanhua, and that Guanting Reservoir is the most vulnerable receptor. Sensitivity and uncertainty analyses are performed to validate the robustness of the WTPRA method. PMID:26633450
Jaberidoost, Mona; Olfat, Laya; Hosseini, Alireza; Kebriaeezadeh, Abbas; Abdollahi, Mohammad; Alaeddini, Mahdi; Dinarvand, Rassoul
2015-01-01
Pharmaceutical supply chain is a significant component of the health system in supplying medicines, particularly in countries where main drugs are provided by local pharmaceutical companies. No previous studies exist assessing risks and disruptions in pharmaceutical companies while assessing the pharmaceutical supply chain. Any risks affecting the pharmaceutical companies could disrupt supply medicines and health system efficiency. The goal of this study was the risk assessment in pharmaceutical industry in Iran considering process's priority, hazard and probability of risks. The study was carried out in 4 phases; risk identification through literature review, risk identification in Iranian pharmaceutical companies through interview with experts, risk analysis through a questionnaire and consultation with experts using group analytic hierarchy process (AHP) method and rating scale (RS) and risk evaluation of simple additive weighting (SAW) method. In total, 86 main risks were identified in the pharmaceutical supply chain with perspective of pharmaceutical companies classified in 11 classes. The majority of risks described in this study were related to the financial and economic category. Also financial management was found to be the most important factor for consideration. Although pharmaceutical industry and supply chain were affected by current political conditions in Iran during the study time, but half of total risks in the pharmaceutical supply chain were found to be internal risks which could be fixed by companies, internally. Likewise, political status and related risks forced companies to focus more on financial and supply management resulting in less attention to quality management.
Hegde, Shalika; Hoban, Elizabeth; Nevill, Annemarie
2012-11-01
Reproductive health research and policies in Cambodia focus on safe motherhood programs particularly for married women, ignoring comprehensive fertility regulation programs for unmarried migrant women of reproductive age. Maternal mortality risks arising due to unsafe abortion methods practiced by unmarried Cambodian women, across the Thai-Cambodia border, can be considered as a public health emergency. Since Thailand has restrictive abortion laws, Cambodian migrant women who have irregular migration status in Thailand experimented with unsafe abortion methods that allowed them to terminate their pregnancies surreptitiously. Unmarried migrant women choose abortion as a preferred birth control method seeking repeat "unsafe" abortions instead of preventing conception. Drawing on the data collected through surveys, in-depth interviews, and document analysis in Chup Commune (pseudonym), Phnom Penh, and Bangkok, the authors describe the public health dimensions of maternal mortality risks faced by unmarried Cambodian migrant women due to various unsafe abortion methods employed as birth control methods.
Systematic risk assessment methodology for critical infrastructure elements - Oil and Gas subsectors
NASA Astrophysics Data System (ADS)
Gheorghiu, A.-D.; Ozunu, A.
2012-04-01
The concern for the protection of critical infrastructure has been rapidly growing in the last few years in Europe. The level of knowledge and preparedness in this field is beginning to develop in a lawfully organized manner, for the identification and designation of critical infrastructure elements of national and European interest. Oil and gas production, refining, treatment, storage and transmission by pipelines facilities, are considered European critical infrastructure sectors, as per Annex I of the Council Directive 2008/114/EC of 8 December 2008 on the identification and designation of European critical infrastructures and the assessment of the need to improve their protection. Besides identifying European and national critical infrastructure elements, member states also need to perform a risk analysis for these infrastructure items, as stated in Annex II of the above mentioned Directive. In the field of risk assessment, there are a series of acknowledged and successfully used methods in the world, but not all hazard identification and assessment methods and techniques are suitable for a given site, situation, or type of hazard. As Theoharidou, M. et al. noted (Theoharidou, M., P. Kotzanikolaou, and D. Gritzalis 2009. Risk-Based Criticality Analysis. In Critical Infrastructure Protection III. Proceedings. Third Annual IFIP WG 11.10 International Conference on Critical Infrastructure Protection. Hanover, New Hampshire, USA, March 23-25, 2009: revised selected papers, edited by C. Palmer and S. Shenoi, 35-49. Berlin: Springer.), despite the wealth of knowledge already created, there is a need for simple, feasible, and standardized criticality analyses. The proposed systematic risk assessment methodology includes three basic steps: the first step (preliminary analysis) includes the identification of hazards (including possible natural hazards) for each installation/section within a given site, followed by a criterial analysis and then a detailed analysis step. The criterial evaluation is used as a ranking system in order to establish the priorities for the detailed risk assessment. This criterial analysis stage is necessary because the total number of installations and sections on a site can be quite large. As not all installations and sections on a site contribute significantly to the risk of a major accident occurring, it is not efficient to include all installations and sections in the detailed risk assessment, which can be time and resource consuming. The selected installations are then taken into consideration in the detailed risk assessment, which is the third step of the systematic risk assessment methodology. Following this step, conclusions can be drawn related to the overall risk characteristics of the site. The proposed methodology can as such be successfully applied to the assessment of risk related to critical infrastructure elements falling under the energy sector of Critical Infrastructure, mainly the sub-sectors oil and gas. Key words: Systematic risk assessment, criterial analysis, energy sector critical infrastructure elements
An evaluation of the treatment of risk and uncertainties in the IPCC reports on climate change.
Aven, Terje; Renn, Ortwin
2015-04-01
Few global threats rival global climate change in scale and potential consequence. The principal international authority assessing climate risk is the Intergovernmental Panel on Climate Change (IPCC). Through repeated assessments the IPCC has devoted considerable effort and interdisciplinary competence to articulating a common characterization of climate risk and uncertainties. We have reviewed the assessment and its foundation for the Fifth Assessment Reports published in 2013 and 2014, in particular the guidance note for lead authors of the fifth IPCC assessment report on consistent treatment of uncertainties. Our analysis shows that the work carried out by the ICPP is short of providing a theoretically and conceptually convincing foundation on the treatment of risk and uncertainties. The main reasons for our assessment are: (i) the concept of risk is given a too narrow definition (a function of consequences and probability/likelihood); and (ii) the reports lack precision in delineating their concepts and methods. The goal of this article is to contribute to improving the handling of uncertainty and risk in future IPCC studies, thereby obtaining a more theoretically substantiated characterization as well as enhanced scientific quality for risk analysis in this area. Several suggestions for how to improve the risk and uncertainty treatment are provided. © 2014 Society for Risk Analysis.
A simple prognostic model for overall survival in metastatic renal cell carcinoma
Assi, Hazem I.; Patenaude, Francois; Toumishey, Ethan; Ross, Laura; Abdelsalam, Mahmoud; Reiman, Tony
2016-01-01
Introduction: The primary purpose of this study was to develop a simpler prognostic model to predict overall survival for patients treated for metastatic renal cell carcinoma (mRCC) by examining variables shown in the literature to be associated with survival. Methods: We conducted a retrospective analysis of patients treated for mRCC at two Canadian centres. All patients who started first-line treatment were included in the analysis. A multivariate Cox proportional hazards regression model was constructed using a stepwise procedure. Patients were assigned to risk groups depending on how many of the three risk factors from the final multivariate model they had. Results: There were three risk factors in the final multivariate model: hemoglobin, prior nephrectomy, and time from diagnosis to treatment. Patients in the high-risk group (two or three risk factors) had a median survival of 5.9 months, while those in the intermediate-risk group (one risk factor) had a median survival of 16.2 months, and those in the low-risk group (no risk factors) had a median survival of 50.6 months. Conclusions: In multivariate analysis, shorter survival times were associated with hemoglobin below the lower limit of normal, absence of prior nephrectomy, and initiation of treatment within one year of diagnosis. PMID:27217858
Voss, Andreas; Fischer, Claudia; Schroeder, Rico; Figulla, Hans R; Goernig, Matthias
2012-07-01
The objectives of this study were to introduce a new type of heart-rate variability analysis improving risk stratification in patients with idiopathic dilated cardiomyopathy (DCM) and to provide additional information about impaired heart beat generation in these patients. Beat-to-beat intervals (BBI) of 30-min ECGs recorded from 91 DCM patients and 21 healthy subjects were analyzed applying the lagged segmented Poincaré plot analysis (LSPPA) method. LSPPA includes the Poincaré plot reconstruction with lags of 1-100, rotating the cloud of points, its normalized segmentation adapted to their standard deviations, and finally, a frequency-dependent clustering. The lags were combined into eight different clusters representing specific frequency bands within 0.012-1.153 Hz. Statistical differences between low- and high-risk DCM could be found within the clusters II-VIII (e.g., cluster IV: 0.033-0.038 Hz; p = 0.0002; sensitivity = 85.7 %; specificity = 71.4 %). The multivariate statistics led to a sensitivity of 92.9 %, specificity of 85.7 % and an area under the curve of 92.1 % discriminating these patient groups. We introduced the LSPPA method to investigate time correlations in BBI time series. We found that LSPPA contributes considerably to risk stratification in DCM and yields the highest discriminant power in the low and very low-frequency bands.
Multivariate Analysis and Machine Learning in Cerebral Palsy Research
Zhang, Jing
2017-01-01
Cerebral palsy (CP), a common pediatric movement disorder, causes the most severe physical disability in children. Early diagnosis in high-risk infants is critical for early intervention and possible early recovery. In recent years, multivariate analytic and machine learning (ML) approaches have been increasingly used in CP research. This paper aims to identify such multivariate studies and provide an overview of this relatively young field. Studies reviewed in this paper have demonstrated that multivariate analytic methods are useful in identification of risk factors, detection of CP, movement assessment for CP prediction, and outcome assessment, and ML approaches have made it possible to automatically identify movement impairments in high-risk infants. In addition, outcome predictors for surgical treatments have been identified by multivariate outcome studies. To make the multivariate and ML approaches useful in clinical settings, further research with large samples is needed to verify and improve these multivariate methods in risk factor identification, CP detection, movement assessment, and outcome evaluation or prediction. As multivariate analysis, ML and data processing technologies advance in the era of Big Data of this century, it is expected that multivariate analysis and ML will play a bigger role in improving the diagnosis and treatment of CP to reduce mortality and morbidity rates, and enhance patient care for children with CP. PMID:29312134
Multivariate Analysis and Machine Learning in Cerebral Palsy Research.
Zhang, Jing
2017-01-01
Cerebral palsy (CP), a common pediatric movement disorder, causes the most severe physical disability in children. Early diagnosis in high-risk infants is critical for early intervention and possible early recovery. In recent years, multivariate analytic and machine learning (ML) approaches have been increasingly used in CP research. This paper aims to identify such multivariate studies and provide an overview of this relatively young field. Studies reviewed in this paper have demonstrated that multivariate analytic methods are useful in identification of risk factors, detection of CP, movement assessment for CP prediction, and outcome assessment, and ML approaches have made it possible to automatically identify movement impairments in high-risk infants. In addition, outcome predictors for surgical treatments have been identified by multivariate outcome studies. To make the multivariate and ML approaches useful in clinical settings, further research with large samples is needed to verify and improve these multivariate methods in risk factor identification, CP detection, movement assessment, and outcome evaluation or prediction. As multivariate analysis, ML and data processing technologies advance in the era of Big Data of this century, it is expected that multivariate analysis and ML will play a bigger role in improving the diagnosis and treatment of CP to reduce mortality and morbidity rates, and enhance patient care for children with CP.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mays, S.E.; Poloski, J.P.; Sullivan, W.H.
1982-07-01
This report describes a risk study of the Browns Ferry, Unit 1, nuclear plant. The study is one of four such studies sponsored by the NRC Office of Research, Division of Risk Assessment, as part of its Interim Reliability Evaluation Program (IREP), Phase II. This report is contained in four volumes: a main report and three appendixes. Appendix C generally describes the methods used to estimate accident sequence frequency values. Information is presented concerning the approach, example collection, failure data, candidate dominant sequences, uncertainty analysis, and sensitivity analysis.
A sampling and analysis method for the determination of asbestos in air is presented in Part 1 of this report, under separate cover. This method is designed specifically to provide results suitable for supporting risk assessments at Superfund sites, although it is applicable t...
ERIC Educational Resources Information Center
Cappella, Elise; Hwang, Sophia H. J.; Kieffer, Michael J.; Yates, Miranda
2018-01-01
Given the potential of afterschool programs to support youth in urban, low-income communities, we examined the role of afterschool classroom ecology in the academic outcomes of Latino and African American youth with and without social-behavioral risk. Using multireporter methods and multilevel analysis, we find that positive classroom ecology…
ERIC Educational Resources Information Center
Hicks, Brian M.; Dirago, Ana C.; Iacono, William G.; McGue, Matt
2009-01-01
Background: Behavior genetic methods can help to elucidate gene-environment (G-E) interplay in the development of internalizing (INT) disorders (i.e., major depression and anxiety disorders). To date, however, no study has conducted a comprehensive analysis examining multiple environmental risk factors with the purpose of delineating general…
Use of the Analysis of the Volatile Faecal Metabolome in Screening for Colorectal Cancer
2015-01-01
Diagnosis of colorectal cancer is an invasive and expensive colonoscopy, which is usually carried out after a positive screening test. Unfortunately, existing screening tests lack specificity and sensitivity, hence many unnecessary colonoscopies are performed. Here we report on a potential new screening test for colorectal cancer based on the analysis of volatile organic compounds (VOCs) in the headspace of faecal samples. Faecal samples were obtained from subjects who had a positive faecal occult blood sample (FOBT). Subjects subsequently had colonoscopies performed to classify them into low risk (non-cancer) and high risk (colorectal cancer) groups. Volatile organic compounds were analysed by selected ion flow tube mass spectrometry (SIFT-MS) and then data were analysed using both univariate and multivariate statistical methods. Ions most likely from hydrogen sulphide, dimethyl sulphide and dimethyl disulphide are statistically significantly higher in samples from high risk rather than low risk subjects. Results using multivariate methods show that the test gives a correct classification of 75% with 78% specificity and 72% sensitivity on FOBT positive samples, offering a potentially effective alternative to FOBT. PMID:26086914
Work-related musculoskeletal complaints: some ergonomics challenges upon the start of a new century.
Westgaard, R H
2000-12-01
Three themes likely to be important within health-related ergonomics in the coming years are discussed. The first two themes concern methods for risk analysis of low-level biomechanical and psychosocial exposures. The third theme is approaches to successful implementation of ergonomics interventions. Evidence on the assessment of low-level biomechanical and psychosocial exposures by instrumented measurements is discussed. It is concluded that, despite recent advances in our understanding of exposure-effect associations under these exposure conditions, we must at present rely on more subjective methods, employed in a collaboration between expert and worker. This approach to risk analysis identifies in most cases critical exposures in a work situation. The focus should then be on the successful implementation of measures against those exposures, as identification alone does not solve problems. The aim of improved health for the workers further requires that the full complement of risk factors be considered, including work, leisure time and person-based risk factors. Finally, the need to put ergonomics intervention initiatives in an organisational context is emphasised, and examples of approaches used by Norwegian companies are presented.
METAL SPECIATION IN SOIL, SEDIMENT, AND WATER SYSTEMS VIA SYNCHROTRON RADIATION RESEARCH
Metal contaminated environmental systems (soils, sediments, and water) have challenged researchers for many years. Traditional methods of analysis have employed extraction methods to determine total metal content and define risk based on the premise that as metal concentration in...
On the Concept and Definition of Terrorism Risk.
Aven, Terje; Guikema, Seth
2015-12-01
In this article, we provide some reflections on how to define and understand the concept of terrorism risk in a professional risk assessment context. As a basis for this discussion we introduce a set of criteria that we believe should apply to any conceptualization of terrorism risk. These criteria are based on both criteria used in other areas of risk analysis and our experience with terrorism risk analysis. That is, these criteria offer our perspective. We show that several of the suggested perspectives and definitions have weaknesses in relation to these criteria. A main problem identified is the idea that terrorism risk can be conceptualized as a function of probability and consequence, not as a function of the interactions between adaptive individuals and organizations. We argue that perspectives based solely on probability and consequence should be used cautiously or not at all because they fail to reflect the essential features of the concept of terrorism risk, the threats and attacks, their consequences, and the uncertainties, all in the context of adaptation by the adversaries. These three elements should in our view constitute the main pillars of the terrorism risk concept. From this concept we can develop methods for assessing the risk by identifying a set of threats, attacks, and consequence measures associated with the possible outcome scenarios together with a description of the uncertainties and interactions between the adversaries. © 2015 Society for Risk Analysis.
Relative risk analysis of several manufactured nanomaterials: an insurance industry context.
Robichaud, Christine Ogilvie; Tanzil, Dicksen; Weilenmann, Ulrich; Wiesner, Mark R
2005-11-15
A relative risk assessment is presented for the industrial fabrication of several nanomaterials. The production processes for five nanomaterials were selected for this analysis, based on their current or near-term potential for large-scale production and commercialization: single-walled carbon nanotubes, bucky balls (C60), one variety of quantum dots, alumoxane nanoparticles, and nano-titanium dioxide. The assessment focused on the activities surrounding the fabrication of nanomaterials, exclusive of any impacts or risks with the nanomaterials themselves. A representative synthesis method was selected for each nanomaterial based on its potential for scaleup. A list of input materials, output materials, and waste streams for each step of fabrication was developed and entered into a database that included key process characteristics such as temperature and pressure. The physical-chemical properties and quantities of the inventoried materials were used to assess relative risk based on factors such as volatility, carcinogenicity, flammability, toxicity, and persistence. These factors were first used to qualitatively rank risk, then combined using an actuarial protocol developed by the insurance industry for the purpose of calculating insurance premiums for chemical manufacturers. This protocol ranks three categories of risk relative to a 100 point scale (where 100 represents maximum risk): incident risk, normal operations risk, and latent contamination risk. Results from this analysis determined that relative environmental risk from manufacturing each of these five materials was comparatively low in relation to other common industrial manufacturing processes.
Sun, Zhen; Kong, Xin-Juan; Jing, Xue; Deng, Run-Jun; Tian, Zi-Bin
2015-01-01
Background The nutritional risk screening (NRS 2002) has been applied increasingly in patients who underwent abdominal surgery for nutritional risk assessment. However, the usefulness of the NRS 2002 for predicting is controversial. This meta-analysis was to examine whether a preoperative evaluation of nutritional risk by NRS 2002 provided prediction of postoperative outcomes in patients undergoing abdominal surgery. Methods A systematic literature search for published papers was conducted using the following online databases: MEDLINE, EMBASE, the Cochrane library, EBSCO, CRD databases, Cinahl, PsycInfo and BIOSIS previews. The pooled odds ratio (OR) or weight mean difference (WMD) was calculated using a random-effect model or a fix-effect model. Results Eleven studies with a total of 3527 patients included in this study. Postoperative overall complications were more frequent in nutritional risk patients versus patients without nutritional risk (the pooled OR 3.13 [2.51, 3.90] p<0.00001). The pooled OR of mortality for the nutritional risk group and non-nutritional risk group was 3.61 [1.38, 9.47] (p = 0.009). Furthermore, the postoperative hospital stay was significant longer in the preoperative nutritional risk group than in the nutritional normal group (WMD 5.58 [4.21, 6.95] p<0.00001). Conclusions The present study has demonstrated that patients at preoperative nutritional risk have increased complication rates, high mortality and prolonged hospital stay after surgery. However, NRS 2002 needs to be validated in larger samples of patients undergoing abdominal surgery by better reference method. PMID:26172830
Li, Guowei; Cook, Deborah J; Levine, Mitchell A H; Guyatt, Gordon; Crowther, Mark; Heels-Ansdell, Diane; Holbrook, Anne; Lamontagne, Francois; Walter, Stephen D; Ferguson, Niall D; Finfer, Simon; Arabi, Yaseen M; Bellomo, Rinaldo; Cooper, D Jamie; Thabane, Lehana
2015-09-01
Failure to recognize the presence of competing risk or to account for it may result in misleading conclusions. We aimed to perform a competing risk analysis to assess the efficacy of the low molecular weight heparin dalteparin versus unfractionated heparin (UFH) in venous thromboembolism (VTE) in medical-surgical critically ill patients, taking death as a competing risk.This was a secondary analysis of a prospective randomized study of the Prophylaxis for Thromboembolism in Critical Care Trial (PROTECT) database. A total of 3746 medical-surgical critically ill patients from 67 intensive care units (ICUs) in 6 countries receiving either subcutaneous UFH 5000 IU twice daily (n = 1873) or dalteparin 5000 IU once daily plus once-daily placebo (n = 1873) were included for analysis.A total of 205 incident proximal leg deep vein thromboses (PLDVT) were reported during follow-up, among which 96 were in the dalteparin group and 109 were in the UFH group. No significant treatment effect of dalteparin on PLDVT compared with UFH was observed in either the competing risk analysis or standard survival analysis (also known as cause-specific analysis) using multivariable models adjusted for APACHE II score, history of VTE, need for vasopressors, and end-stage renal disease: sub-hazard ratio (SHR) = 0.92, 95% confidence interval (CI): 0.70-1.21, P-value = 0.56 for the competing risk analysis; hazard ratio (HR) = 0.92, 95% CI: 0.68-1.23, P-value = 0.57 for cause-specific analysis. Dalteparin was associated with a significant reduction in risk of pulmonary embolism (PE): SHR = 0.54, 95% CI: 0.31-0.94, P-value = 0.02 for the competing risk analysis; HR = 0.51, 95% CI: 0.30-0.88, P-value = 0.01 for the cause-specific analysis. Two additional sensitivity analyses using the treatment variable as a time-dependent covariate and using as-treated and per-protocol approaches demonstrated similar findings.This competing risk analysis yields no significant treatment effect on PLDVT but a superior effect of dalteparin on PE compared with UFH in medical-surgical critically ill patients. The findings from the competing risk method are in accordance with results from the cause-specific analysis.clinicaltrials.gov Identifier: NCT00182143.
Li, Guowei; Cook, Deborah J.; Levine, Mitchell A.H.; Guyatt, Gordon; Crowther, Mark; Heels-Ansdell, Diane; Holbrook, Anne; Lamontagne, Francois; Walter, Stephen D.; Ferguson, Niall D.; Finfer, Simon; Arabi, Yaseen M.; Bellomo, Rinaldo; Cooper, D. Jamie; Thabane, Lehana
2015-01-01
Abstract Failure to recognize the presence of competing risk or to account for it may result in misleading conclusions. We aimed to perform a competing risk analysis to assess the efficacy of the low molecular weight heparin dalteparin versus unfractionated heparin (UFH) in venous thromboembolism (VTE) in medical-surgical critically ill patients, taking death as a competing risk. This was a secondary analysis of a prospective randomized study of the Prophylaxis for Thromboembolism in Critical Care Trial (PROTECT) database. A total of 3746 medical-surgical critically ill patients from 67 intensive care units (ICUs) in 6 countries receiving either subcutaneous UFH 5000 IU twice daily (n = 1873) or dalteparin 5000 IU once daily plus once-daily placebo (n = 1873) were included for analysis. A total of 205 incident proximal leg deep vein thromboses (PLDVT) were reported during follow-up, among which 96 were in the dalteparin group and 109 were in the UFH group. No significant treatment effect of dalteparin on PLDVT compared with UFH was observed in either the competing risk analysis or standard survival analysis (also known as cause-specific analysis) using multivariable models adjusted for APACHE II score, history of VTE, need for vasopressors, and end-stage renal disease: sub-hazard ratio (SHR) = 0.92, 95% confidence interval (CI): 0.70–1.21, P-value = 0.56 for the competing risk analysis; hazard ratio (HR) = 0.92, 95% CI: 0.68–1.23, P-value = 0.57 for cause-specific analysis. Dalteparin was associated with a significant reduction in risk of pulmonary embolism (PE): SHR = 0.54, 95% CI: 0.31–0.94, P-value = 0.02 for the competing risk analysis; HR = 0.51, 95% CI: 0.30–0.88, P-value = 0.01 for the cause-specific analysis. Two additional sensitivity analyses using the treatment variable as a time-dependent covariate and using as-treated and per-protocol approaches demonstrated similar findings. This competing risk analysis yields no significant treatment effect on PLDVT but a superior effect of dalteparin on PE compared with UFH in medical-surgical critically ill patients. The findings from the competing risk method are in accordance with results from the cause-specific analysis. clinicaltrials.gov Identifier: NCT00182143 PMID:26356708
Read, Gemma J M; Salmon, Paul M; Lenné, Michael G; Stanton, Neville A
2016-03-01
Pedestrian fatalities at rail level crossings (RLXs) are a public safety concern for governments worldwide. There is little literature examining pedestrian behaviour at RLXs and no previous studies have adopted a formative approach to understanding behaviour in this context. In this article, cognitive work analysis is applied to understand the constraints that shape pedestrian behaviour at RLXs in Melbourne, Australia. The five phases of cognitive work analysis were developed using data gathered via document analysis, behavioural observation, walk-throughs and critical decision method interviews. The analysis demonstrates the complex nature of pedestrian decision making at RLXs and the findings are synthesised to provide a model illustrating the influences on pedestrian decision making in this context (i.e. time, effort and social pressures). Further, the CWA outputs are used to inform an analysis of the risks to safety associated with pedestrian behaviour at RLXs and the identification of potential interventions to reduce risk. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Preventing Child Abuse: A Meta-Analysis of Parent Training Programs
ERIC Educational Resources Information Center
Lundahl, Brad W.; Nimer, Janelle; Parsons, Bruce
2006-01-01
Objective: A meta-analysis was conducted to evaluate the ability of parent training programs to reduce parents' risk of abusing a child. Method: A total of 23 studies were submitted to a meta-analysis. Outcomes of interest included parents' attitudes toward abuse, emotional adjustment, child-rearing skills, and actual abuse. Conclusions:…
Pretest probability estimation in the evaluation of patients with possible deep vein thrombosis.
Vinson, David R; Patel, Jason P; Irving, Cedric S
2011-07-01
An estimation of pretest probability is integral to the proper interpretation of a negative compression ultrasound in the diagnostic assessment of lower-extremity deep vein thrombosis. We sought to determine the rate, method, and predictors of pretest probability estimation in such patients. This cross-sectional study of outpatients was conducted in a suburban community hospital in 2006. Estimation of pretest probability was done by enzyme-linked immunosorbent assay d-dimer, Wells criteria, and unstructured clinical impression. Using logistic regression analysis, we measured predictors of documented risk assessment. A cohort analysis was undertaken to compare 3-month thromboembolic outcomes between risk groups. Among 524 cases, 289 (55.2%) underwent pretest probability estimation using the following methods: enzyme-linked immunosorbent assay d-dimer (228; 43.5%), clinical impression (106; 20.2%), and Wells criteria (24; 4.6%), with 69 (13.2%) patients undergoing a combination of at least two methods. Patient factors were not predictive of pretest probability estimation, but the specialty of the clinician was predictive; emergency physicians (P < .0001) and specialty clinicians (P = .001) were less likely than primary care clinicians to perform risk assessment. Thromboembolic events within 3 months were experienced by 0 of 52 patients in the explicitly low-risk group, 4 (1.8%) of 219 in the explicitly moderate- to high-risk group, and 1 (0.4%) of 226 in the group that did not undergo explicit risk assessment. Negative ultrasounds in the workup of deep vein thrombosis are commonly interpreted in isolation apart from pretest probability estimations. Risk assessments varied by physician specialties. Opportunities exist for improvement in the diagnostic evaluation of these patients. Copyright © 2011 Elsevier Inc. All rights reserved.
Wienke, B R; O'Leary, T R
2008-05-01
Linking model and data, we detail the LANL diving reduced gradient bubble model (RGBM), dynamical principles, and correlation with data in the LANL Data Bank. Table, profile, and meter risks are obtained from likelihood analysis and quoted for air, nitrox, helitrox no-decompression time limits, repetitive dive tables, and selected mixed gas and repetitive profiles. Application analyses include the EXPLORER decompression meter algorithm, NAUI tables, University of Wisconsin Seafood Diver tables, comparative NAUI, PADI, Oceanic NDLs and repetitive dives, comparative nitrogen and helium mixed gas risks, USS Perry deep rebreather (RB) exploration dive,world record open circuit (OC) dive, and Woodville Karst Plain Project (WKPP) extreme cave exploration profiles. The algorithm has seen extensive and utilitarian application in mixed gas diving, both in recreational and technical sectors, and forms the bases forreleased tables and decompression meters used by scientific, commercial, and research divers. The LANL Data Bank is described, and the methods used to deduce risk are detailed. Risk functions for dissolved gas and bubbles are summarized. Parameters that can be used to estimate profile risk are tallied. To fit data, a modified Levenberg-Marquardt routine is employed with L2 error norm. Appendices sketch the numerical methods, and list reports from field testing for (real) mixed gas diving. A Monte Carlo-like sampling scheme for fast numerical analysis of the data is also detailed, as a coupled variance reduction technique and additional check on the canonical approach to estimating diving risk. The method suggests alternatives to the canonical approach. This work represents a first time correlation effort linking a dynamical bubble model with deep stop data. Supercomputing resources are requisite to connect model and data in application.
Tourkmani, Abdo Karim; Sánchez-Huerta, Valeria; De Wit, Guillermo; Martínez, Jaime D.; Mingo, David; Mahillo-Fernández, Ignacio; Jiménez-Alfaro, Ignacio
2017-01-01
AIM To analyze the relationship between the score obtained in the Risk Score System (RSS) proposed by Hicks et al with penetrating keratoplasty (PKP) graft failure at 1y postoperatively and among each factor in the RSS with the risk of PKP graft failure using univariate and multivariate analysis. METHODS The retrospective cohort study had 152 PKPs from 152 patients. Eighteen cases were excluded from our study due to primary failure (10 cases), incomplete medical notes (5 cases) and follow-up less than 1y (3 cases). We included 134 PKPs from 134 patients stratified by preoperative risk score. Spearman coefficient was calculated for the relationship between the score obtained and risk of failure at 1y. Univariate and multivariate analysis were calculated for the impact of every single risk factor included in the RSS over graft failure at 1y. RESULTS Spearman coefficient showed statistically significant correlation between the score in the RSS and graft failure (P<0.05). Multivariate logistic regression analysis showed no statistically significant relationship (P>0.05) between diagnosis and lens status with graft failure. The relationship between the other risk factors studied and graft failure was significant (P<0.05), although the results for previous grafts and graft failure was unreliable. None of our patients had previous blood transfusion, thus, it had no impact. CONCLUSION After the application of multivariate analysis techniques, some risk factors do not show the expected impact over graft failure at 1y. PMID:28393027
Managing risks and hazardous in industrial operations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Almaula, S.C.
1996-12-31
The main objective of this paper is to demonstrate that it makes good business sense to identify risks and hazards of an operation and take appropriate steps to manage them effectively. Developing and implementing an effective risk and hazard management plan also contibutes to other industry requirements and standards. Development of a risk management system, key elements of a risk management plan, and hazards and risk analysis methods are outlined. Comparing potential risk to the cost of prevention is also discussed. It is estimated that the cost of developing and preparing the first risk management plan varies between $50,000 tomore » $200,000. 3 refs., 2 figs., 1 tab.« less
Transient risk factors for acute traumatic hand injuries: a case‐crossover study in Hong Kong
Chow, C Y; Lee, H; Lau, J; Yu, I T S
2007-01-01
Objectives To identify the remediable transient risk factors of occupational hand injuries in Hong Kong in order to guide the development of prevention strategies. Methods The case‐crossover study design was adopted. Study subjects were workers with acute hand injuries presenting to the government Occupational Medicine Unit for compensation claims within 90 days from the date of injury. Detailed information on exposures to specific transient factors during the 60 minutes prior to the occurrence of the injury, during the same time interval on the day prior to the injury, as well as the usual exposure during the past work‐month was obtained through telephone interviews. Both matched‐pair interval approach and usual frequency approach were adopted to assess the associations between transient exposures in the workplace and the short‐term risk of sustaining a hand injury. Results A total of 196 injured workers were interviewed. The results of the matched‐pair interval analysis matched well with the results obtained using the usual frequency analysis. Seven significant transient risk factors were identified: using malfunctioning equipment/materials, using a different work method, performing an unusual work task, working overtime, feeling ill, being distracted and rushing, with odds ratios ranging from 10.5 to 26.0 in the matched‐pair interval analysis and relative risks ranging between 8.0 and 28.3 with the usual frequency analysis. Wearing gloves was found to have an insignificant protective effect on the occurrence of hand injury in both analyses. Conclusions Using the case‐crossover study design for acute occupational hand injuries, seven transient risk factors that were mostly modifiable were identified. It is suggested that workers and their employers should increase their awareness of these risk factors, and efforts should be made to avoid exposures to these factors by means of engineering and administrative controls supplemented by safety education and training. PMID:16973734
Goovaerts, Pierre
2006-01-01
Boundary analysis of cancer maps may highlight areas where causative exposures change through geographic space, the presence of local populations with distinct cancer incidences, or the impact of different cancer control methods. Too often, such analysis ignores the spatial pattern of incidence or mortality rates and overlooks the fact that rates computed from sparsely populated geographic entities can be very unreliable. This paper proposes a new methodology that accounts for the uncertainty and spatial correlation of rate data in the detection of significant edges between adjacent entities or polygons. Poisson kriging is first used to estimate the risk value and the associated standard error within each polygon, accounting for the population size and the risk semivariogram computed from raw rates. The boundary statistic is then defined as half the absolute difference between kriged risks. Its reference distribution, under the null hypothesis of no boundary, is derived through the generation of multiple realizations of the spatial distribution of cancer risk values. This paper presents three types of neutral models generated using methods of increasing complexity: the common random shuffle of estimated risk values, a spatial re-ordering of these risks, or p-field simulation that accounts for the population size within each polygon. The approach is illustrated using age-adjusted pancreatic cancer mortality rates for white females in 295 US counties of the Northeast (1970–1994). Simulation studies demonstrate that Poisson kriging yields more accurate estimates of the cancer risk and how its value changes between polygons (i.e. boundary statistic), relatively to the use of raw rates or local empirical Bayes smoother. When used in conjunction with spatial neutral models generated by p-field simulation, the boundary analysis based on Poisson kriging estimates minimizes the proportion of type I errors (i.e. edges wrongly declared significant) while the frequency of these errors is predicted well by the p-value of the statistical test. PMID:19023455
Curtis, Andrew; Blackburn, Jason K; Widmer, Jocelyn M; Morris, J Glenn
2013-04-15
Fine-scale and longitudinal geospatial analysis of health risks in challenging urban areas is often limited by the lack of other spatial layers even if case data are available. Underlying population counts, residential context, and associated causative factors such as standing water or trash locations are often missing unless collected through logistically difficult, and often expensive, surveys. The lack of spatial context also hinders the interpretation of results and designing intervention strategies structured around analytical insights. This paper offers a ubiquitous spatial data collection approach using a spatial video that can be used to improve analysis and involve participatory collaborations. A case study will be used to illustrate this approach with three health risks mapped at the street scale for a coastal community in Haiti. Spatial video was used to collect street and building scale information, including standing water, trash accumulation, presence of dogs, cohort specific population characteristics, and other cultural phenomena. These data were digitized into Google Earth and then coded and analyzed in a GIS using kernel density and spatial filtering approaches. The concentrations of these risks around area schools which are sometimes sources of diarrheal disease infection because of the high concentration of children and variable sanitary practices will show the utility of the method. In addition schools offer potential locations for cholera education interventions. Previously unavailable fine scale health risk data vary in concentration across the town, with some schools being proximate to greater concentrations of the mapped risks. The spatial video is also used to validate coded data and location specific risks within these "hotspots". Spatial video is a tool that can be used in any environment to improve local area health analysis and intervention. The process is rapid and can be repeated in study sites through time to track spatio-temporal dynamics of the communities. Its simplicity should also be used to encourage local participatory collaborations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mossahebi, S; Feigenberg, S; Nichols, E
Purpose: GammaPod™, the first stereotactic radiotherapy device for early stage breast cancer treatment, has been recently installed and commissioned at our institution. A multidisciplinary working group applied the failure mode and effects analysis (FMEA) approach to perform a risk analysis. Methods: FMEA was applied to the GammaPod™ treatment process by: 1) generating process maps for each stage of treatment; 2) identifying potential failure modes and outlining their causes and effects; 3) scoring the potential failure modes using the risk priority number (RPN) system based on the product of severity, frequency of occurrence, and detectability (ranging 1–10). An RPN of highermore » than 150 was set as the threshold for minimal concern of risk. For these high-risk failure modes, potential quality assurance procedures and risk control techniques have been proposed. A new set of severity, occurrence, and detectability values were re-assessed in presence of the suggested mitigation strategies. Results: In the single-day image-and-treat workflow, 19, 22, and 27 sub-processes were identified for the stages of simulation, treatment planning, and delivery processes, respectively. During the simulation stage, 38 potential failure modes were found and scored, in terms of RPN, in the range of 9-392. 34 potential failure modes were analyzed in treatment planning with a score range of 16-200. For the treatment delivery stage, 47 potential failure modes were found with an RPN score range of 16-392. The most critical failure modes consisted of breast-cup pressure loss and incorrect target localization due to patient upper-body alignment inaccuracies. The final RPN score of these failure modes based on recommended actions were assessed to be below 150. Conclusion: FMEA risk analysis technique was applied to the treatment process of GammaPod™, a new stereotactic radiotherapy technology. Application of systematic risk analysis methods is projected to lead to improved quality of GammaPod™ treatments. Ying Niu and Cedric Yu are affiliated with Xcision Medical Systems.« less
Donovan, Sarah-Louise; Salmon, Paul M; Lenné, Michael G; Horberry, Tim
2017-10-01
Safety leadership is an important factor in supporting safety in high-risk industries. This article contends that applying systems-thinking methods to examine safety leadership can support improved learning from incidents. A case study analysis was undertaken of a large-scale mining landslide incident in which no injuries or fatalities were incurred. A multi-method approach was adopted, in which the Critical Decision Method, Rasmussen's Risk Management Framework and Accimap method were applied to examine the safety leadership decisions and actions which enabled the safe outcome. The approach enabled Rasmussen's predictions regarding safety and performance to be examined in the safety leadership context, with findings demonstrating the distribution of safety leadership across leader and system levels, and the presence of vertical integration as key to supporting the successful safety outcome. In doing so, the findings also demonstrate the usefulness of applying systems-thinking methods to examine and learn from incidents in terms of what 'went right'. The implications, including future research directions, are discussed. Practitioner Summary: This paper presents a case study analysis, in which systems-thinking methods are applied to the examination of safety leadership decisions and actions during a large-scale mining landslide incident. The findings establish safety leadership as a systems phenomenon, and furthermore, demonstrate the usefulness of applying systems-thinking methods to learn from incidents in terms of what 'went right'. Implications, including future research directions, are discussed.
Measurement of Bone: Diagnosis of SCI-Induced Osteoporosis and Fracture Risk Prediction
Morse, Leslie R.
2015-01-01
Background: Spinal cord injury (SCI) is associated with a rapid loss of bone mass, resulting in severe osteoporosis and a 5- to 23-fold increase in fracture risk. Despite the seriousness of fractures in SCI, there are multiple barriers to osteoporosis diagnosis and wide variations in treatment practices for SCI-induced osteoporosis. Methods: We review the biological and structural changes that are known to occur in bone after SCI in the context of promoting future research to prevent or reduce risk of fracture in this population. We also review the most commonly used methods for assessing bone after SCI and discuss the strengths, limitations, and clinical applications of each method. Conclusions: Although dual-energy x-ray absorptiometry assessments of bone mineral density may be used clinically to detect changes in bone after SCI, 3-dimensional methods such as quantitative CT analysis are recommended for research applications and are explained in detail. PMID:26689691
NASA Astrophysics Data System (ADS)
Tang, Zhongqian; Zhang, Hua; Yi, Shanzhen; Xiao, Yangfan
2018-03-01
GIS-based multi-criteria decision analysis (MCDA) is increasingly used to support flood risk assessment. However, conventional GIS-MCDA methods fail to adequately represent spatial variability and are accompanied with considerable uncertainty. It is, thus, important to incorporate spatial variability and uncertainty into GIS-based decision analysis procedures. This research develops a spatially explicit, probabilistic GIS-MCDA approach for the delineation of potentially flood susceptible areas. The approach integrates the probabilistic and the local ordered weighted averaging (OWA) methods via Monte Carlo simulation, to take into account the uncertainty related to criteria weights, spatial heterogeneity of preferences and the risk attitude of the analyst. The approach is applied to a pilot study for the Gucheng County, central China, heavily affected by the hazardous 2012 flood. A GIS database of six geomorphological and hydrometeorological factors for the evaluation of susceptibility was created. Moreover, uncertainty and sensitivity analysis were performed to investigate the robustness of the model. The results indicate that the ensemble method improves the robustness of the model outcomes with respect to variation in criteria weights and identifies which criteria weights are most responsible for the variability of model outcomes. Therefore, the proposed approach is an improvement over the conventional deterministic method and can provides a more rational, objective and unbiased tool for flood susceptibility evaluation.
Holmskov, Mathilde; Storebø, Ole Jakob; Moreira-Maia, Carlos R.; Ramstad, Erica; Magnusson, Frederik Løgstrup; Krogh, Helle B.; Groth, Camilla; Gillies, Donna; Zwi, Morris; Skoog, Maria; Gluud, Christian; Simonsen, Erik
2017-01-01
Objectives To study in more depth the relationship between type, dose, or duration of methylphenidate offered to children and adolescents with attention deficit hyperactivity disorder and their risks of gastrointestinal adverse events based on our Cochrane systematic review. Methods and findings We use data from our review including 185 randomised clinical trials. Randomised parallel-group trials and cross-over trials reporting gastrointestinal adverse events associated with methylphenidate were included. Data were extracted and quality assessed according to Cochrane guidelines. Data were summarised as risk ratios (RR) with 95% confidence intervals (CI) using the inverse variance method. Bias risks were assessed according to domains. Trial Sequential Analysis (TSA) was used to control random errors. Eighteen parallel group trials and 43 cross-over trials reported gastrointestinal adverse events. All trials were at high risk of bias. In parallel group trials, methylphenidate decreased appetite (RR 3.66, 95% CI 2.56 to 5.23) and weight (RR 3.89, 95% CI 1.43 to 10.59). In cross-over trials, methylphenidate increased abdominal pain (RR 1.61, 95% CI 1.27 to 2.04). We found no significant differences in the risk according to type, dose, or duration of administration. The required information size was achieved in three out of four outcomes. Conclusion Methylphenidate increases the risks of decreased appetite, weight loss, and abdominal pain in children and adolescents with attention deficit hyperactivity disorder. No differences in the risks of gastrointestinal adverse events according to type, dose, or duration of administration were found. PMID:28617801
Clinical evaluation incorporating a personal genome
Ashley, Euan A.; Butte, Atul J.; Wheeler, Matthew T.; Chen, Rong; Klein, Teri E.; Dewey, Frederick E.; Dudley, Joel T.; Ormond, Kelly E.; Pavlovic, Aleksandra; Hudgins, Louanne; Gong, Li; Hodges, Laura M.; Berlin, Dorit S.; Thorn, Caroline F.; Sangkuhl, Katrin; Hebert, Joan M.; Woon, Mark; Sagreiya, Hersh; Whaley, Ryan; Morgan, Alexander A.; Pushkarev, Dmitry; Neff, Norma F; Knowles, Joshua W.; Chou, Mike; Thakuria, Joseph; Rosenbaum, Abraham; Zaranek, Alexander Wait; Church, George; Greely, Henry T.; Quake, Stephen R.; Altman, Russ B.
2010-01-01
Background The cost of genomic information has fallen steeply but the path to clinical translation of risk estimates for common variants found in genome wide association studies remains unclear. Since the speed and cost of sequencing complete genomes is rapidly declining, more comprehensive means of analyzing these data in concert with rare variants for genetic risk assessment and individualisation of therapy are required. Here, we present the first integrated analysis of a complete human genome in a clinical context. Methods An individual with a family history of vascular disease and early sudden death was evaluated. Clinical assessment included risk prediction for coronary artery disease, screening for causes of sudden cardiac death, and genetic counselling. Genetic analysis included the development of novel methods for the integration of whole genome sequence data including 2.6 million single nucleotide polymorphisms and 752 copy number variations. The algorithm focused on predicting genetic risk of genes associated with known Mendelian disease, recognised drug responses, and pathogenicity for novel variants. In addition, since integration of risk ratios derived from case control studies is challenging, we estimated posterior probabilities from age and sex appropriate prior probability and likelihood ratios derived for each genotype. In addition, we developed a visualisation approach to account for gene-environment interactions and conditionally dependent risks. Findings We found increased genetic risk for myocardial infarction, type II diabetes and certain cancers. Rare variants in LPA are consistent with the family history of coronary artery disease. Pharmacogenomic analysis suggested a positive response to lipid lowering therapy, likely clopidogrel resistance, and a low initial dosing requirement for warfarin. Many variants of uncertain significance were reported. Interpretation Although challenges remain, our results suggest that whole genome sequencing can yield useful and clinically relevant information for individual patients, especially for those with a strong family history of significant disease. PMID:20435227
Senarathna, S M D K Ganga; Ranganathan, Shalini S; Buckley, Nick; Soysa, S S S B D Preethi; Fernandopulle, B M Rohini
2012-01-01
Acute paracetamol poisoning is an emerging problem in Sri Lanka. Management guidelines recommend ingested dose and serum paracetamol concentrations to assess the risk. Our aim was to determine the usefulness of the patient's history of an ingested dose of >150 mg/kg and paracetamol concentration obtained by a simple colorimetric method to assess risk in patients with acute paracetamol poisoning. Serum paracetamol concentrations were determined in 100 patients with a history of paracetamol overdose using High Performance Liquid Chromatography (HPLC); (reference method). The results were compared to those obtained with a colorimetric method. The utility of risk assessment by reported dose ingested and colorimetric analysis were compared. The area under the receiver operating characteristic curve for the history of ingested dose was 0.578 and there was no dose cut-off providing useful risk categorization. Both analytical methods had less than 5% intra- and inter-batch variation and were accurate on spiked samples. The time from blood collection to result was six times faster and ten times cheaper for colorimetry (30 minutes, US$2) than for HPLC (180 minutes, US$20). The correlation coefficient between the paracetamol levels by the two methods was 0.85. The agreement on clinical risk categorization on the standard nomogram was also good (Kappa = 0.62, sensitivity 81%, specificity 89%). History of dose ingested alone greatly over-estimated the number of patients who need antidotes and it was a poor predictor of risk. Paracetamol concentrations by colorimetry are rapid and inexpensive. The use of these would greatly improve the assessment of risk and greatly reduce unnecessary expenditure on antidotes.
Cecchini, Massimo; Bedini, Roberto; Mosetti, Davide; Marino, Sonia; Stasi, Serenella
2018-06-01
In recent years, the interest in health and safety in the workplace has increased. Agriculture is one of the human work activities with the highest risk indexes. Studies on risk perception of agricultural workers are often referred to as specific risk factors (especially pesticides), but the risk perception plays an important role in preventing every kind of accident and occupational disease. The aim of this research is to test a new method for understanding the relation between risk perception among farmers and the main risk factors to which they are exposed. A secondary aim is to investigate the influence of training in risk perception in agriculture. The data collection was realized using a questionnaire designed to investigate the risk perception; the questionnaire was given to a sample of 119 agricultural workers in central Italy. Through the use of the "principal components analysis" it was possible to highlight and verify the latent dimensions underlying the collected data in comparison with scales of attitudes. Results show that the highest percentage of strong negative attitude is among the people who have worked for more years, while farmers who have worked for fewer years have a marked positive attitude. The analysis of the questionnaires through the synthetic index method (Rizzi index) showed that agricultural workers involved, in particular the elderly workers, have a negative attitude towards safety; workers are hostile to safety measures if they have not attended special training courses.
Daniulaityte, Raminta; Falck, Russel; Carlson, Robert G.
2012-01-01
Background There has been a rise in the illicit use of pharmaceutical opioids (”pain pills”) in the United States. Conducted with young adult non-medical users of pharmaceutical opioids, this study uses qualitative methods and cultural consensus analysis to describe risk perceptions associated with pharmaceutical opioids and to determine patterns of cultural sharing and intra-cultural variation of these views. Methods The qualitative sub-sample (n=47) was selected from a larger sample of 396 young adults (18–23 years old), who were participating in a natural history study of illicit pharmaceutical opioid use. Qualitative life history interviews, drug ranking task, and cultural consensus analysis were used to elicit participant views about risks and harms associated with pain pills and other drugs, as well as alcohol and tobacco. Results Cultural consensus analysis revealed that the participants shared a single cultural model of drug risks, but the level of agreement decreased with the increasing range of drugs ever used. Further, those with more extensive drug use histories differed from less “experienced” users in their views about OxyContin and some other drugs. Overall, pain pills were viewed as addicting and potentially deadly substances, but these properties were linked to the patterns and methods of use, as well as characteristics of an individual user. Further, risks associated with pharmaceutical opioids were further curtailed because they “came from the doctor,” and thus had a legitimate aspect to their use. Conclusions This study highlights potential problems with universal approaches to substance use prevention and intervention among young people since such approaches ignore the fact that substance use education messages may be experienced differently depending on an individual’s drug use history and his/her perceptions of drug risks. Findings reported here may be useful in the development of prevention and intervention programs aimed at reducing the harm associated with illicit use of pain pills. PMID:22417823
Guo, Guang-Hui; Wu, Feng-Chang; He, Hong-Ping; Feng, Cheng-Lian; Zhang, Rui-Qing; Li, Hui-Xian
2012-04-01
Probabilistic approaches, such as Monte Carlo Sampling (MCS) and Latin Hypercube Sampling (LHS), and non-probabilistic approaches, such as interval analysis, fuzzy set theory and variance propagation, were used to characterize uncertainties associated with risk assessment of sigma PAH8 in surface water of Taihu Lake. The results from MCS and LHS were represented by probability distributions of hazard quotients of sigma PAH8 in surface waters of Taihu Lake. The probabilistic distribution of hazard quotient were obtained from the results of MCS and LHS based on probabilistic theory, which indicated that the confidence intervals of hazard quotient at 90% confidence level were in the range of 0.000 18-0.89 and 0.000 17-0.92, with the mean of 0.37 and 0.35, respectively. In addition, the probabilities that the hazard quotients from MCS and LHS exceed the threshold of 1 were 9.71% and 9.68%, respectively. The sensitivity analysis suggested the toxicity data contributed the most to the resulting distribution of quotients. The hazard quotient of sigma PAH8 to aquatic organisms ranged from 0.000 17 to 0.99 using interval analysis. The confidence interval was (0.001 5, 0.016 3) at the 90% confidence level calculated using fuzzy set theory, and the confidence interval was (0.000 16, 0.88) at the 90% confidence level based on the variance propagation. These results indicated that the ecological risk of sigma PAH8 to aquatic organisms were low. Each method has its own set of advantages and limitations, which was based on different theory; therefore, the appropriate method should be selected on a case-by-case to quantify the effects of uncertainties on the ecological risk assessment. Approach based on the probabilistic theory was selected as the most appropriate method to assess the risk of sigma PAH8 in surface water of Taihu Lake, which provided an important scientific foundation of risk management and control for organic pollutants in water.
SOCIAL STABILITY AND HIV RISK BEHAVIOR: EVALUATING THE ROLE OF ACCUMULATED VULNERABILITY
German, Danielle; Latkin, Carl A.
2011-01-01
This study evaluated a cumulative and syndromic relationship among commonly co-occurring vulnerabilites (homelessness, incarceration, low-income, residential transition) in association with HIV-related risk behaviors among 635 low-income women in Baltimore. Analysis included descriptive statistics, logistic regression, latent class analysis and latent class regression. Both methods of assessing multidimensional instability showed significant associations with risk indicators. Risk of multiple partners, sex exchange, and drug use decreased significantly with each additional domain. Higher stability class membership (77%) was associated with decreased likelihood of multiple partners, exchange partners, recent drug use, and recent STI. Multidimensional social vulnerabilities were cumulatively and synergistically linked to HIV risk behavior. Independent instability measures may miss important contextual determinants of risk. Social stability offers a useful framework to understand the synergy of social vulnerabilities that shape sexual risk behavior. Social policies and programs aiming to enhance housing and overall social stability are likely to be beneficial for HIV prevention. PMID:21259043
Principal Component and Linkage Analysis of Cardiovascular Risk Traits in the Norfolk Isolate
Cox, Hannah C.; Bellis, Claire; Lea, Rod A.; Quinlan, Sharon; Hughes, Roger; Dyer, Thomas; Charlesworth, Jac; Blangero, John; Griffiths, Lyn R.
2009-01-01
Objective(s) An individual's risk of developing cardiovascular disease (CVD) is influenced by genetic factors. This study focussed on mapping genetic loci for CVD-risk traits in a unique population isolate derived from Norfolk Island. Methods This investigation focussed on 377 individuals descended from the population founders. Principal component analysis was used to extract orthogonal components from 11 cardiovascular risk traits. Multipoint variance component methods were used to assess genome-wide linkage using SOLAR to the derived factors. A total of 285 of the 377 related individuals were informative for linkage analysis. Results A total of 4 principal components accounting for 83% of the total variance were derived. Principal component 1 was loaded with body size indicators; principal component 2 with body size, cholesterol and triglyceride levels; principal component 3 with the blood pressures; and principal component 4 with LDL-cholesterol and total cholesterol levels. Suggestive evidence of linkage for principal component 2 (h2 = 0.35) was observed on chromosome 5q35 (LOD = 1.85; p = 0.0008). While peak regions on chromosome 10p11.2 (LOD = 1.27; p = 0.005) and 12q13 (LOD = 1.63; p = 0.003) were observed to segregate with principal components 1 (h2 = 0.33) and 4 (h2 = 0.42), respectively. Conclusion(s): This study investigated a number of CVD risk traits in a unique isolated population. Findings support the clustering of CVD risk traits and provide interesting evidence of a region on chromosome 5q35 segregating with weight, waist circumference, HDL-c and total triglyceride levels. PMID:19339786
Horne, Benjamin D; Malhotra, Alka; Camp, Nicola J
2003-01-01
Background High triglycerides (TG) and low high-density lipoprotein cholesterol (HDL-C) jointly increase coronary disease risk. We performed linkage analysis for TG/HDL-C ratio in the Framingham Heart Study data as a quantitative trait, using methods implemented in LINKAGE, GENEHUNTER (GH), MCLINK, and SOLAR. Results were compared to each other and to those from a previous evaluation using SOLAR for TG/HDL-C ratio on this sample. We also investigated linked pedigrees in each region using by-pedigree analysis. Results Fourteen regions with at least suggestive linkage evidence were identified, including some that may increase and some that may decrease coronary risk. Ten of the 14 regions were identified by more than one analysis, and several of these regions were not previously detected. The best regions identified for each method were on chromosomes 2 (LOD = 2.29, MCLINK), 5 (LOD = 2.65, GH), 7 (LOD = 2.67, SOLAR), and 22 (LOD = 3.37, LINKAGE). By-pedigree multi-point LOD values in MCLINK showed linked pedigrees for all five regions, ranging from 3 linked pedigrees (chromosome 5) to 14 linked pedigrees (chromosome 7), and suggested localizations of between 9 cM and 27 cM in size. Conclusion Reasonable concordance was found across analysis methods. No single method identified all regions, either by full sample LOD or with by-pedigree analysis. Concordance across methods appeared better at the pedigree level, with many regions showing by-pedigree support in MCLINK when no evidence was observed in the full sample. Thus, investigating by-pedigree linkage evidence may provide a useful tool for evaluating linkage regions. PMID:14975161
Horne, Benjamin D; Malhotra, Alka; Camp, Nicola J
2003-12-31
High triglycerides (TG) and low high-density lipoprotein cholesterol (HDL-C) jointly increase coronary disease risk. We performed linkage analysis for TG/HDL-C ratio in the Framingham Heart Study data as a quantitative trait, using methods implemented in LINKAGE, GENEHUNTER (GH), MCLINK, and SOLAR. Results were compared to each other and to those from a previous evaluation using SOLAR for TG/HDL-C ratio on this sample. We also investigated linked pedigrees in each region using by-pedigree analysis. Fourteen regions with at least suggestive linkage evidence were identified, including some that may increase and some that may decrease coronary risk. Ten of the 14 regions were identified by more than one analysis, and several of these regions were not previously detected. The best regions identified for each method were on chromosomes 2 (LOD = 2.29, MCLINK), 5 (LOD = 2.65, GH), 7 (LOD = 2.67, SOLAR), and 22 (LOD = 3.37, LINKAGE). By-pedigree multi-point LOD values in MCLINK showed linked pedigrees for all five regions, ranging from 3 linked pedigrees (chromosome 5) to 14 linked pedigrees (chromosome 7), and suggested localizations of between 9 cM and 27 cM in size. Reasonable concordance was found across analysis methods. No single method identified all regions, either by full sample LOD or with by-pedigree analysis. Concordance across methods appeared better at the pedigree level, with many regions showing by-pedigree support in MCLINK when no evidence was observed in the full sample. Thus, investigating by-pedigree linkage evidence may provide a useful tool for evaluating linkage regions.
Preventing blood transfusion failures: FMEA, an effective assessment method.
Najafpour, Zhila; Hasoumi, Mojtaba; Behzadi, Faranak; Mohamadi, Efat; Jafary, Mohamadreza; Saeedi, Morteza
2017-06-30
Failure Mode and Effect Analysis (FMEA) is a method used to assess the risk of failures and harms to patients during the medical process and to identify the associated clinical issues. The aim of this study was to conduct an assessment of blood transfusion process in a teaching general hospital, using FMEA as the method. A structured FMEA was recruited in our study performed in 2014, and corrective actions were implemented and re-evaluated after 6 months. Sixteen 2-h sessions were held to perform FMEA in the blood transfusion process, including five steps: establishing the context, selecting team members, analysis of the processes, hazard analysis, and developing a risk reduction protocol for blood transfusion. Failure modes with the highest risk priority numbers (RPNs) were identified. The overall RPN scores ranged from 5 to 100 among which, four failure modes were associated with RPNs over 75. The data analysis indicated that failures with the highest RPNs were: labelling (RPN: 100), transfusion of blood or the component (RPN: 100), patient identification (RPN: 80) and sampling (RPN: 75). The results demonstrated that mis-transfusion of blood or blood component is the most important error, which can lead to serious morbidity or mortality. Provision of training to the personnel on blood transfusion, knowledge raising on hazards and appropriate preventative measures, as well as developing standard safety guidelines are essential, and must be implemented during all steps of blood and blood component transfusion.
Apostoli, P; Sala, Emma
2009-01-01
in some sequences of the film "Modern Times" Chaplin is clearly involved in activities at high risk for work-related musculo-skeletal disorders of the upper extremities (UEWMSDs), but evidence and perception of any complaint are not evident. To evaluate the extent of the biomechanical risk using current risk assessment methods and discuss the possible reasons for lack of complaints. we made an analysis using six of the current methods for ergonomic risk assessment (State of Washington, check list OCRA, HAL by ACGIH, RULA Strain Index, OREGE). All the methods applied demonstrated high-to-very high levels of biomechanical risk for the upper extremities, with evident psychic effects but without apparent musculo-skeletal disorders. The discrepancy between evident psychological disorders ad apparent absence of UEWMSDs are discussed as being due to either: an artistic choice by Charlie Chaplin who focused on the aspects thought to be more immediately and easily comic; the short duration of the physical load exertion; or because of a different perception of muscular work and fatigue that was also typical until the 1970's and 1980's, which also confirmed the principles and practices of our preventive and medical disciplines at that time.
The SOBANE strategy for the management of risk, as applied to whole-body or hand-arm vibration.
Malchaire, J; Piette, A
2006-06-01
The objective was to develop a coherent set of methods to be used effectively in industry to prevent and manage the risks associated with exposure to vibration, by coordinating the progressive intervention of the workers, their management, the occupational health and safety (OHS) professionals and the experts. The methods were developed separately for the exposure to whole-body and hand-arm vibration. The SOBANE strategy of risk prevention includes four levels of intervention: level 1, Screening; level 2, Observation; level 3, Analysis and; level 4, Expertise. The methods making it possible to apply this strategy were developed for 14 types of risk factors. The article presents the methods specific to the prevention of the risks associated with the exposure to vibration. The strategy is similar to those published for the risks associated with exposure to noise, heat and musculoskeletal disorders. It explicitly recognizes the qualifications of the workers and their management with regard to the work situation and shares the principle that measuring the exposure of the workers is not necessarily the first step in order to improve these situations. It attempts to optimize the recourse to the competences of the OHS professionals and the experts, in order to come more rapidly, effectively and economically to practical control measures.
NASA Technical Reports Server (NTRS)
Hatfield, Glen S.; Hark, Frank; Stott, James
2016-01-01
Launch vehicle reliability analysis is largely dependent upon using predicted failure rates from data sources such as MIL-HDBK-217F. Reliability prediction methodologies based on component data do not take into account system integration risks such as those attributable to manufacturing and assembly. These sources often dominate component level risk. While consequence of failure is often understood, using predicted values in a risk model to estimate the probability of occurrence may underestimate the actual risk. Managers and decision makers use the probability of occurrence to influence the determination whether to accept the risk or require a design modification. The actual risk threshold for acceptance may not be fully understood due to the absence of system level test data or operational data. This paper will establish a method and approach to identify the pitfalls and precautions of accepting risk based solely upon predicted failure data. This approach will provide a set of guidelines that may be useful to arrive at a more realistic quantification of risk prior to acceptance by a program.
Reconceptualising risk: Perceptions of risk in rural and remote maternity service planning.
Barclay, Lesley; Kornelsen, Jude; Longman, Jo; Robin, Sarah; Kruske, Sue; Kildea, Sue; Pilcher, Jennifer; Martin, Tanya; Grzybowski, Stefan; Donoghue, Deborah; Rolfe, Margaret; Morgan, Geoff
2016-07-01
to explore perceptions and examples of risk related to pregnancy and childbirth in rural and remote Australia and how these influence the planning of maternity services. data collection in this qualitative component of a mixed methods study included 88 semi-structured individual and group interviews (n=102), three focus groups (n=22) and one group information session (n=17). Researchers identified two categories of risk for exploration: health services risk (including clinical and corporate risks) and social risk (including cultural, emotional and financial risks). Data were aggregated and thematically analysed to identify perceptions and examples of risk related to each category. fieldwork was conducted in four jurisdictions at nine sites in rural (n=3) and remote (n=6) Australia. 117 health service employees and 24 consumers. examples and perceptions relating to each category of risk were identified from the data. Most medical practitioners and health service managers perceived clinical risks related to rural birthing services without access to caesarean section. Consumer participants were more likely to emphasise social risks arising from a lack of local birthing services. our analysis demonstrated that the closure of services adds social risk, which exacerbates clinical risk. Analysis also highlighted that perceptions of clinical risk are privileged over social risk in decisions about rural and remote maternity service planning. a comprehensive analysis of risk that identifies how social and other forms of risk contribute to adverse clinical outcomes would benefit rural and remote people and their health services. Formal risk analyses should consider the risks associated with failure to provide birthing services in rural and remote communities as well as the risks of maintaining services. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Liang, Yong; Chai, Hua; Liu, Xiao-Ying; Xu, Zong-Ben; Zhang, Hai; Leung, Kwong-Sak
2016-03-01
One of the most important objectives of the clinical cancer research is to diagnose cancer more accurately based on the patients' gene expression profiles. Both Cox proportional hazards model (Cox) and accelerated failure time model (AFT) have been widely adopted to the high risk and low risk classification or survival time prediction for the patients' clinical treatment. Nevertheless, two main dilemmas limit the accuracy of these prediction methods. One is that the small sample size and censored data remain a bottleneck for training robust and accurate Cox classification model. In addition to that, similar phenotype tumours and prognoses are actually completely different diseases at the genotype and molecular level. Thus, the utility of the AFT model for the survival time prediction is limited when such biological differences of the diseases have not been previously identified. To try to overcome these two main dilemmas, we proposed a novel semi-supervised learning method based on the Cox and AFT models to accurately predict the treatment risk and the survival time of the patients. Moreover, we adopted the efficient L1/2 regularization approach in the semi-supervised learning method to select the relevant genes, which are significantly associated with the disease. The results of the simulation experiments show that the semi-supervised learning model can significant improve the predictive performance of Cox and AFT models in survival analysis. The proposed procedures have been successfully applied to four real microarray gene expression and artificial evaluation datasets. The advantages of our proposed semi-supervised learning method include: 1) significantly increase the available training samples from censored data; 2) high capability for identifying the survival risk classes of patient in Cox model; 3) high predictive accuracy for patients' survival time in AFT model; 4) strong capability of the relevant biomarker selection. Consequently, our proposed semi-supervised learning model is one more appropriate tool for survival analysis in clinical cancer research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dandini, Vincent John; Duran, Felicia Angelica; Wyss, Gregory Dane
2003-09-01
This article describes how features of event tree analysis and Monte Carlo-based discrete event simulation can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology, with some of the best features of each. The resultant object-based event scenario tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible. Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST methodology is then applied to anmore » aviation safety problem that considers mechanisms by which an aircraft might become involved in a runway incursion incident. The resulting OBEST model demonstrates how a close link between human reliability analysis and probabilistic risk assessment methods can provide important insights into aviation safety phenomenology.« less
Qin, D L; Jin, X N; Wang, S J; Wang, J J; Mamat, N; Wang, F J; Wang, Y; Shen, Z A; Sheng, L G; Forsman, M; Yang, L Y; Wang, S; Zhang, Z B; He, L H
2018-06-18
To form a new assessment method to evaluate postural workload comprehensively analyzing the dynamic and static postural workload for workers during their work process to analyze the reliability and validity, and to study the relation between workers' postural workload and work-related musculoskeletal disorders (WMSDs). In the study, 844 workers from electronic and railway vehicle manufacturing factories were selected as subjects investigated by using the China Musculoskeletal Questionnaire (CMQ) to form the postural workload comprehensive assessment method. The Cronbach's α, cluster analysis and factor analysis were used to assess the reliability and validity of the new assessment method. Non-conditional Logistic regression was used to analyze the relation between workers' postural workload and WMSDs. Reliability of the assessment method for postural workload: internal consistency analysis results showed that Cronbach's α was 0.934 and the results of split-half reliability indicated that Spearman-Brown coefficient was 0.881 and the correlation coefficient between the first part and the second was 0.787. Validity of the assessment method for postural workload: the results of cluster analysis indicated that square Euclidean distance between dynamic and static postural workload assessment in the same part or work posture was the shortest. The results of factor analysis showed that 2 components were extracted and the cumulative percentage of variance achieved 65.604%. The postural workload score of the different occupational workers showed significant difference (P<0.05) by covariance analysis. The results of nonconditional Logistic regression indicated that alcohol intake (OR=2.141, 95%CI 1.337-3.428) and obesity (OR=3.408, 95%CI 1.629-7.130) were risk factors for WMSDs. The risk for WMSDs would rise as workers' postural workload rose (OR=1.035, 95%CI 1.022-1.048). There was significant different risk for WMSDs in the different groups of workers distinguished by work type, gender and age. Female workers exhibited a higher prevalence for WMSDs (OR=2.626, 95%CI 1.414-4.879) and workers between 30-40 years of age (OR=1.909, 95%CI 1.237-2.946) as compared with those under 30. This method for comprehensively assessing postural workload is reliable and effective when used in assembling workers, and there is certain relation between the postural workload and WMSDs.
Comparison of 3 Methods for Identifying Dietary Patterns Associated With Risk of Disease
DiBello, Julia R.; Kraft, Peter; McGarvey, Stephen T.; Goldberg, Robert; Campos, Hannia
2008-01-01
Reduced rank regression and partial least-squares regression (PLS) are proposed alternatives to principal component analysis (PCA). Using all 3 methods, the authors derived dietary patterns in Costa Rican data collected on 3,574 cases and controls in 1994–2004 and related the resulting patterns to risk of first incident myocardial infarction. Four dietary patterns associated with myocardial infarction were identified. Factor 1, characterized by high intakes of lean chicken, vegetables, fruit, and polyunsaturated oil, was generated by all 3 dietary pattern methods and was associated with a significantly decreased adjusted risk of myocardial infarction (28%–46%, depending on the method used). PCA and PLS also each yielded a pattern associated with a significantly decreased risk of myocardial infarction (31% and 23%, respectively); this pattern was characterized by moderate intake of alcohol and polyunsaturated oil and low intake of high-fat dairy products. The fourth factor derived from PCA was significantly associated with a 38% increased risk of myocardial infarction and was characterized by high intakes of coffee and palm oil. Contrary to previous studies, the authors found PCA and PLS to produce more patterns associated with cardiovascular disease than reduced rank regression. The most effective method for deriving dietary patterns related to disease may vary depending on the study goals. PMID:18945692
Goldstein, Benjamin A.; Navar, Ann Marie; Carter, Rickey E.
2017-01-01
Abstract Risk prediction plays an important role in clinical cardiology research. Traditionally, most risk models have been based on regression models. While useful and robust, these statistical methods are limited to using a small number of predictors which operate in the same way on everyone, and uniformly throughout their range. The purpose of this review is to illustrate the use of machine-learning methods for development of risk prediction models. Typically presented as black box approaches, most machine-learning methods are aimed at solving particular challenges that arise in data analysis that are not well addressed by typical regression approaches. To illustrate these challenges, as well as how different methods can address them, we consider trying to predicting mortality after diagnosis of acute myocardial infarction. We use data derived from our institution's electronic health record and abstract data on 13 regularly measured laboratory markers. We walk through different challenges that arise in modelling these data and then introduce different machine-learning approaches. Finally, we discuss general issues in the application of machine-learning methods including tuning parameters, loss functions, variable importance, and missing data. Overall, this review serves as an introduction for those working on risk modelling to approach the diffuse field of machine learning. PMID:27436868
Credit scoring analysis using weighted k nearest neighbor
NASA Astrophysics Data System (ADS)
Mukid, M. A.; Widiharih, T.; Rusgiyono, A.; Prahutama, A.
2018-05-01
Credit scoring is a quatitative method to evaluate the credit risk of loan applications. Both statistical methods and artificial intelligence are often used by credit analysts to help them decide whether the applicants are worthy of credit. These methods aim to predict future behavior in terms of credit risk based on past experience of customers with similar characteristics. This paper reviews the weighted k nearest neighbor (WKNN) method for credit assessment by considering the use of some kernels. We use credit data from a private bank in Indonesia. The result shows that the Gaussian kernel and rectangular kernel have a better performance based on the value of percentage corrected classified whose value is 82.4% respectively.
Dean, Derek J; Teulings, Hans-Leo; Caligiuri, Michael; Mittal, Vijay A
2013-11-21
Growing evidence suggests that movement abnormalities are a core feature of psychosis. One marker of movement abnormality, dyskinesia, is a result of impaired neuromodulation of dopamine in fronto-striatal pathways. The traditional methods for identifying movement abnormalities include observer-based reports and force stability gauges. The drawbacks of these methods are long training times for raters, experimenter bias, large site differences in instrumental apparatus, and suboptimal reliability. Taking these drawbacks into account has guided the development of better standardized and more efficient procedures to examine movement abnormalities through handwriting analysis software and tablet. Individuals at risk for psychosis showed significantly more dysfluent pen movements (a proximal measure for dyskinesia) in a handwriting task. Handwriting kinematics offers a great advance over previous methods of assessing dyskinesia, which could clearly be beneficial for understanding the etiology of psychosis.
Dean, Derek J.; Teulings, Hans-Leo; Caligiuri, Michael; Mittal, Vijay A.
2013-01-01
Growing evidence suggests that movement abnormalities are a core feature of psychosis. One marker of movement abnormality, dyskinesia, is a result of impaired neuromodulation of dopamine in fronto-striatal pathways. The traditional methods for identifying movement abnormalities include observer-based reports and force stability gauges. The drawbacks of these methods are long training times for raters, experimenter bias, large site differences in instrumental apparatus, and suboptimal reliability. Taking these drawbacks into account has guided the development of better standardized and more efficient procedures to examine movement abnormalities through handwriting analysis software and tablet. Individuals at risk for psychosis showed significantly more dysfluent pen movements (a proximal measure for dyskinesia) in a handwriting task. Handwriting kinematics offers a great advance over previous methods of assessing dyskinesia, which could clearly be beneficial for understanding the etiology of psychosis. PMID:24300590
Background/Question/Methods Bacterial pathogens in surface water present disease risks to aquatic communities and for human recreational activities. Sources of these pathogens include runoff from urban, suburban, and agricultural point and non-point sources, but hazardous micr...
Rapid Detection Method for the Four Most Common CHEK2 Mutations Based on Melting Profile Analysis.
Borun, Pawel; Salanowski, Kacper; Godlewski, Dariusz; Walkowiak, Jaroslaw; Plawski, Andrzej
2015-12-01
CHEK2 is a tumor suppressor gene, and the mutations affecting the functionality of the protein product increase cancer risk in various organs. The elevated risk, in a significant percentage of cases, is determined by the occurrence of one of the four most common mutations in the CHEK2 gene, including c.470T>C (p.I157T), c.444+1G>A (IVS2+1G>A), c.1100delC, and c.1037+1538_1224+328del5395 (del5395). We have developed and validated a rapid and effective method for their detection based on high-resolution melting analysis and comparative-high-resolution melting, a novel approach enabling simultaneous detection of copy number variations. The analysis is performed in two polymerase chain reactions followed by melting analysis, without any additional reagents or handling other than that used in standard high-resolution melting. Validation of the method was conducted in a group of 103 patients with diagnosed breast cancer, a group of 240 unrelated patients with familial history of cancer associated with the CHEK2 gene mutations, and a 100-person control group. The results of the analyses for all three groups were fully consistent with the results from other methods. The method we have developed improves the identification of the CHEK2 mutation carriers, reduces the cost of such analyses, as well as facilitates their implementation. Along with the increased efficiency, the method maintains accuracy and reliability comparable to other more labor-consuming techniques.
Arenal-type pyroclastic flows: A probabilistic event tree risk analysis
NASA Astrophysics Data System (ADS)
Meloy, Anthony F.
2006-09-01
A quantitative hazard-specific scenario-modelling risk analysis is performed at Arenal volcano, Costa Rica for the newly recognised Arenal-type pyroclastic flow (ATPF) phenomenon using an event tree framework. These flows are generated by the sudden depressurisation and fragmentation of an active basaltic andesite lava pool as a result of a partial collapse of the crater wall. The deposits of this type of flow include angular blocks and juvenile clasts, which are rarely found in other types of pyroclastic flow. An event tree analysis (ETA) is a useful tool and framework in which to analyse and graphically present the probabilities of the occurrence of many possible events in a complex system. Four event trees are created in the analysis, three of which are extended to investigate the varying individual risk faced by three generic representatives of the surrounding community: a resident, a worker, and a tourist. The raw numerical risk estimates determined by the ETA are converted into a set of linguistic expressions (i.e. VERY HIGH, HIGH, MODERATE etc.) using an established risk classification scale. Three individually tailored semi-quantitative risk maps are then created from a set of risk conversion tables to show how the risk varies for each individual in different areas around the volcano. In some cases, by relocating from the north to the south, the level of risk can be reduced by up to three classes. While the individual risk maps may be broadly applicable, and therefore of interest to the general community, the risk maps and associated probability values generated in the ETA are intended to be used by trained professionals and government agencies to evaluate the risk and effectively manage the long-term development of infrastructure and habitation. With the addition of fresh monitoring data, the combination of both long- and short-term event trees would provide a comprehensive and consistent method of risk analysis (both during and pre-crisis), and as such, an ETA is considered to be a valuable quantitative decision support tool.
Li, Rongxia; Stewart, Brock; Weintraub, Eric
2016-01-01
The self-controlled case series (SCCS) and self-controlled risk interval (SCRI) designs have recently become widely used in the field of post-licensure vaccine safety monitoring to detect potential elevated risks of adverse events following vaccinations. The SCRI design can be viewed as a subset of the SCCS method in that a reduced comparison time window is used for the analysis. Compared to the SCCS method, the SCRI design has less statistical power due to fewer events occurring in the shorter control interval. In this study, we derived the asymptotic relative efficiency (ARE) between these two methods to quantify this loss in power in the SCRI design. The equation is formulated as [Formula: see text] (a: control window-length ratio between SCRI and SCCS designs; b: ratio of risk window length and control window length in the SCCS design; and [Formula: see text]: relative risk of exposed window to control window). According to this equation, the relative efficiency declines as the ratio of control-period length between SCRI and SCCS methods decreases, or with an increase in the relative risk [Formula: see text]. We provide an example utilizing data from the Vaccine Safety Datalink (VSD) to study the potential elevated risk of febrile seizure following seasonal influenza vaccine in the 2010-2011 season.
Simon-Freeman, Rebecca; Bluthenthal, Ricky N.
2013-01-01
The legal environment is one factor that influences injection drug users' (IDUs) risk for HIV and other bloodborne pathogens such as hepatitis C virus (HCV). We examined the association between law enforcement encounters (i.e., arrests and citations) and receptive syringe sharing among IDUs in the context of an intensified policing effort. We conducted a mixed methods analysis of 30 qualitative and 187 quantitative interviews with IDUs accessing services at a Los Angeles, CA syringe exchange program from 2008 to 2009. Qualitative findings illustrate concerns related to visibility, drug withdrawal, and previous history of arrest/incarceration. In quantitative analysis, the number of citations received, current homelessness, and perceiving that being arrested would be a “big problem” were independently associated with recent syringe sharing. Findings illustrate some of the unintended public health consequences associated with intensified street-level policing, including risk for HIV and HCV transmission. PMID:23620243
A Study on Re-entry Predictions of Uncontrolled Space Objects for Space Situational Awareness
NASA Astrophysics Data System (ADS)
Choi, Eun-Jung; Cho, Sungki; Lee, Deok-Jin; Kim, Siwoo; Jo, Jung Hyun
2017-12-01
The key risk analysis technologies for the re-entry of space objects into Earth’s atmosphere are divided into four categories: cataloguing and databases of the re-entry of space objects, lifetime and re-entry trajectory predictions, break-up models after re-entry and multiple debris distribution predictions, and ground impact probability models. In this study, we focused on re- entry prediction, including orbital lifetime assessments, for space situational awareness systems. Re-entry predictions are very difficult and are affected by various sources of uncertainty. In particular, during uncontrolled re-entry, large spacecraft may break into several pieces of debris, and the surviving fragments can be a significant hazard for persons and properties on the ground. In recent years, specific methods and procedures have been developed to provide clear information for predicting and analyzing the re-entry of space objects and for ground-risk assessments. Representative tools include object reentry survival analysis tool (ORSAT) and debris assessment software (DAS) developed by National Aeronautics and Space Administration (NASA), spacecraft atmospheric re-entry and aerothermal break-up (SCARAB) and debris risk assessment and mitigation analysis (DRAMA) developed by European Space Agency (ESA), and semi-analytic tool for end of life analysis (STELA) developed by Centre National d’Etudes Spatiales (CNES). In this study, various surveys of existing re-entry space objects are reviewed, and an efficient re-entry prediction technique is suggested based on STELA, the life-cycle analysis tool for satellites, and DRAMA, a re-entry analysis tool. To verify the proposed method, the re-entry of the Tiangong-1 Space Lab, which is expected to re-enter Earth’s atmosphere shortly, was simulated. Eventually, these results will provide a basis for space situational awareness risk analyses of the re-entry of space objects.
Reliability and risk assessment of structures
NASA Technical Reports Server (NTRS)
Chamis, C. C.
1991-01-01
Development of reliability and risk assessment of structural components and structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) the evaluation of the various uncertainties in terms of cumulative distribution functions for various structural response variables based on known or assumed uncertainties in primitive structural variables; (2) evaluation of the failure probability; (3) reliability and risk-cost assessment; and (4) an outline of an emerging approach for eventual certification of man-rated structures by computational methods. Collectively, the results demonstrate that the structural durability/reliability of man-rated structural components and structures can be effectively evaluated by using formal probabilistic methods.
Montorselli, Niccolò Brachetti; Lombardini, Carolina; Magagnotti, Natascia; Marchi, Enrico; Neri, Francesco; Picchi, Gianni; Spinelli, Raffaele
2010-11-01
The study compared the performance of four different logging crews with respect to productivity, organization and safety. To this purpose, the authors developed a data collection method capable of providing a quantitative analysis of risk-taking behavior. Four crews were tested under the same working conditions, representative of close-to-nature alpine forestry. Motor-manual working methods were applied, since these methods are still prevalent in the specific study area, despite the growing popularity of mechanical processors. Crews from public companies showed a significantly lower frequency of risk-taking behavior. The best safety performance was offered by the only (public) crew that had been administered formal safety training. The study seems to deny the common prejudice that safety practice is inversely proportional to productivity. Instead, productivity is increased by introducing more efficient working methods and equipment. The quantitative analysis of risk-taking behavior developed in this study can be applied to a number of industrial fields besides forestry. Characterizing risk-taking behavior for a given case may eventually lead to the development of custom-made training programmes, which may address problem areas while avoiding that the message is weakened by the inclusion of redundant information. In the specific case of logging crews in the central Alps, the study suggests that current training courses may be weak on ergonomics, and advocates a staged training programme, focusing first on accident reduction and then expanding to the prevention of chronic illness. 2010 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baumann, Brian C.; He, Jiwei; Hwang, Wei-Ting
Purpose: To inform prospective trials of adjuvant radiation therapy (adj-RT) for bladder cancer after radical cystectomy, a locoregional failure (LF) risk stratification was proposed. This stratification was developed and validated using surgical databases that may not reflect the outcomes expected in prospective trials. Our purpose was to assess sources of bias that may affect the stratification model's validity or alter the LF risk estimates for each subgroup: time bias due to evolving surgical techniques; trial accrual bias due to inclusion of patients who would be ineligible for adj-RT trials because of early disease progression, death, or loss to follow-up shortlymore » after cystectomy; bias due to different statistical methods to estimate LF; and subgrouping bias due to different definitions of the LF subgroups. Methods and Materials: The LF risk stratification was developed using a single-institution cohort (n=442, 1990-2008) and the multi-institutional SWOG 8710 cohort (n=264, 1987-1998) treated with radical cystectomy with or without chemotherapy. We evaluated the sensitivity of the stratification to sources of bias using Fine-Gray regression and Kaplan-Meier analyses. Results: Year of radical cystectomy was not associated with LF risk on univariate or multivariate analysis after controlling for risk group. By use of more stringent inclusion criteria, 26 SWOG patients (10%) and 60 patients from the single-institution cohort (14%) were excluded. Analysis of the remaining patients confirmed 3 subgroups with significantly different LF risks with 3-year rates of 7%, 17%, and 36%, respectively (P<.01), nearly identical to the rates without correcting for trial accrual bias. Kaplan-Meier techniques estimated higher subgroup LF rates than competing risk analysis. The subgroup definitions used in the NRG-GU001 adj-RT trial were validated. Conclusions: These sources of bias did not invalidate the LF risk stratification or substantially change the model's LF estimates.« less
Doohan, Isabelle; Björnstig, Ulf; Östlund, Ulrika; Saveman, Britt-Inger
2017-04-01
The aim of this study was to explore physical and mental consequences and injury mechanisms among bus crash survivors to identify aspects that influence recovery. The study participants were the total population of survivors (N=56) from a bus crash in Sweden. The study had a mixed-methods design that provided quantitative and qualitative data on injuries, mental well-being, and experiences. Results from descriptive statistics and qualitative thematic analysis were interpreted and integrated in a mixed-methods analysis. Among the survivors, 11 passengers (20%) sustained moderate to severe injuries, and the remaining 45 (80%) had minor or no physical injuries. Two-thirds of the survivors screened for posttraumatic stress disorder (PTSD) risk were assessed, during the period of one to three months after the bus crash, as not being at-risk, and the remaining one-third were at-risk. The thematic analysis resulted in themes covering the consequences and varying aspects that affected the survivors' recoveries. The integrated findings are in the form of four "core cases" of survivors who represent a combination of characteristics: injury severity, mental well-being, social context, and other aspects hindering and facilitating recovery. Core case Avery represents a survivor who had minor or no injuries and who demonstrated a successful mental recovery. Core case Blair represents a survivor with moderate to severe injuries who experienced a successful mental recovery. Core case Casey represents a survivor who sustained minor injuries or no injuries in the crash but who was at-risk of developing PTSD. Core case Daryl represents a survivor who was at-risk of developing PTSD and who also sustained moderate to severe injuries in the crash. The present study provides a multi-faceted understanding of mass-casualty incident (MCI) survivors (ie, having minor injuries does not always correspond to minimal risk for PTSD and moderate to severe injuries do not always correspond to increased risk for PTSD). Injury mitigation measures (eg, safer roadside material and anti-lacerative windows) would reduce the consequences of bus crashes. A well-educated rescue team and a compassionate and competent social environment will facilitate recovery. Doohan I , Björnstig U , Östlund U , Saveman BI . Exploring injury panorama, consequences, and recovery among bus crash survivors: a mixed-methods research study. Prehosp Disaster Med. 2017;32(2):165-174.
Agro-ecoregionalization of Iowa using multivariate geographical clustering
Carol L. Williams; William W. Hargrove; Matt Leibman; David E. James
2008-01-01
Agro-ecoregionalization is categorization of landscapes for use in crop suitability analysis, strategic agroeconomic development, risk analysis, and other purposes. Past agro-ecoregionalizations have been subjective, expert opinion driven, crop specific, and unsuitable for statistical extrapolation. Use of quantitative analytical methods provides an opportunity for...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patriarca, Riccardo, E-mail: riccardo.patriarca@uniroma1.it; Di Gravio, Giulio; Costantino, Francesco
Environmental auditing is a main issue for any production plant and assessing environmental performance is crucial to identify risks factors. The complexity of current plants arises from interactions among technological, human and organizational system components, which are often transient and not easily detectable. The auditing thus requires a systemic perspective, rather than focusing on individual behaviors, as emerged in recent research in the safety domain for socio-technical systems. We explore the significance of modeling the interactions of system components in everyday work, by the application of a recent systemic method, i.e. the Functional Resonance Analysis Method (FRAM), in order tomore » define dynamically the system structure. We present also an innovative evolution of traditional FRAM following a semi-quantitative approach based on Monte Carlo simulation. This paper represents the first contribution related to the application of FRAM in the environmental context, moreover considering a consistent evolution based on Monte Carlo simulation. The case study of an environmental risk auditing in a sinter plant validates the research, showing the benefits in terms of identifying potential critical activities, related mitigating actions and comprehensive environmental monitoring indicators. - Highlights: • We discuss the relevance of a systemic risk based environmental audit. • We present FRAM to represent functional interactions of the system. • We develop a semi-quantitative FRAM framework to assess environmental risks. • We apply the semi-quantitative FRAM framework to build a model for a sinter plant.« less
Hossain, Mian B
2005-09-01
With a population of over 131 million and a fertility rate of 29.9 per 1000, population growth constitutes a primary threat to continued economic growth and development in Bangladesh. One strategy that has been used to cease further increases in fertility in Bangladesh involves using family planning outreach workers who travel throughout rural and urban areas educating women regarding contraceptive alternatives. This study uses a longitudinal database to assess the impact of family planning outreach workers' contact upon contraceptive switching and upon the risk of an unintended pregnancy. Using longitudinal data on contraceptive use from the Operations Research Project (ORP) of the International Centre for Diarrhoeal Disease Research (ICDDR,B) in Bangladesh, multiple decrement life table analysis and multilevel, discrete-time competing risk hazards models were used to estimate the cumulative probabilities of switching to an alternative form of contraceptive use after a woman engaged in a discussion with an outreach worker. After controlling for the effects of socio-demographic and economic characteristics, the analysis revealed that family planning outreach workers' contact with women significantly decreases the risk of transitioning to the non-use of contraceptives. This contact also reduces the risk of an unintended pregnancy. Family planning workers' contact with women is associated with the increased risk of a woman switching from one modern method to another modern method. The study results indicate that side-effects and other method-related reasons are the two primary reasons for contraceptive discontinuation in rural Bangladesh.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.