Failure mode effect analysis and fault tree analysis as a combined methodology in risk management
NASA Astrophysics Data System (ADS)
Wessiani, N. A.; Yoshio, F.
2018-04-01
There have been many studies reported the implementation of Failure Mode Effect Analysis (FMEA) and Fault Tree Analysis (FTA) as a method in risk management. However, most of the studies usually only choose one of these two methods in their risk management methodology. On the other side, combining these two methods will reduce the drawbacks of each methods when implemented separately. This paper aims to combine the methodology of FMEA and FTA in assessing risk. A case study in the metal company will illustrate how this methodology can be implemented. In the case study, this combined methodology will assess the internal risks that occur in the production process. Further, those internal risks should be mitigated based on their level of risks.
Development of economic consequence methodology for process risk analysis.
Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed
2015-04-01
A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies. © 2014 Society for Risk Analysis.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
Analysis of Alternatives for Risk Assessment Methodologies and Tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nachtigal, Noel M.; Fruetel, Julia A.; Gleason, Nathaniel J.
The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in themore » risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.« less
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
NASA Astrophysics Data System (ADS)
Escuder-Bueno, I.; Castillo-Rodríguez, J. T.; Zechner, S.; Jöbstl, C.; Perales-Momparler, S.; Petaccia, G.
2012-09-01
Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009-2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative). First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.
DOT National Transportation Integrated Search
2006-11-01
This report discusses data acquisition and analysis for grade crossing risk analysis at the proposed San Joaquin High-Speed Rail Corridor in San Joaquin, California, and documents the data acquisition and analysis methodologies used to collect and an...
Khan, F I; Iqbal, A; Ramesh, N; Abbasi, S A
2001-10-12
As it is conventionally done, strategies for incorporating accident--prevention measures in any hazardous chemical process industry are developed on the basis of input from risk assessment. However, the two steps-- risk assessment and hazard reduction (or safety) measures--are not linked interactively in the existing methodologies. This prevents a quantitative assessment of the impacts of safety measures on risk control. We have made an attempt to develop a methodology in which risk assessment steps are interactively linked with implementation of safety measures. The resultant system tells us the extent of reduction of risk by each successive safety measure. It also tells based on sophisticated maximum credible accident analysis (MCAA) and probabilistic fault tree analysis (PFTA) whether a given unit can ever be made 'safe'. The application of the methodology has been illustrated with a case study.
Martins, Marcelo Ramos; Schleder, Adriana Miralles; Droguett, Enrique López
2014-12-01
This article presents an iterative six-step risk analysis methodology based on hybrid Bayesian networks (BNs). In typical risk analysis, systems are usually modeled as discrete and Boolean variables with constant failure rates via fault trees. Nevertheless, in many cases, it is not possible to perform an efficient analysis using only discrete and Boolean variables. The approach put forward by the proposed methodology makes use of BNs and incorporates recent developments that facilitate the use of continuous variables whose values may have any probability distributions. Thus, this approach makes the methodology particularly useful in cases where the available data for quantification of hazardous events probabilities are scarce or nonexistent, there is dependence among events, or when nonbinary events are involved. The methodology is applied to the risk analysis of a regasification system of liquefied natural gas (LNG) on board an FSRU (floating, storage, and regasification unit). LNG is becoming an important energy source option and the world's capacity to produce LNG is surging. Large reserves of natural gas exist worldwide, particularly in areas where the resources exceed the demand. Thus, this natural gas is liquefied for shipping and the storage and regasification process usually occurs at onshore plants. However, a new option for LNG storage and regasification has been proposed: the FSRU. As very few FSRUs have been put into operation, relevant failure data on FSRU systems are scarce. The results show the usefulness of the proposed methodology for cases where the risk analysis must be performed under considerable uncertainty. © 2014 Society for Risk Analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, Dave; Brunett, Acacia J.; Bucknor, Matthew
GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory are currently engaged in a joint effort to modernize and develop probabilistic risk assessment (PRA) techniques for advanced non-light water reactors. At a high level, the primary outcome of this project will be the development of next-generation PRA methodologies that will enable risk-informed prioritization of safety- and reliability-focused research and development, while also identifying gaps that may be resolved through additional research. A subset of this effort is the development of PRA methodologies to conduct a mechanistic source term (MST) analysis for event sequences that could result in the release ofmore » radionuclides. The MST analysis seeks to realistically model and assess the transport, retention, and release of radionuclides from the reactor to the environment. The MST methods developed during this project seek to satisfy the requirements of the Mechanistic Source Term element of the ASME/ANS Non-LWR PRA standard. The MST methodology consists of separate analysis approaches for risk-significant and non-risk significant event sequences that may result in the release of radionuclides from the reactor. For risk-significant event sequences, the methodology focuses on a detailed assessment, using mechanistic models, of radionuclide release from the fuel, transport through and release from the primary system, transport in the containment, and finally release to the environment. The analysis approach for non-risk significant event sequences examines the possibility of large radionuclide releases due to events such as re-criticality or the complete loss of radionuclide barriers. This paper provides details on the MST methodology, including the interface between the MST analysis and other elements of the PRA, and provides a simplified example MST calculation for a sodium fast reactor.« less
A semi-quantitative approach to GMO risk-benefit analysis.
Morris, E Jane
2011-10-01
In many countries there are increasing calls for the benefits of genetically modified organisms (GMOs) to be considered as well as the risks, and for a risk-benefit analysis to form an integral part of GMO regulatory frameworks. This trend represents a shift away from the strict emphasis on risks, which is encapsulated in the Precautionary Principle that forms the basis for the Cartagena Protocol on Biosafety, and which is reflected in the national legislation of many countries. The introduction of risk-benefit analysis of GMOs would be facilitated if clear methodologies were available to support the analysis. Up to now, methodologies for risk-benefit analysis that would be applicable to the introduction of GMOs have not been well defined. This paper describes a relatively simple semi-quantitative methodology that could be easily applied as a decision support tool, giving particular consideration to the needs of regulators in developing countries where there are limited resources and experience. The application of the methodology is demonstrated using the release of an insect resistant maize variety in South Africa as a case study. The applicability of the method in the South African regulatory system is also discussed, as an example of what might be involved in introducing changes into an existing regulatory process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dandini, Vincent John; Duran, Felicia Angelica; Wyss, Gregory Dane
2003-09-01
This article describes how features of event tree analysis and Monte Carlo-based discrete event simulation can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology, with some of the best features of each. The resultant object-based event scenario tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible. Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST methodology is then applied to anmore » aviation safety problem that considers mechanisms by which an aircraft might become involved in a runway incursion incident. The resulting OBEST model demonstrates how a close link between human reliability analysis and probabilistic risk assessment methods can provide important insights into aviation safety phenomenology.« less
FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES
2017-06-01
FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES BY AMANDA DONNELLY A THESIS...work develops a comparative model for strategic design methodologies, focusing on the primary elements of vision, time, process, communication and...collaboration, and risk assessment. My analysis dissects and compares three potential design methodologies including, net assessment, scenarios and
NASA Technical Reports Server (NTRS)
Unal, Resit; Keating, Charles; Conway, Bruce; Chytka, Trina
2004-01-01
A comprehensive expert-judgment elicitation methodology to quantify input parameter uncertainty and analysis tool uncertainty in a conceptual launch vehicle design analysis has been developed. The ten-phase methodology seeks to obtain expert judgment opinion for quantifying uncertainties as a probability distribution so that multidisciplinary risk analysis studies can be performed. The calibration and aggregation techniques presented as part of the methodology are aimed at improving individual expert estimates, and provide an approach to aggregate multiple expert judgments into a single probability distribution. The purpose of this report is to document the methodology development and its validation through application to a reference aerospace vehicle. A detailed summary of the application exercise, including calibration and aggregation results is presented. A discussion of possible future steps in this research area is given.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for designs failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kimura, C.Y.; Glaser, R.E.; Mensing, R.W.
1996-08-01
The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team, the model evaluation team, the structural analysis team, and the consequence team. Each team, consisting of at least one member of the ACRAM plus additional DOE and DOE contractor personnel, specializes in the development of the methodology assigned to thatmore » team. This report documents the work performed by the data development team and provides the technical basis for the data used by the ACRAM Standard for determining the aircraft crash frequency. This report should be used to provide the generic data needed to calculate the aircraft crash frequency into the facility under consideration as part of the process for determining the aircraft crash risk to ground facilities as given by the DOE Standard Aircraft Crash Risk Assessment Methodology (ACRAM). Some broad guidance is presented on how to obtain the needed site-specific and facility specific data but this data is not provided by this document.« less
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflights systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for design, failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.
Cyber-Informed Engineering: The Need for a New Risk Informed and Design Methodology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Price, Joseph Daniel; Anderson, Robert Stephen
Current engineering and risk management methodologies do not contain the foundational assumptions required to address the intelligent adversary’s capabilities in malevolent cyber attacks. Current methodologies focus on equipment failures or human error as initiating events for a hazard, while cyber attacks use the functionality of a trusted system to perform operations outside of the intended design and without the operator’s knowledge. These threats can by-pass or manipulate traditionally engineered safety barriers and present false information, invalidating the fundamental basis of a safety analysis. Cyber threats must be fundamentally analyzed from a completely new perspective where neither equipment nor human operationmore » can be fully trusted. A new risk analysis and design methodology needs to be developed to address this rapidly evolving threatscape.« less
RISMC Toolkit and Methodology Research and Development Plan for External Hazards Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coleman, Justin Leigh
This report includes the description and development plan for a Risk Informed Safety Margins Characterization (RISMC) toolkit and methodology that will evaluate multihazard risk in an integrated manner to support the operating nuclear fleet.
A hierarchical-multiobjective framework for risk management
NASA Technical Reports Server (NTRS)
Haimes, Yacov Y.; Li, Duan
1991-01-01
A broad hierarchical-multiobjective framework is established and utilized to methodologically address the management of risk. United into the framework are the hierarchical character of decision-making, the multiple decision-makers at separate levels within the hierarchy, the multiobjective character of large-scale systems, the quantitative/empirical aspects, and the qualitative/normative/judgmental aspects. The methodological components essentially consist of hierarchical-multiobjective coordination, risk of extreme events, and impact analysis. Examples of applications of the framework are presented. It is concluded that complex and interrelated forces require an analysis of trade-offs between engineering analysis and societal preferences, as in the hierarchical-multiobjective framework, to successfully address inherent risk.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-20
..., Risk Management and Analysis (RAM) ACTION: Notice of request for public comments. SUMMARY: The... of 1995. Title of Information Collection: Risk Analysis and Management. OMB Control Number: None.... Methodology: The State Department, is implementing a Risk Analysis and Management Program to vet potential...
Development of risk-based decision methodology for facility design.
DOT National Transportation Integrated Search
2014-06-01
This report develops a methodology for CDOT to use in the risk analysis of various types of facilities and provides : illustrative examples for the use of the proposed framework. An overview of the current practices and applications to : illustrate t...
Risk-benefit analysis and public policy: a bibliography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clark, E.M.; Van Horn, A.J.
1976-11-01
Risk-benefit analysis has been implicitly practiced whenever decision-makers are confronted with decisions involving risks to life, health, or to the environment. Various methodologies have been developed to evaluate relevant criteria and to aid in assessing the impacts of alternative projects. Among these have been cost-benefit analysis, which has been widely used for project evaluation. However, in many cases it has been difficult to assign dollar costs to those criteria involving risks and benefits which are not now assigned explicit monetary values in our economic system. Hence, risk-benefit analysis has evolved to become more than merely an extension of cost-benefit analysis,more » and many methods have been applied to examine the trade-offs between risks and benefits. In addition, new scientific and statistical techniques have been developed for assessing current and future risks. The 950 references included in this bibliography are meant to suggest the breadth of those methodologies which have been applied to decisions involving risk.« less
Environmental Risk Assessment of dredging processes - application to Marin harbour (NW Spain)
NASA Astrophysics Data System (ADS)
Gómez, A. G.; García Alba, J.; Puente, A.; Juanes, J. A.
2014-04-01
A methodological procedure to estimate the environmental risk of dredging operations in aquatic systems has been developed. Environmental risk estimations are based on numerical models results, which provide an appropriated spatio-temporal framework analysis to guarantee an effective decision-making process. The methodological procedure has been applied on a real dredging operation in the port of Marin (NW Spain). Results from Marin harbour confirmed the suitability of the developed methodology and the conceptual approaches as a comprehensive and practical management tool.
Risk analysis within environmental impact assessment of proposed construction activity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zeleňáková, Martina; Zvijáková, Lenka
Environmental impact assessment is an important process, prior to approval of the investment plan, providing a detailed examination of the likely and foreseeable impacts of proposed construction activity on the environment. The objective of this paper is to develop a specific methodology for the analysis and evaluation of environmental impacts of selected constructions – flood protection structures using risk analysis methods. The application of methodology designed for the process of environmental impact assessment will develop assumptions for further improvements or more effective implementation and performance of this process. The main objective of the paper is to improve the implementation ofmore » the environmental impact assessment process. Through the use of risk analysis methods in environmental impact assessment process, the set objective has been achieved. - Highlights: This paper is informed by an effort to develop research with the aim of: • Improving existing qualitative and quantitative methods for assessing the impacts • A better understanding of relations between probabilities and consequences • Methodology for the EIA of flood protection constructions based on risk analysis • Creative approaches in the search for environmentally friendly proposed activities.« less
Hazmat transport: a methodological framework for the risk analysis of marshalling yards.
Cozzani, Valerio; Bonvicini, Sarah; Spadoni, Gigliola; Zanelli, Severino
2007-08-17
A methodological framework was outlined for the comprehensive risk assessment of marshalling yards in the context of quantified area risk analysis. Three accident typologies were considered for yards: (i) "in-transit-accident-induced" releases; (ii) "shunting-accident-induced" spills; and (iii) "non-accident-induced" leaks. A specific methodology was developed for the assessment of expected release frequencies and equivalent release diameters, based on the application of HazOp and Fault Tree techniques to reference schemes defined for the more common types of railcar vessels used for "hazmat" transportation. The approach was applied to the assessment of an extended case-study. The results evidenced that "non-accident-induced" leaks in marshalling yards represent an important contribution to the overall risk associated to these zones. Furthermore, the results confirmed the considerable role of these fixed installations to the overall risk associated to "hazmat" transportation.
NASA Technical Reports Server (NTRS)
Cirillo, William M.; Earle, Kevin D.; Goodliff, Kandyce E.; Reeves, J. D.; Stromgren, Chel; Andraschko, Mark R.; Merrill, R. Gabe
2008-01-01
NASA s Constellation Program employs a strategic analysis methodology in providing an integrated analysis capability of Lunar exploration scenarios and to support strategic decision-making regarding those scenarios. The strategic analysis methodology integrates the assessment of the major contributors to strategic objective satisfaction performance, affordability, and risk and captures the linkages and feedbacks between all three components. Strategic analysis supports strategic decision making by senior management through comparable analysis of alternative strategies, provision of a consistent set of high level value metrics, and the enabling of cost-benefit analysis. The tools developed to implement the strategic analysis methodology are not element design and sizing tools. Rather, these models evaluate strategic performance using predefined elements, imported into a library from expert-driven design/sizing tools or expert analysis. Specific components of the strategic analysis tool set include scenario definition, requirements generation, mission manifesting, scenario lifecycle costing, crew time analysis, objective satisfaction benefit, risk analysis, and probabilistic evaluation. Results from all components of strategic analysis are evaluated a set of pre-defined figures of merit (FOMs). These FOMs capture the high-level strategic characteristics of all scenarios and facilitate direct comparison of options. The strategic analysis methodology that is described in this paper has previously been applied to the Space Shuttle and International Space Station Programs and is now being used to support the development of the baseline Constellation Program lunar architecture. This paper will present an overview of the strategic analysis methodology and will present sample results from the application of the strategic analysis methodology to the Constellation Program lunar architecture.
Development/Modernization of an Advanced Non-Light Water Reactor Probabilistic Risk Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henneke, Dennis W.; Robinson, James
In 2015, GE Hitachi Nuclear Energy (GEH) teamed with Argonne National Laboratory (Argonne) to perform Research and Development (R&D) of next-generation Probabilistic Risk Assessment (PRA) methodologies for the modernization of an advanced non-Light Water Reactor (non-LWR) PRA. This effort built upon a PRA developed in the early 1990s for GEH’s Power Reactor Inherently Safe Module (PRISM) Sodium Fast Reactor (SFR). The work had four main tasks: internal events development modeling the risk from the reactor for hazards occurring at-power internal to the plant; an all hazards scoping review to analyze the risk at a high level from external hazards suchmore » as earthquakes and high winds; an all modes scoping review to understand the risk at a high level from operating modes other than at-power; and risk insights to integrate the results from each of the three phases above. To achieve these objectives, GEH and Argonne used and adapted proven PRA methodologies and techniques to build a modern non-LWR all hazards/all modes PRA. The teams also advanced non-LWR PRA methodologies, which is an important outcome from this work. This report summarizes the project outcomes in two major phases. The first phase presents the methodologies developed for non-LWR PRAs. The methodologies are grouped by scope, from Internal Events At-Power (IEAP) to hazards analysis to modes analysis. The second phase presents details of the PRISM PRA model which was developed as a validation of the non-LWR methodologies. The PRISM PRA was performed in detail for IEAP, and at a broader level for hazards and modes. In addition to contributing methodologies, this project developed risk insights applicable to non-LWR PRA, including focus-areas for future R&D, and conclusions about the PRISM design.« less
Avaliani, S L; Novikov, S M; Shashina, T A; Dodina, N S; Kislitsin, V A; Mishina, A L
2014-01-01
The lack of adequate legislative and regulatory framework for ensuring minimization of the health risks in the field of environmental protection is the obstacle for the application of the risk analysis methodology as a leading tool for administrative activity in Russia. "Principles of the state policy in the sphere of ensuring chemical and biological safety of the Russian Federation for the period up to 2025 and beyond", approved by the President of the Russian Federation on 01 November 2013, No PR-25 73, are aimed at the legal support for the health risk analysis methodology. In the article there have been supposed the main stages of the operative control of the environmental quality, which lead to the reduction of the health risk to the acceptable level. The further improvement of the health risk analysis methodology in Russia should contribute to the implementation of the state policy in the sphere of chemical and biological safety through the introduction of complex measures on neutralization of chemical and biological threats to the human health and the environment, as well as evaluation of the economic effectiveness of these measures. The primary step should be the legislative securing of the quantitative value for the term: "acceptable risk".
Identifying Items to Assess Methodological Quality in Physical Therapy Trials: A Factor Analysis
Cummings, Greta G.; Fuentes, Jorge; Saltaji, Humam; Ha, Christine; Chisholm, Annabritt; Pasichnyk, Dion; Rogers, Todd
2014-01-01
Background Numerous tools and individual items have been proposed to assess the methodological quality of randomized controlled trials (RCTs). The frequency of use of these items varies according to health area, which suggests a lack of agreement regarding their relevance to trial quality or risk of bias. Objective The objectives of this study were: (1) to identify the underlying component structure of items and (2) to determine relevant items to evaluate the quality and risk of bias of trials in physical therapy by using an exploratory factor analysis (EFA). Design A methodological research design was used, and an EFA was performed. Methods Randomized controlled trials used for this study were randomly selected from searches of the Cochrane Database of Systematic Reviews. Two reviewers used 45 items gathered from 7 different quality tools to assess the methodological quality of the RCTs. An exploratory factor analysis was conducted using the principal axis factoring (PAF) method followed by varimax rotation. Results Principal axis factoring identified 34 items loaded on 9 common factors: (1) selection bias; (2) performance and detection bias; (3) eligibility, intervention details, and description of outcome measures; (4) psychometric properties of the main outcome; (5) contamination and adherence to treatment; (6) attrition bias; (7) data analysis; (8) sample size; and (9) control and placebo adequacy. Limitation Because of the exploratory nature of the results, a confirmatory factor analysis is needed to validate this model. Conclusions To the authors' knowledge, this is the first factor analysis to explore the underlying component items used to evaluate the methodological quality or risk of bias of RCTs in physical therapy. The items and factors represent a starting point for evaluating the methodological quality and risk of bias in physical therapy trials. Empirical evidence of the association among these items with treatment effects and a confirmatory factor analysis of these results are needed to validate these items. PMID:24786942
Identifying items to assess methodological quality in physical therapy trials: a factor analysis.
Armijo-Olivo, Susan; Cummings, Greta G; Fuentes, Jorge; Saltaji, Humam; Ha, Christine; Chisholm, Annabritt; Pasichnyk, Dion; Rogers, Todd
2014-09-01
Numerous tools and individual items have been proposed to assess the methodological quality of randomized controlled trials (RCTs). The frequency of use of these items varies according to health area, which suggests a lack of agreement regarding their relevance to trial quality or risk of bias. The objectives of this study were: (1) to identify the underlying component structure of items and (2) to determine relevant items to evaluate the quality and risk of bias of trials in physical therapy by using an exploratory factor analysis (EFA). A methodological research design was used, and an EFA was performed. Randomized controlled trials used for this study were randomly selected from searches of the Cochrane Database of Systematic Reviews. Two reviewers used 45 items gathered from 7 different quality tools to assess the methodological quality of the RCTs. An exploratory factor analysis was conducted using the principal axis factoring (PAF) method followed by varimax rotation. Principal axis factoring identified 34 items loaded on 9 common factors: (1) selection bias; (2) performance and detection bias; (3) eligibility, intervention details, and description of outcome measures; (4) psychometric properties of the main outcome; (5) contamination and adherence to treatment; (6) attrition bias; (7) data analysis; (8) sample size; and (9) control and placebo adequacy. Because of the exploratory nature of the results, a confirmatory factor analysis is needed to validate this model. To the authors' knowledge, this is the first factor analysis to explore the underlying component items used to evaluate the methodological quality or risk of bias of RCTs in physical therapy. The items and factors represent a starting point for evaluating the methodological quality and risk of bias in physical therapy trials. Empirical evidence of the association among these items with treatment effects and a confirmatory factor analysis of these results are needed to validate these items. © 2014 American Physical Therapy Association.
Method and system for dynamic probabilistic risk assessment
NASA Technical Reports Server (NTRS)
Dugan, Joanne Bechta (Inventor); Xu, Hong (Inventor)
2013-01-01
The DEFT methodology, system and computer readable medium extends the applicability of the PRA (Probabilistic Risk Assessment) methodology to computer-based systems, by allowing DFT (Dynamic Fault Tree) nodes as pivot nodes in the Event Tree (ET) model. DEFT includes a mathematical model and solution algorithm, supports all common PRA analysis functions and cutsets. Additional capabilities enabled by the DFT include modularization, phased mission analysis, sequence dependencies, and imperfect coverage.
Intelligence Support to Supply Chain Risk Management
2012-06-01
of Master of Science in Operations Analysis Charles L. Carter, MA Major, USAF June 2012 DISTRIBUTION STATEMENT A. APPROVED FOR...literature regarding supply chain risk management and intelligence doctrine. This review established the importance of supply chain risk analysis to...risk analysis . This research culminated in the development of a methodology for intelligence professionals to use to support supply chain risk
Crash Simulation and Animation: 'A New Approach for Traffic Safety Analysis'
DOT National Transportation Integrated Search
2001-02-01
This researchs objective is to present a methodology to supplement the conventional traffic safety analysis techniques. This methodology aims at using computer simulation to animate and visualize crash occurrence at high-risk locations. This methodol...
Root Source Analysis/ValuStream[Trade Mark] - A Methodology for Identifying and Managing Risks
NASA Technical Reports Server (NTRS)
Brown, Richard Lee
2008-01-01
Root Source Analysis (RoSA) is a systems engineering methodology that has been developed at NASA over the past five years. It is designed to reduce costs, schedule, and technical risks by systematically examining critical assumptions and the state of the knowledge needed to bring to fruition the products that satisfy mission-driven requirements, as defined for each element of the Work (or Product) Breakdown Structure (WBS or PBS). This methodology is sometimes referred to as the ValuStream method, as inherent in the process is the linking and prioritizing of uncertainties arising from knowledge shortfalls directly to the customer's mission driven requirements. RoSA and ValuStream are synonymous terms. RoSA is not simply an alternate or improved method for identifying risks. It represents a paradigm shift. The emphasis is placed on identifying very specific knowledge shortfalls and assumptions that are the root sources of the risk (the why), rather than on assessing the WBS product(s) themselves (the what). In so doing RoSA looks forward to anticipate, identify, and prioritize knowledge shortfalls and assumptions that are likely to create significant uncertainties/ risks (as compared to Root Cause Analysis, which is most often used to look back to discover what was not known, or was assumed, that caused the failure). Experience indicates that RoSA, with its primary focus on assumptions and the state of the underlying knowledge needed to define, design, build, verify, and operate the products, can identify critical risks that historically have been missed by the usual approaches (i.e., design review process and classical risk identification methods). Further, the methodology answers four critical questions for decision makers and risk managers: 1. What s been included? 2. What's been left out? 3. How has it been validated? 4. Has the real source of the uncertainty/ risk been identified, i.e., is the perceived problem the real problem? Users of the RoSA methodology have characterized it as a true bottoms up risk assessment.
Risk Analysis Methods for Deepwater Port Oil Transfer Systems
DOT National Transportation Integrated Search
1976-06-01
This report deals with the risk analysis methodology for oil spills from the oil transfer systems in deepwater ports. Failure mode and effect analysis in combination with fault tree analysis are identified as the methods best suited for the assessmen...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-05
... Numerical Simulations Risk Management Methodology November 1, 2010. I. Introduction On August 25, 2010, The... Analysis and Numerical Simulations (``STANS'') risk management methodology. The rule change alters... collateral within the STANS Monte Carlo simulations.\\7\\ \\7\\ OCC believes the approach currently used to...
Stochastic response surface methodology: A study in the human health area
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oliveira, Teresa A., E-mail: teresa.oliveira@uab.pt; Oliveira, Amílcar, E-mail: amilcar.oliveira@uab.pt; Centro de Estatística e Aplicações, Universidade de Lisboa
2015-03-10
In this paper we review Stochastic Response Surface Methodology as a tool for modeling uncertainty in the context of Risk Analysis. An application in the survival analysis in the breast cancer context is implemented with R software.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brandt, C.; Blanton, M.L.; Dirkes, R.
1995-12-31
Bioconcentration in aquatic systems is generally taken to refer to contaminant uptake through non-ingestion pathways (i.e., dermal and respiration uptake). Ecological risk assessments performed on aquatic systems often rely on published data on bioconcentration factors to calibrate models of exposure. However, many published BCFs, especially those from in situ studies, are confounded by uptake from ingestion of prey. As part of exposure assessment and risk analysis of the Columbia River`s Hanford Reach, the authors tested a methodology to estimate radionuclide BCFs for several aquatic species in the Hanford Reach of the Columbia River. The iterative methodology solves for BCFs frommore » known body burdens and environmental media concentrations. This paper provides BCF methodology description comparisons of BCF from literature and modeled values and how they were used in the exposure assessment and risk analysis of the Columbia River`s Hanford Reach.« less
NASA Astrophysics Data System (ADS)
Tabibzadeh, Maryam
According to the final Presidential National Commission report on the BP Deepwater Horizon (DWH) blowout, there is need to "integrate more sophisticated risk assessment and risk management practices" in the oil industry. Reviewing the literature of the offshore drilling industry indicates that most of the developed risk analysis methodologies do not fully and more importantly, systematically address the contribution of Human and Organizational Factors (HOFs) in accident causation. This is while results of a comprehensive study, from 1988 to 2005, of more than 600 well-documented major failures in offshore structures show that approximately 80% of those failures were due to HOFs. In addition, lack of safety culture, as an issue related to HOFs, have been identified as a common contributing cause of many accidents in this industry. This dissertation introduces an integrated risk analysis methodology to systematically assess the critical role of human and organizational factors in offshore drilling safety. The proposed methodology in this research focuses on a specific procedure called Negative Pressure Test (NPT), as the primary method to ascertain well integrity during offshore drilling, and analyzes the contributing causes of misinterpreting such a critical test. In addition, the case study of the BP Deepwater Horizon accident and their conducted NPT is discussed. The risk analysis methodology in this dissertation consists of three different approaches and their integration constitutes the big picture of my whole methodology. The first approach is the comparative analysis of a "standard" NPT, which is proposed by the author, with the test conducted by the DWH crew. This analysis contributes to identifying the involved discrepancies between the two test procedures. The second approach is a conceptual risk assessment framework to analyze the causal factors of the identified mismatches in the previous step, as the main contributors of negative pressure test misinterpretation. Finally, a rational decision making model is introduced to quantify a section of the developed conceptual framework in the previous step and analyze the impact of different decision making biases on negative pressure test results. Along with the corroborating findings of previous studies, the analysis of the developed conceptual framework in this paper indicates that organizational factors are root causes of accumulated errors and questionable decisions made by personnel or management. Further analysis of this framework identifies procedural issues, economic pressure, and personnel management issues as the organizational factors with the highest influence on misinterpreting a negative pressure test. It is noteworthy that the captured organizational factors in the introduced conceptual framework are not only specific to the scope of the NPT. Most of these organizational factors have been identified as not only the common contributing causes of other offshore drilling accidents but also accidents in other oil and gas related operations as well as high-risk operations in other industries. In addition, the proposed rational decision making model in this research introduces a quantitative structure for analysis of the results of a conducted NPT. This model provides a structure and some parametric derived formulas to determine a cut-off point value, which assists personnel in accepting or rejecting an implemented negative pressure test. Moreover, it enables analysts to assess different decision making biases involved in the process of interpreting a conducted negative pressure test as well as the root organizational factors of those biases. In general, although the proposed integrated research methodology in this dissertation is developed for the risk assessment of human and organizational factors contributions in negative pressure test misinterpretation, it can be generalized and be potentially useful for other well control situations, both offshore and onshore; e.g. fracking. In addition, this methodology can be applied for the analysis of any high-risk operations, in not only the oil and gas industry but also in other industries such as nuclear power plants, aviation industry, and transportation sector.
Risk analysis based on hazards interactions
NASA Astrophysics Data System (ADS)
Rossi, Lauro; Rudari, Roberto; Trasforini, Eva; De Angeli, Silvia; Becker, Joost
2017-04-01
Despite an increasing need for open, transparent, and credible multi-hazard risk assessment methods, models, and tools, the availability of comprehensive risk information needed to inform disaster risk reduction is limited, and the level of interaction across hazards is not systematically analysed. Risk assessment methodologies for different hazards often produce risk metrics that are not comparable. Hazard interactions (consecutive occurrence two or more different events) are generally neglected, resulting in strongly underestimated risk assessment in the most exposed areas. This study presents cases of interaction between different hazards, showing how subsidence can affect coastal and river flood risk (Jakarta and Bandung, Indonesia) or how flood risk is modified after a seismic event (Italy). The analysis of well documented real study cases, based on a combination between Earth Observation and in-situ data, would serve as basis the formalisation of a multi-hazard methodology, identifying gaps and research frontiers. Multi-hazard risk analysis is performed through the RASOR platform (Rapid Analysis and Spatialisation Of Risk). A scenario-driven query system allow users to simulate future scenarios based on existing and assumed conditions, to compare with historical scenarios, and to model multi-hazard risk both before and during an event (www.rasor.eu).
Tao, Huan; Zhang, Yueyuan; Li, Qian; Chen, Jin
2017-11-01
To assess the methodological quality of systematic reviews (SRs) or meta-analysis concerning the predictive value of ERCC1 in platinum chemotherapy of non-small cell lung cancer. We searched the PubMed, EMbase, Cochrane library, international prospective register of systematic reviews, Chinese BioMedical Literature Database, China National Knowledge Infrastructure, Wan Fang and VIP database for SRs or meta-analysis. The methodological quality of included literatures was evaluated by risk of bias in systematic review (ROBIS) scale. Nineteen eligible SRs/meta-analysis were included. The most frequently searched databases were EMbase (74%), PubMed, Medline and CNKI. Fifteen SRs did additional retrieval manually, but none of them retrieved the registration platform. 47% described the two-reviewers model in the screening for eligible original articles, and seven SRs described the two reviewers to extract data. In methodological quality assessment, inter-rater reliability Kappa was 0.87 between two reviewers. Research question were well related to all SRs in phase 1 and the eligibility criteria was suitable for each SR, and rated as 'low' risk bias. But the 'high' risk bias existed in all the SRs regarding methods used to identify and/or select studies, and data collection and study appraisal. More than two-third of SRs or meta-analysis were finished with high risk of bias in the synthesis, findings and the final phase. The study demonstrated poor methodological quality of SRs/meta-analysis assessing the predictive value of ERCC1 in chemotherapy among the NSCLC patients, especially the high performance bias. Registration or publishing the protocol is recommended in future research.
Risk-Based Explosive Safety Analysis
2016-11-30
safety siting of energetic liquids and propellants can be greatly aided by the use of risk-based methodologies. The low probability of exposed...liquids or propellants . 15. SUBJECT TERMS N/A 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 19a. NAME OF...of energetic liquids and propellants can be greatly aided by the use of risk-based methodologies. The low probability of exposed personnel and the
Api, A M; Belsito, D; Bruze, M; Cadby, P; Calow, P; Dagli, M L; Dekant, W; Ellis, G; Fryer, A D; Fukayama, M; Griem, P; Hickey, C; Kromidas, L; Lalko, J F; Liebler, D C; Miyachi, Y; Politano, V T; Renskers, K; Ritacco, G; Salvito, D; Schultz, T W; Sipes, I G; Smith, B; Vitale, D; Wilcox, D K
2015-08-01
The Research Institute for Fragrance Materials, Inc. (RIFM) has been engaged in the generation and evaluation of safety data for fragrance materials since its inception over 45 years ago. Over time, RIFM's approach to gathering data, estimating exposure and assessing safety has evolved as the tools for risk assessment evolved. This publication is designed to update the RIFM safety assessment process, which follows a series of decision trees, reflecting advances in approaches in risk assessment and new and classical toxicological methodologies employed by RIFM over the past ten years. These changes include incorporating 1) new scientific information including a framework for choosing structural analogs, 2) consideration of the Threshold of Toxicological Concern (TTC), 3) the Quantitative Risk Assessment (QRA) for dermal sensitization, 4) the respiratory route of exposure, 5) aggregate exposure assessment methodology, 6) the latest methodology and approaches to risk assessments, 7) the latest alternatives to animal testing methodology and 8) environmental risk assessment. The assessment begins with a thorough analysis of existing data followed by in silico analysis, identification of 'read across' analogs, generation of additional data through in vitro testing as well as consideration of the TTC approach. If necessary, risk management may be considered. Copyright © 2014 Elsevier Ltd. All rights reserved.
Security Events and Vulnerability Data for Cybersecurity Risk Estimation.
Allodi, Luca; Massacci, Fabio
2017-08-01
Current industry standards for estimating cybersecurity risk are based on qualitative risk matrices as opposed to quantitative risk estimates. In contrast, risk assessment in most other industry sectors aims at deriving quantitative risk estimations (e.g., Basel II in Finance). This article presents a model and methodology to leverage on the large amount of data available from the IT infrastructure of an organization's security operation center to quantitatively estimate the probability of attack. Our methodology specifically addresses untargeted attacks delivered by automatic tools that make up the vast majority of attacks in the wild against users and organizations. We consider two-stage attacks whereby the attacker first breaches an Internet-facing system, and then escalates the attack to internal systems by exploiting local vulnerabilities in the target. Our methodology factors in the power of the attacker as the number of "weaponized" vulnerabilities he/she can exploit, and can be adjusted to match the risk appetite of the organization. We illustrate our methodology by using data from a large financial institution, and discuss the significant mismatch between traditional qualitative risk assessments and our quantitative approach. © 2017 Society for Risk Analysis.
Case Study on Project Risk Management Planning Based on Soft System Methodology
NASA Astrophysics Data System (ADS)
Lifang, Xie; Jun, Li
This paper analyzed the soft system characters of construction projects and the applicability on using Soft System Methodology (SSM) for risk analysis after a brief review of SSM. Taking a hydropower project as an example, it constructed the general frame of project risk management planning (PRMP) and established the Risk Management Planning (RMP) system from the perspective of the interests of co-ordination. This paper provided the ideas and methods for construction RMP under the win-win situation through the practice of SSM.
An Integrated Low-Speed Performance and Noise Prediction Methodology for Subsonic Aircraft
NASA Technical Reports Server (NTRS)
Olson, E. D.; Mavris, D. N.
2000-01-01
An integrated methodology has been assembled to compute the engine performance, takeoff and landing trajectories, and community noise levels for a subsonic commercial aircraft. Where feasible, physics-based noise analysis methods have been used to make the results more applicable to newer, revolutionary designs and to allow for a more direct evaluation of new technologies. The methodology is intended to be used with approximation methods and risk analysis techniques to allow for the analysis of a greater number of variable combinations while retaining the advantages of physics-based analysis. Details of the methodology are described and limited results are presented for a representative subsonic commercial aircraft.
Methodology for national risk analysis and prioritization of toxic industrial chemicals.
Taxell, Piia; Engström, Kerstin; Tuovila, Juha; Söderström, Martin; Kiljunen, Harri; Vanninen, Paula; Santonen, Tiina
2013-01-01
The identification of chemicals that pose the greatest threat to human health from incidental releases is a cornerstone in public health preparedness for chemical threats. The present study developed and applied a methodology for the risk analysis and prioritization of industrial chemicals to identify the most significant chemicals that pose a threat to public health in Finland. The prioritization criteria included acute and chronic health hazards, physicochemical and environmental hazards, national production and use quantities, the physicochemical properties of the substances, and the history of substance-related incidents. The presented methodology enabled a systematic review and prioritization of industrial chemicals for the purpose of national public health preparedness for chemical incidents.
Risk methodology overview. [for carbon fiber release
NASA Technical Reports Server (NTRS)
Credeur, K. R.
1979-01-01
Some considerations of risk estimation, how risk is measured, and how risk analysis decisions are made are discussed. Specific problems of carbon fiber release are discussed by reviewing the objective, describing the main elements, and giving an example of the risk logic and outputs.
INDOOR AIR ASSESSMENT - A REVIEW OF INDOOR AIR QUALITY RISK CHARACTERIZATION
Risk assessment methodologies provide a mechanism for incorporating scientific evidence and Judgments Into the risk management decision process. isk characterization framework has been developed to provide a systematic approach for analysis and presentation of risk characterizati...
Lee, Andy H; Zhou, Xu; Kang, Deying; Luo, Yanan; Liu, Jiali; Sun, Xin
2018-01-01
Objective To assess risk of bias and to investigate methodological issues concerning the design, conduct and analysis of randomised controlled trials (RCTs) testing acupuncture for knee osteoarthritis (KOA). Methods PubMed, EMBASE, Cochrane Central Register of Controlled Trials and four major Chinese databases were searched for RCTs that investigated the effect of acupuncture for KOA. The Cochrane tool was used to examine the risk of bias of eligible RCTs. Their methodological details were examined using a standardised and pilot-tested questionnaire of 48 items, together with the association between four predefined factors and important methodological quality indicators. Results A total of 248 RCTs were eligible, of which 39 (15.7%) used computer-generated randomisation sequence. Of the 31 (12.5%) trials that stated the allocation concealment, only one used central randomisation. Twenty-five (10.1%) trials mentioned that their acupuncture procedures were standardised, but only 18 (7.3%) specified how the standardisation was achieved. The great majority of trials (n=233, 94%) stated that blinding was in place, but 204 (87.6%) did not clarify who was blinded. Only 27 (10.9%) trials specified the primary outcome, for which 7 used intention-to-treat analysis. Only 17 (6.9%) trials included details on sample size calculation; none preplanned an interim analysis and associated stopping rule. In total, 46 (18.5%) trials explicitly stated that loss to follow-up occurred, but only 6 (2.4%) provided some information to deal with the issue. No trials prespecified, conducted or reported any subgroup or adjusted analysis for the primary outcome. Conclusion The overall risk of bias was high among published RCTs testing acupuncture for KOA. Methodological limitations were present in many important aspects of design, conduct and analyses. These findings inform the development of evidence-based methodological guidance for future trials assessing the effect of acupuncture for KOA. PMID:29511016
A methodology for the assessment of flood hazards at the regional scale
NASA Astrophysics Data System (ADS)
Gallina, Valentina; Torresan, Silvia; Critto, Andrea; Zabeo, Alex; Semenzin, Elena; Marcomini, Antonio
2013-04-01
In recent years, the frequency of water-related disasters has increased and recent flood events in Europe (e.g. 2002 in Central Europe, 2007 in UK, 2010 in Italy) caused physical-environmental and socio-economic damages. Specifically, floods are the most threatening water-related disaster that affects humans, their lives and properties. Within the KULTURisk project (FP7) a Regional Risk Assessment (RRA) methodology is proposed to evaluate the benefits of risk prevention in terms of reduced environmental risks due to floods. The method is based on the KULTURisk framework and allows the identification and prioritization of targets (i.e. people, buildings, infrastructures, agriculture, natural and semi-natural systems, cultural heritages) and areas at risk from floods in the considered region by comparing the baseline scenario (i.e. current state) with alternative scenarios (i.e. where different structural and/or non-structural measures are planned). The RRA methodology is flexible and can be adapted to different case studies (i.e. large rivers, alpine/mountain catchments, urban areas and coastal areas) and spatial scales (i.e. from the large river to the urban scale). The final aim of RRA is to help decision-makers in examining the possible environmental risks associated with uncertain future flood hazards and in identifying which prevention scenario could be the most suitable one. The RRA methodology employs Multi-Criteria Decision Analysis (MCDA functions) in order to integrate stakeholder preferences and experts judgments into the analysis. Moreover, Geographic Information Systems (GISs) are used to manage, process, analyze, and map data to facilitate the analysis and the information sharing with different experts and stakeholders. In order to characterize flood risks, the proposed methodology integrates the output of hydrodynamic models with the analysis of site-specific bio-geophysical and socio-economic indicators (e.g. slope of the territory, land cover, population density, economic activities) of several case studies in order to develop risk maps that identify and prioritize relative hot-spot areas and targets at risk at the regional scale. The main outputs of the RRA are receptor-based maps of risks useful to communicate the potential implications of floods in non-monetary terms to stakeholders and administrations. These maps can be a basis for the management of flood risks as they can provide information about the indicative number of inhabitants, the type of economic activities, natural systems and cultural heritages potentially affected by flooding. Moreover, they can provide suitable information about flood risk in the considered area in order to define priorities for prevention measures, for land use planning and management. Finally, the outputs of the RRA methodology can be used as data input in the Socio- Economic Regional Risk Assessment methodology for the economic evaluation of different damages (e.g. tangible costs, intangible costs) and for the social assessment considering the benefits of the human dimension of vulnerability (i.e. adaptive and coping capacity). Within the KULTURisk project, the methodology has been applied and validated in several European case studies. Moreover, its generalization to address other types of natural hazards (e.g. earthquakes, forest fires) will be evaluated. The preliminary results of the RRA application in the KULTURisk project will be here presented and discussed.
Assessing the Fire Risk for a Historic Hangar
NASA Technical Reports Server (NTRS)
Datta, Koushik; Morrison, Richard S.
2010-01-01
NASA Ames Research Center (ARC) is evaluating options of reuse of its historic Hangar 1. As a part of this evaluation, a qualitative fire risk assessment study was performed to evaluate the potential threat of combustion of the historic hangar. The study focused on the fire risk trade-off of either installing or not installing a Special Hazard Fire Suppression System in the Hangar 1 deck areas. The assessment methodology was useful in discussing the important issues among various groups within the Center. Once the methodology was deemed acceptable, the results were assessed. The results showed that the risk remained in the same risk category, whether Hangar 1 does or does not have a Special Hazard Fire Suppression System. Note that the methodology assessed the risk to Hangar 1 and not the risk to an aircraft in the hangar. If one had a high value aircraft, the aircraft risk analysis could potentially show a different result. The assessed risk results were then communicated to management and other stakeholders.
Use-related risk analysis for medical devices based on improved FMEA.
Liu, Long; Shuai, Ma; Wang, Zhu; Li, Ping
2012-01-01
In order to effectively analyze and control use-related risk of medical devices, quantitative methodologies must be applied. Failure Mode and Effects Analysis (FMEA) is a proactive technique for error detection and risk reduction. In this article, an improved FMEA based on Fuzzy Mathematics and Grey Relational Theory is developed to better carry out user-related risk analysis for medical devices. As an example, the analysis process using this improved FMEA method for a certain medical device (C-arm X-ray machine) is described.
Execution of a self-directed risk assessment methodology to address HIPAA data security requirements
NASA Astrophysics Data System (ADS)
Coleman, Johnathan
2003-05-01
This paper analyzes the method and training of a self directed risk assessment methodology entitled OCTAVE (Operationally Critical Threat Asset and Vulnerability Evaluation) at over 170 DOD medical treatment facilities. It focuses specifically on how OCTAVE built interdisciplinary, inter-hierarchical consensus and enhanced local capabilities to perform Health Information Assurance. The Risk Assessment Methodology was developed by the Software Engineering Institute at Carnegie Mellon University as part of the Defense Health Information Assurance Program (DHIAP). The basis for its success is the combination of analysis of organizational practices and technological vulnerabilities. Together, these areas address the core implications behind the HIPAA Security Rule and can be used to develop Organizational Protection Strategies and Technological Mitigation Plans. A key component of OCTAVE is the inter-disciplinary composition of the analysis team (Patient Administration, IT staff and Clinician). It is this unique composition of analysis team members, along with organizational and technical analysis of business practices, assets and threats, which enables facilities to create sound and effective security policies. The Risk Assessment is conducted in-house, and therefore the process, results and knowledge remain within the organization, helping to build consensus in an environment of differing organizational and disciplinary perspectives on Health Information Assurance.
2013-04-01
and Integrated Risk Management Methodologies 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e...supply chains, risk management with real options, and sustainability . [dnford@nps.edu] Thomas J. Housel—Housel specializes in valuing intellectual...maintenance services for the RDN. Damen Schelde has used an ILS since 2002 to manage the shipbuilding process from project initiation through the
A Case Study of Measuring Process Risk for Early Insights into Software Safety
NASA Technical Reports Server (NTRS)
Layman, Lucas; Basili, Victor; Zelkowitz, Marvin V.; Fisher, Karen L.
2011-01-01
In this case study, we examine software safety risk in three flight hardware systems in NASA's Constellation spaceflight program. We applied our Technical and Process Risk Measurement (TPRM) methodology to the Constellation hazard analysis process to quantify the technical and process risks involving software safety in the early design phase of these projects. We analyzed 154 hazard reports and collected metrics to measure the prevalence of software in hazards and the specificity of descriptions of software causes of hazardous conditions. We found that 49-70% of 154 hazardous conditions could be caused by software or software was involved in the prevention of the hazardous condition. We also found that 12-17% of the 2013 hazard causes involved software, and that 23-29% of all causes had a software control. The application of the TPRM methodology identified process risks in the application of the hazard analysis process itself that may lead to software safety risk.
Bernstein, Richard H
2007-01-01
"Care management" purposefully obscures the distinctions between disease and case management and stresses their common features: action in the present to prevent adverse future outcomes and costs. It includes identifying a high-need population by referrals, screening, or data analysis, assessing those likely to benefit from interventions, intervening, evaluating the intervention, and adjusting interventions when needed. High-risk individuals can be identified using at least 9 techniques, from referrals and questionnaires to retrospective claims analysis and predictive models. Other than referrals, software based on the risk-adjustment methodology that we have adapted can incorporate all these methodologies. Because the risk adjustment employs extensive case mix and severity adjustment, it provides care managers with 3 innovative ways to identify not only high-risk individuals but also high-opportunity cases.
Improved FTA methodology and application to subsea pipeline reliability design.
Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan
2014-01-01
An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form.
Improved FTA Methodology and Application to Subsea Pipeline Reliability Design
Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan
2014-01-01
An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form. PMID:24667681
Samantra, Chitrasen; Datta, Saurav; Mahapatra, Siba Sankar
2017-03-01
In the context of underground coal mining industry, the increased economic issues regarding implementation of additional safety measure systems, along with growing public awareness to ensure high level of workers safety, have put great pressure on the managers towards finding the best solution to ensure safe as well as economically viable alternative selection. Risk-based decision support system plays an important role in finding such solutions amongst candidate alternatives with respect to multiple decision criteria. Therefore, in this paper, a unified risk-based decision-making methodology has been proposed for selecting an appropriate safety measure system in relation to an underground coal mining industry with respect to multiple risk criteria such as financial risk, operating risk, and maintenance risk. The proposed methodology uses interval-valued fuzzy set theory for modelling vagueness and subjectivity in the estimates of fuzzy risk ratings for making appropriate decision. The methodology is based on the aggregative fuzzy risk analysis and multi-criteria decision making. The selection decisions are made within the context of understanding the total integrated risk that is likely to incur while adapting the particular safety system alternative. Effectiveness of the proposed methodology has been validated through a real-time case study. The result in the context of final priority ranking is seemed fairly consistent.
Research in Modeling and Simulation for Airspace Systems Innovation
NASA Technical Reports Server (NTRS)
Ballin, Mark G.; Kimmel, William M.; Welch, Sharon S.
2007-01-01
This viewgraph presentation provides an overview of some of the applied research and simulation methodologies at the NASA Langley Research Center that support aerospace systems innovation. Risk assessment methodologies, complex systems design and analysis methodologies, and aer ospace operations simulations are described. Potential areas for future research and collaboration using interactive and distributed simula tions are also proposed.
Revealing the underlying drivers of disaster risk: a global analysis
NASA Astrophysics Data System (ADS)
Peduzzi, Pascal
2017-04-01
Disasters events are perfect examples of compound events. Disaster risk lies at the intersection of several independent components such as hazard, exposure and vulnerability. Understanding the weight of each component requires extensive standardisation. Here, I show how footprints of past disastrous events were generated using GIS modelling techniques and used for extracting population and economic exposures based on distribution models. Using past event losses, it was possible to identify and quantify a wide range of socio-politico-economic drivers associated with human vulnerability. The analysis was applied to about nine thousand individual past disastrous events covering earthquakes, floods and tropical cyclones. Using a multiple regression analysis on these individual events it was possible to quantify each risk component and assess how vulnerability is influenced by various hazard intensities. The results show that hazard intensity, exposure, poverty, governance as well as other underlying factors (e.g. remoteness) can explain the magnitude of past disasters. Analysis was also performed to highlight the role of future trends in population and climate change and how this may impacts exposure to tropical cyclones in the future. GIS models combined with statistical multiple regression analysis provided a powerful methodology to identify, quantify and model disaster risk taking into account its various components. The same methodology can be applied to various types of risk at local to global scale. This method was applied and developed for the Global Risk Analysis of the Global Assessment Report on Disaster Risk Reduction (GAR). It was first applied on mortality risk in GAR 2009 and GAR 2011. New models ranging from global assets exposure and global flood hazard models were also recently developed to improve the resolution of the risk analysis and applied through CAPRA software to provide probabilistic economic risk assessments such as Average Annual Losses (AAL) and Probable Maximum Losses (PML) in GAR 2013 and GAR 2015. In parallel similar methodologies were developed to highlitght the role of ecosystems for Climate Change Adaptation (CCA) and Disaster Risk Reduction (DRR). New developments may include slow hazards (such as e.g. soil degradation and droughts), natech hazards (by intersecting with georeferenced critical infrastructures) The various global hazard, exposure and risk models can be visualized and download through the PREVIEW Global Risk Data Platform.
Developing and validating risk prediction models in an individual participant data meta-analysis
2014-01-01
Background Risk prediction models estimate the risk of developing future outcomes for individuals based on one or more underlying characteristics (predictors). We review how researchers develop and validate risk prediction models within an individual participant data (IPD) meta-analysis, in order to assess the feasibility and conduct of the approach. Methods A qualitative review of the aims, methodology, and reporting in 15 articles that developed a risk prediction model using IPD from multiple studies. Results The IPD approach offers many opportunities but methodological challenges exist, including: unavailability of requested IPD, missing patient data and predictors, and between-study heterogeneity in methods of measurement, outcome definitions and predictor effects. Most articles develop their model using IPD from all available studies and perform only an internal validation (on the same set of data). Ten of the 15 articles did not allow for any study differences in baseline risk (intercepts), potentially limiting their model’s applicability and performance in some populations. Only two articles used external validation (on different data), including a novel method which develops the model on all but one of the IPD studies, tests performance in the excluded study, and repeats by rotating the omitted study. Conclusions An IPD meta-analysis offers unique opportunities for risk prediction research. Researchers can make more of this by allowing separate model intercept terms for each study (population) to improve generalisability, and by using ‘internal-external cross-validation’ to simultaneously develop and validate their model. Methodological challenges can be reduced by prospectively planned collaborations that share IPD for risk prediction. PMID:24397587
The physical vulnerability of elements at risk: a methodology based on fluid and classical mechanics
NASA Astrophysics Data System (ADS)
Mazzorana, B.; Fuchs, S.; Levaggi, L.
2012-04-01
The impacts of the flood events occurred in autumn 2011 in the Italian regions Liguria and Tuscany revived the engagement of the public decision makers to enhance in synergy flood control and land use planning. In this context, the design of efficient flood risk mitigation strategies and their subsequent implementation critically relies on a careful vulnerability analysis of both, the immobile and mobile elements at risk potentially exposed to flood hazards. Based on fluid and classical mechanics notions we developed computation schemes enabling for a dynamic vulnerability and risk analysis facing a broad typological variety of elements at risk. The methodological skeleton consists of (1) hydrodynamic computation of the time-varying flood intensities resulting for each element at risk in a succession of loading configurations; (2) modelling the mechanical response of the impacted elements through static, elasto-static and dynamic analyses; (3) characterising the mechanical response through proper structural damage variables and (4) economic valuation of the expected losses as a function of the quantified damage variables. From a computational perspective we coupled the description of the hydrodynamic flow behaviour and the induced structural modifications of the elements at risk exposed. Valuation methods, suitable to support a correct mapping from the value domains of the physical damage variables to the economic loss values are discussed. In such a way we target to complement from a methodological perspective the existing, mainly empirical, vulnerability and risk assessment approaches to refine the conceptual framework of the cost-benefit analysis. Moreover, we aim to support the design of effective flood risk mitigation strategies by diminishing the main criticalities within the systems prone to flood risk.
Gómez-García, Francisco; Ruano, Juan; Aguilar-Luque, Macarena; Alcalde-Mellado, Patricia; Gay-Mimbrera, Jesús; Hernández-Romero, José Luis; Sanz-Cabanillas, Juan Luis; Maestre-López, Beatriz; González-Padilla, Marcelino; Carmona-Fernández, Pedro J; García-Nieto, Antonio Vélez; Isla-Tejera, Beatriz
2017-12-29
Article summaries' information and structure may influence researchers/clinicians' decisions to conduct deeper full-text analyses. Specifically, abstracts of systematic reviews (SRs) and meta-analyses (MA) should provide structured summaries for quick assessment. This study explored a method for determining the methodological quality and bias risk of full-text reviews using abstract information alone. Systematic literature searches for SRs and/or MA about psoriasis were undertaken on MEDLINE, EMBASE, and Cochrane database. For each review, quality, abstract-reporting completeness, full-text methodological quality, and bias risk were evaluated using Preferred Reporting Items for Systematic Reviews and Meta-analyses for abstracts (PRISMA-A), Assessing the Methodological Quality of Systematic Reviews (AMSTAR), and ROBIS tools, respectively. Article-, author-, and journal-derived metadata were systematically extracted from eligible studies using a piloted template, and explanatory variables concerning abstract-reporting quality were assessed using univariate and multivariate-regression models. Two classification models concerning SRs' methodological quality and bias risk were developed based on per-item and total PRISMA-A scores and decision-tree algorithms. This work was supported, in part, by project ICI1400136 (JR). No funding was received from any pharmaceutical company. This study analysed 139 SRs on psoriasis interventions. On average, they featured 56.7% of PRISMA-A items. The mean total PRISMA-A score was significantly higher for high-methodological-quality SRs than for moderate- and low-methodological-quality reviews. SRs with low-bias risk showed higher total PRISMA-A values than reviews with high-bias risk. In the final model, only 'authors per review > 6' (OR: 1.098; 95%CI: 1.012-1.194), 'academic source of funding' (OR: 3.630; 95%CI: 1.788-7.542), and 'PRISMA-endorsed journal' (OR: 4.370; 95%CI: 1.785-10.98) predicted PRISMA-A variability. Reviews with a total PRISMA-A score < 6, lacking identification as SR or MA in the title, and lacking explanation concerning bias risk assessment methods were classified as low-methodological quality. Abstracts with a total PRISMA-A score ≥ 9, including main outcomes results and explanation bias risk assessment method were classified as having low-bias risk. The methodological quality and bias risk of SRs may be determined by abstract's quality and completeness analyses. Our proposal aimed to facilitate synthesis of evidence evaluation by clinical professionals lacking methodological skills. External validation is necessary.
ERIC Educational Resources Information Center
Keller-Margulis, Milena; McQuillin, Samuel D.; Castañeda, Juan Javier; Ochs, Sarah; Jones, John H.
2018-01-01
Multitiered systems of support depend on screening technology to identify students at risk. The purpose of this study was to examine the use of a computer-adaptive test and latent class growth analysis (LCGA) to identify students at risk in reading with focus on the use of this methodology to characterize student performance in screening.…
NASA Astrophysics Data System (ADS)
Tesfamichael, Aklilu A.; Caplan, Arthur J.; Kaluarachchi, Jagath J.
2005-05-01
This study provides an improved methodology for investigating the trade-offs between the health risks and economic benefits of using atrazine in the agricultural sector by incorporating public attitude to pesticide management in the analysis. Regression models are developed to predict finished water atrazine concentration in high-risk community water supplies in the United States. The predicted finished water atrazine concentrations are then used in a health risk assessment. The computed health risks are compared with the total economic surplus in the U.S. corn market for different atrazine application rates using estimated demand and supply functions developed in this work. Analysis of different scenarios with consumer price premiums for chemical-free and reduced-chemical corn indicate that if the society is willing to pay a price premium, risks can be reduced without a large reduction in the total economic surplus and net benefits may be higher. The results also show that this methodology provides an improved scientific framework for future decision making and policy evaluation in pesticide management.
Carter, D A; Hirst, I L
2000-01-07
This paper considers the application of one of the weighted risk indicators used by the Major Hazards Assessment Unit (MHAU) of the Health and Safety Executive (HSE) in formulating advice to local planning authorities on the siting of new major accident hazard installations. In such cases the primary consideration is to ensure that the proposed installation would not be incompatible with existing developments in the vicinity, as identified by the categorisation of the existing developments and the estimation of individual risk values at those developments. In addition a simple methodology, described here, based on MHAU's "Risk Integral" and a single "worst case" even analysis, is used to enable the societal risk aspects of the hazardous installation to be considered at an early stage of the proposal, and to determine the degree of analysis that will be necessary to enable HSE to give appropriate advice.
The Contribution of Human Factors in Military System Development: Methodological Considerations
1980-07-01
Risk/Uncertainty Analysis - Project Scoring - Utility Scales - Relevance Tree Techniques (Reverse Factor Analysis) 2. Computer Simulation Simulation...effectiveness of mathematical models for R&D project selection. Management Science, April 1973, 18. 6-43 .1~ *.-. Souder, W.E. h scoring methodology for...per some interval PROFICIENCY test scores (written) RADIATION radiation effects aircrew performance on radiation environments REACTION TIME 1) (time
[The methods of assessment of health risk from exposure to radon and radon daughters].
Demin, V F; Zhukovskiy, M V; Kiselev, S M
2014-01-01
The critical analysis of existing models of the relationship dose-effect (RDE) for radon exposure on human health has been performed. Conclusion about the necessity and possibility of improving these models has been made. A new improved version ofthe RDE has been developed. A technique for assessing the human health risk of exposure to radon, including the method for estimating of exposure doses of radon, an improved model of RDE, proper methodology risk assessment has been described. Methodology is proposed for the use in the territory of Russia.
Using landslide risk analysis to protect fish habitat
R. M. Rice
1986-01-01
The protection of anadromous fish habitat is an important water quslity concern in the Pacific Northwest. Sediment from logging-related debris avalanches can cause habitat degradation. Research on conditions associated with the sites where debris avalanches originate has resulted in a risk assessment methodology based on linear discriminant analysis. The probability...
Jia, Pengli; Tang, Li; Yu, Jiajie; Lee, Andy H; Zhou, Xu; Kang, Deying; Luo, Yanan; Liu, Jiali; Sun, Xin
2018-03-06
To assess risk of bias and to investigate methodological issues concerning the design, conduct and analysis of randomised controlled trials (RCTs) testing acupuncture for knee osteoarthritis (KOA). PubMed, EMBASE, Cochrane Central Register of Controlled Trials and four major Chinese databases were searched for RCTs that investigated the effect of acupuncture for KOA. The Cochrane tool was used to examine the risk of bias of eligible RCTs. Their methodological details were examined using a standardised and pilot-tested questionnaire of 48 items, together with the association between four predefined factors and important methodological quality indicators. A total of 248 RCTs were eligible, of which 39 (15.7%) used computer-generated randomisation sequence. Of the 31 (12.5%) trials that stated the allocation concealment, only one used central randomisation. Twenty-five (10.1%) trials mentioned that their acupuncture procedures were standardised, but only 18 (7.3%) specified how the standardisation was achieved. The great majority of trials (n=233, 94%) stated that blinding was in place, but 204 (87.6%) did not clarify who was blinded. Only 27 (10.9%) trials specified the primary outcome, for which 7 used intention-to-treat analysis. Only 17 (6.9%) trials included details on sample size calculation; none preplanned an interim analysis and associated stopping rule. In total, 46 (18.5%) trials explicitly stated that loss to follow-up occurred, but only 6 (2.4%) provided some information to deal with the issue. No trials prespecified, conducted or reported any subgroup or adjusted analysis for the primary outcome. The overall risk of bias was high among published RCTs testing acupuncture for KOA. Methodological limitations were present in many important aspects of design, conduct and analyses. These findings inform the development of evidence-based methodological guidance for future trials assessing the effect of acupuncture for KOA. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Systematic risk assessment methodology for critical infrastructure elements - Oil and Gas subsectors
NASA Astrophysics Data System (ADS)
Gheorghiu, A.-D.; Ozunu, A.
2012-04-01
The concern for the protection of critical infrastructure has been rapidly growing in the last few years in Europe. The level of knowledge and preparedness in this field is beginning to develop in a lawfully organized manner, for the identification and designation of critical infrastructure elements of national and European interest. Oil and gas production, refining, treatment, storage and transmission by pipelines facilities, are considered European critical infrastructure sectors, as per Annex I of the Council Directive 2008/114/EC of 8 December 2008 on the identification and designation of European critical infrastructures and the assessment of the need to improve their protection. Besides identifying European and national critical infrastructure elements, member states also need to perform a risk analysis for these infrastructure items, as stated in Annex II of the above mentioned Directive. In the field of risk assessment, there are a series of acknowledged and successfully used methods in the world, but not all hazard identification and assessment methods and techniques are suitable for a given site, situation, or type of hazard. As Theoharidou, M. et al. noted (Theoharidou, M., P. Kotzanikolaou, and D. Gritzalis 2009. Risk-Based Criticality Analysis. In Critical Infrastructure Protection III. Proceedings. Third Annual IFIP WG 11.10 International Conference on Critical Infrastructure Protection. Hanover, New Hampshire, USA, March 23-25, 2009: revised selected papers, edited by C. Palmer and S. Shenoi, 35-49. Berlin: Springer.), despite the wealth of knowledge already created, there is a need for simple, feasible, and standardized criticality analyses. The proposed systematic risk assessment methodology includes three basic steps: the first step (preliminary analysis) includes the identification of hazards (including possible natural hazards) for each installation/section within a given site, followed by a criterial analysis and then a detailed analysis step. The criterial evaluation is used as a ranking system in order to establish the priorities for the detailed risk assessment. This criterial analysis stage is necessary because the total number of installations and sections on a site can be quite large. As not all installations and sections on a site contribute significantly to the risk of a major accident occurring, it is not efficient to include all installations and sections in the detailed risk assessment, which can be time and resource consuming. The selected installations are then taken into consideration in the detailed risk assessment, which is the third step of the systematic risk assessment methodology. Following this step, conclusions can be drawn related to the overall risk characteristics of the site. The proposed methodology can as such be successfully applied to the assessment of risk related to critical infrastructure elements falling under the energy sector of Critical Infrastructure, mainly the sub-sectors oil and gas. Key words: Systematic risk assessment, criterial analysis, energy sector critical infrastructure elements
Risk-based maintenance of ethylene oxide production facilities.
Khan, Faisal I; Haddara, Mahmoud R
2004-05-20
This paper discusses a methodology for the design of an optimum inspection and maintenance program. The methodology, called risk-based maintenance (RBM) is based on integrating a reliability approach and a risk assessment strategy to obtain an optimum maintenance schedule. First, the likely equipment failure scenarios are formulated. Out of many likely failure scenarios, the ones, which are most probable, are subjected to a detailed study. Detailed consequence analysis is done for the selected scenarios. Subsequently, these failure scenarios are subjected to a fault tree analysis to determine their probabilities. Finally, risk is computed by combining the results of the consequence and the probability analyses. The calculated risk is compared against known acceptable criteria. The frequencies of the maintenance tasks are obtained by minimizing the estimated risk. A case study involving an ethylene oxide production facility is presented. Out of the five most hazardous units considered, the pipeline used for the transportation of the ethylene is found to have the highest risk. Using available failure data and a lognormal reliability distribution function human health risk factors are calculated. Both societal risk factors and individual risk factors exceeded the acceptable risk criteria. To determine an optimal maintenance interval, a reverse fault tree analysis was used. The maintenance interval was determined such that the original high risk is brought down to an acceptable level. A sensitivity analysis is also undertaken to study the impact of changing the distribution of the reliability model as well as the error in the distribution parameters on the maintenance interval.
Assessing secondary soil salinization risk based on the PSR sustainability framework.
Zhou, De; Lin, Zhulu; Liu, Liming; Zimmermann, David
2013-10-15
Risk assessment of secondary soil salinization, which is caused in part by the way people manage the land, is an essential challenge to agricultural sustainability. The objective of our study was to develop a soil salinity risk assessment methodology by selecting a consistent set of risk factors based on the conceptual Pressure-State-Response (PSR) sustainability framework and incorporating the grey relational analysis and the Analytic Hierarchy Process methods. The proposed salinity risk assessment methodology was demonstrated through a case study of developing composite risk index maps for the Yinchuan Plain, a major irrigation agriculture district in northwest China. Fourteen risk factors were selected in terms of the three PSR criteria: pressure, state, and response. The results showed that the salinity risk in the Yinchuan Plain was strongly influenced by the subsoil and groundwater salinity, land use, distance to irrigation canals, and depth to groundwater. To maintain agricultural sustainability in the Yinchuan Plain, a suite of remedial and preventative actions were proposed to manage soil salinity risk in the regions that are affected by salinity at different levels and by different salinization processes. The weight sensitivity analysis results also showed that the overall salinity risk of the Yinchuan Plain would increase or decrease as the weights for pressure or response risk factors increased, signifying the importance of human activities on secondary soil salinization. Ideally, the proposed methodology will help us develop more consistent management tools for risk assessment and management and for control of secondary soil salinization. Copyright © 2013 Elsevier Ltd. All rights reserved.
The comparison of various approach to evaluation erosion risks and design control erosion measures
NASA Astrophysics Data System (ADS)
Kapicka, Jiri
2015-04-01
In the present is in the Czech Republic one methodology how to compute and compare erosion risks. This methodology contain also method to design erosion control measures. The base of this methodology is Universal Soil Loss Equation (USLE) and their result long-term average annual rate of erosion (G). This methodology is used for landscape planners. Data and statistics from database of erosion events in the Czech Republic shows that many troubles and damages are from local episodes of erosion events. An extent of these events and theirs impact are conditional to local precipitation events, current plant phase and soil conditions. These erosion events can do troubles and damages on agriculture land, municipally property and hydro components and even in a location is from point of view long-term average annual rate of erosion in good conditions. Other way how to compute and compare erosion risks is episodes approach. In this paper is presented the compare of various approach to compute erosion risks. The comparison was computed to locality from database of erosion events on agricultural land in the Czech Republic where have been records two erosion events. The study area is a simple agriculture land without any barriers that can have high influence to water flow and soil sediment transport. The computation of erosion risks (for all methodology) was based on laboratory analysis of soil samples which was sampled on study area. Results of the methodology USLE, MUSLE and results from mathematical model Erosion 3D have been compared. Variances of the results in space distribution of the places with highest soil erosion where compared and discussed. Other part presents variances of design control erosion measures where their design was done on based different methodology. The results shows variance of computed erosion risks which was done by different methodology. These variances can start discussion about different approach how compute and evaluate erosion risks in areas with different importance.
Ruiz-Goikoetxea, Maite; Cortese, Samuele; Aznarez-Sanado, Maite; Magallón, Sara; Alvarez Zallo, Noelia; Luis, Elkin O; de Castro-Manglano, Pilar; Soutullo, Cesar; Arrondo, Gonzalo
2018-01-01
A systematic review with meta-analyses was performed to: 1) quantify the association between ADHD and risk of unintentional physical injuries in children/adolescents ("risk analysis"); 2) assess the effect of ADHD medications on this risk ("medication analysis"). We searched 114 databases through June 2017. For the risk analysis, studies reporting sex-controlled odds ratios (ORs) or hazard ratios (HRs) estimating the association between ADHD and injuries were combined. Pooled ORs (28 studies, 4,055,620 individuals without and 350,938 with ADHD) and HRs (4 studies, 901,891 individuals without and 20,363 with ADHD) were 1.53 (95% CI=1.40,1.67) and 1.39 (95% CI=1.06,1.83), respectively. For the medication analysis, we meta-analysed studies that avoided the confounding-by-indication bias [four studies with a self-controlled methodology and another comparing risk over time and groups (a "difference in differences" methodology)]. The pooled effect size was 0.879 (95% CI=0.838,0.922) (13,254 individuals with ADHD). ADHD is significantly associated with an increased risk of unintentional injuries and ADHD medications have a protective effect, at least in the short term, as indicated by self-controlled studies. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
THE ROLE OF EXPOSURE ANALYSIS IN HUMAN HEALTH RISK ASSESSMENT
This presentation will cover the basic methodologies used for assessing human exposures to environmental pollutants, and some of the scientific challenges involved in conducting exposure and risk assessments in support of regulatory evaluations.
Does Metformin Reduce Cancer Risks? Methodologic Considerations.
Golozar, Asieh; Liu, Shuiqing; Lin, Joeseph A; Peairs, Kimberly; Yeh, Hsin-Chieh
2016-01-01
The substantial burden of cancer and diabetes and the association between the two conditions has been a motivation for researchers to look for targeted strategies that can simultaneously affect both diseases and reduce their overlapping burden. In the absence of randomized clinical trials, researchers have taken advantage of the availability and richness of administrative databases and electronic medical records to investigate the effects of drugs on cancer risk among diabetic individuals. The majority of these studies suggest that metformin could potentially reduce cancer risk. However, the validity of this purported reduction in cancer risk is limited by several methodological flaws either in the study design or in the analysis. Whether metformin use decreases cancer risk relies heavily on the availability of valid data sources with complete information on confounders, accurate assessment of drug use, appropriate study design, and robust analytical techniques. The majority of the observational studies assessing the association between metformin and cancer risk suffer from methodological shortcomings and efforts to address these issues have been incomplete. Future investigations on the association between metformin and cancer risk should clearly address the methodological issues due to confounding by indication, prevalent user bias, and time-related biases. Although the proposed strategies do not guarantee a bias-free estimate for the association between metformin and cancer, they will reduce synthesis of and reporting of erroneous results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Szilard, Ronaldo Henriques
A Risk Informed Safety Margin Characterization (RISMC) toolkit and methodology are proposed for investigating nuclear power plant core, fuels design and safety analysis, including postulated Loss-of-Coolant Accident (LOCA) analysis. This toolkit, under an integrated evaluation model framework, is name LOCA toolkit for the US (LOTUS). This demonstration includes coupled analysis of core design, fuel design, thermal hydraulics and systems analysis, using advanced risk analysis tools and methods to investigate a wide range of results.
Gregori, Dario; Rosato, Rosalba; Zecchin, Massimo; Di Lenarda, Andrea
2005-01-01
This paper discusses the use of bivariate survival curves estimators within the competing risk framework. Competing risks models are used for the analysis of medical data with more than one cause of death. The case of dilated cardiomiopathy is explored. Bivariate survival curves plot the conjoint mortality processes. The different graphic representation of bivariate survival analysis is the major contribute of this methodology to the competing risks analysis.
de Dianous, Valérie; Fiévez, Cécile
2006-03-31
Over the last two decades a growing interest for risk analysis has been noted in the industries. The ARAMIS project has defined a methodology for risk assessment. This methodology has been built to help the industrialist to demonstrate that they have a sufficient risk control on their site. Risk analysis consists first in the identification of all the major accidents, assuming that safety functions in place are inefficient. This step of identification of the major accidents uses bow-tie diagrams. Secondly, the safety barriers really implemented on the site are taken into account. The barriers are identified on the bow-ties. An evaluation of their performance (response time, efficiency, and level of confidence) is performed to validate that they are relevant for the expected safety function. At last, the evaluation of their probability of failure enables to assess the frequency of occurrence of the accident. The demonstration of the risk control based on a couple gravity/frequency of occurrence is also possible for all the accident scenarios. During the risk analysis, a practical tool called risk graph is used to assess if the number and the reliability of the safety functions for a given cause are sufficient to reach a good risk control.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-02
... Change To Adopt Changes That Would Affect Its Standard Portfolio Analysis of Risk Methodology for Certain... Rule Change CME proposes to adopt certain changes that would affect its Standard Portfolio Analysis of... calibrates the risk of portfolios, consisting of positions in highly similar and correlated futures and...
Risk-Informed External Hazards Analysis for Seismic and Flooding Phenomena for a Generic PWR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parisi, Carlo; Prescott, Steve; Ma, Zhegang
This report describes the activities performed during the FY2017 for the US-DOE Light Water Reactor Sustainability Risk-Informed Safety Margin Characterization (LWRS-RISMC), Industry Application #2. The scope of Industry Application #2 is to deliver a risk-informed external hazards safety analysis for a representative nuclear power plant. Following the advancements occurred during the previous FYs (toolkits identification, models development), FY2017 focused on: increasing the level of realism of the analysis; improving the tools and the coupling methodologies. In particular the following objectives were achieved: calculation of buildings pounding and their effects on components seismic fragility; development of a SAPHIRE code PRA modelsmore » for 3-loops Westinghouse PWR; set-up of a methodology for performing static-dynamic PRA coupling between SAPHIRE and EMRALD codes; coupling RELAP5-3D/RAVEN for performing Best-Estimate Plus Uncertainty analysis and automatic limit surface search; and execute sample calculations for demonstrating the capabilities of the toolkit in performing a risk-informed external hazards safety analyses.« less
Decision-theoretic methodology for reliability and risk allocation in nuclear power plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cho, N.Z.; Papazoglou, I.A.; Bari, R.A.
1985-01-01
This paper describes a methodology for allocating reliability and risk to various reactor systems, subsystems, components, operations, and structures in a consistent manner, based on a set of global safety criteria which are not rigid. The problem is formulated as a multiattribute decision analysis paradigm; the multiobjective optimization, which is performed on a PRA model and reliability cost functions, serves as the guiding principle for reliability and risk allocation. The concept of noninferiority is used in the multiobjective optimization problem. Finding the noninferior solution set is the main theme of the current approach. The assessment of the decision maker's preferencesmore » could then be performed more easily on the noninferior solution set. Some results of the methodology applications to a nontrivial risk model are provided and several outstanding issues such as generic allocation and preference assessment are discussed.« less
Malinowski, M L; Beling, P A; Haimes, Y Y; LaViers, A; Marvel, J A; Weiss, B A
2015-01-01
The fields of risk analysis and prognostics and health management (PHM) have developed in a largely independent fashion. However, both fields share a common core goal. They aspire to manage future adverse consequences associated with prospective dysfunctions of the systems under consideration due to internal or external forces. This paper describes how two prominent risk analysis theories and methodologies - Hierarchical Holographic Modeling (HHM) and Risk Filtering, Ranking, and Management (RFRM) - can be adapted to support the design of PHM systems in the context of smart manufacturing processes. Specifically, the proposed methodologies will be used to identify targets - components, subsystems, or systems - that would most benefit from a PHM system in regards to achieving the following objectives: minimizing cost, minimizing production/maintenance time, maximizing system remaining usable life (RUL), maximizing product quality, and maximizing product output. HHM is a comprehensive modeling theory and methodology that is grounded on the premise that no system can be modeled effectively from a single perspective. It can also be used as an inductive method for scenario structuring to identify emergent forced changes (EFCs) in a system. EFCs connote trends in external or internal sources of risk to a system that may adversely affect specific states of the system. An important aspect of proactive risk management includes bolstering the resilience of the system for specific EFCs by appropriately controlling the states. Risk scenarios for specific EFCs can be the basis for the design of prognostic and diagnostic systems that provide real-time predictions and recognition of scenario changes. The HHM methodology includes visual modeling techniques that can enhance stakeholders' understanding of shared states, resources, objectives and constraints among the interdependent and interconnected subsystems of smart manufacturing systems. In risk analysis, HHM is often paired with Risk Filtering, Ranking, and Management (RFRM). The RFRM process provides the users, (e.g., technology developers, original equipment manufacturers (OEMs), technology integrators, manufacturers), with the most critical risks to the objectives, which can be used to identify the most critical components and subsystems that would most benefit from a PHM system. A case study is presented in which HHM and RFRM are adapted for PHM in the context of an active manufacturing facility located in the United States. The methodologies help to identify the critical risks to the manufacturing process, and the major components and subsystems that would most benefit from a developed PHM system.
Malinowski, M.L.; Beling, P.A.; Haimes, Y.Y.; LaViers, A.; Marvel, J.A.; Weiss, B.A.
2017-01-01
The fields of risk analysis and prognostics and health management (PHM) have developed in a largely independent fashion. However, both fields share a common core goal. They aspire to manage future adverse consequences associated with prospective dysfunctions of the systems under consideration due to internal or external forces. This paper describes how two prominent risk analysis theories and methodologies – Hierarchical Holographic Modeling (HHM) and Risk Filtering, Ranking, and Management (RFRM) – can be adapted to support the design of PHM systems in the context of smart manufacturing processes. Specifically, the proposed methodologies will be used to identify targets – components, subsystems, or systems – that would most benefit from a PHM system in regards to achieving the following objectives: minimizing cost, minimizing production/maintenance time, maximizing system remaining usable life (RUL), maximizing product quality, and maximizing product output. HHM is a comprehensive modeling theory and methodology that is grounded on the premise that no system can be modeled effectively from a single perspective. It can also be used as an inductive method for scenario structuring to identify emergent forced changes (EFCs) in a system. EFCs connote trends in external or internal sources of risk to a system that may adversely affect specific states of the system. An important aspect of proactive risk management includes bolstering the resilience of the system for specific EFCs by appropriately controlling the states. Risk scenarios for specific EFCs can be the basis for the design of prognostic and diagnostic systems that provide real-time predictions and recognition of scenario changes. The HHM methodology includes visual modeling techniques that can enhance stakeholders’ understanding of shared states, resources, objectives and constraints among the interdependent and interconnected subsystems of smart manufacturing systems. In risk analysis, HHM is often paired with Risk Filtering, Ranking, and Management (RFRM). The RFRM process provides the users, (e.g., technology developers, original equipment manufacturers (OEMs), technology integrators, manufacturers), with the most critical risks to the objectives, which can be used to identify the most critical components and subsystems that would most benefit from a PHM system. A case study is presented in which HHM and RFRM are adapted for PHM in the context of an active manufacturing facility located in the United States. The methodologies help to identify the critical risks to the manufacturing process, and the major components and subsystems that would most benefit from a developed PHM system. PMID:28664162
A Comparative Analysis of Disaster Risk, Vulnerability and Resilience Composite Indicators.
Beccari, Benjamin
2016-03-14
In the past decade significant attention has been given to the development of tools that attempt to measure the vulnerability, risk or resilience of communities to disasters. Particular attention has been given to the development of composite indices to quantify these concepts mirroring their deployment in other fields such as sustainable development. Whilst some authors have published reviews of disaster vulnerability, risk and resilience composite indicator methodologies, these have been of a limited nature. This paper seeks to dramatically expand these efforts by analysing 106 composite indicator methodologies to understand the breadth and depth of practice. An extensive search of the academic and grey literature was undertaken for composite indicator and scorecard methodologies that addressed multiple/all hazards; included social and economic aspects of risk, vulnerability or resilience; were sub-national in scope; explained the method and variables used; focussed on the present-day; and, had been tested or implemented. Information on the index construction, geographic areas of application, variables used and other relevant data was collected and analysed. Substantial variety in construction practices of composite indicators of risk, vulnerability and resilience were found. Five key approaches were identified in the literature, with the use of hierarchical or deductive indices being the most common. Typically variables were chosen by experts, came from existing statistical datasets and were combined by simple addition with equal weights. A minimum of 2 variables and a maximum of 235 were used, although approximately two thirds of methodologies used less than 40 variables. The 106 methodologies used 2298 unique variables, the most frequently used being common statistical variables such as population density and unemployment rate. Classification of variables found that on average 34% of the variables used in each methodology related to the social environment, 25% to the disaster environment, 20% to the economic environment, 13% to the built environment, 6% to the natural environment and 3% were other indices. However variables specifically measuring action to mitigate or prepare for disasters only comprised 12%, on average, of the total number of variables in each index. Only 19% of methodologies employed any sensitivity or uncertainty analysis and in only a single case was this comprehensive. A number of potential limitations of the present state of practice and how these might impact on decision makers are discussed. In particular the limited deployment of sensitivity and uncertainty analysis and the low use of direct measures of disaster risk, vulnerability and resilience could significantly limit the quality and reliability of existing methodologies. Recommendations for improvements to indicator development and use are made, as well as suggested future research directions to enhance the theoretical and empirical knowledge base for composite indicator development.
A Comparative Analysis of Disaster Risk, Vulnerability and Resilience Composite Indicators
Beccari, Benjamin
2016-01-01
Introduction: In the past decade significant attention has been given to the development of tools that attempt to measure the vulnerability, risk or resilience of communities to disasters. Particular attention has been given to the development of composite indices to quantify these concepts mirroring their deployment in other fields such as sustainable development. Whilst some authors have published reviews of disaster vulnerability, risk and resilience composite indicator methodologies, these have been of a limited nature. This paper seeks to dramatically expand these efforts by analysing 106 composite indicator methodologies to understand the breadth and depth of practice. Methods: An extensive search of the academic and grey literature was undertaken for composite indicator and scorecard methodologies that addressed multiple/all hazards; included social and economic aspects of risk, vulnerability or resilience; were sub-national in scope; explained the method and variables used; focussed on the present-day; and, had been tested or implemented. Information on the index construction, geographic areas of application, variables used and other relevant data was collected and analysed. Results: Substantial variety in construction practices of composite indicators of risk, vulnerability and resilience were found. Five key approaches were identified in the literature, with the use of hierarchical or deductive indices being the most common. Typically variables were chosen by experts, came from existing statistical datasets and were combined by simple addition with equal weights. A minimum of 2 variables and a maximum of 235 were used, although approximately two thirds of methodologies used less than 40 variables. The 106 methodologies used 2298 unique variables, the most frequently used being common statistical variables such as population density and unemployment rate. Classification of variables found that on average 34% of the variables used in each methodology related to the social environment, 25% to the disaster environment, 20% to the economic environment, 13% to the built environment, 6% to the natural environment and 3% were other indices. However variables specifically measuring action to mitigate or prepare for disasters only comprised 12%, on average, of the total number of variables in each index. Only 19% of methodologies employed any sensitivity or uncertainty analysis and in only a single case was this comprehensive. Discussion: A number of potential limitations of the present state of practice and how these might impact on decision makers are discussed. In particular the limited deployment of sensitivity and uncertainty analysis and the low use of direct measures of disaster risk, vulnerability and resilience could significantly limit the quality and reliability of existing methodologies. Recommendations for improvements to indicator development and use are made, as well as suggested future research directions to enhance the theoretical and empirical knowledge base for composite indicator development. PMID:27066298
Recommendations for benefit-risk assessment methodologies and visual representations.
Hughes, Diana; Waddingham, Ed; Mt-Isa, Shahrul; Goginsky, Alesia; Chan, Edmond; Downey, Gerald F; Hallgreen, Christine E; Hockley, Kimberley S; Juhaeri, Juhaeri; Lieftucht, Alfons; Metcalf, Marilyn A; Noel, Rebecca A; Phillips, Lawrence D; Ashby, Deborah; Micaleff, Alain
2016-03-01
The purpose of this study is to draw on the practical experience from the PROTECT BR case studies and make recommendations regarding the application of a number of methodologies and visual representations for benefit-risk assessment. Eight case studies based on the benefit-risk balance of real medicines were used to test various methodologies that had been identified from the literature as having potential applications in benefit-risk assessment. Recommendations were drawn up based on the results of the case studies. A general pathway through the case studies was evident, with various classes of methodologies having roles to play at different stages. Descriptive and quantitative frameworks were widely used throughout to structure problems, with other methods such as metrics, estimation techniques and elicitation techniques providing ways to incorporate technical or numerical data from various sources. Similarly, tree diagrams and effects tables were universally adopted, with other visualisations available to suit specific methodologies or tasks as required. Every assessment was found to follow five broad stages: (i) Planning, (ii) Evidence gathering and data preparation, (iii) Analysis, (iv) Exploration and (v) Conclusion and dissemination. Adopting formal, structured approaches to benefit-risk assessment was feasible in real-world problems and facilitated clear, transparent decision-making. Prior to this work, no extensive practical application and appraisal of methodologies had been conducted using real-world case examples, leaving users with limited knowledge of their usefulness in the real world. The practical guidance provided here takes us one step closer to a harmonised approach to benefit-risk assessment from multiple perspectives. Copyright © 2016 John Wiley & Sons, Ltd.
Costing the satellite power system
NASA Technical Reports Server (NTRS)
Hazelrigg, G. A., Jr.
1978-01-01
The paper presents a methodology for satellite power system costing, places approximate limits on the accuracy possible in cost estimates made at this time, and outlines the use of probabilistic cost information in support of the decision-making process. Reasons for using probabilistic costing or risk analysis procedures instead of standard deterministic costing procedures are considered. Components of cost, costing estimating relationships, grass roots costing, and risk analysis are discussed. Risk analysis using a Monte Carlo simulation model is used to estimate future costs.
Comparative quantification of health risks: Conceptual framework and methodological issues
Murray, Christopher JL; Ezzati, Majid; Lopez, Alan D; Rodgers, Anthony; Vander Hoorn, Stephen
2003-01-01
Reliable and comparable analysis of risks to health is key for preventing disease and injury. Causal attribution of morbidity and mortality to risk factors has traditionally been conducted in the context of methodological traditions of individual risk factors, often in a limited number of settings, restricting comparability. In this paper, we discuss the conceptual and methodological issues for quantifying the population health effects of individual or groups of risk factors in various levels of causality using knowledge from different scientific disciplines. The issues include: comparing the burden of disease due to the observed exposure distribution in a population with the burden from a hypothetical distribution or series of distributions, rather than a single reference level such as non-exposed; considering the multiple stages in the causal network of interactions among risk factor(s) and disease outcome to allow making inferences about some combinations of risk factors for which epidemiological studies have not been conducted, including the joint effects of multiple risk factors; calculating the health loss due to risk factor(s) as a time-indexed "stream" of disease burden due to a time-indexed "stream" of exposure, including consideration of discounting; and the sources of uncertainty. PMID:12780936
EPRI/NRC-RES fire human reliability analysis guidelines.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lewis, Stuart R.; Cooper, Susan E.; Najafi, Bijan
2010-03-01
During the 1990s, the Electric Power Research Institute (EPRI) developed methods for fire risk analysis to support its utility members in the preparation of responses to Generic Letter 88-20, Supplement 4, 'Individual Plant Examination - External Events' (IPEEE). This effort produced a Fire Risk Assessment methodology for operations at power that was used by the majority of U.S. nuclear power plants (NPPs) in support of the IPEEE program and several NPPs overseas. Although these methods were acceptable for accomplishing the objectives of the IPEEE, EPRI and the U.S. Nuclear Regulatory Commission (NRC) recognized that they required upgrades to support currentmore » requirements for risk-informed, performance-based (RI/PB) applications. In 2001, EPRI and the USNRC's Office of Nuclear Regulatory Research (RES) embarked on a cooperative project to improve the state-of-the-art in fire risk assessment to support a new risk-informed environment in fire protection. This project produced a consensus document, NUREG/CR-6850 (EPRI 1011989), entitled 'Fire PRA Methodology for Nuclear Power Facilities' which addressed fire risk for at power operations. NUREG/CR-6850 developed high level guidance on the process for identification and inclusion of human failure events (HFEs) into the fire PRA (FPRA), and a methodology for assigning quantitative screening values to these HFEs. It outlined the initial considerations of performance shaping factors (PSFs) and related fire effects that may need to be addressed in developing best-estimate human error probabilities (HEPs). However, NUREG/CR-6850 did not describe a methodology to develop best-estimate HEPs given the PSFs and the fire-related effects. In 2007, EPRI and RES embarked on another cooperative project to develop explicit guidance for estimating HEPs for human failure events under fire generated conditions, building upon existing human reliability analysis (HRA) methods. This document provides a methodology and guidance for conducting a fire HRA. This process includes identification and definition of post-fire human failure events, qualitative analysis, quantification, recovery, dependency, and uncertainty. This document provides three approaches to quantification: screening, scoping, and detailed HRA. Screening is based on the guidance in NUREG/CR-6850, with some additional guidance for scenarios with long time windows. Scoping is a new approach to quantification developed specifically to support the iterative nature of fire PRA quantification. Scoping is intended to provide less conservative HEPs than screening, but requires fewer resources than a detailed HRA analysis. For detailed HRA quantification, guidance has been developed on how to apply existing methods to assess post-fire fire HEPs.« less
NASA Astrophysics Data System (ADS)
Martino, P.
1980-12-01
A general methodology is presented for conducting an analysis of the various aspects of the hazards associated with the storage and transportation of liquefied natural gas (LNG) which should be considered during the planning stages of a typical LNG ship terminal. The procedure includes the performance of a hazards and system analysis of the proposed site, a probability analysis of accident scenarios and safety impacts, an analysis of the consequences of credible accidents such as tanker accidents, spills and fires, the assessment of risks and the design and evaluation of risk mitigation measures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramuhalli, Pradeep; Hirt, Evelyn H.; Veeramany, Arun
This research report summaries the development and evaluation of a prototypic enhanced risk monitor (ERM) methodology (framework) that includes alternative risk metrics and uncertainty analysis. This updated ERM methodology accounts for uncertainty in the equipment condition assessment (ECA), the prognostic result, and the probabilistic risk assessment (PRA) model. It is anticipated that the ability to characterize uncertainty in the estimated risk and update the risk estimates in real time based on equipment condition assessment (ECA) will provide a mechanism for optimizing plant performance while staying within specified safety margins. These results (based on impacting active component O&M using real-time equipmentmore » condition information) are a step towards ERMs that, if integrated with AR supervisory plant control systems, can help control O&M costs and improve affordability of advanced reactors.« less
Patorno, Elisabetta; Patrick, Amanda R; Garry, Elizabeth M; Schneeweiss, Sebastian; Gillet, Victoria G; Bartels, Dorothee B; Masso-Gonzalez, Elvira; Seeger, John D
2014-11-01
Recent years have witnessed a growing body of observational literature on the association between glucose-lowering treatments and cardiovascular disease. However, many of the studies are based on designs or analyses that inadequately address the methodological challenges involved. We reviewed recent observational literature on the association between glucose-lowering medications and cardiovascular outcomes and assessed the design and analysis methods used, with a focus on their ability to address specific methodological challenges. We describe and illustrate these methodological issues and their impact on observed associations, providing examples from the reviewed literature. We suggest approaches that may be employed to manage these methodological challenges. From the evaluation of 81 publications of observational investigations assessing the association between glucose-lowering treatments and cardiovascular outcomes, we identified the following methodological challenges: 1) handling of temporality in administrative databases; 2) handling of risks that vary with time and treatment duration; 3) definitions of the exposure risk window; 4) handling of exposures that change over time; and 5) handling of confounding by indication. Most of these methodological challenges may be suitably addressed through application of appropriate methods. Observational research plays an increasingly important role in the evaluation of the clinical effects of diabetes treatment. Implementation of appropriate research methods holds the promise of reducing the potential for spurious findings and the risk that the spurious findings will mislead the medical community about risks and benefits of diabetes medications.
Regional risk assessment for contaminated sites part 2: ranking of potentially contaminated sites.
Pizzol, Lisa; Critto, Andrea; Agostini, Paola; Marcomini, Antonio
2011-11-01
Environmental risks are traditionally assessed and presented in non spatial ways although the heterogeneity of the contaminants spatial distributions, the spatial positions and relations between receptors and stressors, as well as the spatial distribution of the variables involved in the risk assessment, strongly influence exposure estimations and hence risks. Taking into account spatial variability is increasingly being recognized as a further and essential step in sound exposure and risk assessment. To address this issue an innovative methodology which integrates spatial analysis and a relative risk approach was developed. The purpose of this methodology is to prioritize sites at regional scale where a preliminary site investigation may be required. The methodology aimed at supporting the inventory of contaminated sites was implemented within the spatial decision support sYstem for Regional rIsk Assessment of DEgraded land, SYRIADE, and was applied to the case-study of the Upper Silesia region (Poland). The developed methodology and tool are both flexible and easy to adapt to different regional contexts, allowing the user to introduce the regional relevant parameters identified on the basis of user expertise and regional data availability. Moreover, the used GIS functionalities, integrated with mathematical approaches, allow to take into consideration, all at once, the multiplicity of sources and impacted receptors within the region of concern, to assess the risks posed by all contaminated sites in the region and, finally, to provide a risk-based ranking of the potentially contaminated sites. Copyright © 2011. Published by Elsevier Ltd.
Li, Daiqing; Zhang, Chen; Pizzol, Lisa; Critto, Andrea; Zhang, Haibo; Lv, Shihai; Marcomini, Antonio
2014-04-01
The rapid industrial development and urbanization processes that occurred in China over the past 30years has increased dramatically the consumption of natural resources and raw materials, thus exacerbating the human pressure on environmental ecosystems. In result, large scale environmental pollution of soil, natural waters and urban air were recorded. The development of effective industrial planning to support regional sustainable economy development has become an issue of serious concern for local authorities which need to select safe sites for new industrial settlements (i.e. industrial plants) according to assessment approaches considering cumulative impacts, synergistic pollution effects and risks of accidental releases. In order to support decision makers in the development of efficient and effective regional land-use plans encompassing the identification of suitable areas for new industrial settlements and areas in need of intervention measures, this study provides a spatial regional risk assessment methodology which integrates relative risk assessment (RRA) and socio-economic assessment (SEA) and makes use of spatial analysis (GIS) methodologies and multicriteria decision analysis (MCDA) techniques. The proposed methodology was applied to the Chinese region of Hulunbeier which is located in eastern Inner Mongolia Autonomous Region, adjacent to the Republic of Mongolia. The application results demonstrated the effectiveness of the proposed methodology in the identification of the most hazardous and risky industrial settlements, the most vulnerable regional receptors and the regional districts which resulted to be the most relevant for intervention measures since they are characterized by high regional risk and excellent socio-economic development conditions. Copyright © 2013 Elsevier Ltd. All rights reserved.
Soller, Jeffrey A; Eftim, Sorina E; Nappier, Sharon P
2018-01-01
Understanding pathogen risks is a critically important consideration in the design of water treatment, particularly for potable reuse projects. As an extension to our published microbial risk assessment methodology to estimate infection risks associated with Direct Potable Reuse (DPR) treatment train unit process combinations, herein, we (1) provide an updated compilation of pathogen density data in raw wastewater and dose-response models; (2) conduct a series of sensitivity analyses to consider potential risk implications using updated data; (3) evaluate the risks associated with log credit allocations in the United States; and (4) identify reference pathogen reductions needed to consistently meet currently applied benchmark risk levels. Sensitivity analyses illustrated changes in cumulative annual risks estimates, the significance of which depends on the pathogen group driving the risk for a given treatment train. For example, updates to norovirus (NoV) raw wastewater values and use of a NoV dose-response approach, capturing the full range of uncertainty, increased risks associated with one of the treatment trains evaluated, but not the other. Additionally, compared to traditional log-credit allocation approaches, our results indicate that the risk methodology provides more nuanced information about how consistently public health benchmarks are achieved. Our results indicate that viruses need to be reduced by 14 logs or more to consistently achieve currently applied benchmark levels of protection associated with DPR. The refined methodology, updated model inputs, and log credit allocation comparisons will be useful to regulators considering DPR projects and design engineers as they consider which unit treatment processes should be employed for particular projects. Published by Elsevier Ltd.
Advances in Risk Analysis with Big Data.
Choi, Tsan-Ming; Lambert, James H
2017-08-01
With cloud computing, Internet-of-things, wireless sensors, social media, fast storage and retrieval, etc., organizations and enterprises have access to unprecedented amounts and varieties of data. Current risk analysis methodology and applications are experiencing related advances and breakthroughs. For example, highway operations data are readily available, and making use of them reduces risks of traffic crashes and travel delays. Massive data of financial and enterprise systems support decision making under risk by individuals, industries, regulators, etc. In this introductory article, we first discuss the meaning of big data for risk analysis. We then examine recent advances in risk analysis with big data in several topic areas. For each area, we identify and introduce the relevant articles that are featured in the special issue. We conclude with a discussion on future research opportunities. © 2017 Society for Risk Analysis.
Qiao, Yuanhua; Keren, Nir; Mannan, M Sam
2009-08-15
Risk assessment and management of transportation of hazardous materials (HazMat) require the estimation of accident frequency. This paper presents a methodology to estimate hazardous materials transportation accident frequency by utilizing publicly available databases and expert knowledge. The estimation process addresses route-dependent and route-independent variables. Negative binomial regression is applied to an analysis of the Department of Public Safety (DPS) accident database to derive basic accident frequency as a function of route-dependent variables, while the effects of route-independent variables are modeled by fuzzy logic. The integrated methodology provides the basis for an overall transportation risk analysis, which can be used later to develop a decision support system.
Cognitive mapping tools: review and risk management needs.
Wood, Matthew D; Bostrom, Ann; Bridges, Todd; Linkov, Igor
2012-08-01
Risk managers are increasingly interested in incorporating stakeholder beliefs and other human factors into the planning process. Effective risk assessment and management requires understanding perceptions and beliefs of involved stakeholders, and how these beliefs give rise to actions that influence risk management decisions. Formal analyses of risk manager and stakeholder cognitions represent an important first step. Techniques for diagramming stakeholder mental models provide one tool for risk managers to better understand stakeholder beliefs and perceptions concerning risk, and to leverage this new understanding in developing risk management strategies. This article reviews three methodologies for assessing and diagramming stakeholder mental models--decision-analysis-based mental modeling, concept mapping, and semantic web analysis--and assesses them with regard to their ability to address risk manager needs. © 2012 Society for Risk Analysis.
The United States Environmental Protection Agency (EPA) is developing a comprehensive environmental exposure and risk analysis software system for agency-wide application using the methodology of a Multi-media, Multi-pathway, Multi-receptor Risk Assessment (3MRA) model. This sof...
Carbon Fiber Risk Analysis. [conference
NASA Technical Reports Server (NTRS)
1979-01-01
The scope and status of the effort to assess the risks associated with the accidental release of carbon/graphite fibers from civil aircraft is presented. Vulnerability of electrical and electronic equipment to carbon fibers, dispersal of carbon fibers, effectiveness of filtering systems, impact of fiber induced failures, and risk methodology are among the topics covered.
The U.S. Environmental Protection Agency (EPA) is developing a comprehensive environmental exposure and risk analysis software system for agency-wide application using the methodology of a Multi-media, Multi-pathway, Multi-receptor Risk Assessment (3MRA) model. This software sys...
Vehicle mass and injury risk in two-car crashes: A novel methodology.
Tolouei, Reza; Maher, Mike; Titheridge, Helena
2013-01-01
This paper introduces a novel methodology based on disaggregate analysis of two-car crash data to estimate the partial effects of mass, through the velocity change, on absolute driver injury risk in each of the vehicles involved in the crash when absolute injury risk is defined as the probability of injury when the vehicle is involved in a two-car crash. The novel aspect of the introduced methodology is in providing a solution to the issue of lack of data on the speed of vehicles prior to the crash, which is required to calculate the velocity change, as well as a solution to the issue of lack of information on non-injury two-car crashes in national accident data. These issues have often led to focussing on relative measures of injury risk that are not independent of risk in the colliding cars. Furthermore, the introduced methodology is used to investigate whether there is any effect of vehicle size above and beyond that of mass ratio, and whether there are any effects associated with the gender and age of the drivers. The methodology was used to analyse two-car crashes to investigate the partial effects of vehicle mass and size on absolute driver injury risk. The results confirmed that in a two-car collision, vehicle mass has a protective effect on its own driver injury risk and an aggressive effect on the driver injury risk of the colliding vehicle. The results also confirmed that there is a protective effect of vehicle size above and beyond that of vehicle mass for frontal and front to side collisions. Copyright © 2012 Elsevier Ltd. All rights reserved.
Wang, Ting-Ting; Li, Jin-Mei; Zhou, Dong
2016-01-01
With great interest, we read the paper "Polymorphisms in IL-4/IL-13 pathway genes and glioma risk: an updated meta-analysis" (by Chen PQ et al.) [1], which has reached important conclusions about the relationship between polymorphisms in interleukin (IL)-4/IL-13 pathway genes and glioma risk. Through quantitative analysis, the meta-analysis found no association between IL-4/IL-13 pathway genetic polymorphisms and glioma risk (Chen et al. in Tumor Biol 36:121-127, 2015). The meta-analysis is the most comprehensive study of polymorphisms in the IL-4/IL-13 pathway and glioma risk. Nevertheless, some deficiencies still exist in this meta-analysis that we would like to raise.
SAMCO: Society Adaptation for coping with Mountain risks in a global change COntext
NASA Astrophysics Data System (ADS)
Grandjean, Gilles; Bernardie, Severine; Malet, Jean-Philippe; Puissant, Anne; Houet, Thomas; Berger, Frederic; Fort, Monique; Pierre, Daniel
2013-04-01
The SAMCO project aims to develop a proactive resilience framework enhancing the overall resilience of societies on the impacts of mountain risks. The project aims to elaborate methodological tools to characterize and measure ecosystem and societal resilience from an operative perspective on three mountain representative case studies. To achieve this objective, the methodology is split in several points with (1) the definition of the potential impacts of global environmental changes (climate system, ecosystem e.g. land use, socio-economic system) on landslide hazards, (2) the analysis of these consequences in terms of vulnerability (e.g. changes in the location and characteristics of the impacted areas and level of their perturbation) and (3) the implementation of a methodology for quantitatively investigating and mapping indicators of mountain slope vulnerability exposed to several hazard types, and the development of a GIS-based demonstration platform. The strength and originality of the SAMCO project will be to combine different techniques, methodologies and models (multi-hazard assessment, risk evolution in time, vulnerability functional analysis, and governance strategies) and to gather various interdisciplinary expertises in earth sciences, environmental sciences, and social sciences. The multidisciplinary background of the members could potentially lead to the development of new concepts and emerging strategies for mountain hazard/risk adaptation. Research areas, characterized by a variety of environmental, economical and social settings, are severely affected by landslides, and have experienced significant land use modifications (reforestation, abandonment of traditional agricultural practices) and human interferences (urban expansion, ski resorts construction) over the last century.
Dynamic Blowout Risk Analysis Using Loss Functions.
Abimbola, Majeed; Khan, Faisal
2018-02-01
Most risk analysis approaches are static; failing to capture evolving conditions. Blowout, the most feared accident during a drilling operation, is a complex and dynamic event. The traditional risk analysis methods are useful in the early design stage of drilling operation while falling short during evolving operational decision making. A new dynamic risk analysis approach is presented to capture evolving situations through dynamic probability and consequence models. The dynamic consequence models, the focus of this study, are developed in terms of loss functions. These models are subsequently integrated with the probability to estimate operational risk, providing a real-time risk analysis. The real-time evolving situation is considered dependent on the changing bottom-hole pressure as drilling progresses. The application of the methodology and models are demonstrated with a case study of an offshore drilling operation evolving to a blowout. © 2017 Society for Risk Analysis.
WE-B-BRC-01: Current Methodologies in Risk Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rath, F.
Prospective quality management techniques, long used by engineering and industry, have become a growing aspect of efforts to improve quality management and safety in healthcare. These techniques are of particular interest to medical physics as scope and complexity of clinical practice continue to grow, thus making the prescriptive methods we have used harder to apply and potentially less effective for our interconnected and highly complex healthcare enterprise, especially in imaging and radiation oncology. An essential part of most prospective methods is the need to assess the various risks associated with problems, failures, errors, and design flaws in our systems. Wemore » therefore begin with an overview of risk assessment methodologies used in healthcare and industry and discuss their strengths and weaknesses. The rationale for use of process mapping, failure modes and effects analysis (FMEA) and fault tree analysis (FTA) by TG-100 will be described, as well as suggestions for the way forward. This is followed by discussion of radiation oncology specific risk assessment strategies and issues, including the TG-100 effort to evaluate IMRT and other ways to think about risk in the context of radiotherapy. Incident learning systems, local as well as the ASTRO/AAPM ROILS system, can also be useful in the risk assessment process. Finally, risk in the context of medical imaging will be discussed. Radiation (and other) safety considerations, as well as lack of quality and certainty all contribute to the potential risks associated with suboptimal imaging. The goal of this session is to summarize a wide variety of risk analysis methods and issues to give the medical physicist access to tools which can better define risks (and their importance) which we work to mitigate with both prescriptive and prospective risk-based quality management methods. Learning Objectives: Description of risk assessment methodologies used in healthcare and industry Discussion of radiation oncology-specific risk assessment strategies and issues Evaluation of risk in the context of medical imaging and image quality E. Samei: Research grants from Siemens and GE.« less
Looking Closer at the Effects of Framing on Risky Choice: An Item Response Theory Analysis.
Sickar; Highhouse
1998-07-01
Item response theory (IRT) methodology allowed an in-depth examination of several issues that would be difficult to explore using traditional methodology. IRT models were estimated for 4 risky-choice items, answered by students under either a gain or loss frame. Results supported the typical framing finding of risk-aversion for gains and risk-seeking for losses but also suggested that a latent construct we label preference for risk was influential in predicting risky choice. Also, the Asian Disease item, most often used in framing research, was found to have anomalous statistical properties when compared to other framing items. Copyright 1998 Academic Press.
NASA Astrophysics Data System (ADS)
Dobes, P.; Hrdina, P.; Kotatko, A.; Danihelka, P.; Bednarik, M.; Krejci, O.; Kasperakova, D.
2009-04-01
One of present questions in the context of natural and technological risk mapping, which become important in last years, is analysis and assessment of selected types of multirisks. It results from relevant R&D projetcs and also from international workshops and conferences. From various surveys and presented activities it is evident existence a lot of data and methodological approaches for single risk categories but a lack of tested methodological approaches for multirisks. Within framework of workgroup was done literature search of multirisk assessment methodologies and innovations. The idea of this relatively small, local scale case study arose during the 3rd Risk Mapping Workshop, coordinated by EC DG JRC, IPSC in November 2007. The proposal was based on the previous risk analysis and assessment project, which has been done for Frydek-Mistek County area (Czech Republic) in the year 2002. Several industrial facilities in the Trinec are partly situated in the inundation area of river Olše and are partly protected by concrete barriers built on the banks of Olše. It has to be mentioned that these banks are unstable and in the permanent slow movement. If iron-concrete barriers will be overflowed by water as the result of sudden bank landslide or flood wave, it could trigger several industrial accidents on steel and energy production facilities. Area is highly developed from demographic and socioeconomic point of view. Selected area is in high stage of geological, engineering geological and hydrogeological investigation. Most important scenarios of acidents in the area were developed by What-If analysis and Black box analysis (just growth of several different scenarios; qualitative analysis). In the period of few years later, more QRA analyses of industrial risks were proceeded separately, thanks to District Office, public and Seveso II Directive requirements. General scenarios of multi-hazard events was considered. In the case study, three methodologies was applied to assess hazard and risk: qualitative approach based on German methodology of Risk matrix compilation; quantitative approach based on statistical methods previously used for the area between two towns Hlohovec and Sered in Slovakia; quantitative approach for the modelling of the floods on the river Olse based on model HEC-RAS. For evaluation of selected scenarios impacts to the facilities and also to the public, including evaluation of present barriers, was used also method of expert assesment. With regard to the preliminary results it could be estimated, that flooding of industrial facilities is less probable due to existing barriers, but several usefull recomendations for similar prone areas could be derived. Acknowledgements This work is partially supported by the Czech Ministry of the Evnironment in the frame of R&D project "Comprehensive Interactions between Natural Processes and Industry with Regard to Major Accident Prevention and Emergency Planning" (Registration Number: SPII 1a10 45/07).
Modeling Payload Stowage Impacts on Fire Risks On-Board the International Space Station
NASA Technical Reports Server (NTRS)
Anton, Kellie e.; Brown, Patrick F.
2010-01-01
The purpose of this presentation is to determine the risks of fire on-board the ISS due to non-standard stowage. ISS stowage is constantly being reexamined for optimality. Non-standard stowage involves stowing items outside of rack drawers, and fire risk is a key concern and is heavily mitigated. A Methodology is needed to account for fire risk due to non-standard stowage to capture the risk. The contents include: 1) Fire Risk Background; 2) General Assumptions; 3) Modeling Techniques; 4) Event Sequence Diagram (ESD); 5) Qualitative Fire Analysis; 6) Sample Qualitative Results for Fire Risk; 7) Qualitative Stowage Analysis; 8) Sample Qualitative Results for Non-Standard Stowage; and 9) Quantitative Analysis Basic Event Data.
Tailoring a Human Reliability Analysis to Your Industry Needs
NASA Technical Reports Server (NTRS)
DeMott, D. L.
2016-01-01
Companies at risk of accidents caused by human error that result in catastrophic consequences include: airline industry mishaps, medical malpractice, medication mistakes, aerospace failures, major oil spills, transportation mishaps, power production failures and manufacturing facility incidents. Human Reliability Assessment (HRA) is used to analyze the inherent risk of human behavior or actions introducing errors into the operation of a system or process. These assessments can be used to identify where errors are most likely to arise and the potential risks involved if they do occur. Using the basic concepts of HRA, an evolving group of methodologies are used to meet various industry needs. Determining which methodology or combination of techniques will provide a quality human reliability assessment is a key element to developing effective strategies for understanding and dealing with risks caused by human errors. There are a number of concerns and difficulties in "tailoring" a Human Reliability Assessment (HRA) for different industries. Although a variety of HRA methodologies are available to analyze human error events, determining the most appropriate tools to provide the most useful results can depend on industry specific cultures and requirements. Methodology selection may be based on a variety of factors that include: 1) how people act and react in different industries, 2) expectations based on industry standards, 3) factors that influence how the human errors could occur such as tasks, tools, environment, workplace, support, training and procedure, 4) type and availability of data, 5) how the industry views risk & reliability, and 6) types of emergencies, contingencies and routine tasks. Other considerations for methodology selection should be based on what information is needed from the assessment. If the principal concern is determination of the primary risk factors contributing to the potential human error, a more detailed analysis method may be employed versus a requirement to provide a numerical value as part of a probabilistic risk assessment. Industries involved with humans operating large equipment or transport systems (ex. railroads or airlines) would have more need to address the man machine interface than medical workers administering medications. Human error occurs in every industry; in most cases the consequences are relatively benign and occasionally beneficial. In cases where the results can have disastrous consequences, the use of Human Reliability techniques to identify and classify the risk of human errors allows a company more opportunities to mitigate or eliminate these types of risks and prevent costly tragedies.
Persechino, Benedetta; Valenti, Antonio; Ronchetti, Matteo; Rondinone, Bruna Maria; Di Tecco, Cristina; Vitali, Sara; Iavicoli, Sergio
2013-06-01
Work-related stress is one of the major causes of occupational ill health. In line with the regulatory framework on occupational health and safety (OSH), adequate models for assessing and managing risk need to be identified so as to minimize the impact of this stress not only on workers' health, but also on productivity. After close analysis of the Italian and European reference regulatory framework and work-related stress assessment and management models used in some European countries, we adopted the UK Health and Safety Executive's (HSE) Management Standards (MS) approach, adapting it to the Italian context in order to provide a suitable methodological proposal for Italy. We have developed a work-related stress risk assessment strategy, meeting regulatory requirements, now available on a specific web platform that includes software, tutorials, and other tools to assist companies in their assessments. This methodological proposal is new on the Italian work-related stress risk assessment scene. Besides providing an evaluation approach using scientifically validated instruments, it ensures the active participation of occupational health professionals in each company. The assessment tools provided enable companies not only to comply with the law, but also to contribute to a database for monitoring and assessment and give access to a reserved area for data analysis and comparisons.
Persechino, Benedetta; Valenti, Antonio; Ronchetti, Matteo; Rondinone, Bruna Maria; Di Tecco, Cristina; Vitali, Sara; Iavicoli, Sergio
2013-01-01
Background Work-related stress is one of the major causes of occupational ill health. In line with the regulatory framework on occupational health and safety (OSH), adequate models for assessing and managing risk need to be identified so as to minimize the impact of this stress not only on workers' health, but also on productivity. Methods After close analysis of the Italian and European reference regulatory framework and work-related stress assessment and management models used in some European countries, we adopted the UK Health and Safety Executive's (HSE) Management Standards (MS) approach, adapting it to the Italian context in order to provide a suitable methodological proposal for Italy. Results We have developed a work-related stress risk assessment strategy, meeting regulatory requirements, now available on a specific web platform that includes software, tutorials, and other tools to assist companies in their assessments. Conclusion This methodological proposal is new on the Italian work-related stress risk assessment scene. Besides providing an evaluation approach using scientifically validated instruments, it ensures the active participation of occupational health professionals in each company. The assessment tools provided enable companies not only to comply with the law, but also to contribute to a database for monitoring and assessment and give access to a reserved area for data analysis and comparisons. PMID:23961332
Patel, Teresa; Fisher, Stanley P.
2016-01-01
Objective This study aimed to utilize failure modes and effects analysis (FMEA) to transform clinical insights into a risk mitigation plan for intrathecal (IT) drug delivery in pain management. Methods The FMEA methodology, which has been used for quality improvement, was adapted to assess risks (i.e., failure modes) associated with IT therapy. Ten experienced pain physicians scored 37 failure modes in the following categories: patient selection for therapy initiation (efficacy and safety concerns), patient safety during IT therapy, and product selection for IT therapy. Participants assigned severity, probability, and detection scores for each failure mode, from which a risk priority number (RPN) was calculated. Failure modes with the highest RPNs (i.e., most problematic) were discussed, and strategies were proposed to mitigate risks. Results Strategic discussions focused on 17 failure modes with the most severe outcomes, the highest probabilities of occurrence, and the most challenging detection. The topic of the highest‐ranked failure mode (RPN = 144) was manufactured monotherapy versus compounded combination products. Addressing failure modes associated with appropriate patient and product selection was predicted to be clinically important for the success of IT therapy. Conclusions The methodology of FMEA offers a systematic approach to prioritizing risks in a complex environment such as IT therapy. Unmet needs and information gaps are highlighted through the process. Risk mitigation and strategic planning to prevent and manage critical failure modes can contribute to therapeutic success. PMID:27477689
A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.
Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco
2018-01-01
One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.
Bastiaensen, P; Abernethy, D; Etter, E
2017-04-01
African countries that wish to export are increasingly faced with import risk assessments from importing countries concerned about the sources of their imported goods. Other risk analysis methodologies and approaches are also employed, which focus on animal and human health within countries and communities. Based on an analysis of evaluations conducted by the World Organisation for Animal Health (OIE), using the Performance of Veterinary Services Tool, the authors attempt to define current practice in Africa and degrees of compliance with the World Trade Organization Agreement on the Application of Sanitary and Phytosanitary Measures ('SPS Agreement') and OIE standards. To assist in this task, the authors also make use of a review of selected risk assessment reports. Results point to a lack of technical capacity and capability to conduct risk assessments in compliance with OIE standards (except in the case of three countries), ranging from an outright absence of any form of (documented) risk assessment and consecutive risk management decisions (level of advancement 1) to shortcomings in one or several aspects of the risk assessment process. This is confirmed by a number of case studies, half of which have been produced by international consultants. The major recommendations of this paper are i) to strengthen the human resources pool for conducting risk assessments and ii) to establish dedicated risk assessment units, with clear terms of reference, job descriptions and policies, procedures and protocols.
Automotive Manufacturer Risk Analysis : Meeting the Automotive Fuel Economy Standards
DOT National Transportation Integrated Search
1979-08-01
An overview of the methodology and some findings are presented of a study which assessed the impact of the automotive fuel economy standards (AFES) on the four major U.S. automakers. A risk model was used to estimate the financial performance of the ...
Toroody, Ahmad Bahoo; Abaei, Mohammad Mahdy; Gholamnia, Reza
2016-12-01
Risk assessment can be classified into two broad categories: traditional and modern. This paper is aimed at contrasting the functional resonance analysis method (FRAM) as a modern approach with the fault tree analysis (FTA) as a traditional method, regarding assessing the risks of a complex system. Applied methodology by which the risk assessment is carried out, is presented in each approach. Also, FRAM network is executed with regard to nonlinear interaction of human and organizational levels to assess the safety of technological systems. The methodology is implemented for lifting structures deep offshore. The main finding of this paper is that the combined application of FTA and FRAM during risk assessment, could provide complementary perspectives and may contribute to a more comprehensive understanding of an incident. Finally, it is shown that coupling a FRAM network with a suitable quantitative method will result in a plausible outcome for a predefined accident scenario.
Jiang, Jheng Jie; Lee, Chon Lin; Fang, Meng Der; Boyd, Kenneth G.; Gibb, Stuart W.
2015-01-01
This paper presents a methodology based on multivariate data analysis for characterizing potential source contributions of emerging contaminants (ECs) detected in 26 river water samples across multi-scape regions during dry and wet seasons. Based on this methodology, we unveil an approach toward potential source contributions of ECs, a concept we refer to as the “Pharmaco-signature.” Exploratory analysis of data points has been carried out by unsupervised pattern recognition (hierarchical cluster analysis, HCA) and receptor model (principal component analysis-multiple linear regression, PCA-MLR) in an attempt to demonstrate significant source contributions of ECs in different land-use zone. Robust cluster solutions grouped the database according to different EC profiles. PCA-MLR identified that 58.9% of the mean summed ECs were contributed by domestic impact, 9.7% by antibiotics application, and 31.4% by drug abuse. Diclofenac, ibuprofen, codeine, ampicillin, tetracycline, and erythromycin-H2O have significant pollution risk quotients (RQ>1), indicating potentially high risk to aquatic organisms in Taiwan. PMID:25874375
Khakzad, Nima; Khan, Faisal; Amyotte, Paul
2015-07-01
Compared to the remarkable progress in risk analysis of normal accidents, the risk analysis of major accidents has not been so well-established, partly due to the complexity of such accidents and partly due to low probabilities involved. The issue of low probabilities normally arises from the scarcity of major accidents' relevant data since such accidents are few and far between. In this work, knowing that major accidents are frequently preceded by accident precursors, a novel precursor-based methodology has been developed for likelihood modeling of major accidents in critical infrastructures based on a unique combination of accident precursor data, information theory, and approximate reasoning. For this purpose, we have introduced an innovative application of information analysis to identify the most informative near accident of a major accident. The observed data of the near accident were then used to establish predictive scenarios to foresee the occurrence of the major accident. We verified the methodology using offshore blowouts in the Gulf of Mexico, and then demonstrated its application to dam breaches in the United Sates. © 2015 Society for Risk Analysis.
The SAMCO Web-platform for resilience assessment in mountainous valleys impacted by landslide risks.
NASA Astrophysics Data System (ADS)
Grandjean, Gilles; Thomas, Loic; Bernardie, Severine
2016-04-01
The ANR-SAMCO project aims to develop a proactive resilience framework enhancing the overall resilience of societies on the impacts of mountain risks. The project aims to elaborate methodological tools to characterize and measure ecosystem and societal resilience from an operative perspective on three mountain representative case studies. To achieve this objective, the methodology is split in several points: (1) the definition of the potential impacts of global environmental changes (climate system, ecosystem e.g. land use, socio-economic system) on landslide hazards, (2) the analysis of these consequences in terms of vulnerability (e.g. changes in the location and characteristics of the impacted areas and level of their perturbation) and (3) the implementation of a methodology for quantitatively investigating and mapping indicators of mountain slope vulnerability exposed to several hazard types, and the development of a GIS-based demonstration platform available on the web. The strength and originality of the SAMCO project lies in the combination of different techniques, methodologies and models (multi-hazard assessment, risk evolution in time, vulnerability functional analysis, and governance strategies) that are implemented in a user-oriented web-platform, currently in development. We present the first results of this development task, architecture and functions of the web-tools, the case studies database showing the multi-hazard maps and the stakes at risks. Risk assessment over several area of interest in Alpine or Pyrenean valleys are still in progress, but the first analyses are presented for current and future periods for which climate change and land-use (economical, geographical and social aspects) scenarios are taken into account. This tool, dedicated to stakeholders, should be finally used to evaluate resilience of mountainous regions since multiple scenarios can be tested and compared.
NASA Astrophysics Data System (ADS)
Baklanov, A.; Mahura, A.; Sørensen, J. H.
2003-06-01
There are objects with some periods of higher than normal levels of risk of accidental atmospheric releases (nuclear, chemical, biological, etc.). Such accidents or events may occur due to natural hazards, human errors, terror acts, and during transportation of waste or various operations at high risk. A methodology for risk assessment is suggested and it includes two approaches: 1) probabilistic analysis of possible atmospheric transport patterns using long-term trajectory and dispersion modelling, and 2) forecast and evaluation of possible contamination and consequences for the environment and population using operational dispersion modelling. The first approach could be applied during the preparation stage, and the second - during the operation stage. The suggested methodology is applied on an example of the most important phases (lifting, transportation, and decommissioning) of the ``Kursk" nuclear submarine operation. It is found that the temporal variability of several probabilistic indicators (fast transport probability fields, maximum reaching distance, maximum possible impact zone, and average integral concentration of 137Cs) showed that the fall of 2001 was the most appropriate time for the beginning of the operation. These indicators allowed to identify the hypothetically impacted geographical regions and territories. In cases of atmospheric transport toward the most populated areas, the forecasts of possible consequences during phases of the high and medium potential risk levels based on a unit hypothetical release (e.g. 1 Bq) are performed. The analysis showed that the possible deposition fractions of 10-11 (Bq/m2) over the Kola Peninsula, and 10-12 - 10-13 (Bq/m2) for the remote areas of the Scandinavia and Northwest Russia could be observed. The suggested methodology may be used successfully for any potentially dangerous object involving risk of atmospheric release of hazardous materials of nuclear, chemical or biological nature.
NASA Astrophysics Data System (ADS)
Baklanov, A.; Mahura, A.; Sørensen, J. H.
2003-03-01
There are objects with some periods of higher than normal levels of risk of accidental atmospheric releases (nuclear, chemical, biological, etc.). Such accidents or events may occur due to natural hazards, human errors, terror acts, and during transportation of waste or various operations at high risk. A methodology for risk assessment is suggested and it includes two approaches: 1) probabilistic analysis of possible atmospheric transport patterns using long-term trajectory and dispersion modelling, and 2) forecast and evaluation of possible contamination and consequences for the environment and population using operational dispersion modelling. The first approach could be applied during the preparation stage, and the second - during the operation stage. The suggested methodology is applied on an example of the most important phases (lifting, transportation, and decommissioning) of the "Kursk" nuclear submarine operation. It is found that the temporal variability of several probabilistic indicators (fast transport probability fields, maximum reaching distance, maximum possible impact zone, and average integral concentration of 137Cs) showed that the fall of 2001 was the most appropriate time for the beginning of the operation. These indicators allowed to identify the hypothetically impacted geographical regions and territories. In cases of atmospheric transport toward the most populated areas, the forecasts of possible consequences during phases of the high and medium potential risk levels based on a unit hypothetical release are performed. The analysis showed that the possible deposition fractions of 1011 over the Kola Peninsula, and 10-12 - 10-13 for the remote areas of the Scandinavia and Northwest Russia could be observed. The suggested methodology may be used successfully for any potentially dangerous object involving risk of atmospheric release of hazardous materials of nuclear, chemical or biological nature.
Development of Management Methodology for Engineering Production Quality
NASA Astrophysics Data System (ADS)
Gorlenko, O.; Miroshnikov, V.; Borbatc, N.
2016-04-01
The authors of the paper propose four directions of the methodology developing the quality management of engineering products that implement the requirements of new international standard ISO 9001:2015: the analysis of arrangement context taking into account stakeholders, the use of risk management, management of in-house knowledge, assessment of the enterprise activity according to the criteria of effectiveness
1982-08-01
between one that provides for total protection of life and property and one that per- mits operators to conduct activities in a " laisse - faire " manner...Workers. AD-PO00 456 General Risk Analysis Methodological Implications to Explosives Risk Management Systems. AD-PO0O 457 Risk Analysis for Explosives...THE EFFECTS OF THE HEALTH AND SAFETY AT WORK ACT, 1974, ON MILITARY EXPLOSIVES SAFETY MANAGEMENT IN THE UNITED KINGDOM ........................ 7 Air
The ethics of placebo-controlled trials: methodological justifications.
Millum, Joseph; Grady, Christine
2013-11-01
The use of placebo controls in clinical trials remains controversial. Ethical analysis and international ethical guidance permit the use of placebo controls in randomized trials when scientifically indicated in four cases: (1) when there is no proven effective treatment for the condition under study; (2) when withholding treatment poses negligible risks to participants; (3) when there are compelling methodological reasons for using placebo, and withholding treatment does not pose a risk of serious harm to participants; and, more controversially, (4) when there are compelling methodological reasons for using placebo, and the research is intended to develop interventions that can be implemented in the population from which trial participants are drawn, and the trial does not require participants to forgo treatment they would otherwise receive. The concept of methodological reasons is essential to assessing the ethics of placebo controls in these controversial last two cases. This article sets out key considerations relevant to considering whether methodological reasons for a placebo control are compelling. © 2013.
Guglielminotti, Jean; Dechartres, Agnès; Mentré, France; Montravers, Philippe; Longrois, Dan; Laouénan, Cedric
2015-10-01
Prognostic research studies in anesthesiology aim to identify risk factors for an outcome (explanatory studies) or calculate the risk of this outcome on the basis of patients' risk factors (predictive studies). Multivariable models express the relationship between predictors and an outcome and are used in both explanatory and predictive studies. Model development demands a strict methodology and a clear reporting to assess its reliability. In this methodological descriptive review, we critically assessed the reporting and methodology of multivariable analysis used in observational prognostic studies published in anesthesiology journals. A systematic search was conducted on Medline through Web of Knowledge, PubMed, and journal websites to identify observational prognostic studies with multivariable analysis published in Anesthesiology, Anesthesia & Analgesia, British Journal of Anaesthesia, and Anaesthesia in 2010 and 2011. Data were extracted by 2 independent readers. First, studies were analyzed with respect to reporting of outcomes, design, size, methods of analysis, model performance (discrimination and calibration), model validation, clinical usefulness, and STROBE (i.e., Strengthening the Reporting of Observational Studies in Epidemiology) checklist. A reporting rate was calculated on the basis of 21 items of the aforementioned points. Second, they were analyzed with respect to some predefined methodological points. Eighty-six studies were included: 87.2% were explanatory and 80.2% investigated a postoperative event. The reporting was fairly good, with a median reporting rate of 79% (75% in explanatory studies and 100% in predictive studies). Six items had a reporting rate <36% (i.e., the 25th percentile), with some of them not identified in the STROBE checklist: blinded evaluation of the outcome (11.9%), reason for sample size (15.1%), handling of missing data (36.0%), assessment of colinearity (17.4%), assessment of interactions (13.9%), and calibration (34.9%). When reported, a few methodological shortcomings were observed, both in explanatory and predictive studies, such as an insufficient number of events of the outcome (44.6%), exclusion of cases with missing data (93.6%), or categorization of continuous variables (65.1%.). The reporting of multivariable analysis was fairly good and could be further improved by checking reporting guidelines and EQUATOR Network website. Limiting the number of candidate variables, including cases with missing data, and not arbitrarily categorizing continuous variables should be encouraged.
Shield, Kevin D.; Parkin, D. Maxwell; Whiteman, David C.; Rehm, Jürgen; Viallon, Vivian; Micallef, Claire Marant; Vineis, Paolo; Rushton, Lesley; Bray, Freddie; Soerjomataram, Isabelle
2016-01-01
The proportions of new cancer cases and deaths that are caused by exposure to risk factors and that could be prevented are key statistics for public health policy and planning. This paper summarizes the methodologies for estimating, challenges in the analysis of, and utility of, population attributable and preventable fractions for cancers caused by major risk factors such as tobacco smoking, dietary factors, high body fat, physical inactivity, alcohol consumption, infectious agents, occupational exposure, air pollution, sun exposure, and insufficient breastfeeding. For population attributable and preventable fractions, evidence of a causal relationship between a risk factor and cancer, outcome (such as incidence and mortality), exposure distribution, relative risk, theoretical-minimum-risk, and counterfactual scenarios need to be clearly defined and congruent. Despite limitations of the methodology and the data used for estimations, the population attributable and preventable fractions are a useful tool for public health policy and planning. PMID:27547696
Tavares, Alexandre Oliveira; Barros, José Leandro; Santos, Angela
2017-04-01
This study presents a new multidimensional methodology for tsunami vulnerability assessment that combines the morphological, structural, social, and tax component of vulnerability. This new approach can be distinguished from previous methodologies that focused primarily on the evaluation of potentially affected buildings and did not use tsunami numerical modeling. The methodology was applied to the Figueira da Foz and Vila do Bispo municipalities in Portugal. For each area, the potential tsunami-inundated areas were calculated considering the 1755 Lisbon tsunami, which is the greatest disaster caused by natural hazards that ever occurred in Portugal. Furthermore, the four components of the vulnerability were calculated to obtain a composite vulnerability index. This methodology enables us to differentiate the two areas in their vulnerability, highlighting the characteristics of the territory components. This methodology can be a starting point for the creation of a local assessment framework at the municipal scale related to tsunami risk. In addition, the methodology is an important support for the different local stakeholders. © 2016 Society for Risk Analysis.
Aktipis, Athena
2016-01-01
In a meta-analysis published by myself and co-authors, we report differences in the life history risk factors for estrogen receptor negative (ER-) and estrogen receptor positive (ER+) breast cancers. Our meta-analysis did not find the association of ER- breast cancer risk with fast life history characteristics that Hidaka and Boddy suggest in their response to our article. There are a number of possible explanations for the differences between their conclusions and the conclusions we drew from our meta-analysis, including limitations of our meta-analysis and methodological challenges in measuring and categorizing estrogen receptor status. These challenges, along with the association of ER+ breast cancer with slow life history characteristics, may make it challenging to find a clear signal of ER- breast cancer with fast life history characteristics, even if that relationship does exist. The contradictory results regarding breast cancer risk and life history characteristics illustrate a more general challenge in evolutionary medicine: often different sub-theories in evolutionary biology make contradictory predictions about disease risk. In this case, life history models predict that breast cancer risk should increase with faster life history characteristics, while the evolutionary mismatch hypothesis predicts that breast cancer risk should increase with delayed reproduction. Whether life history tradeoffs contribute to ER- breast cancer is still an open question, but current models and several lines of evidence suggest that it is a possibility. © The Author(s) 2016. Published by Oxford University Press on behalf of the Foundation for Evolution, Medicine, and Public Health.
Systems Analysis of NASA Aviation Safety Program: Final Report
NASA Technical Reports Server (NTRS)
Jones, Sharon M.; Reveley, Mary S.; Withrow, Colleen A.; Evans, Joni K.; Barr, Lawrence; Leone, Karen
2013-01-01
A three-month study (February to April 2010) of the NASA Aviation Safety (AvSafe) program was conducted. This study comprised three components: (1) a statistical analysis of currently available civilian subsonic aircraft data from the National Transportation Safety Board (NTSB), the Federal Aviation Administration (FAA), and the Aviation Safety Information Analysis and Sharing (ASIAS) system to identify any significant or overlooked aviation safety issues; (2) a high-level qualitative identification of future safety risks, with an assessment of the potential impact of the NASA AvSafe research on the National Airspace System (NAS) based on these risks; and (3) a detailed, top-down analysis of the NASA AvSafe program using an established and peer-reviewed systems analysis methodology. The statistical analysis identified the top aviation "tall poles" based on NTSB accident and FAA incident data from 1997 to 2006. A separate examination of medical helicopter accidents in the United States was also conducted. Multiple external sources were used to develop a compilation of ten "tall poles" in future safety issues/risks. The top-down analysis of the AvSafe was conducted by using a modification of the Gibson methodology. Of the 17 challenging safety issues that were identified, 11 were directly addressed by the AvSafe program research portfolio.
ERIC Educational Resources Information Center
Metzger, Isha; Cooper, Shauna M.; Zarrett, Nicole; Flory, Kate
2013-01-01
The current review conducted a systematic assessment of culturally sensitive risk prevention programs for African American adolescents. Prevention programs meeting the inclusion and exclusion criteria were evaluated across several domains: (1) theoretical orientation and foundation; (2) methodological rigor; (3) level of cultural integration; (4)…
ERIC Educational Resources Information Center
Fallon, Barbara; Trocme, Nico; MacLaurin, Bruce; Sinha, Vandna; Black, Tara
2011-01-01
This paper describes the methodological changes that occurred across cycles of the Canadian Incidence Study of Reported Child Abuse and Neglect (CIS), specifically outlining the rationale for tracking investigations of families with children at risk of maltreatment in the CIS-2008 cycle. This paper also presents analysis of data from the CIS-2008…
NASA Astrophysics Data System (ADS)
Gastounioti, Aimilia; Keller, Brad M.; Hsieh, Meng-Kang; Conant, Emily F.; Kontos, Despina
2016-03-01
Growing evidence suggests that quantitative descriptors of the parenchymal texture patterns hold a valuable role in assessing an individual woman's risk for breast cancer. In this work, we assess the hypothesis that breast cancer risk factors are not uniformly expressed in the breast parenchymal tissue and, therefore, breast-anatomy-weighted parenchymal texture descriptors, where different breasts ROIs have non uniform contributions, may enhance breast cancer risk assessment. To this end, we introduce an automated breast-anatomy-driven methodology which generates a breast atlas, which is then used to produce a weight map that reinforces the contributions of the central and upper-outer breast areas. We incorporate this methodology to our previously validated lattice-based strategy for parenchymal texture analysis. In the framework of a pilot case-control study, including digital mammograms from 424 women, our proposed breast-anatomy-weighted texture descriptors are optimized and evaluated against non weighted texture features, using regression analysis with leave-one-out cross validation. The classification performance is assessed in terms of the area under the curve (AUC) of the receiver operating characteristic. The collective discriminatory capacity of the weighted texture features was maximized (AUC=0.87) when the central breast area was considered more important than the upperouter area, with significant performance improvement (DeLong's test, p-value<0.05) against the non-weighted texture features (AUC=0.82). Our results suggest that breast-anatomy-driven methodologies have the potential to further upgrade the promising role of parenchymal texture analysis in breast cancer risk assessment and may serve as a reference in the design of future studies towards image-driven personalized recommendations regarding women's cancer risk evaluation.
Bucher Della Torre, Sophie; Keller, Amélie; Laure Depeyre, Jocelyne; Kruseman, Maaike
2016-04-01
In the context of a worldwide high prevalence of childhood obesity, the role of sugar-sweetened beverage (SSB) consumption as a cause of excess weight gain remains controversial. Conflicting results may be due to methodological issues in original studies and in reviews. The aim of this review was to systematically analyze the methodology of studies investigating the influence of SSB consumption on risk of obesity and obesity among children and adolescents, and the studies' ability to answer this research question. A systematic review of cohort and experimental studies published until December 2013 in peer-reviewed journals was performed on Medline, CINAHL, Web of Knowledge, and ClinicalTrials.gov. Studies investigating the influence of SSB consumption on risk of obesity and obesity among children and adolescents were included, and methodological quality to answer this question was assessed independently by two investigators using the Academy of Nutrition and Dietetics Quality Criteria Checklist. Among the 32 identified studies, nine had positive quality ratings and 23 studies had at least one major methodological issue. Main methodological issues included SSB definition and inadequate measurement of exposure. Studies with positive quality ratings found an association between SSB consumption and risk of obesity or obesity (n=5) (ie, when SSB consumption increased so did obesity) or mixed results (n=4). Studies with a neutral quality rating found a positive association (n=7), mixed results (n=9), or no association (n=7). The present review shows that the majority of studies with strong methodology indicated a positive association between SSB consumption and risk of obesity or obesity, especially among overweight children. In addition, study findings highlight the need for the careful and precise measurement of the consumption of SSBs and of important confounders. Copyright © 2016 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.
Probabilistic risk analysis and terrorism risk.
Ezell, Barry Charles; Bennett, Steven P; von Winterfeldt, Detlof; Sokolowski, John; Collins, Andrew J
2010-04-01
Since the terrorist attacks of September 11, 2001, and the subsequent establishment of the U.S. Department of Homeland Security (DHS), considerable efforts have been made to estimate the risks of terrorism and the cost effectiveness of security policies to reduce these risks. DHS, industry, and the academic risk analysis communities have all invested heavily in the development of tools and approaches that can assist decisionmakers in effectively allocating limited resources across the vast array of potential investments that could mitigate risks from terrorism and other threats to the homeland. Decisionmakers demand models, analyses, and decision support that are useful for this task and based on the state of the art. Since terrorism risk analysis is new, no single method is likely to meet this challenge. In this article we explore a number of existing and potential approaches for terrorism risk analysis, focusing particularly on recent discussions regarding the applicability of probabilistic and decision analytic approaches to bioterrorism risks and the Bioterrorism Risk Assessment methodology used by the DHS and criticized by the National Academies and others.
Stingray Failure Mode, Effects and Criticality Analysis: WEC Risk Registers
Ken Rhinefrank
2016-07-25
Analysis method to systematically identify all potential failure modes and their effects on the Stingray WEC system. This analysis is incorporated early in the development cycle such that the mitigation of the identified failure modes can be achieved cost effectively and efficiently. The FMECA can begin once there is enough detail to functions and failure modes of a given system, and its interfaces with other systems. The FMECA occurs coincidently with the design process and is an iterative process which allows for design changes to overcome deficiencies in the analysis.Risk Registers for major subsystems completed according to the methodology described in "Failure Mode Effects and Criticality Analysis Risk Reduction Program Plan.pdf" document below, in compliance with the DOE Risk Management Framework developed by NREL.
Prieto, M.L.; Cuéllar-Barboza, A.B.; Bobo, W.V.; Roger, V.L.; Bellivier, F.; Leboyer, M.; West, C.P.; Frye, M.A.
2016-01-01
Objective To review the evidence on and estimate the risk of myocardial infarction and stroke in bipolar disorder. Method A systematic search using MEDLINE, EMBASE, PsycINFO, Web of Science, Scopus, Cochrane Database of Systematic Reviews, and bibliographies (1946 – May, 2013) was conducted. Case-control and cohort studies of bipolar disorder patients age 15 or older with myocardial infarction or stroke as outcomes were included. Two independent reviewers extracted data and assessed quality. Estimates of effect were summarized using random-effects meta-analysis. Results Five cohort studies including 13 115 911 participants (27 092 bipolar) were included. Due to the use of registers, different statistical methods, and inconsistent adjustment for confounders, there was significant methodological heterogeneity among studies. The exploratory meta-analysis yielded no evidence for a significant increase in the risk of myocardial infarction: [relative risk (RR): 1.09, 95% CI 0.96–1.24, P = 0.20; I2 = 6%]. While there was evidence of significant study heterogeneity, the risk of stroke in bipolar disorder was significantly increased (RR 1.74, 95% CI 1.29–2.35; P = 0.0003; I2 = 83%). Conclusion There may be a differential risk of myocardial infarction and stroke in patients with bipolar disorder. Confidence in these pooled estimates was limited by the small number of studies, significant heterogeneity and dissimilar methodological features. PMID:24850482
A Methodology to Support Decision Making in Flood Plan Mitigation
NASA Astrophysics Data System (ADS)
Biscarini, C.; di Francesco, S.; Manciola, P.
2009-04-01
The focus of the present document is on specific decision-making aspects of flood risk analysis. A flood is the result of runoff from rainfall in quantities too great to be confined in the low-water channels of streams. Little can be done to prevent a major flood, but we may be able to minimize damage within the flood plain of the river. This broad definition encompasses many possible mitigation measures. Floodplain management considers the integrated view of all engineering, nonstructural, and administrative measures for managing (minimizing) losses due to flooding on a comprehensive scale. The structural measures are the flood-control facilities designed according to flood characteristics and they include reservoirs, diversions, levees or dikes, and channel modifications. Flood-control measures that modify the damage susceptibility of floodplains are usually referred to as nonstructural measures and may require minor engineering works. On the other hand, those measures designed to modify the damage potential of permanent facilities are called non-structural and allow reducing potential damage during a flood event. Technical information is required to support the tasks of problem definition, plan formulation, and plan evaluation. The specific information needed and the related level of detail are dependent on the nature of the problem, the potential solutions, and the sensitivity of the findings to the basic information. Actions performed to set up and lay out the study are preliminary to the detailed analysis. They include: defining the study scope and detail, the field data collection, a review of previous studies and reports, and the assembly of needed maps and surveys. Risk analysis can be viewed as having many components: risk assessment, risk communication and risk management. Risk assessment comprises an analysis of the technical aspects of the problem, risk communication deals with conveying the information and risk management involves the decision process. In the present paper we propose a novel methodology for supporting the priority setting in the assessment of such issues, beyond the typical "expected value" approach. Scientific contribution and management aspects are merged to create a simplified method for plan basin implementation, based on risk and economic analyses. However, the economic evaluation is not the sole criterion for flood-damage reduction plan selection. Among the different criteria that are relevant to the decision process, safety and quality of human life, economic damage, expenses related with the chosen measures and environmental issues should play a fundamental role on the decisions made by the authorities. Some numerical indices, taking in account administrative, technical, economical and risk aspects, are defined and are combined together in a mathematical formula that defines a Priority Index (PI). In particular, the priority index defines a ranking of priority interventions, thus allowing the formulation of the investment plan. The research is mainly focused on the technical factors of risk assessment, providing quantitative and qualitative estimates of possible alternatives, containing measures of the risk associated with those alternatives. Moreover, the issues of risk management are analyzed, in particular with respect to the role of decision making in the presence of risk information. However, a great effort is devoted to make this index easy to be formulated and effective to allow a clear and transparent comparison between the alternatives. Summarizing this document describes a major- steps for incorporation of risk analysis into the decision making process: framing of the problem in terms of risk analysis, application of appropriate tools and techniques to obtain quantified results, use of the quantified results in the choice of structural and non-structural measures. In order to prove the reliability of the proposed methodology and to show how risk-based information can be incorporated into a flood analysis process, its application to some middle italy river basins is presented. The methodology assessment is performed by comparing different scenarios and showing that the optimal decision stems from a feasibility evaluation.
Multi-hazard risk analysis related to hurricanes
NASA Astrophysics Data System (ADS)
Lin, Ning
Hurricanes present major hazards to the United States. Associated with extreme winds, heavy rainfall, and storm surge, landfalling hurricanes often cause enormous structural damage to coastal regions. Hurricane damage risk assessment provides the basis for loss mitigation and related policy-making. Current hurricane risk models, however, often oversimplify the complex processes of hurricane damage. This dissertation aims to improve existing hurricane risk assessment methodology by coherently modeling the spatial-temporal processes of storm landfall, hazards, and damage. Numerical modeling technologies are used to investigate the multiplicity of hazards associated with landfalling hurricanes. The application and effectiveness of current weather forecasting technologies to predict hurricane hazards is investigated. In particular, the Weather Research and Forecasting model (WRF), with Geophysical Fluid Dynamics Laboratory (GFDL)'s hurricane initialization scheme, is applied to the simulation of the wind and rainfall environment during hurricane landfall. The WRF model is further coupled with the Advanced Circulation (AD-CIRC) model to simulate storm surge in coastal regions. A case study examines the multiple hazards associated with Hurricane Isabel (2003). Also, a risk assessment methodology is developed to estimate the probability distribution of hurricane storm surge heights along the coast, particularly for data-scarce regions, such as New York City. This methodology makes use of relatively simple models, specifically a statistical/deterministic hurricane model and the Sea, Lake and Overland Surges from Hurricanes (SLOSH) model, to simulate large numbers of synthetic surge events, and conducts statistical analysis. The estimation of hurricane landfall probability and hazards are combined with structural vulnerability models to estimate hurricane damage risk. Wind-induced damage mechanisms are extensively studied. An innovative windborne debris risk model is developed based on the theory of Poisson random measure, substantiated by a large amount of empirical data. An advanced vulnerability assessment methodology is then developed, by integrating this debris risk model and a component-based pressure damage model, to predict storm-specific or annual damage to coastal residential neighborhoods. The uniqueness of this vulnerability model lies in its detailed description of the interaction between wind pressure and windborne debris effects over periods of strong winds, which is a major mechanism leading to structural failures during hurricanes.
A methodology for estimating risks associated with landslides of contaminated soil into rivers.
Göransson, Gunnel; Norrman, Jenny; Larson, Magnus; Alén, Claes; Rosén, Lars
2014-02-15
Urban areas adjacent to surface water are exposed to soil movements such as erosion and slope failures (landslides). A landslide is a potential mechanism for mobilisation and spreading of pollutants. This mechanism is in general not included in environmental risk assessments for contaminated sites, and the consequences associated with contamination in the soil are typically not considered in landslide risk assessments. This study suggests a methodology to estimate the environmental risks associated with landslides in contaminated sites adjacent to rivers. The methodology is probabilistic and allows for datasets with large uncertainties and the use of expert judgements, providing quantitative estimates of probabilities for defined failures. The approach is illustrated by a case study along the river Göta Älv, Sweden, where failures are defined and probabilities for those failures are estimated. Failures are defined from a pollution perspective and in terms of exceeding environmental quality standards (EQSs) and acceptable contaminant loads. Models are then suggested to estimate probabilities of these failures. A landslide analysis is carried out to assess landslide probabilities based on data from a recent landslide risk classification study along the river Göta Älv. The suggested methodology is meant to be a supplement to either landslide risk assessment (LRA) or environmental risk assessment (ERA), providing quantitative estimates of the risks associated with landslide in contaminated sites. The proposed methodology can also act as a basis for communication and discussion, thereby contributing to intersectoral management solutions. From the case study it was found that the defined failures are governed primarily by the probability of a landslide occurring. The overall probabilities for failure are low; however, if a landslide occurs the probabilities of exceeding EQS are high and the probability of having at least a 10% increase in the contamination load within one year is also high. Copyright © 2013 Elsevier B.V. All rights reserved.
Gallina, Valentina; Torresan, Silvia; Critto, Andrea; Sperotto, Anna; Glade, Thomas; Marcomini, Antonio
2016-03-01
This paper presents a review of existing multi-risk assessment concepts and tools applied by organisations and projects providing the basis for the development of a multi-risk methodology in a climate change perspective. Relevant initiatives were developed for the assessment of multiple natural hazards (e.g. floods, storm surges, droughts) affecting the same area in a defined timeframe (e.g. year, season, decade). Major research efforts were focused on the identification and aggregation of multiple hazard types (e.g. independent, correlated, cascading hazards) by means of quantitative and semi-quantitative approaches. Moreover, several methodologies aim to assess the vulnerability of multiple targets to specific natural hazards by means of vulnerability functions and indicators at the regional and local scale. The overall results of the review show that multi-risk approaches do not consider the effects of climate change and mostly rely on the analysis of static vulnerability (i.e. no time-dependent vulnerabilities, no changes among exposed elements). A relevant challenge is therefore to develop comprehensive formal approaches for the assessment of different climate-induced hazards and risks, including dynamic exposure and vulnerability. This requires the selection and aggregation of suitable hazard and vulnerability metrics to make a synthesis of information about multiple climate impacts, the spatial analysis and ranking of risks, including their visualization and communication to end-users. To face these issues, climate impact assessors should develop cross-sectorial collaborations among different expertise (e.g. modellers, natural scientists, economists) integrating information on climate change scenarios with sectorial climate impact assessment, towards the development of a comprehensive multi-risk assessment process. Copyright © 2015 Elsevier Ltd. All rights reserved.
Risk assessment techniques with applicability in marine engineering
NASA Astrophysics Data System (ADS)
Rudenko, E.; Panaitescu, F. V.; Panaitescu, M.
2015-11-01
Nowadays risk management is a carefully planned process. The task of risk management is organically woven into the general problem of increasing the efficiency of business. Passive attitude to risk and awareness of its existence are replaced by active management techniques. Risk assessment is one of the most important stages of risk management, since for risk management it is necessary first to analyze and evaluate risk. There are many definitions of this notion but in general case risk assessment refers to the systematic process of identifying the factors and types of risk and their quantitative assessment, i.e. risk analysis methodology combines mutually complementary quantitative and qualitative approaches. Purpose of the work: In this paper we will consider as risk assessment technique Fault Tree analysis (FTA). The objectives are: understand purpose of FTA, understand and apply rules of Boolean algebra, analyse a simple system using FTA, FTA advantages and disadvantages. Research and methodology: The main purpose is to help identify potential causes of system failures before the failures actually occur. We can evaluate the probability of the Top event.The steps of this analize are: the system's examination from Top to Down, the use of symbols to represent events, the use of mathematical tools for critical areas, the use of Fault tree logic diagrams to identify the cause of the Top event. Results: In the finally of study it will be obtained: critical areas, Fault tree logical diagrams and the probability of the Top event. These results can be used for the risk assessment analyses.
Mt-Isa, Shahrul; Hallgreen, Christine E; Wang, Nan; Callréus, Torbjörn; Genov, Georgy; Hirsch, Ian; Hobbiger, Stephen F; Hockley, Kimberley S; Luciani, Davide; Phillips, Lawrence D; Quartey, George; Sarac, Sinan B; Stoeckert, Isabelle; Tzoulaki, Ioanna; Micaleff, Alain; Ashby, Deborah
2014-07-01
The need for formal and structured approaches for benefit-risk assessment of medicines is increasing, as is the complexity of the scientific questions addressed before making decisions on the benefit-risk balance of medicines. We systematically collected, appraised and classified available benefit-risk methodologies to facilitate and inform their future use. A systematic review of publications identified benefit-risk assessment methodologies. Methodologies were appraised on their fundamental principles, features, graphical representations, assessability and accessibility. We created a taxonomy of methodologies to facilitate understanding and choice. We identified 49 methodologies, critically appraised and classified them into four categories: frameworks, metrics, estimation techniques and utility survey techniques. Eight frameworks describe qualitative steps in benefit-risk assessment and eight quantify benefit-risk balance. Nine metric indices include threshold indices to measure either benefit or risk; health indices measure quality-of-life over time; and trade-off indices integrate benefits and risks. Six estimation techniques support benefit-risk modelling and evidence synthesis. Four utility survey techniques elicit robust value preferences from relevant stakeholders to the benefit-risk decisions. Methodologies to help benefit-risk assessments of medicines are diverse and each is associated with different limitations and strengths. There is not a 'one-size-fits-all' method, and a combination of methods may be needed for each benefit-risk assessment. The taxonomy introduced herein may guide choice of adequate methodologies. Finally, we recommend 13 of 49 methodologies for further appraisal for use in the real-life benefit-risk assessment of medicines. Copyright © 2014 John Wiley & Sons, Ltd.
From event analysis to global lessons: disaster forensics for building resilience
NASA Astrophysics Data System (ADS)
Keating, Adriana; Venkateswaran, Kanmani; Szoenyi, Michael; MacClune, Karen; Mechler, Reinhard
2016-04-01
With unprecedented growth in disaster risk, there is an urgent need for enhanced learning about and understanding disasters, particularly in relation to the trends in the drivers of increasing risk. Building on the disaster forensics field, we introduce the Post Event Review Capability (PERC) methodology for systematically and holistically analyzing disaster events, and identifying actionable recommendations. PERC responds to a need for learning about the successes and failures in disaster risk management and resilience, and uncovers the underlying drivers of increasing risk. We draw generalizable insights identified from seven applications of the methodology to date, where we find that across the globe policy makers and practitioners in disaster risk management face strikingly similar challenges despite variations in context, indicating encouraging potential for mutual learning. These lessons highlight the importance of integrated risk reduction strategies. We invite others to utilize the freely available PERC approach and contribute to building a repository of learnings on disaster risk management and resilience.
From event analysis to global lessons: disaster forensics for building resilience
NASA Astrophysics Data System (ADS)
Keating, Adriana; Venkateswaran, Kanmani; Szoenyi, Michael; MacClune, Karen; Mechler, Reinhard
2016-07-01
With unprecedented growth in disaster risk, there is an urgent need for enhanced learning and understanding of disasters, particularly in relation to the trends in drivers of increasing risk. Building on the disaster forensics field, we introduce the post-event review capability (PERC) methodology for systematically and holistically analysing disaster events, and identifying actionable recommendations. PERC responds to a need for learning about the successes and failures in disaster risk management and resilience, and uncovers the underlying drivers of increasing risk. We draw generalisable insights identified from seven applications of the methodology to date, where we find that across the globe policy makers and practitioners in disaster risk management face strikingly similar challenges despite variations in context, indicating encouraging potential for mutual learning. These lessons highlight the importance of integrated risk reduction strategies. We invite others to utilise the freely available PERC approach and contribute to building a repository of learning on disaster risk management and resilience.
NASA Technical Reports Server (NTRS)
Ling, Lisa
2014-01-01
For the purpose of performing safety analysis and risk assessment for a potential off-nominal atmospheric reentry resulting in vehicle breakup, a synthesis of trajectory propagation coupled with thermal analysis and the evaluation of node failure is required to predict the sequence of events, the timeline, and the progressive demise of spacecraft components. To provide this capability, the Simulation for Prediction of Entry Article Demise (SPEAD) analysis tool was developed. The software and methodology have been validated against actual flights, telemetry data, and validated software, and safety/risk analyses were performed for various programs using SPEAD. This report discusses the capabilities, modeling, validation, and application of the SPEAD analysis tool.
NASA Astrophysics Data System (ADS)
Ronco, P.; Bullo, M.; Torresan, S.; Critto, A.; Olschewski, R.; Zappa, M.; Marcomini, A.
2014-07-01
The main objective of the paper is the application of the KULTURisk Regional Risk Assessment (KR-RRA) methodology, presented in the companion paper (Part 1, Ronco et al., 2014), to the Sihl River valley, in Switzerland. Through a tuning process of the methodology to the site-specific context and features, flood related risks have been assessed for different receptors lying on the Sihl River valley including the city of Zurich, which represents a typical case of river flooding in urban area. After characterizing the peculiarities of the specific case study, risk maps have been developed under a 300 years return period scenario (selected as baseline) for six identified relevant targets, exposed to flood risk in the Sihl valley, namely: people, economic activities (including buildings, infrastructures and agriculture), natural and semi-natural systems and cultural heritage. Finally, the total risk index map, which allows to identify and rank areas and hotspots at risk by means of Multi Criteria Decision Analysis tools, has been produced to visualize the spatial pattern of flood risk within the area of study. By means of a tailored participative approach, the total risk maps supplement the consideration of technical experts with the (essential) point of view of the relevant stakeholders for the appraisal of the specific scores and weights related to the receptor-relative risks. The total risk maps obtained for the Sihl River case study are associated with the lower classes of risk. In general, higher relative risks are concentrated in the deeply urbanized area within and around the Zurich city centre and areas that rely just behind to the Sihl River course. Here, forecasted injuries and potential fatalities are mainly due to high population density and high presence of old (vulnerable) people; inundated buildings are mainly classified as continuous and discontinuous urban fabric; flooded roads, pathways and railways, the majority of them referring to the Zurich main train station (Hauptbahnhof), are at high risk of inundation, causing huge indirect damages. The analysis of flood risk to agriculture, natural and semi-natural systems and cultural heritage have pointed out that these receptors could be relatively less impacted by the selected flood scenario mainly because their scattered presence. Finally, the application of the KR-RRA methodology to the Sihl River case study as well as to several other sites across Europe (not presented here), has demonstrated its flexibility and possible adaptation to different geographical and socio-economic contexts, depending on data availability and peculiarities of the sites, as well as for other hazard scenarios.
Grigorev, Yu I; Lyapina, N V
2014-01-01
The hygienic analysis of centralized drinking water supply in Tula region was performed. Priority contaminants of drinking water were established. On the base of the application of risk assessment methodology there was calculated carcinogenic risk for children's health. A direct relationship between certain classes of diseases and pollution of drinking water with chemical contaminants has been determined.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-13
... Numerical Simulations Risk Management Methodology September 7, 2010. Pursuant to Section 19(b)(1) of the... for incorporation in the System for Theoretical Analysis and Numerical Simulations (``STANS'') risk... ETFs \\3\\ in the STANS margin calculation process.\\4\\ When OCC began including common stock and ETFs in...
Getting Clean in a Drug Rehabilitation Program in Prison: A Grounded Theory Analysis
ERIC Educational Resources Information Center
Smith, Sharon; Ferguson, Neil
2005-01-01
High-risk drug use is prevalent among UK prison populations (Lipton, 1995) while recovery in prison is both complex and variable. Grounded theory methodology was employed to gain a greater understanding of the perceptions and conceptualisations of "risk," "need" and "motivation" in relation to prisoner drug abusing…
REVIEW OF DRAFT REVISED BLUE BOOK ON ESTIMATING CANCER RISKS FROM EXPOSURE TO IONIZING RADIATION
In 1994, EPA published a report, referred to as the “Blue Book,” which lays out EPA’s current methodology for quantitatively estimating radiogenic cancer risks. A follow-on report made minor adjustments to the previous estimates and presented a partial analysis of the uncertainti...
Nelson, Kristen C; Andow, David A; Banker, Michael J
2009-01-01
Societal evaluation of new technologies, specifically nanotechnology and genetically engineered organisms (GEOs), challenges current practices of governance and science. Employing environmental risk assessment (ERA) for governance and oversight assumes we have a reasonable ability to understand consequences and predict adverse effects. However, traditional ERA has come under considerable criticism for its many shortcomings and current governance institutions have demonstrated limitations in transparency, public input, and capacity. Problem Formulation and Options Assessment (PFOA) is a methodology founded on three key concepts in risk assessment (science-based consideration, deliberation, and multi-criteria analysis) and three in governance (participation, transparency, and accountability). Developed through a series of international workshops, the PFOA process emphasizes engagement with stakeholders in iterative stages, from identification of the problem(s) through comparison of multiple technology solutions that could be used in the future with their relative benefits, harms, and risk. It provides "upstream public engagement" in a deliberation informed by science that identifies values for improved decision making.
NASA Astrophysics Data System (ADS)
Tahri, Meryem; Maanan, Mohamed; Hakdaoui, Mustapha
2016-04-01
This paper shows a method to assess the vulnerability of coastal risks such as coastal erosion or submarine applying Fuzzy Analytic Hierarchy Process (FAHP) and spatial analysis techniques with Geographic Information System (GIS). The coast of the Mohammedia located in Morocco was chosen as the study site to implement and validate the proposed framework by applying a GIS-FAHP based methodology. The coastal risk vulnerability mapping follows multi-parametric causative factors as sea level rise, significant wave height, tidal range, coastal erosion, elevation, geomorphology and distance to an urban area. The Fuzzy Analytic Hierarchy Process methodology enables the calculation of corresponding criteria weights. The result shows that the coastline of the Mohammedia is characterized by a moderate, high and very high level of vulnerability to coastal risk. The high vulnerability areas are situated in the east at Monika and Sablette beaches. This technical approach is based on the efficiency of the Geographic Information System tool based on Fuzzy Analytical Hierarchy Process to help decision maker to find optimal strategies to minimize coastal risks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1988-12-01
This document contains twelve papers on various aspects of low-level radioactive waste management. Topics of this volume include: performance assessment methodology; remedial action alternatives; site selection and site characterization procedures; intruder scenarios; sensitivity analysis procedures; mathematical models for mixed waste environmental transport; and risk assessment methodology. Individual papers were processed separately for the database. (TEM)
A probabilistic seismic risk assessment procedure for nuclear power plants: (I) Methodology
Huang, Y.-N.; Whittaker, A.S.; Luco, N.
2011-01-01
A new procedure for probabilistic seismic risk assessment of nuclear power plants (NPPs) is proposed. This procedure modifies the current procedures using tools developed recently for performance-based earthquake engineering of buildings. The proposed procedure uses (a) response-based fragility curves to represent the capacity of structural and nonstructural components of NPPs, (b) nonlinear response-history analysis to characterize the demands on those components, and (c) Monte Carlo simulations to determine the damage state of the components. The use of response-rather than ground-motion-based fragility curves enables the curves to be independent of seismic hazard and closely related to component capacity. The use of Monte Carlo procedure enables the correlation in the responses of components to be directly included in the risk assessment. An example of the methodology is presented in a companion paper to demonstrate its use and provide the technical basis for aspects of the methodology. ?? 2011 Published by Elsevier B.V.
ADVISORY ON UPDATED METHODOLOGY FOR ...
The National Academy of Sciences (NAS) published the Biological Effects of Ionizing Radiation (BEIR) committee's report (BEIR VII) on risks from ionizing radiation exposures in 2006. The Committee analyzed the most recent epidemiology from the important exposed cohorts and factored in changes resulting from the updated analysis of dosimetry for the Japanese atomic bomb survivors. To the extent practical, the Committee also considered relevant radiobiological data, including that from the Department of Energy's low dose effects research program. Based on the review of this information, the Committee proposed a set of models for estimating risks from low-dose ionizing radiation. ORIA then prepared a white paper revising the Agency's methodology for estimating cancer risks from exposure to ionizing radiation in light of this report and other relevant information. This is the first product to be developed as a result of the BEIR VII report. We requested that the SAB conduct an advisory during the development of this methodology. The second product to be prepared will be a revised version of the document,
Concerns related to Safety Management of Engineered Nanomaterials in research environment
NASA Astrophysics Data System (ADS)
Groso, A.; Meyer, Th
2013-04-01
Since the rise of occupational safety and health research on nanomaterials a lot of progress has been made in generating health effects and exposure data. However, when detailed quantitative risk analysis is in question, more research is needed, especially quantitative measures of workers exposure and standards to categorize toxicity/hazardousness data. In the absence of dose-response relationships and quantitative exposure measurements, control banding (CB) has been widely adopted by OHS community as a pragmatic tool in implementing a risk management strategy based on a precautionary approach. Being in charge of health and safety in a Swiss university, where nanomaterials are largely used and produced, we are also faced with the challenge related to nanomaterials' occupational safety. In this work, we discuss the field application of an in-house risk management methodology similar to CB as well as some other methodologies. The challenges and issues related to the process will be discussed. Since exact data on nanomaterials hazardousness are missing for most of the situations, we deduce that the outcome of the analysis for a particular process is essentially the same with a simple methodology that determines only exposure potential and the one taking into account the hazardousness of ENPs. It is evident that when reliable data on hazardousness factors (as surface chemistry, solubility, carcinogenicity, toxicity etc.) will be available, more differentiation will be possible in determining the risk for different materials. On the protective measures side, all CB methodologies are inclined to overprotection side, only that some of them suggest comprehensive protective/preventive measures and others remain with basic advices. The implementation and control of protective measures in research environment will also be discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blaylock, B.P.; Legg, J.; Travis, C.C.
1995-06-01
This document describes a worker health risk evaluation methodology for assessing risks associated with Environmental Restoration (ER) and Waste Management (WM). The methodology is appropriate for estimating worker risks across the Department of Energy (DOE) Complex at both programmatic and site-specific levels. This document supports the worker health risk methodology used to perform the human health risk assessment portion of the DOE Programmatic Environmental Impact Statement (PEIS) although it has applications beyond the PEIS, such as installation-wide worker risk assessments, screening-level assessments, and site-specific assessments.
Walia, Gurjot S; Wong, Alison L; Lo, Andrea Y; Mackert, Gina A; Carl, Hannah M; Pedreira, Rachel A; Bello, Ricardo; Aquino, Carla S; Padula, William V; Sacks, Justin M
2016-12-01
To present a systematic review of the literature assessing the efficacy of monitoring devices for reducing the risk of developing pressure injuries. This continuing education activity is intended for physicians, physician assistants, nurse practitioners, and nurses with an interest in skin and wound care. After participating in this educational activity, the participant should be better able to:1. Explain the methodology of the literature review and its results.2. Discuss the scope of the problem and the implications of the research. OBJECTIVE: To assess the efficacy of monitoring devices for reducing the risk of developing pressure injuries (PIs). The authors systematically reviewed the literature by searching PubMed/MEDLINE and CINAHL databases through January 2016. Articles included clinical trials and cohort studies that tested monitoring devices, evaluating PI risk factors on patients in acute and skilled nursing settings. The articles were scored using the Methodological Index for Non-randomized Studies. Using a standardized extraction form, the authors extracted patient inclusion/exclusion criteria, care setting, key baseline, description of monitoring device and methodology, number of patients included in each group, description of any standard of care, follow-up period, and outcomes. Of the identified 1866 publications, 9 met the inclusion criteria. The high-quality studies averaged Methodological Index for Non-randomized Studies scores of 19.4 for clinical trials and 12.2 for observational studies. These studies evaluated monitoring devices that measured interface pressure, subdermal tissue stress, motion, and moisture. Most studies found a statistically significant decrease in PIs; 2 studies were eligible for meta-analysis, demonstrating that use of monitoring devices was associated with an 88% reduction in the risk of developing PIs (Mantel-Haenszel risk ratio, 0.12; 95% confidence interval, 0.04-0.41; I = 0%). Pressure injury monitoring devices are associated with a strong reduction in the risk of developing PIs. These devices provide clinicians and patients with critical information to implement prevention guidelines. Randomized controlled trials would help assess which technologies are most effective at reducing the risk of developing PIs.
Whalley, H C; Kestelman, J N; Rimmington, J E; Kelso, A; Abukmeil, S S; Best, J J; Johnstone, E C; Lawrie, S M
1999-07-30
The Edinburgh High Risk Project is a longitudinal study of brain structure (and function) in subjects at high risk of developing schizophrenia in the next 5-10 years for genetic reasons. In this article we describe the methods of volumetric analysis of structural magnetic resonance images used in the study. We also consider potential sources of error in these methods: the validity of our image analysis techniques; inter- and intra-rater reliability; possible positional variation; and thresholding criteria used in separating brain from cerebro-spinal fluid (CSF). Investigation with a phantom test object (of similar imaging characteristics to the brain) provided evidence for the validity of our image acquisition and analysis techniques. Both inter- and intra-rater reliability were found to be good in whole brain measures but less so for smaller regions. There were no statistically significant differences in positioning across the three study groups (patients with schizophrenia, high risk subjects and normal volunteers). A new technique for thresholding MRI scans longitudinally is described (the 'rescale' method) and compared with our established method (thresholding by eye). Few differences between the two techniques were seen at 3- and 6-month follow-up. These findings demonstrate the validity and reliability of the structural MRI analysis techniques used in the Edinburgh High Risk Project, and highlight methodological issues of general concern in cross-sectional and longitudinal studies of brain structure in healthy control subjects and neuropsychiatric populations.
Clark, Renee M; Besterfield-Sacre, Mary E
2009-03-01
We take a novel approach to analyzing hazardous materials transportation risk in this research. Previous studies analyzed this risk from an operations research (OR) or quantitative risk assessment (QRA) perspective by minimizing or calculating risk along a transport route. Further, even though the majority of incidents occur when containers are unloaded, the research has not focused on transportation-related activities, including container loading and unloading. In this work, we developed a decision model of a hazardous materials release during unloading using actual data and an exploratory data modeling approach. Previous studies have had a theoretical perspective in terms of identifying and advancing the key variables related to this risk, and there has not been a focus on probability and statistics-based approaches for doing this. Our decision model empirically identifies the critical variables using an exploratory methodology for a large, highly categorical database involving latent class analysis (LCA), loglinear modeling, and Bayesian networking. Our model identified the most influential variables and countermeasures for two consequences of a hazmat incident, dollar loss and release quantity, and is one of the first models to do this. The most influential variables were found to be related to the failure of the container. In addition to analyzing hazmat risk, our methodology can be used to develop data-driven models for strategic decision making in other domains involving risk.
ERIC Educational Resources Information Center
Moore, John W., Ed.; Moore, Elizabeth A., Ed.
1977-01-01
Discusses the role of the US Food and Drug Administration (FDA) in protecting the American public from carcinogens. Describes scientific testing methodology, risk-benefit analysis and the Delaney clause with its application to saccharin. (CP)
Total Risk Integrated Methodology (TRIM) - TRIM.Expo
The Exposure Event module of TRIM (TRIM.Expo), similar to most human exposure models, provides an analysis of the relationships between various chemical concentrations in the environment and exposure levels of humans.
NASA Astrophysics Data System (ADS)
Papathoma-Köhle, Maria
2016-08-01
The assessment of the physical vulnerability of elements at risk as part of the risk analysis is an essential aspect for the development of strategies and structural measures for risk reduction. Understanding, analysing and, if possible, quantifying physical vulnerability is a prerequisite for designing strategies and adopting tools for its reduction. The most common methods for assessing physical vulnerability are vulnerability matrices, vulnerability curves and vulnerability indicators; however, in most of the cases, these methods are used in a conflicting way rather than in combination. The article focuses on two of these methods: vulnerability curves and vulnerability indicators. Vulnerability curves express physical vulnerability as a function of the intensity of the process and the degree of loss, considering, in individual cases only, some structural characteristics of the affected buildings. However, a considerable amount of studies argue that vulnerability assessment should focus on the identification of these variables that influence the vulnerability of an element at risk (vulnerability indicators). In this study, an indicator-based methodology (IBM) for mountain hazards including debris flow (Kappes et al., 2012) is applied to a case study for debris flows in South Tyrol, where in the past a vulnerability curve has been developed. The relatively "new" indicator-based method is being scrutinised and recommendations for its improvement are outlined. The comparison of the two methodological approaches and their results is challenging since both methodological approaches deal with vulnerability in a different way. However, it is still possible to highlight their weaknesses and strengths, show clearly that both methodologies are necessary for the assessment of physical vulnerability and provide a preliminary "holistic methodological framework" for physical vulnerability assessment showing how the two approaches may be used in combination in the future.
Use of Model-Based Design Methods for Enhancing Resiliency Analysis of Unmanned Aerial Vehicles
NASA Astrophysics Data System (ADS)
Knox, Lenora A.
The most common traditional non-functional requirement analysis is reliability. With systems becoming more complex, networked, and adaptive to environmental uncertainties, system resiliency has recently become the non-functional requirement analysis of choice. Analysis of system resiliency has challenges; which include, defining resilience for domain areas, identifying resilience metrics, determining resilience modeling strategies, and understanding how to best integrate the concepts of risk and reliability into resiliency. Formal methods that integrate all of these concepts do not currently exist in specific domain areas. Leveraging RAMSoS, a model-based reliability analysis methodology for Systems of Systems (SoS), we propose an extension that accounts for resiliency analysis through evaluation of mission performance, risk, and cost using multi-criteria decision-making (MCDM) modeling and design trade study variability modeling evaluation techniques. This proposed methodology, coined RAMSoS-RESIL, is applied to a case study in the multi-agent unmanned aerial vehicle (UAV) domain to investigate the potential benefits of a mission architecture where functionality to complete a mission is disseminated across multiple UAVs (distributed) opposed to being contained in a single UAV (monolithic). The case study based research demonstrates proof of concept for the proposed model-based technique and provides sufficient preliminary evidence to conclude which architectural design (distributed vs. monolithic) is most resilient based on insight into mission resilience performance, risk, and cost in addition to the traditional analysis of reliability.
NASA Astrophysics Data System (ADS)
Guzman, Diego; Mohor, Guilherme; Câmara, Clarissa; Mendiondo, Eduardo
2017-04-01
Researches from around the world relate global environmental changes with the increase of vulnerability to extreme events, such as heavy and scarce precipitations - floods and droughts. Hydrological disasters have caused increasing losses in recent years. Thus, risk transfer mechanisms, such as insurance, are being implemented to mitigate impacts, finance the recovery of the affected population, and promote the reduction of hydrological risks. However, among the main problems in implementing these strategies, there are: First, the partial knowledge of natural and anthropogenic climate change in terms of intensity and frequency; Second, the efficient risk reduction policies require accurate risk assessment, with careful consideration of costs; Third, the uncertainty associated with numerical models and input data used. The objective of this document is to introduce and discuss the feasibility of the application of Hydrological Risk Transfer Models (HRTMs) as a strategy of adaptation to global climate change. The article shows the development of a methodology for the collective and multi-sectoral vulnerability management, facing the hydrological risk in the long term, under an insurance funds simulator. The methodology estimates the optimized premium as a function of willingness to pay (WTP) and the potential direct loss derived from hydrological risk. The proposed methodology structures the watershed insurance scheme in three analysis modules. First, the hazard module, which characterizes the hydrologic threat from the recorded series input or modelled series under IPCC / RCM's generated scenarios. Second, the vulnerability module calculates the potential economic loss for each sector1 evaluated as a function of the return period "TR". Finally, the finance module determines the value of the optimal aggregate premium by evaluating equiprobable scenarios of water vulnerability; taking into account variables such as the maximum limit of coverage, deductible, reinsurance schemes, and incentives for risk reduction. The methodology tested by members of the Integrated Nucleus of River Basins (NIBH) (University of Sao Paulo (USP) School of Engineering of São Carlos (EESC) - Brazil) presents an alternative to the analysis and planning of insurance funds, aiming to mitigate the impacts of hydrological droughts and stream flash floods. The presented procedure is especially important when information relevant to studies and the development and implementation of insurance funds are difficult to access and of complex evaluation. A sequence of academic applications has been made in Brazil under the South American context, where the market of hydrological insurance has a low penetration compared to developed economies and insurance markets more established as the United States and Europe, producing relevant information and demonstrating the potential of the methodology in development.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, Dave; Brunett, Acacia J.; Bucknor, Matthew
GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory are currently engaged in a joint effort to modernize and develop probabilistic risk assessment (PRA) techniques for advanced non-light water reactors. At a high level the primary outcome of this project will be the development of next-generation PRA methodologies that will enable risk-informed prioritization of safety- and reliability-focused research and development, while also identifying gaps that may be resolved through additional research. A subset of this effort is the development of a reliability database (RDB) methodology to determine applicable reliability data for inclusion in the quantification of the PRA. The RDBmore » method developed during this project seeks to satisfy the requirements of the Data Analysis element of the ASME/ANS Non-LWR PRA standard. The RDB methodology utilizes a relevancy test to examine reliability data and determine whether it is appropriate to include as part of the reliability database for the PRA. The relevancy test compares three component properties to establish the level of similarity to components examined as part of the PRA. These properties include the component function, the component failure modes, and the environment/boundary conditions of the component. The relevancy test is used to gauge the quality of data found in a variety of sources, such as advanced reactor-specific databases, non-advanced reactor nuclear databases, and non-nuclear databases. The RDB also establishes the integration of expert judgment or separate reliability analysis with past reliability data. This paper provides details on the RDB methodology, and includes an example application of the RDB methodology for determining the reliability of the intermediate heat exchanger of a sodium fast reactor. The example explores a variety of reliability data sources, and assesses their applicability for the PRA of interest through the use of the relevancy test.« less
Prediction of road accidents: A Bayesian hierarchical approach.
Deublein, Markus; Schubert, Matthias; Adey, Bryan T; Köhler, Jochen; Faber, Michael H
2013-03-01
In this paper a novel methodology for the prediction of the occurrence of road accidents is presented. The methodology utilizes a combination of three statistical methods: (1) gamma-updating of the occurrence rates of injury accidents and injured road users, (2) hierarchical multivariate Poisson-lognormal regression analysis taking into account correlations amongst multiple dependent model response variables and effects of discrete accident count data e.g. over-dispersion, and (3) Bayesian inference algorithms, which are applied by means of data mining techniques supported by Bayesian Probabilistic Networks in order to represent non-linearity between risk indicating and model response variables, as well as different types of uncertainties which might be present in the development of the specific models. Prior Bayesian Probabilistic Networks are first established by means of multivariate regression analysis of the observed frequencies of the model response variables, e.g. the occurrence of an accident, and observed values of the risk indicating variables, e.g. degree of road curvature. Subsequently, parameter learning is done using updating algorithms, to determine the posterior predictive probability distributions of the model response variables, conditional on the values of the risk indicating variables. The methodology is illustrated through a case study using data of the Austrian rural motorway network. In the case study, on randomly selected road segments the methodology is used to produce a model to predict the expected number of accidents in which an injury has occurred and the expected number of light, severe and fatally injured road users. Additionally, the methodology is used for geo-referenced identification of road sections with increased occurrence probabilities of injury accident events on a road link between two Austrian cities. It is shown that the proposed methodology can be used to develop models to estimate the occurrence of road accidents for any road network provided that the required data are available. Copyright © 2012 Elsevier Ltd. All rights reserved.
Identifying treatment effect heterogeneity in clinical trials using subpopulations of events: STEPP.
Lazar, Ann A; Bonetti, Marco; Cole, Bernard F; Yip, Wai-Ki; Gelber, Richard D
2016-04-01
Investigators conducting randomized clinical trials often explore treatment effect heterogeneity to assess whether treatment efficacy varies according to patient characteristics. Identifying heterogeneity is central to making informed personalized healthcare decisions. Treatment effect heterogeneity can be investigated using subpopulation treatment effect pattern plot (STEPP), a non-parametric graphical approach that constructs overlapping patient subpopulations with varying values of a characteristic. Procedures for statistical testing using subpopulation treatment effect pattern plot when the endpoint of interest is survival remain an area of active investigation. A STEPP analysis was used to explore patterns of absolute and relative treatment effects for varying levels of a breast cancer biomarker, Ki-67, in the phase III Breast International Group 1-98 randomized clinical trial, comparing letrozole to tamoxifen as adjuvant therapy for postmenopausal women with hormone receptor-positive breast cancer. Absolute treatment effects were measured by differences in 4-year cumulative incidence of breast cancer recurrence, while relative effects were measured by the subdistribution hazard ratio in the presence of competing risks using O-E (observed-minus-expected) methodology, an intuitive non-parametric method. While estimation of hazard ratio values based on O-E methodology has been shown, a similar development for the subdistribution hazard ratio has not. Furthermore, we observed that the subpopulation treatment effect pattern plot analysis may not produce results, even with 100 patients within each subpopulation. After further investigation through simulation studies, we observed inflation of the type I error rate of the traditional test statistic and sometimes singular variance-covariance matrix estimates that may lead to results not being produced. This is due to the lack of sufficient number of events within the subpopulations, which we refer to as instability of the subpopulation treatment effect pattern plot analysis. We introduce methodology designed to improve stability of the subpopulation treatment effect pattern plot analysis and generalize O-E methodology to the competing risks setting. Simulation studies were designed to assess the type I error rate of the tests for a variety of treatment effect measures, including subdistribution hazard ratio based on O-E estimation. This subpopulation treatment effect pattern plot methodology and standard regression modeling were used to evaluate heterogeneity of Ki-67 in the Breast International Group 1-98 randomized clinical trial. We introduce methodology that generalizes O-E methodology to the competing risks setting and that improves stability of the STEPP analysis by pre-specifying the number of events across subpopulations while controlling the type I error rate. The subpopulation treatment effect pattern plot analysis of the Breast International Group 1-98 randomized clinical trial showed that patients with high Ki-67 percentages may benefit most from letrozole, while heterogeneity was not detected using standard regression modeling. The STEPP methodology can be used to study complex patterns of treatment effect heterogeneity, as illustrated in the Breast International Group 1-98 randomized clinical trial. For the subpopulation treatment effect pattern plot analysis, we recommend a minimum of 20 events within each subpopulation. © The Author(s) 2015.
Role of scientific data in health decisions.
Samuels, S W
1979-01-01
The distinction between reality and models or methodological assumptions is necessary for an understanding of the use of data--economic, technical or biological--in decision-making. The traditional modes of analysis used in decisions are discussed historically and analytically. Utilitarian-based concepts such as cost-benefit analysis and cannibalistic concepts such as "acceptable risk" are rejected on logical and moral grounds. Historical reality suggests the concept of socially necessary risk determined through the dialectic process in democracy. PMID:120251
A case study using the PrOACT-URL and BRAT frameworks for structured benefit risk assessment.
Nixon, Richard; Dierig, Christoph; Mt-Isa, Shahrul; Stöckert, Isabelle; Tong, Thaison; Kuhls, Silvia; Hodgson, Gemma; Pears, John; Waddingham, Ed; Hockley, Kimberley; Thomson, Andrew
2016-01-01
While benefit-risk assessment is a key component of the drug development and maintenance process, it is often described in a narrative. In contrast, structured benefit-risk assessment builds on established ideas from decision analysis and comprises a qualitative framework and quantitative methodology. We compare two such frameworks, applying multi-criteria decision-analysis (MCDA) within the PrOACT-URL framework and weighted net clinical benefit (wNCB), within the BRAT framework. These are applied to a case study of natalizumab for the treatment of relapsing remitting multiple sclerosis. We focus on the practical considerations of applying these methods and give recommendations for visual presentation of results. In the case study, we found structured benefit-risk analysis to be a useful tool for structuring, quantifying, and communicating the relative benefit and safety profiles of drugs in a transparent, rational and consistent way. The two frameworks were similar. MCDA is a generic and flexible methodology that can be used to perform a structured benefit-risk in any common context. wNCB is a special case of MCDA and is shown to be equivalent to an extension of the number needed to treat (NNT) principle. It is simpler to apply and understand than MCDA and can be applied when all outcomes are measured on a binary scale. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Risk Metrics for Android (trademark) Devices
2017-02-01
allows for easy distribution of malware. This report surveys malware distribution methodologies , then describes current work being done to determine the...given a standard weight of wi = 1. Two data sets were used for testing this methodology . Because the authors are Chinese, they chose to download apps...Order Analysis excels at handling non -obfuscated apps, but may not be able to detect malware that employs encryption or dynamically changes its payload
Simulation-Based Probabilistic Tsunami Hazard Analysis: Empirical and Robust Hazard Predictions
NASA Astrophysics Data System (ADS)
De Risi, Raffaele; Goda, Katsuichiro
2017-08-01
Probabilistic tsunami hazard analysis (PTHA) is the prerequisite for rigorous risk assessment and thus for decision-making regarding risk mitigation strategies. This paper proposes a new simulation-based methodology for tsunami hazard assessment for a specific site of an engineering project along the coast, or, more broadly, for a wider tsunami-prone region. The methodology incorporates numerous uncertain parameters that are related to geophysical processes by adopting new scaling relationships for tsunamigenic seismic regions. Through the proposed methodology it is possible to obtain either a tsunami hazard curve for a single location, that is the representation of a tsunami intensity measure (such as inundation depth) versus its mean annual rate of occurrence, or tsunami hazard maps, representing the expected tsunami intensity measures within a geographical area, for a specific probability of occurrence in a given time window. In addition to the conventional tsunami hazard curve that is based on an empirical statistical representation of the simulation-based PTHA results, this study presents a robust tsunami hazard curve, which is based on a Bayesian fitting methodology. The robust approach allows a significant reduction of the number of simulations and, therefore, a reduction of the computational effort. Both methods produce a central estimate of the hazard as well as a confidence interval, facilitating the rigorous quantification of the hazard uncertainties.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-24
... methodological issues that arise in the use of meta-analyses to evaluate safety risks, followed by a discussion... design, conduct and use of meta-analysis. Although many external stakeholders conduct meta-analyses, FDA... meeting. FDA expects that this meeting will build upon prior stakeholder feedback on the design, conduct...
Prieto, M L; Cuéllar-Barboza, A B; Bobo, W V; Roger, V L; Bellivier, F; Leboyer, M; West, C P; Frye, M A
2014-11-01
To review the evidence on and estimate the risk of myocardial infarction and stroke in bipolar disorder. A systematic search using MEDLINE, EMBASE, PsycINFO, Web of Science, Scopus, Cochrane Database of Systematic Reviews, and bibliographies (1946 - May, 2013) was conducted. Case-control and cohort studies of bipolar disorder patients age 15 or older with myocardial infarction or stroke as outcomes were included. Two independent reviewers extracted data and assessed quality. Estimates of effect were summarized using random-effects meta-analysis. Five cohort studies including 13 115 911 participants (27 092 bipolar) were included. Due to the use of registers, different statistical methods, and inconsistent adjustment for confounders, there was significant methodological heterogeneity among studies. The exploratory meta-analysis yielded no evidence for a significant increase in the risk of myocardial infarction: [relative risk (RR): 1.09, 95% CI 0.96-1.24, P = 0.20; I(2) = 6%]. While there was evidence of significant study heterogeneity, the risk of stroke in bipolar disorder was significantly increased (RR 1.74, 95% CI 1.29-2.35; P = 0.0003; I(2) = 83%). There may be a differential risk of myocardial infarction and stroke in patients with bipolar disorder. Confidence in these pooled estimates was limited by the small number of studies, significant heterogeneity and dissimilar methodological features. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Analysis of potential trade-offs in regulation of disinfection by-products
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cromwell, J.E.; Zhang, X.; Regli, S.
1992-11-01
Executive Order 12291 requires the preparation of a Regulatory Impact Analysis (RIA) on all new major federal regulations. The goal of an RIA is to develop and organize information on benefits, costs, and economic impacts so as to clarify trade-offs among alternative regulatory options. This paper outlines explicit methodology for assessing the technical potential for risk-risk tradeoffs. The strategies used to cope with complexities and uncertainties in developing the Disinfection By-Products Regulatory Analysis Model are explained. Results are presented and discussed in light of uncertainties, and in light of the analytical requirements for regulatory impact analysis.
Designing trials for pressure ulcer risk assessment research: methodological challenges.
Balzer, K; Köpke, S; Lühmann, D; Haastert, B; Kottner, J; Meyer, G
2013-08-01
For decades various pressure ulcer risk assessment scales (PURAS) have been developed and implemented into nursing practice despite uncertainty whether use of these tools helps to prevent pressure ulcers. According to current methodological standards, randomised controlled trials (RCTs) are required to conclusively determine the clinical efficacy and safety of this risk assessment strategy. In these trials, PURAS-aided risk assessment has to be compared to nurses' clinical judgment alone in terms of its impact on pressure ulcer incidence and adverse outcomes. However, RCTs evaluating diagnostic procedures are prone to specific risks of bias and threats to the statistical power which may challenge their validity and feasibility. This discussion paper critically reflects on the rigour and feasibility of experimental research needed to substantiate the clinical efficacy of PURAS-aided risk assessment. Based on reflections of the methodological literature, a critical appraisal of available trials on this subject and an analysis of a protocol developed for a methodologically robust cluster-RCT, this paper arrives at the following conclusions: First, available trials do not provide reliable estimates of the impact of PURAS-aided risk assessment on pressure ulcer incidence compared to nurses' clinical judgement alone due to serious risks of bias and insufficient sample size. Second, it seems infeasible to assess this impact by means of rigorous experimental studies since sample size would become extremely high if likely threats to validity and power are properly taken into account. Third, means of evidence linkages seem to currently be the most promising approaches for evaluating the clinical efficacy and safety of PURAS-aided risk assessment. With this kind of secondary research, the downstream effect of use of PURAS on pressure ulcer incidence could be modelled by combining best available evidence for single parts of this pathway. However, to yield reliable modelling results, more robust experimental research evaluating specific parts of the pressure ulcer risk assessment-prevention pathway is needed. Copyright © 2013 Elsevier Ltd. All rights reserved.
Application of the NUREG/CR-6850 EPRI/NRC Fire PRA Methodology to a DOE Facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tom Elicson; Bentley Harwood; Richard Yorg
2011-03-01
The application NUREG/CR-6850 EPRI/NRC fire PRA methodology to DOE facility presented several challenges. This paper documents the process and discusses several insights gained during development of the fire PRA. A brief review of the tasks performed is provided with particular focus on the following: • Tasks 5 and 14: Fire-induced risk model and fire risk quantification. A key lesson learned was to begin model development and quantification as early as possible in the project using screening values and simplified modeling if necessary. • Tasks 3 and 9: Fire PRA cable selection and detailed circuit failure analysis. In retrospect, it wouldmore » have been beneficial to perform the model development and quantification in 2 phases with detailed circuit analysis applied during phase 2. This would have allowed for development of a robust model and quantification earlier in the project and would have provided insights into where to focus the detailed circuit analysis efforts. • Tasks 8 and 11: Scoping fire modeling and detailed fire modeling. More focus should be placed on detailed fire modeling and less focus on scoping fire modeling. This was the approach taken for the fire PRA. • Task 14: Fire risk quantification. Typically, multiple safe shutdown (SSD) components fail during a given fire scenario. Therefore dependent failure analysis is critical to obtaining a meaningful fire risk quantification. Dependent failure analysis for the fire PRA presented several challenges which will be discussed in the full paper.« less
Quantile uncertainty and value-at-risk model risk.
Alexander, Carol; Sarabia, José María
2012-08-01
This article develops a methodology for quantifying model risk in quantile risk estimates. The application of quantile estimates to risk assessment has become common practice in many disciplines, including hydrology, climate change, statistical process control, insurance and actuarial science, and the uncertainty surrounding these estimates has long been recognized. Our work is particularly important in finance, where quantile estimates (called Value-at-Risk) have been the cornerstone of banking risk management since the mid 1980s. A recent amendment to the Basel II Accord recommends additional market risk capital to cover all sources of "model risk" in the estimation of these quantiles. We provide a novel and elegant framework whereby quantile estimates are adjusted for model risk, relative to a benchmark which represents the state of knowledge of the authority that is responsible for model risk. A simulation experiment in which the degree of model risk is controlled illustrates how to quantify Value-at-Risk model risk and compute the required regulatory capital add-on for banks. An empirical example based on real data shows how the methodology can be put into practice, using only two time series (daily Value-at-Risk and daily profit and loss) from a large bank. We conclude with a discussion of potential applications to nonfinancial risks. © 2012 Society for Risk Analysis.
Soldatini, Cecilia; Albores-Barajas, Yuri Vladimir; Lovato, Tomas; Andreon, Adriano; Torricelli, Patrizia; Montemaggiori, Alessandro; Corsa, Cosimo; Georgalas, Vyron
2011-01-01
The presence of wildlife in airport areas poses substantial hazards to aviation. Wildlife aircraft collisions (hereafter wildlife strikes) cause losses in terms of human lives and direct monetary losses for the aviation industry. In recent years, wildlife strikes have increased in parallel with air traffic increase and species habituation to anthropic areas. In this paper, we used an ecological approach to wildlife strike risk assessment to eight Italian international airports. The main achievement is a site-specific analysis that avoids flattening wildlife strike events on a large scale while maintaining comparable airport risk assessments. This second version of the Birdstrike Risk Index (BRI2) is a sensitive tool that provides different time scale results allowing appropriate management planning. The methodology applied has been developed in accordance with the Italian Civil Aviation Authority, which recognizes it as a national standard implemented in the advisory circular ENAC APT-01B.
Soldatini, Cecilia; Albores-Barajas, Yuri Vladimir; Lovato, Tomas; Andreon, Adriano; Torricelli, Patrizia; Montemaggiori, Alessandro; Corsa, Cosimo; Georgalas, Vyron
2011-01-01
The presence of wildlife in airport areas poses substantial hazards to aviation. Wildlife aircraft collisions (hereafter wildlife strikes) cause losses in terms of human lives and direct monetary losses for the aviation industry. In recent years, wildlife strikes have increased in parallel with air traffic increase and species habituation to anthropic areas. In this paper, we used an ecological approach to wildlife strike risk assessment to eight Italian international airports. The main achievement is a site-specific analysis that avoids flattening wildlife strike events on a large scale while maintaining comparable airport risk assessments. This second version of the Birdstrike Risk Index (BRI2) is a sensitive tool that provides different time scale results allowing appropriate management planning. The methodology applied has been developed in accordance with the Italian Civil Aviation Authority, which recognizes it as a national standard implemented in the advisory circular ENAC APT-01B. PMID:22194950
Occupational Noise and Ischemic Heart Disease: A Systematic Review
Dzhambov, Angel M; Dimitrova, Donka D
2016-01-01
Noise exposure might be a risk factor for ischemic heart disease (IHD). Unlike residential exposure, however, evidence for occupational noise is limited. Given that high-quality quantitative synthesis of existing data is highly warranted for occupational safety and policy, we aimed at conducting a systematic review and meta-analysis of the risks of IHD morbidity and mortality because of occupational noise exposure. We carried out a systematic search in MEDLINE, EMBASE, and on the Internet since April 2, 2015, in English, Spanish, Russian, and Bulgarian. A quality-scoring checklist was developed a priori to assess different sources of methodological bias. A qualitative data synthesis was performed. Conservative assumptions were applied when appropriate. A meta-analysis was not feasible because of unresolvable methodological discrepancies between the studies. On the basis of five studies, there was some evidence to suggest higher risk of IHD among workers exposed to objectively assessed noise >75–80 dB for <20 years (supported by one high, one moderate, and one low quality study, opposed by one high and one moderate quality study). Three moderate and two low quality studies out of six found self-rated exposure to be associated with higher risk of IHD, and only one moderate quality study found no effect. Out of four studies, a higher mortality risk was suggested by one moderate quality study relying on self-rated exposure and one of high-quality study using objective exposure. Sensitivity analyses showed that at higher exposures and in some vulnerable subgroups, such as women, the adverse effects were considerably stronger. Despite methodological discrepancies and limitations of the included studies, occupational noise appeared to be a risk factor for IHD morbidity. Results suggested higher risk for IHD mortality only among vulnerable subgroups. Workers exposed to high occupational noise should be considered at higher overall risk of IHD. PMID:27569404
Boronat, F; Budia, A; Broseta, E; Ruiz-Cerdá, J L; Vivas-Consuelo, D
To describe the application of the Lean methodology as a method for continuously improving the efficiency of a urology department in a tertiary hospital. The implementation of the Lean Healthcare methodology in a urology department was conducted in 3 phases: 1) team training and improvement of feedback among the practitioners, 2) management by process and superspecialisation and 3) improvement of indicators (continuous improvement). The indicators were obtained from the Hospital's information systems. The main source of information was the Balanced Scorecard for health systems management (CUIDISS). The comparison with other autonomous and national urology departments was performed through the same platform with the help of the Hospital's records department (IASIST). A baseline was established with the indicators obtained in 2011 for the comparative analysis of the results after implementing the Lean Healthcare methodology. The implementation of this methodology translated into high practitioner satisfaction, improved quality indicators reaching a risk-adjusted complication index (RACI) of 0.59 and a risk-adjusted mortality rate (RAMR) of 0.24 in 4 years. A value of 0.61 was reached with the efficiency indicator (risk-adjusted length of stay [RALOS] index), with a savings of 2869 stays compared with national Benchmarking (IASIST). The risk-adjusted readmissions index (RARI) was the only indicator above the standard, with a value of 1.36 but with progressive annual improvement of the same. The Lean methodology can be effectively applied to a urology department of a tertiary hospital to improve efficiency, obtaining significant and continuous improvements in all its indicators, as well as practitioner satisfaction. Team training, management by process, continuous improvement and delegation of responsibilities has been shown to be the fundamental pillars of this methodology. Copyright © 2017 AEU. Publicado por Elsevier España, S.L.U. All rights reserved.
Sanchez, E Y; Represa, S; Mellado, D; Balbi, K B; Acquesta, A D; Colman Lerner, J E; Porta, A A
2018-06-15
The potential impact of a technological accident can be assessed by risk estimation. Taking this into account, the latent or potential condition can be warned and mitigated. In this work we propose a methodology to estimate risk of technological hazards, focused on two components. The first one is the processing of meteorological databases to define the most probably and conservative scenario of study, and the second one, is the application of a local social vulnerability index to classify the population. In this case of study, the risk was estimated for a hypothetical release of liquefied ammonia in a meat-packing industry in the city of La Plata, Argentina. The method consists in integrating the simulated toxic threat zone with ALOHA software, and the layer of sociodemographic classification of the affected population. The results show the areas associated with higher risks of exposure to ammonia, which are worth being addressed for the prevention of disasters in the region. Advantageously, this systemic approach is methodologically flexible as it provides the possibility of being applied in various scenarios based on the available information of both, the exposed population and its meteorology. Furthermore, this methodology optimizes the processing of the input data and its calculation. Copyright © 2018 Elsevier B.V. All rights reserved.
Simulation of investment returns of toll projects.
DOT National Transportation Integrated Search
2013-08-01
This research develops a methodological framework to illustrate key stages in applying the simulation of investment returns of toll projects, acting as an example process of helping agencies conduct numerical risk analysis by taking certain uncertain...
Critical asset and portfolio risk analysis: an all-hazards framework.
Ayyub, Bilal M; McGill, William L; Kaminskiy, Mark
2007-08-01
This article develops a quantitative all-hazards framework for critical asset and portfolio risk analysis (CAPRA) that considers both natural and human-caused hazards. Following a discussion on the nature of security threats, the need for actionable risk assessments, and the distinction between asset and portfolio-level analysis, a general formula for all-hazards risk analysis is obtained that resembles the traditional model based on the notional product of consequence, vulnerability, and threat, though with clear meanings assigned to each parameter. Furthermore, a simple portfolio consequence model is presented that yields first-order estimates of interdependency effects following a successful attack on an asset. Moreover, depending on the needs of the decisions being made and available analytical resources, values for the parameters in this model can be obtained at a high level or through detailed systems analysis. Several illustrative examples of the CAPRA methodology are provided.
A methodology for modeling regional terrorism risk.
Chatterjee, Samrat; Abkowitz, Mark D
2011-07-01
Over the past decade, terrorism risk has become a prominent consideration in protecting the well-being of individuals and organizations. More recently, there has been interest in not only quantifying terrorism risk, but also placing it in the context of an all-hazards environment in which consideration is given to accidents and natural hazards, as well as intentional acts. This article discusses the development of a regional terrorism risk assessment model designed for this purpose. The approach taken is to model terrorism risk as a dependent variable, expressed in expected annual monetary terms, as a function of attributes of population concentration and critical infrastructure. This allows for an assessment of regional terrorism risk in and of itself, as well as in relation to man-made accident and natural hazard risks, so that mitigation resources can be allocated in an effective manner. The adopted methodology incorporates elements of two terrorism risk modeling approaches (event-based models and risk indicators), producing results that can be utilized at various jurisdictional levels. The validity, strengths, and limitations of the model are discussed in the context of a case study application within the United States. © 2011 Society for Risk Analysis.
Mobile phone use and risk of tumors: a meta-analysis.
Myung, Seung-Kwon; Ju, Woong; McDonnell, Diana D; Lee, Yeon Ji; Kazinets, Gene; Cheng, Chih-Tao; Moskowitz, Joel M
2009-11-20
Case-control studies have reported inconsistent findings regarding the association between mobile phone use and tumor risk. We investigated these associations using a meta-analysis. We searched MEDLINE (PubMed), EMBASE, and the Cochrane Library in August 2008. Two evaluators independently reviewed and selected articles based on predetermined selection criteria. Of 465 articles meeting our initial criteria, 23 case-control studies, which involved 37,916 participants (12,344 patient cases and 25,572 controls), were included in the final analyses. Compared with never or rarely having used a mobile phone, the odds ratio for overall use was 0.98 for malignant and benign tumors (95% CI, 0.89 to 1.07) in a random-effects meta-analysis of all 23 studies. However, a significant positive association (harmful effect) was observed in a random-effects meta-analysis of eight studies using blinding, whereas a significant negative association (protective effect) was observed in a fixed-effects meta-analysis of 15 studies not using blinding. Mobile phone use of 10 years or longer was associated with a risk of tumors in 13 studies reporting this association (odds ratio = 1.18; 95% CI, 1.04 to 1.34). Further, these findings were also observed in the subgroup analyses by methodologic quality of study. Blinding and methodologic quality of study were strongly associated with the research group. The current study found that there is possible evidence linking mobile phone use to an increased risk of tumors from a meta-analysis of low-biased case-control studies. Prospective cohort studies providing a higher level of evidence are needed.
HIV RISK REDUCTION INTERVENTIONS AMONG SUBSTANCE-ABUSING REPRODUCTIVE-AGE WOMEN: A SYSTEMATIC REVIEW
Weissman, Jessica; Kanamori, Mariano; Dévieux, Jessy G.; Trepka, Mary Jo; De La Rosa, Mario
2017-01-01
HIV/AIDS is one of the leading causes of death among reproductive-age women throughout the world, and substance abuse plays a major role in HIV infection. We conducted a systematic review, in accordance with the 2015 Preferred Items for Reporting Systematic Reviews and Meta-analysis tool, to assess HIV risk-reduction intervention studies among reproductive-age women who abuse substances. We initially identified 6,506 articles during our search and, after screening titles and abstracts, examining articles in greater detail, and finally excluding those rated methodologically weak, a total of 10 studies were included in this review. Studies that incorporated behavioral skills training into the intervention and were based on theoretical model(s) were the most effective in general at decreasing sex and drug risk behaviors. Additional HIV risk-reduction intervention research with improved methodological designs is warranted to determine the most efficacious HIV risk-reduction intervention for reproductive-age women who abuse substances. PMID:28467160
WE-B-BRC-02: Risk Analysis and Incident Learning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fraass, B.
Prospective quality management techniques, long used by engineering and industry, have become a growing aspect of efforts to improve quality management and safety in healthcare. These techniques are of particular interest to medical physics as scope and complexity of clinical practice continue to grow, thus making the prescriptive methods we have used harder to apply and potentially less effective for our interconnected and highly complex healthcare enterprise, especially in imaging and radiation oncology. An essential part of most prospective methods is the need to assess the various risks associated with problems, failures, errors, and design flaws in our systems. Wemore » therefore begin with an overview of risk assessment methodologies used in healthcare and industry and discuss their strengths and weaknesses. The rationale for use of process mapping, failure modes and effects analysis (FMEA) and fault tree analysis (FTA) by TG-100 will be described, as well as suggestions for the way forward. This is followed by discussion of radiation oncology specific risk assessment strategies and issues, including the TG-100 effort to evaluate IMRT and other ways to think about risk in the context of radiotherapy. Incident learning systems, local as well as the ASTRO/AAPM ROILS system, can also be useful in the risk assessment process. Finally, risk in the context of medical imaging will be discussed. Radiation (and other) safety considerations, as well as lack of quality and certainty all contribute to the potential risks associated with suboptimal imaging. The goal of this session is to summarize a wide variety of risk analysis methods and issues to give the medical physicist access to tools which can better define risks (and their importance) which we work to mitigate with both prescriptive and prospective risk-based quality management methods. Learning Objectives: Description of risk assessment methodologies used in healthcare and industry Discussion of radiation oncology-specific risk assessment strategies and issues Evaluation of risk in the context of medical imaging and image quality E. Samei: Research grants from Siemens and GE.« less
Røssvoll, Elin Halbach; Ueland, Øydis; Hagtvedt, Therese; Jacobsen, Eivind; Lavik, Randi; Langsrud, Solveig
2012-09-01
Traditionally, consumer food safety survey responses have been classified as either "right" or "wrong" and food handling practices that are associated with high risk of infection have been treated in the same way as practices with lower risks. In this study, a risk-based method for consumer food safety surveys has been developed, and HACCP (hazard analysis and critical control point) methodology was used for selecting relevant questions. We conducted a nationally representative Web-based survey (n = 2,008), and to fit the self-reported answers we adjusted a risk-based grading system originally developed for observational studies. The results of the survey were analyzed both with the traditional "right" and "wrong" classification and with the risk-based grading system. The results using the two methods were very different. Only 5 of the 10 most frequent food handling violations were among the 10 practices associated with the highest risk. These 10 practices dealt with different aspects of heat treatment (lacking or insufficient), whereas the majority of the most frequent violations involved storing food at room temperature for too long. Use of the risk-based grading system for survey responses gave a more realistic picture of risks associated with domestic food handling practices. The method highlighted important violations and minor errors, which are performed by most people and are not associated with significant risk. Surveys built on a HACCP-based approach with risk-based grading will contribute to a better understanding of domestic food handling practices and will be of great value for targeted information and educational activities.
Vande Putte, Danny; Verhelst, Marc
Risk management has never been easy. Finding efficient mitigating measures is not always straightforward. Finding measures for cyber crime, however, is a really huge challenge because cyber threats are changing all the time. As the sophistication of these threats is growing, their impact increases. Moreover, society and its economy have become increasingly dependent on information and communication technologies. Standard risk analysis methodologies will help to score the cyber risk and to place it in the risk tolerance matrix. This will allow business continuity managers to figure out if there is still a gap with the maximum tolerable outage for time-critical business processes and if extra business continuity measures are necessary to fill the gap.
Hoskin, Jordan D; Miyatani, Masae; Craven, B Catharine
2017-03-30
Carotid intima-media thickness (cIMT) may be used increasingly as a cardiovascular disease (CVD) screening tool in individuals with spinal cord injury (SCI) as other routine invasive diagnostic tests are often unfeasible. However, variation in cIMT acquisition and analysis methods is an issue in the current published literature. The growth of the field is dependent on cIMT quality acquisition and analysis to ensure accurate reporting of CVD risk. The purpose of this study is to evaluate the quality of the reported methodology used to collect cIMT values in SCI. Data from 12 studies, which measured cIMT in individuals with SCI, were identified from the Medline, Embase and CINAHL databases. The quality of the reported methodologies was scored based on adherence to cIMT methodological guidelines abstracted from two consensus papers. Five studies were scored as 'moderate quality' in methodological reporting, having specified 9 to 11 of 15 quality reporting criterion. The remaining seven studies were scored as 'low quality', having reported less than 9 of 15 quality reporting criterion. No study had methodological reporting that was scored as 'high quality'. The overall reporting of quality methodology was poor in the published SCI literature. A greater adherence to current methodological guidelines is needed to advance the field of cIMT in SCI. Further research is necessary to refine cIMT acquisition and analysis guidelines to aid authors designing research and journals in screening manuscripts for publication.
The application of seismic risk-benefit analysis to land use planning in Taipei City.
Hung, Hung-Chih; Chen, Liang-Chun
2007-09-01
In the developing countries of Asia local authorities rarely use risk analysis instruments as a decision-making support mechanism during planning and development procedures. The main purpose of this paper is to provide a methodology to enable planners to undertake such analyses. We illustrate a case study of seismic risk-benefit analysis for the city of Taipei, Taiwan, using available land use maps and surveys as well as a new tool developed by the National Science Council in Taiwan--the HAZ-Taiwan earthquake loss estimation system. We use three hypothetical earthquakes to estimate casualties and total and annualised direct economic losses, and to show their spatial distribution. We also characterise the distribution of vulnerability over the study area using cluster analysis. A risk-benefit ratio is calculated to express the levels of seismic risk attached to alternative land use plans. This paper suggests ways to perform earthquake risk evaluations and the authors intend to assist city planners to evaluate the appropriateness of their planning decisions.
Threat Assessment and Remediation Analysis (TARA)
2014-10-01
of countermeasure selection strategies that prescribe the application of countermeasures based on level of risk tolerance. This paper outlines the...catalog data, which are discussed later in this paper . The methodology can be described as conjoined trade studies, where the first trade identifies and...ranks vulnerabilities based on assessed risk, and the second identifies and selects countermeasures based on assessed utility and cost. This paper
Developing a Methodology for Risk-Informed Trade-Space Analysis in Acquisition
2015-01-01
73 6.10. Research, Development, Test, and Evaluation Cost Distribution, Technology 1 Mitigation of...6.11. Research, Development, Test, and Evaluation Cost Distribution, Technology 3 Mitigation of the Upgrade Alternative...courses of action, or risk- mitigation behaviors, which take place in the event that the technology is not developed by the mile- stone date (e.g
Emerging Demands for Public Policies in Rio De Janeiro: Educational Prevention of Social Risks
ERIC Educational Resources Information Center
Gomes Da Silva, Magda Maria Ventura; Garcia, Maria del Pilar Quicios
2016-01-01
This paper disseminates some results of an international research on the social risk manifestations published in eight periodicals in Rio de Janeiro from July 2013 to December 2014. A sample of the research coincides with the population: 541 news, which constitutes 1255 analytical units. The methodology consisted of a content analysis of the news,…
Development Risk Methodology for Whole Systems Trade Analysis
2016-08-01
Operations and Reports (0704-0188), 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202- 4302. Respondents should be aware that notwithstanding...JCIDS - Joint Capabilities Integration and Development System L - Likelihood MS - Milestone O&S - Operations and Sustainment P.95 - 95th...and their consequences. These dimensions are: performance, unit cost, operations & sustainment (O&S) cost, development risk, and growth potential
A new approach to flood vulnerability assessment for historic buildings in England
NASA Astrophysics Data System (ADS)
Stephenson, V.; D'Ayala, D.
2014-05-01
The recent increase in frequency and severity of flooding in the UK has led to a shift in the perception of risk associated with flood hazards. This has extended to the conservation community, and the risks posed to historic structures that suffer from flooding are particularly concerning for those charged with preserving and maintaining such buildings. In order to fully appraise the risks in a manner appropriate to the complex issue of preservation, a new methodology is presented here that studies the nature of the vulnerability of such structures, and places it in the context of risk assessment, accounting for the vulnerable object and the subsequent exposure of that object to flood hazards. The testing of the methodology is carried out using three urban case studies and the results of the survey analysis provide guidance on the development of fragility curves for historic structures exposed to flooding. This occurs through appraisal of vulnerability indicators related to building form, structural and fabric integrity, and preservation of architectural and archaeological values. Key findings of the work include determining the applicability of these indicators to fragility analysis, and the determination of the relative vulnerability of the three case study sites.
Quesada, Jose Antonio; Melchor, Inmaculada; Nolasco, Andreu
2017-05-26
The analysis of spatio-temporal patterns of disease or death in urban areas has been developed mainly from the ecological studies approach. These designs may have some limitations like the ecological fallacy and instability with few cases. The objective of this study was to apply the point process methodology, as a complement to that of aggregated data, to study HIV/AIDS mortality in men in the city of Alicante (Spain). A case-control study in residents in the city during the period 2004-2011 was designed. Cases were men who died from HIV/AIDS and controls represented the general population, matched by age to cases. The risk surfaces of death over the city were estimated using the log-risk function of intensities, and we contrasted their temporal variations over the two periods. High risk significant areas of death by HIV/AIDS, which coincide with the most deprived areas in the city, were detected. Significant spatial change of the areas at risk between the periods studied was not detected. The point process methodology is a useful tool to analyse the patterns of death by HIV/AIDS in urban areas.
Prasad, Manya; Kathuria, Prachi; Nair, Pallavi; Kumar, Amit; Prasad, Kameshwar
2017-05-01
Mobile phones emit electromagnetic radiations that are classified as possibly carcinogenic to humans. Evidence for increased risk for brain tumours accumulated in parallel by epidemiologic investigations remains controversial. This paper aims to investigate whether methodological quality of studies and source of funding can explain the variation in results. PubMed and Cochrane CENTRAL searches were conducted from 1966 to December 2016, which was supplemented with relevant articles identified in the references. Twenty-two case control studies were included for systematic review. Meta-analysis of 14 case-control studies showed practically no increase in risk of brain tumour [OR 1.03 (95% CI 0.92-1.14)]. However, for mobile phone use of 10 years or longer (or >1640 h), the overall result of the meta-analysis showed a significant 1.33 times increase in risk. The summary estimate of government funded as well as phone industry funded studies showed 1.07 times increase in odds which was not significant, while mixed funded studies did not show any increase in risk of brain tumour. Metaregression analysis indicated that the association was significantly associated with methodological study quality (p < 0.019, 95% CI 0.009-0.09). Relationship between source of funding and log OR for each study was not statistically significant (p < 0.32, 95% CI 0.036-0.010). We found evidence linking mobile phone use and risk of brain tumours especially in long-term users (≥10 years). Studies with higher quality showed a trend towards high risk of brain tumour, while lower quality showed a trend towards lower risk/protection.
Project risk management in the construction of high-rise buildings
NASA Astrophysics Data System (ADS)
Titarenko, Boris; Hasnaoui, Amir; Titarenko, Roman; Buzuk, Liliya
2018-03-01
This paper shows the project risk management methods, which allow to better identify risks in the construction of high-rise buildings and to manage them throughout the life cycle of the project. One of the project risk management processes is a quantitative analysis of risks. The quantitative analysis usually includes the assessment of the potential impact of project risks and their probabilities. This paper shows the most popular methods of risk probability assessment and tries to indicate the advantages of the robust approach over the traditional methods. Within the framework of the project risk management model a robust approach of P. Huber is applied and expanded for the tasks of regression analysis of project data. The suggested algorithms used to assess the parameters in statistical models allow to obtain reliable estimates. A review of the theoretical problems of the development of robust models built on the methodology of the minimax estimates was done and the algorithm for the situation of asymmetric "contamination" was developed.
Anthropic Risk Assessment on Biodiversity
NASA Astrophysics Data System (ADS)
Piragnolo, M.; Pirotti, F.; Vettore, A.; Salogni, G.
2013-01-01
This paper presents a methodology for risk assessment of anthropic activities on habitats and species. The method has been developed for Veneto Region, in order to simplify and improve the quality of EIA procedure (VINCA). Habitats and species, animals and plants, are protected by European Directive 92/43/EEC and 2009/147/EC but they are subject at hazard due to pollution produced by human activities. Biodiversity risks may conduct to deterioration and disturbance in ecological niches, with consequence of loss of biodiversity. Ecological risk assessment applied on Natura 2000 network, is needed to best practice of management and monitoring of environment and natural resources. Threats, pressure and activities, stress and indicators may be managed by geodatabase and analysed using GIS technology. The method used is the classic risk assessment in ecological context, and it defines the natural hazard as influence, element of risk as interference and vulnerability. Also it defines a new parameter called pressure. It uses risk matrix for the risk analysis on spatial and temporal scale. The methodology is qualitative and applies the precautionary principle in environmental assessment. The final product is a matrix which excludes the risk and could find application in the development of a territorial information system.
The US EPA’s ToxCastTM program seeks to combine advances in high-throughput screening technology with methodologies from statistics and computer science to develop high-throughput decision support tools for assessing chemical hazard and risk. To develop new methods of analysis of...
Cooper, Guy Paul; Yeager, Violet; Burkle, Frederick M.; Subbarao, Italo
2015-01-01
Background: This article describes a novel triangulation methodological approach for identifying twitter activity of regional active twitter users during the 2013 Hattiesburg EF-4 Tornado. Methodology: A data extraction and geographically centered filtration approach was utilized to generate Twitter data for 48 hrs pre- and post-Tornado. The data was further validated using six sigma approach utilizing GPS data. Results: The regional analysis revealed a total of 81,441 tweets, 10,646 Twitter users, 27,309 retweets and 2637 tweets with GPS coordinates. Conclusions: Twitter tweet activity increased 5 fold during the response to the Hattiesburg Tornado. Retweeting activity increased 2.2 fold. Tweets with a hashtag increased 1.4 fold. Twitter was an effective disaster risk reduction tool for the Hattiesburg EF-4 Tornado 2013. PMID:26203396
Assessment of health risks of policies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ádám, Balázs, E-mail: badam@cmss.sdu.dk; Department of Preventive Medicine, Faculty of Public Health, University of Debrecen, P.O. Box 9, H-4012 Debrecen; Molnár, Ágnes, E-mail: MolnarAg@smh.ca
The assessment of health risks of policies is an inevitable, although challenging prerequisite for the inclusion of health considerations in political decision making. The aim of our project was to develop a so far missing methodological guide for the assessment of the complex impact structure of policies. The guide was developed in a consensual way based on experiences gathered during the assessment of specific national policies selected by the partners of an EU project. Methodological considerations were discussed and summarized in workshops and pilot tested on the EU Health Strategy for finalization. The combined tool, which includes a textual guidancemore » and a checklist, follows the top-down approach, that is, it guides the analysis of causal chains from the policy through related health determinants and risk factors to health outcomes. The tool discusses the most important practical issues of assessment by impact level. It emphasises the transparent identification and prioritisation of factors, the consideration of the feasibility of exposure and outcome assessment with special focus on quantification. The developed guide provides useful methodological instructions for the comprehensive assessment of health risks of policies that can be effectively used in the health impact assessment of policy proposals. - Highlights: • Methodological guide for the assessment of health risks of policies is introduced. • The tool is developed based on the experiences from several case studies. • The combined tool consists of a textual guidance and a checklist. • The top-down approach is followed through the levels of the full impact chain. • The guide provides assistance for the health impact assessment of policy proposals.« less
Applying machine learning to pattern analysis for automated in-design layout optimization
NASA Astrophysics Data System (ADS)
Cain, Jason P.; Fakhry, Moutaz; Pathak, Piyush; Sweis, Jason; Gennari, Frank; Lai, Ya-Chieh
2018-04-01
Building on previous work for cataloging unique topological patterns in an integrated circuit physical design, a new process is defined in which a risk scoring methodology is used to rank patterns based on manufacturing risk. Patterns with high risk are then mapped to functionally equivalent patterns with lower risk. The higher risk patterns are then replaced in the design with their lower risk equivalents. The pattern selection and replacement is fully automated and suitable for use for full-chip designs. Results from 14nm product designs show that the approach can identify and replace risk patterns with quantifiable positive impact on the risk score distribution after replacement.
[Perinatal mortality research in Brazil: review of methodology and results].
Fonseca, Sandra Costa; Coutinho, Evandro da Silva Freire
2004-01-01
The perinatal mortality rate remains a public health problem, demanding epidemiological studies to describe its magnitude and time trends, identify risk factors, and define adequate interventions. There are still methodological controversies, resulting in heterogeneous studies and possible biases. In Brazil, there has been a growing scientific output on this theme, mainly in the South and Southeast of the country. Twenty-four articles from 1996 to 2003 were reviewed, focusing on definitions and classifications, data sources, study designs, measurement of variables, statistical analysis, and results. The review showed an increasing utilization of data bases (mainly SINASC and SIM), few studies on stillbirth, the incorporation of classification schemes, and disagreement concerning risk factors.
Baines, Janis; Cunningham, Judy; Leemhuis, Christel; Hambridge, Tracy; Mackerras, Dorothy
2011-01-01
The approach used by food regulation agencies to examine the literature and forecast the impact of possible food regulations has many similar features to the approach used in nutritional epidemiological research. We outline the Risk Analysis Framework described by FAO/WHO, in which there is formal progression from identification of the nutrient or food chemical of interest, through to describing its effect on health and then assessing whether there is a risk to the population based on dietary exposure estimates. We then discuss some important considerations for the dietary modeling component of the Framework, including several methodological issues that also exist in research nutritional epidemiology. Finally, we give several case studies that illustrate how the different methodological components are used together to inform decisions about how to manage the regulatory problem. PMID:22254081
MASQOT: a method for cDNA microarray spot quality control
Bylesjö, Max; Eriksson, Daniel; Sjödin, Andreas; Sjöström, Michael; Jansson, Stefan; Antti, Henrik; Trygg, Johan
2005-01-01
Background cDNA microarray technology has emerged as a major player in the parallel detection of biomolecules, but still suffers from fundamental technical problems. Identifying and removing unreliable data is crucial to prevent the risk of receiving illusive analysis results. Visual assessment of spot quality is still a common procedure, despite the time-consuming work of manually inspecting spots in the range of hundreds of thousands or more. Results A novel methodology for cDNA microarray spot quality control is outlined. Multivariate discriminant analysis was used to assess spot quality based on existing and novel descriptors. The presented methodology displays high reproducibility and was found superior in identifying unreliable data compared to other evaluated methodologies. Conclusion The proposed methodology for cDNA microarray spot quality control generates non-discrete values of spot quality which can be utilized as weights in subsequent analysis procedures as well as to discard spots of undesired quality using the suggested threshold values. The MASQOT approach provides a consistent assessment of spot quality and can be considered an alternative to the labor-intensive manual quality assessment process. PMID:16223442
NASA Astrophysics Data System (ADS)
Alexakis, Dimitrios D.; Sarris, Apostolos; Papadopoulos, Nikos; Soupios, Pantelis; Doula, Maria; Cavvadias, Victor
2014-08-01
The olive-oil industry is one of the most important sectors of agricultural production in Greece, which is the third in olive-oil production country worldwide. Olive oil mill wastes (OOMW) constitute a major factor in pollution in olivegrowing regions and an important problem to be solved for the agricultural industry. The olive-oil mill wastes are normally deposited at tanks, or directly in the soil or even on adjacent torrents, rivers and lakes posing a high risk to the environmental pollution and the community health. GEODIAMETRIS project aspires to develop integrated geoinformatic methodologies for performing monitoring of land pollution from the disposal of OOMW in the island of Crete -Greece. These methodologies integrate GPS surveys, satellite remote sensing and risk assessment analysis in GIS environment, application of in situ and laboratory geophysical methodologies as well as soil and water physicochemical analysis. Concerning project's preliminary results, all the operating OOMW areas located in Crete have been already registered through extensive GPS field campaigns. Their spatial and attribute information has been stored in an integrated GIS database and an overall OOMW spectral signature database has been constructed through the analysis of multi-temporal Landsat-8 OLI satellite images. In addition, a specific OOMW area located in Alikianos village (Chania-Crete) has been selected as one of the main case study areas. Various geophysical methodologies, such as Electrical Resistivity Tomography, Induced Polarization, multifrequency electromagnetic, Self Potential measurements and Ground Penetrating Radar have been already implemented. Soil as well as liquid samples have been collected for performing physico-chemical analysis. The preliminary results have already contributed to the gradual development of an integrated environmental monitoring tool for studying and understanding environmental degradation from the disposal of OOMW.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tortorelli, J.P.
A workshop was held at the Idaho National Engineering Laboratory, August 16--18, 1994 on the topic of risk assessment on medical devices that use radioactive isotopes. Its purpose was to review past efforts to develop a risk assessment methodology to evaluate these devices, and to develop a program plan and a scoping document for future methodology development. This report contains presentation material and a transcript of the workshop. Participants included experts in the fields of radiation oncology, medical physics, risk assessment, human-error analysis, and human factors. Staff from the US Nuclear Regulatory Commission (NRC) associated with the regulation of medicalmore » uses of radioactive materials and with research into risk-assessment methods participated in the workshop. The workshop participants concurred in NRC`s intended use of risk assessment as an important technology in the development of regulations for the medical use of radioactive material and encouraged the NRC to proceed rapidly with a pilot study. Specific recommendations are included in the executive summary and the body of this report.« less
Fujinaga, Aiichiro; Uchiyama, Iwao; Morisawa, Shinsuke; Yoneda, Minoru; Sasamoto, Yuzuru
2012-01-01
In Japan, environmental standards for contaminants in groundwater and in leachate from soil are set with the assumption that they are used for drinking water over a human lifetime. Where there is neither a well nor groundwater used for drinking, the standard is thus too severe. Therefore, remediation based on these standards incurs excessive effort and cost. In contrast, the environmental-assessment procedure used in the United States and the Netherlands considers the site conditions (land use, existing wells, etc.); however, a risk assessment is required for each site. Therefore, this study proposes a new framework for judging contamination in Japan by considering the merits of the environmental standards used and a method for risk assessment. The framework involves setting risk-based concentrations that are attainable remediation goals for contaminants in soil and groundwater. The framework was then applied to a model contaminated site for risk management, and the results are discussed regarding the effectiveness and applicability of the new methodology. © 2011 Society for Risk Analysis.
The ARGO Project: assessing NA-TECH risks on off-shore oil platforms
NASA Astrophysics Data System (ADS)
Capuano, Paolo; Basco, Anna; Di Ruocco, Angela; Esposito, Simona; Fusco, Giannetta; Garcia-Aristizabal, Alexander; Mercogliano, Paola; Salzano, Ernesto; Solaro, Giuseppe; Teofilo, Gianvito; Scandone, Paolo; Gasparini, Paolo
2017-04-01
ARGO (Analysis of natural and anthropogenic risks on off-shore oil platforms) is a 2 years project, funded by the DGS-UNMIG (Directorate General for Safety of Mining and Energy Activities - National Mining Office for Hydrocarbons and Georesources) of Italian Ministry of Economic Development. The project, coordinated by AMRA (Center for the Analysis and Monitoring of Environmental Risk), aims at providing technical support for the analysis of natural and anthropogenic risks on offshore oil platforms. In order to achieve this challenging objective, ARGO brings together climate experts, risk management experts, seismologists, geologists, chemical engineers, earth and coastal observation experts. ARGO has developed methodologies for the probabilistic analysis of industrial accidents triggered by natural events (NA-TECH) on offshore oil platforms in the Italian seas, including extreme events related to climate changes. Furthermore the environmental effect of offshore activities has been investigated, including: changes on seismicity and on the evolution of coastal areas close to offshore platforms. Then a probabilistic multi-risk framework has been developed for the analysis of NA-TECH events on offshore installations for hydrocarbon extraction.
General RMP Guidance - Chapter 4: Offsite Consequence Analysis
This chapter provides basic compliance information, not modeling methodologies, for people who plan to do their own air dispersion modeling. OCA is a required part of the risk management program, and involves worst-case and alternative release scenarios.
Mapping Natech risk due to earthquakes using RAPID-N
NASA Astrophysics Data System (ADS)
Girgin, Serkan; Krausmann, Elisabeth
2013-04-01
Natural hazard-triggered technological accidents (so-called Natech accidents) at hazardous installations are an emerging risk with possibly serious consequences due to the potential for release of hazardous materials, fires or explosions. For the reduction of Natech risk, one of the highest priority needs is the identification of Natech-prone areas and the systematic assessment of Natech risks. With hardly any Natech risk maps existing within the EU the European Commission's Joint Research Centre has developed a Natech risk analysis and mapping tool called RAPID-N, that estimates the overall risk of natural-hazard impact to industrial installations and its possible consequences. The results are presented as risk summary reports and interactive risk maps which can be used for decision making. Currently, RAPID-N focuses on Natech risk due to earthquakes at industrial installations. However, it will be extended to also analyse and map Natech risk due to floods in the near future. The RAPID-N methodology is based on the estimation of on-site natural hazard parameters, use of fragility curves to determine damage probabilities of plant units for various damage states, and the calculation of spatial extent, severity, and probability of Natech events potentially triggered by the natural hazard. The methodology was implemented as a web-based risk assessment and mapping software tool which allows easy data entry, rapid local or regional risk assessment and mapping. RAPID-N features an innovative property estimation framework to calculate on-site natural hazard parameters, industrial plant and plant unit characteristics, and hazardous substance properties. Custom damage states and fragility curves can be defined for different types of plant units. Conditional relationships can be specified between damage states and Natech risk states, which describe probable Natech event scenarios. Natech consequences are assessed using a custom implementation of U.S. EPA's Risk Management Program (RMP) Guidance for Offsite Consequence Analysis methodology. This custom implementation is based on the property estimation framework and allows the easy modification of model parameters and the substitution of equations with alternatives. RAPID-N can be applied at different stages of the Natech risk management process: It allows on the one hand the analysis of hypothetical Natech scenarios to prevent or prepare for a Natech accident by supporting land-use and emergency planning. On the other hand, once a natural disaster occurs RAPID-N can be used for rapidly locating facilities with potential Natech accident damage based on actual natural-hazard information. This provides a means to warn the population in the vicinity of the facilities in a timely manner. This presentation will introduce the specific features of RAPID-N and show the use of the tool by application to a case-study area.
Framing risk in pandemic influenza policy and control.
Seetoh, Theresa; Liverani, Marco; Coker, Richard
2012-01-01
This article explores differing understandings of 'risk' in relation to pandemic influenza policy and control. After a preliminary overview of methodological and practical problems in risk analysis, ways in which risk was framed and managed in three historical cases were examined. The interdependence between scientific empiricism and political decision-making led to the mismanagement of the 1976 swine influenza scare in the USA. The 2004 H5N1 avian influenza outbreak in Thailand, on the other hand, was undermined by questions of national economic interest and concerns over global health security. Finally, the recent global emergency of pandemic influenza H1N1 in 2009 demonstrated the difficulties of risk management under a context of pre-established perceptions about the characteristics and inevitability of a pandemic. Following the analysis of these cases, a conceptual framework is presented to illustrate ways in which changing relationships between risk assessment, risk perception and risk management can result in differing policy strategies.
NASA Astrophysics Data System (ADS)
Moglia, Magnus; Sharma, Ashok K.; Maheepala, Shiroma
2012-07-01
SummaryPlanning of regional and urban water resources, and in particular with Integrated Urban Water Management approaches, often considers inter-relationships between human uses of water, the health of the natural environment as well as the cost of various management strategies. Decision makers hence typically need to consider a combination of social, environmental and economic goals. The types of strategies employed can include water efficiency measures, water sensitive urban design, stormwater management, or catchment management. Therefore, decision makers need to choose between different scenarios and to evaluate them against a number of criteria. This type of problem has a discipline devoted to it, i.e. Multi-Criteria Decision Analysis, which has often been applied in water management contexts. This paper describes the application of Subjective Logic in a basic Bayesian Network to a Multi-Criteria Decision Analysis problem. By doing this, it outlines a novel methodology that explicitly incorporates uncertainty and information reliability. The application of the methodology to a known case study context allows for exploration. By making uncertainty and reliability of assessments explicit, it allows for assessing risks of various options, and this may help in alleviating cognitive biases and move towards a well formulated risk management policy.
A simple landslide susceptibility analysis for hazard and risk assessment in developing countries
NASA Astrophysics Data System (ADS)
Guinau, M.; Vilaplana, J. M.
2003-04-01
In recent years, a number of techniques and methodologies have been developed for mitigating natural disasters. The complexity of these methodologies and the scarcity of material and data series justify the need for simple methodologies to obtain the necessary information for minimising the effects of catastrophic natural phenomena. The work with polygonal maps using a GIS allowed us to develop a simple methodology, which was developed in an area of 473 Km2 in the Departamento de Chinandega (NW Nicaragua). This area was severely affected by a large number of landslides (mainly debris flows), triggered by the Hurricane Mitch rainfalls in October 1998. With the aid of aerial photography interpretation at 1:40.000 scale, amplified to 1:20.000, and detailed field work, a landslide map at 1:10.000 scale was constructed. The failure zones of landslides were digitized in order to obtain a failure zone digital map. A terrain unit digital map, in which a series of physical-environmental terrain factors are represented, was also used. Dividing the studied area into two zones (A and B) with homogeneous physical and environmental characteristics, allows us to develop the proposed methodology and to validate it. In zone A, the failure zone digital map is superimposed onto the terrain unit digital map to establish the relationship between the different terrain factors and the failure zones. The numerical expression of this relationship enables us to classify the terrain by its landslide susceptibility. In zone B, this numerical relationship was employed to obtain a landslide susceptibility map, obviating the need for a failure zone map. The validity of the methodology can be tested in this area by using the degree of superposition of the susceptibility map and the failure zone map. The implementation of the methodology in tropical countries with physical and environmental characteristics similar to those of the study area allows us to carry out a landslide susceptibility analysis in areas where landslide records do not exist. This analysis is essential to landslide hazard and risk assessment, which is necessary to determine the actions for mitigating landslide effects, e.g. land planning, emergency aid actions, etc.
45 CFR 153.510 - Risk corridors establishment and payment methodology.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 45 Public Welfare 1 2013-10-01 2013-10-01 false Risk corridors establishment and payment methodology. 153.510 Section 153.510 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES REQUIREMENTS....510 Risk corridors establishment and payment methodology. (a) General requirement. A QHP issuer must...
Pelletier, K R
1997-12-01
This paper is a critical review of the clinical and cost outcome evaluation studies of multifactorial, comprehensive, cardiovascular risk management programs in worksites. A comprehensive international literature search conducted under the auspices of the National Heart, Lung and Blood Institute identified 17 articles based on 12 studies that examined the clinical outcomes of multifactorial, comprehensive programs. These articles were identified through MEDLINE, manual searches of recent journals, and through direct inquiries to worksite health promotion researchers. All studies were conducted between 1978 and 1995, with 1978 being the date of the first citation of a methodologically rigorous evaluation. Of the 12 research studies, only 8 utilized the worksite as both the unit of assignment and as the unit of analysis. None of the studies analyzed adequately for cost effectiveness. Given this limitation, this review briefly considers the relevant worksite research that has demonstrated cost outcomes. Worksite-based, multifactorial cardiovascular intervention programs reviewed for this article varied widely in the comprehensiveness, intensity, and duration of both the interventions and evaluations. Results from randomized trials suggest that providing opportunities for individualized, cardiovascular risk reduction counseling for high-risk employees within the context of comprehensive programming may be the critical component of an effective worksite intervention. Despite the many limitations of the current methodologies of the 12 studies, the majority of the research to date indicates the following: (1) favorable clinical and cost outcomes; (2) that more recent and more rigorously designed research tends to support rather than refute earlier and less rigorously designed studies; and (3) that rather than interpreting the methodological flaws and diversity as inherently negative, one may consider it as indicative of a robust phenomena evident in many types of worksites, with diverse employees, differing interventions, and varying degrees of methodological sophistication. Results of these studies reviewed provide both cautious optimism about the effectiveness of these worksite programs and insights regarding the essential components and characteristics of successful programs.
Contribution of European research to risk analysis.
Boenke, A
2001-12-01
The European Commission's, Quality of Life Research Programme, Key Action 1-Health, Food & Nutrition is mission-oriented and aims, amongst other things, at providing a healthy, safe and high-quality food supply leading to reinforced consumer confidence in the safety, of European food. Its objectives also include the enhancing of the competitiveness of the European food supply. Key Action 1 is currently supporting a number of different types of European collaborative projects in the area of risk analysis. The objectives of these projects range from the development and validation of prevention strategies including the reduction of consumers risks; development and validation of new modelling approaches, harmonization of risk assessment principles methodologies and terminology; standardization of methods and systems used for the safety evaluation of transgenic food; providing of tools for the evaluation of human viral contamination of shellfish and quality control; new methodologies for assessing the potential of unintended effects of genetically modified (genetically modified) foods; development of a risk assessment model for Cryptosporidium parvum related to the food and water industries, to the development of a communication platform for genetically modified organism, producers, retailers, regulatory authorities and consumer groups to improve safety assessment procedures, risk management strategies and risk communication; development and validation of new methods for safety testing of transgenic food; evaluation of the safety and efficacy of iron supplementation in pregnant women, evaluation of the potential cancer-preventing activity of pro- and pre-biotic ('synbiotic') combinations in human volunteers. An overview of these projects is presented here.
Software Risk Identification for Interplanetary Probes
NASA Technical Reports Server (NTRS)
Dougherty, Robert J.; Papadopoulos, Periklis E.
2005-01-01
The need for a systematic and effective software risk identification methodology is critical for interplanetary probes that are using increasingly complex and critical software. Several probe failures are examined that suggest more attention and resources need to be dedicated to identifying software risks. The direct causes of these failures can often be traced to systemic problems in all phases of the software engineering process. These failures have lead to the development of a practical methodology to identify risks for interplanetary probes. The proposed methodology is based upon the tailoring of the Software Engineering Institute's (SEI) method of taxonomy-based risk identification. The use of this methodology will ensure a more consistent and complete identification of software risks in these probes.
Yamato, Tie Parma; Maher, Chris; Koes, Bart; Moseley, Anne
2017-06-01
The Physiotherapy Evidence Database (PEDro) scale has been widely used to investigate methodological quality in physiotherapy randomized controlled trials; however, its validity has not been tested for pharmaceutical trials. The aim of this study was to investigate the validity and interrater reliability of the PEDro scale for pharmaceutical trials. The reliability was also examined for the Cochrane Back and Neck (CBN) Group risk of bias tool. This is a secondary analysis of data from a previous study. We considered randomized placebo controlled trials evaluating any pain medication for chronic spinal pain or osteoarthritis. Convergent validity was evaluated by correlating the PEDro score with the summary score of the CBN risk of bias tool. The construct validity was tested using a linear regression analysis to determine the degree to which the total PEDro score is associated with treatment effect sizes, journal impact factor, and the summary score for the CBN risk of bias tool. The interrater reliability was estimated using the Prevalence and Bias Adjusted Kappa coefficient and 95% confidence interval (CI) for the PEDro scale and CBN risk of bias tool. Fifty-three trials were included, with 91 treatment effect sizes included in the analyses. The correlation between PEDro scale and CBN risk of bias tool was 0.83 (95% CI 0.76-0.88) after adjusting for reliability, indicating strong convergence. The PEDro score was inversely associated with effect sizes, significantly associated with the summary score for the CBN risk of bias tool, and not associated with the journal impact factor. The interrater reliability for each item of the PEDro scale and CBN risk of bias tool was at least substantial for most items (>0.60). The intraclass correlation coefficient for the PEDro score was 0.80 (95% CI 0.68-0.88), and for the CBN, risk of bias tool was 0.81 (95% CI 0.69-0.88). There was evidence for the convergent and construct validity for the PEDro scale when used to evaluate methodological quality of pharmacological trials. Both risk of bias tools have acceptably high interrater reliability. Copyright © 2017 Elsevier Inc. All rights reserved.
Risk-based economic decision analysis of remediation options at a PCE-contaminated site.
Lemming, Gitte; Friis-Hansen, Peter; Bjerg, Poul L
2010-05-01
Remediation methods for contaminated sites cover a wide range of technical solutions with different remedial efficiencies and costs. Additionally, they may vary in their secondary impacts on the environment i.e. the potential impacts generated due to emissions and resource use caused by the remediation activities. More attention is increasingly being given to these secondary environmental impacts when evaluating remediation options. This paper presents a methodology for an integrated economic decision analysis which combines assessments of remediation costs, health risk costs and potential environmental costs. The health risks costs are associated with the residual contamination left at the site and its migration to groundwater used for drinking water. A probabilistic exposure model using first- and second-order reliability methods (FORM/SORM) is used to estimate the contaminant concentrations at a downstream groundwater well. Potential environmental impacts on the local, regional and global scales due to the site remediation activities are evaluated using life cycle assessments (LCA). The potential impacts on health and environment are converted to monetary units using a simplified cost model. A case study based upon the developed methodology is presented in which the following remediation scenarios are analyzed and compared: (a) no action, (b) excavation and off-site treatment of soil, (c) soil vapor extraction and (d) thermally enhanced soil vapor extraction by electrical heating of the soil. Ultimately, the developed methodology facilitates societal cost estimations of remediation scenarios which can be used for internal ranking of the analyzed options. Despite the inherent uncertainties of placing a value on health and environmental impacts, the presented methodology is believed to be valuable in supporting decisions on remedial interventions. Copyright 2010 Elsevier Ltd. All rights reserved.
Assessment of methodologies for analysis of the dungeness B accidental aircraft crash risk.
DOE Office of Scientific and Technical Information (OSTI.GOV)
LaChance, Jeffrey L.; Hansen, Clifford W.
2010-09-01
The Health and Safety Executive (HSE) has requested Sandia National Laboratories (SNL) to review the aircraft crash methodology for nuclear facilities that are being used in the United Kingdom (UK). The scope of the work included a review of one method utilized in the UK for assessing the potential for accidental airplane crashes into nuclear facilities (Task 1) and a comparison of the UK methodology against similar International Atomic Energy Agency (IAEA), United States (US) Department of Energy (DOE), and the US Nuclear Regulatory Commission (NRC) methods (Task 2). Based on the conclusions from Tasks 1 and 2, an additionalmore » Task 3 would provide an assessment of a site-specific crash frequency for the Dungeness B facility using one of the other methodologies. This report documents the results of Task 2. The comparison of the different methods was performed for the three primary contributors to aircraft crash risk at the Dungeness B site: airfield related crashes, crashes below airways, and background crashes. The methods and data specified in each methodology were compared for each of these risk contributors, differences in the methodologies were identified, and the importance of these differences was qualitatively and quantitatively assessed. The bases for each of the methods and the data used were considered in this assessment process. A comparison of the treatment of the consequences of the aircraft crashes was not included in this assessment because the frequency of crashes into critical structures is currently low based on the existing Dungeness B assessment. Although the comparison found substantial differences between the UK and the three alternative methodologies (IAEA, NRC, and DOE) this assessment concludes that use of any of these alternative methodologies would not change the conclusions reached for the Dungeness B site. Performance of Task 3 is thus not recommended.« less
Gómez-García, Francisco; Ruano, Juan; Gay-Mimbrera, Jesus; Aguilar-Luque, Macarena; Sanz-Cabanillas, Juan Luis; Alcalde-Mellado, Patricia; Maestre-López, Beatriz; Carmona-Fernández, Pedro Jesús; González-Padilla, Marcelino; García-Nieto, Antonio Vélez; Isla-Tejera, Beatriz
2017-12-01
No gold standard exists to assess methodological quality of systematic reviews (SRs). Although Assessing the Methodological Quality of Systematic Reviews (AMSTAR) is widely accepted for analyzing quality, the ROBIS instrument has recently been developed. This study aimed to compare the capacity of both instruments to capture the quality of SRs concerning psoriasis interventions. Systematic literature searches were undertaken on relevant databases. For each review, methodological quality and bias risk were evaluated using the AMSTAR and ROBIS tools. Descriptive and principal component analyses were conducted to describe similarities and discrepancies between both assessment tools. We classified 139 intervention SRs as displaying high/moderate/low methodological quality and as high/low risk of bias. A high risk of bias was detected for most SRs classified as displaying high or moderate methodological quality by AMSTAR. When comparing ROBIS result profiles, responses to domain 4 signaling questions showed the greatest differences between bias risk assessments, whereas domain 2 items showed the least. When considering SRs published about psoriasis, methodological quality remains suboptimal, and the risk of bias is elevated, even for SRs exhibiting high methodological quality. Furthermore, the AMSTAR and ROBIS tools may be considered as complementary when conducting quality assessment of SRs. Copyright © 2017 Elsevier Inc. All rights reserved.
Risk analysis for roadways subjected to multiple landslide-related hazards
NASA Astrophysics Data System (ADS)
Corominas, Jordi; Mavrouli, Olga
2014-05-01
Roadways through mountainous terrain often involve cuts and landslide areas whose stability is precarious and require protection and stabilization works. To optimize the allocation of resources, government and technical offices are increasingly interested in both the risk analysis and assessment. Risk analysis has to consider the hazard occurrence and the consequences. The consequences can be both direct and indirect. The former include the costs regarding the repair of the roadway, the damage of vehicles and the potential fatalities, while the latter refer to the costs related to the diversion of vehicles, the excess of distance travelled, the time differences, and tolls. The type of slope instabilities that may affect a roadway may vary and its effects as well. Most current approaches either consider a single hazardous phenomenon each time, or if applied at small (for example national) scale, they do not take into account local conditions at each section of the roadway. The objective of this work is the development of a simple and comprehensive methodology for the assessment of the risk due to multiple hazards along roadways, integrating different landslide types that include rockfalls, debris flows and considering as well the potential failure of retaining walls. To quantify risk, all hazards are expressed with a common term: their probability of occurrence. The methodology takes into consideration the specific local conditions along the roadway. For rockfalls and debris flow a variety of methods for assessing the probability of occurrence exists. To assess the annual probability of failure of retaining walls we use an indicator-based model that provides a hazard index. The model parameters consist in the design safety factor, and further anchorage design and construction parameters. The probability of failure is evaluated in function of the hazard index and next corrected (in terms of order of magnitude) according to in situ observations for increase of two dynamic factors: the service load and the wall deformation. The consequences are then calculated for each hazard type according to its characteristics (mechanism, magnitude, frequency). The difference of this method in comparison with other methodologies for landslide-related hazards lies in the hazard scenarios and consequence profiles that are investigated. The depth of analysis permits to account for local conditions either concerning the hazard or the consequences (the latter with respect to the very particular characteristics of the roadway such as traffic, number of lanes, velocity…). Furthermore it provides an extensive list of quantitative risk descriptors, including both individual and collective ones. The methodology was made automatic using the data sheets by Microsoft Excel. The results can be used to support decision-taking for the planning of protection measures. Gaps in knowledge and restrictions are discussed as well.
NASA Astrophysics Data System (ADS)
Gordon, K.; Houser, T.; Kopp, R. E., III; Hsiang, S. M.; Larsen, K.; Jina, A.; Delgado, M.; Muir-Wood, R.; Rasmussen, D.; Rising, J.; Mastrandrea, M.; Wilson, P. S.
2014-12-01
The United States faces a range of economic risks from global climate change - from increased flooding and storm damage, to climate-driven changes in crop yields and labor productivity, to heat-related strains on energy and public health systems. The Risky Business Project commissioned a groundbreaking new analysis of these and other climate risks by region of the country and sector of the economy. The American Climate Prospectus (ACP) links state-of-the-art climate models with econometric research of human responses to climate variability and cutting edge private sector risk assessment tools, the ACP offers decision-makers a data driven assessment of the specific risks they face. We describe the challenge, methods, findings, and policy implications of the national risk analysis, with particular focus on methodological innovations and novel insights.
NASA Astrophysics Data System (ADS)
Ronco, P.; Bullo, M.; Torresan, S.; Critto, A.; Olschewski, R.; Zappa, M.; Marcomini, A.
2015-03-01
The aim of this paper is the application of the KULTURisk regional risk assessment (KR-RRA) methodology, presented in the companion paper (Part 1, Ronco et al., 2014), to the Sihl River basin, in northern Switzerland. Flood-related risks have been assessed for different receptors lying on the Sihl River valley including Zurich, which represents a typical case of river flooding in an urban area, by calibrating the methodology to the site-specific context and features. Risk maps and statistics have been developed using a 300-year return period scenario for six relevant targets exposed to flood risk: people; economic activities: buildings, infrastructure and agriculture; natural and semi-natural systems; and cultural heritage. Finally, the total risk index map has been produced to visualize the spatial pattern of flood risk within the target area and, therefore, to identify and rank areas and hotspots at risk by means of multi-criteria decision analysis (MCDA) tools. Through a tailored participatory approach, risk maps supplement the consideration of technical experts with the (essential) point of view of relevant stakeholders for the appraisal of the specific scores weighting for the different receptor-relative risks. The total risk maps obtained for the Sihl River case study are associated with the lower classes of risk. In general, higher (relative) risk scores are spatially concentrated in the deeply urbanized city centre and areas that lie just above to river course. Here, predicted injuries and potential fatalities are mainly due to high population density and to the presence of vulnerable people; flooded buildings are mainly classified as continuous and discontinuous urban fabric; flooded roads, pathways and railways, most of them in regards to the Zurich central station (Hauptbahnhof) are at high risk of inundation, causing severe indirect damage. Moreover, the risk pattern for agriculture, natural and semi-natural systems and cultural heritage is relatively less important mainly because the scattered presence of these assets. Finally, the application of the KR-RRA methodology to the Sihl River case study, as well as to several other sites across Europe (not presented here), has demonstrated its flexibility and the possible adaptation of it to different geographical and socioeconomic contexts, depending on data availability and particulars of the sites, and for other (hazard) scenarios.
2012-06-01
Visa Investigate Data Breach March 30, 2012 Visa and MasterCard are investigating whether a data security breach at one of the main companies that...30). MasterCard and Visa Investigate Data Breach . New York Times . Stamatis, D. (2003). Failure Mode Effect Analysis: FMEA from Theory to Execution
Torre, Michele; Digka, Nikoletta; Anastasopoulou, Aikaterini; Tsangaris, Catherine; Mytilineou, Chryssi
2016-12-15
Research studies on the effects of microlitter on marine biota have become more and more frequent the last few years. However, there is strong evidence that scientific results based on microlitter analyses can be biased by contamination from air transported fibres. This study demonstrates a low cost and easy to apply methodology to minimize the background contamination and thus to increase results validity. The contamination during the gastrointestinal content analysis of 400 fishes was tested for several sample processing steps of high risk airborne contamination (e.g. dissection, stereomicroscopic analysis, and chemical digestion treatment for microlitter extraction). It was demonstrated that, using our methodology based on hermetic enclosure devices, isolating the working areas during the various processing steps, airborne contamination reduced by 95.3%. The simplicity and low cost of this methodology provide the benefit that it could be applied not only to laboratory but also to field or on board work. Copyright © 2016 Elsevier Ltd. All rights reserved.
Ge, Long; Tian, Jin-hui; Li, Xiu-xia; Song, Fujian; Li, Lun; Zhang, Jun; Li, Ge; Pei, Gai-qin; Qiu, Xia; Yang, Ke-hu
2016-01-01
Because of the methodological complexity of network meta-analyses (NMAs), NMAs may be more vulnerable to methodological risks than conventional pair-wise meta-analysis. Our study aims to investigate epidemiology characteristics, conduction of literature search, methodological quality and reporting of statistical analysis process in the field of cancer based on PRISMA extension statement and modified AMSTAR checklist. We identified and included 102 NMAs in the field of cancer. 61 NMAs were conducted using a Bayesian framework. Of them, more than half of NMAs did not report assessment of convergence (60.66%). Inconsistency was assessed in 27.87% of NMAs. Assessment of heterogeneity in traditional meta-analyses was more common (42.62%) than in NMAs (6.56%). Most of NMAs did not report assessment of similarity (86.89%) and did not used GRADE tool to assess quality of evidence (95.08%). 43 NMAs were adjusted indirect comparisons, the methods used were described in 53.49% NMAs. Only 4.65% NMAs described the details of handling of multi group trials and 6.98% described the methods of similarity assessment. The median total AMSTAR-score was 8.00 (IQR: 6.00–8.25). Methodological quality and reporting of statistical analysis did not substantially differ by selected general characteristics. Overall, the quality of NMAs in the field of cancer was generally acceptable. PMID:27848997
Retinal Image Quality Assessment for Spaceflight-Induced Vision Impairment Study
NASA Technical Reports Server (NTRS)
Vu, Amanda Cadao; Raghunandan, Sneha; Vyas, Ruchi; Radhakrishnan, Krishnan; Taibbi, Giovanni; Vizzeri, Gianmarco; Grant, Maria; Chalam, Kakarla; Parsons-Wingerter, Patricia
2015-01-01
Long-term exposure to space microgravity poses significant risks for visual impairment. Evidence suggests such vision changes are linked to cephalad fluid shifts, prompting a need to directly quantify microgravity-induced retinal vascular changes. The quality of retinal images used for such vascular remodeling analysis, however, is dependent on imaging methodology. For our exploratory study, we hypothesized that retinal images captured using fluorescein imaging methodologies would be of higher quality in comparison to images captured without fluorescein. A semi-automated image quality assessment was developed using Vessel Generation Analysis (VESGEN) software and MATLAB® image analysis toolboxes. An analysis of ten images found that the fluorescein imaging modality provided a 36% increase in overall image quality (two-tailed p=0.089) in comparison to nonfluorescein imaging techniques.
Schedule Risks Due to Delays in Advanced Technology Development
NASA Technical Reports Server (NTRS)
Reeves, John D. Jr.; Kayat, Kamal A.; Lim, Evan
2008-01-01
This paper discusses a methodology and modeling capability that probabilistically evaluates the likelihood and impacts of delays in advanced technology development prior to the start of design, development, test, and evaluation (DDT&E) of complex space systems. The challenges of understanding and modeling advanced technology development considerations are first outlined, followed by a discussion of the problem in the context of lunar surface architecture analysis. The current and planned methodologies to address the problem are then presented along with sample analyses and results. The methodology discussed herein provides decision-makers a thorough understanding of the schedule impacts resulting from the inclusion of various enabling advanced technology assumptions within system design.
REVIEW OF DRAFT REVISED BLUE BOOK ON ESTIMATING ...
In 1994, EPA published a report, referred to as the “Blue Book,” which lays out EPA’s current methodology for quantitatively estimating radiogenic cancer risks. A follow-on report made minor adjustments to the previous estimates and presented a partial analysis of the uncertainties in the numerical estimates. In 2006, the National Research Council of the National Academy of Sciences released a report on the health risks from exposure to low levels of ionizing radiation. Cosponsored by the EPA and several other Federal agencies, Health Risks from Exposure to Low Levels of Ionizing Radiation BEIR VII Phase 2 (BEIR VII) primarily addresses cancer and genetic risks from low doses of low-LET radiation. In the draft White Paper: Modifying EPA Radiation Risk Models Based on BEIR VII (White Paper), ORIA proposed changes in EPA’s methodology for estimating radiogenic cancers, based on the contents of BEIR VII and some ancillary information. For the most part, it proposed to adopt the models and methodology recommended in BEIR VII; however, certain modifications and expansions are considered to be desirable or necessary for EPA’s purposes. EPA sought advice from the Agency’s Science Advisory Board on the application of BEIR VII and on issues relating to these modifications and expansions in the Advisory on EPA’s Draft White Paper: Modifying EPA Radiation Risk Models Based on BEIR VII (record # 83044). The SAB issued its Advisory on Jan. 31, 2008 (EPA-SAB-08-
A Model-Free Machine Learning Method for Risk Classification and Survival Probability Prediction.
Geng, Yuan; Lu, Wenbin; Zhang, Hao Helen
2014-01-01
Risk classification and survival probability prediction are two major goals in survival data analysis since they play an important role in patients' risk stratification, long-term diagnosis, and treatment selection. In this article, we propose a new model-free machine learning framework for risk classification and survival probability prediction based on weighted support vector machines. The new procedure does not require any specific parametric or semiparametric model assumption on data, and is therefore capable of capturing nonlinear covariate effects. We use numerous simulation examples to demonstrate finite sample performance of the proposed method under various settings. Applications to a glioma tumor data and a breast cancer gene expression survival data are shown to illustrate the new methodology in real data analysis.
Fuzzy risk analysis of a modern γ-ray industrial irradiator.
Castiglia, F; Giardina, M
2011-06-01
Fuzzy fault tree analyses were used to investigate accident scenarios that involve radiological exposure to operators working in industrial γ-ray irradiation facilities. The HEART method, a first generation human reliability analysis method, was used to evaluate the probability of adverse human error in these analyses. This technique was modified on the basis of fuzzy set theory to more directly take into account the uncertainties in the error-promoting factors on which the methodology is based. Moreover, with regard to some identified accident scenarios, fuzzy radiological exposure risk, expressed in terms of potential annual death, was evaluated. The calculated fuzzy risks for the examined plant were determined to be well below the reference risk suggested by International Commission on Radiological Protection.
Ergonomic initiatives at Inmetro: measuring occupational health and safety.
Drucker, L; Amaral, M; Carvalheira, C
2012-01-01
This work studies biomechanical hazards to which the workforce of Instituto Nacional de Metrologia, Qualidade e Tecnologia Industrial (Inmetro) is exposed. It suggests a model for ergonomic evaluation of work, based on the concepts of resilience engineering which take into consideration the institute's ability to manage risk and deal with its consequences. Methodology includes the stages of identification, inventory, analysis, and risk management. Diagnosis of the workplace uses as parameters the minimal criteria stated in Brazilian legislation. The approach has several prospectives and encompasses the points of view of public management, safety engineering, physical therapy and ergonomics-oriented design. The suggested solution integrates all aspects of the problem: biological, psychological, sociological and organizational. Results obtained from a pilot Project allow to build a significant sample of Inmetro's workforce, identifying problems and validating the methodology employed as a tool to be applied to the whole institution. Finally, this work intends to draw risk maps and support goals and methods based on resiliency engineering to assess environmental and ergonomic risk management.
DeAngelo, Jacob; Shervais, John W.; Glen, Jonathan; Nielson, Dennis L.; Garg, Sabodh; Dobson, Patrick; Gasperikova, Erika; Sonnenthal, Eric; Visser, Charles; Liberty, Lee M.; Siler, Drew; Evans, James P.; Santellanes, Sean
2016-01-01
Play fairway analysis in geothermal exploration derives from a systematic methodology originally developed within the petroleum industry and is based on a geologic and hydrologic framework of identified geothermal systems. We are tailoring this methodology to study the geothermal resource potential of the Snake River Plain and surrounding region. This project has contributed to the success of this approach by cataloging the critical elements controlling exploitable hydrothermal systems, establishing risk matrices that evaluate these elements in terms of both probability of success and level of knowledge, and building automated tools to process results. ArcGIS was used to compile a range of different data types, which we refer to as ‘elements’ (e.g., faults, vents, heatflow…), with distinct characteristics and confidence values. Raw data for each element were transformed into data layers with a common format. Because different data types have different uncertainties, each evidence layer had an accompanying confidence layer, which reflects spatial variations in these uncertainties. Risk maps represent the product of evidence and confidence layers, and are the basic building blocks used to construct Common Risk Segment (CRS) maps for heat, permeability, and seal. CRS maps quantify the variable risk associated with each of these critical components. In a final step, the three CRS maps were combined into a Composite Common Risk Segment (CCRS) map for analysis that reveals favorable areas for geothermal exploration. Python scripts were developed to automate data processing and to enhance the flexibility of the data analysis. Python scripting provided the structure that makes a custom workflow possible. Nearly every tool available in the ArcGIS ArcToolbox can be executed using commands in the Python programming language. This enabled the construction of a group of tools that could automate most of the processing for the project. Currently, our tools are repeatable, scalable, modifiable, and transferrable, allowing us to automate the task of data analysis and the production of CRS and CCRS maps. Our ultimate goal is to produce a toolkit that can be imported into ArcGIS and applied to any geothermal play type, with fully tunable parameters that will allow for the production of multiple versions of the CRS and CCRS maps in order to better test for sensitivity and to validate results.
NASA Astrophysics Data System (ADS)
Bouillard, Jacques X.; Vignes, Alexis
2014-02-01
In this paper, an inhalation health and explosion safety risk assessment methodology for nanopowders is described. Since toxicological threshold limit values are still unknown for nanosized substances, detailed risk assessment on specific plants may not be carried out. A simple approach based on occupational hazard/exposure band expressed in mass concentrations is proposed for nanopowders. This approach is consolidated with an iso surface toxicological scaling method, which has the merit, although incomplete, to provide concentration threshold levels for which new metrological instruments should be developed for proper air monitoring in order to ensure safety. Whenever the processing or use of nanomaterials is introducing a risk to the worker, a specific nano pictogram is proposed to inform the worker. Examples of risk assessment of process equipment (i.e., containment valves) processing various nanomaterials are provided. Explosion risks related to very reactive nanomaterials such as aluminum nanopowders can be assessed using this new analysis methodology adapted to nanopowders. It is nevertheless found that to formalize and extend this approach, it is absolutely necessary to develop new relevant standard apparatuses and to qualify individual and collective safety barriers with respect to health and explosion risks. In spite of these uncertainties, it appears, as shown in the second paper (Part II) that health and explosion risks, evaluated for given MWCNTs and aluminum nanoparticles, remain manageable in their continuous fabrication mode, considering current individual and collective safety barriers that can be put in place. The authors would, however, underline that peculiar attention must be paid to non-continuous modes of operations, such as process equipment cleaning steps, that are often under-analyzed and are too often forgotten critical steps needing vigilance in order to minimize potential toxic and explosion risks.
Application of Bayesian and cost benefit risk analysis in water resources management
NASA Astrophysics Data System (ADS)
Varouchakis, E. A.; Palogos, I.; Karatzas, G. P.
2016-03-01
Decision making is a significant tool in water resources management applications. This technical note approaches a decision dilemma that has not yet been considered for the water resources management of a watershed. A common cost-benefit analysis approach, which is novel in the risk analysis of hydrologic/hydraulic applications, and a Bayesian decision analysis are applied to aid the decision making on whether or not to construct a water reservoir for irrigation purposes. The alternative option examined is a scaled parabolic fine variation in terms of over-pumping violations in contrast to common practices that usually consider short-term fines. The methodological steps are analytically presented associated with originally developed code. Such an application, and in such detail, represents new feedback. The results indicate that the probability uncertainty is the driving issue that determines the optimal decision with each methodology, and depending on the unknown probability handling, each methodology may lead to a different optimal decision. Thus, the proposed tool can help decision makers to examine and compare different scenarios using two different approaches before making a decision considering the cost of a hydrologic/hydraulic project and the varied economic charges that water table limit violations can cause inside an audit interval. In contrast to practices that assess the effect of each proposed action separately considering only current knowledge of the examined issue, this tool aids decision making by considering prior information and the sampling distribution of future successful audits.
NASA Astrophysics Data System (ADS)
McKinney, D. C.; Cuellar, A. D.
2015-12-01
Climate change has accelerated glacial retreat in high altitude glaciated regions of Nepal leading to the growth and formation of glacier lakes. Glacial lake outburst floods (GLOF) are sudden events triggered by an earthquake, moraine failure or other shock that causes a sudden outflow of water. These floods are catastrophic because of their sudden onset, the difficulty predicting them, and enormous quantity of water and debris rapidly flooding downstream areas. Imja Lake in the Himalaya of Nepal has experienced accelerated growth since it first appeared in the 1960s. Communities threatened by a flood from Imja Lake have advocated for projects to adapt to the increasing threat of a GLOF. Nonetheless, discussions surrounding projects for Imja have not included a rigorous analysis of the potential consequences of a flood, probability of an event, or costs of mitigation projects in part because this information is unknown or uncertain. This work presents a demonstration of a decision making methodology developed to rationally analyze the risks posed by Imja Lake and the various adaptation projects proposed using available information. In this work the authors use decision analysis, data envelopement analysis (DEA), and sensitivity analysis to assess proposed adaptation measures that would mitigate damage in downstream communities from a GLOF. We use an existing hydrodynamic model of the at-risk area to determine how adaptation projects will affect downstream flooding and estimate fatalities using an empirical method developed for dam failures. The DEA methodology allows us to estimate the value of a statistical life implied by each project given the cost of the project and number of lives saved to determine which project is the most efficient. In contrast the decision analysis methodology requires fatalities to be assigned a cost but allows the inclusion of uncertainty in the decision making process. We compare the output of these two methodologies and determine the sensitivity of the conclusions to changes in uncertain input parameters including project cost, value of a statistical life, and time to a GLOF event.
Translational benchmark risk analysis
Piegorsch, Walter W.
2010-01-01
Translational development – in the sense of translating a mature methodology from one area of application to another, evolving area – is discussed for the use of benchmark doses in quantitative risk assessment. Illustrations are presented with traditional applications of the benchmark paradigm in biology and toxicology, and also with risk endpoints that differ from traditional toxicological archetypes. It is seen that the benchmark approach can apply to a diverse spectrum of risk management settings. This suggests a promising future for this important risk-analytic tool. Extensions of the method to a wider variety of applications represent a significant opportunity for enhancing environmental, biomedical, industrial, and socio-economic risk assessments. PMID:20953283
Developing and Implementing the Data Mining Algorithms in RAVEN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sen, Ramazan Sonat; Maljovec, Daniel Patrick; Alfonsi, Andrea
The RAVEN code is becoming a comprehensive tool to perform probabilistic risk assessment, uncertainty quantification, and verification and validation. The RAVEN code is being developed to support many programs and to provide a set of methodologies and algorithms for advanced analysis. Scientific computer codes can generate enormous amounts of data. To post-process and analyze such data might, in some cases, take longer than the initial software runtime. Data mining algorithms/methods help in recognizing and understanding patterns in the data, and thus discover knowledge in databases. The methodologies used in the dynamic probabilistic risk assessment or in uncertainty and error quantificationmore » analysis couple system/physics codes with simulation controller codes, such as RAVEN. RAVEN introduces both deterministic and stochastic elements into the simulation while the system/physics code model the dynamics deterministically. A typical analysis is performed by sampling values of a set of parameter values. A major challenge in using dynamic probabilistic risk assessment or uncertainty and error quantification analysis for a complex system is to analyze the large number of scenarios generated. Data mining techniques are typically used to better organize and understand data, i.e. recognizing patterns in the data. This report focuses on development and implementation of Application Programming Interfaces (APIs) for different data mining algorithms, and the application of these algorithms to different databases.« less
Di Guardo, Andrea; Finizio, Antonio
2018-01-01
In the last decades, several monitoring programs were established as an effect of EU Directives addressing the quality of water resources (drinking water, groundwater and surface water). Plant Protection Products (PPPs) are an obvious target of monitoring activities, since they are directly released into the environment. One of the challenges in managing the risk of pesticides at the territorial scale is identifying the locations in water bodies needing implementation of risk mitigation measures. In this, the national pesticides monitoring plans could be very helpful. However, monitoring of pesticides is a challenging task because of the high number of registered pesticides, cost of analyses, and the periodicity of sampling related to pesticide application and use. Extensive high-quality data-sets are consequently often missing. More in general, the information that can be obtained from monitoring studies are frequently undervalued by risk managers. In this study, we propose a new methodology providing indications about the need to implement mitigation measures in stretches of surface water bodies on a territory by combining historical series of monitoring data and GIS. The methodology is articulated in two distinct phases: a) acquisition of monitoring data and setting-up of informative layers of georeferenced data (phase 1) and b) statistical and expert analysis for the identification of areas where implementation of limitation or mitigation measures are suggested (phase 2). Our methodology identifies potentially vulnerable water bodies, considering temporal contamination trends and relative risk levels at selected monitoring stations. A case study is presented considering glyphosate monitoring data in Lombardy Region (Northern of Italy) for the 2008-2014 period. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Valentina, Gallina; Silvia, Torresan; Anna, Sperotto; Elisa, Furlan; Andrea, Critto; Antonio, Marcomini
2014-05-01
Nowadays, the challenge for coastal stakeholders and decision makers is to incorporate climate change in land and policy planning in order to ensure a sustainable integrated coastal zone management aimed at preserve coastal environments and socio-economic activities. Consequently, an increasing amount of information on climate variability and its impact on human and natural ecosystem is requested. Climate risk services allows to bridge the gap between climate experts and decision makers communicating timely science-based information about impacts and risks related to climate change that could be incorporated into land planning, policy and practice. Within the CLIM-RUN project (FP7), a participatory Regional Risk Assessment (RRA) methodology was applied for the evaluation of water-related hazards in coastal areas (i.e. pluvial flood and sea-level rise inundation risks) taking into consideration future climate change scenarios in the case study of the North Adriatic Sea for the period 2040-2050. Specifically, through the analysis of hazard, exposure, vulnerability and risk and the application of Multi-Criteria Decision Analysis (MCDA), the RRA methodology allowed to identify and prioritize targets (i.e. residential and commercial-industrial areas, beaches, infrastructures, wetlands, agricultural typology) and sub-areas that are more likely to be affected by pluvial flood and sea-level rise impacts in the same region. From the early stages of the climate risk services development and application, the RRA followed a bottom-up approach taking into account the needs, knowledge and perspectives of local stakeholders dealing with the Integrated Coastal Zone Management (ICZM), by means of questionnaires, workshops and focus groups organized within the project. Specifically, stakeholders were asked to provide their needs in terms of time scenarios, geographical scale and resolution, choice of receptors, vulnerability factors and thresholds that were considered in the implementation of the RRA methodology. The main output of the analysis are climate risk products produced with the DEcision support SYstem for COastal climate change impact assessment (DESYCO) and represented by GIS-based maps and statistics of hazard, exposure, physical and environmental vulnerability, risk and damage. These maps are useful to transfer information about climate change impacts to stakeholders and decision makers, to allow the classification and prioritization of areas that are likely to be affected by climate change impacts more severely than others in the same region, and therefore to support the identification of suitable areas for infrastructure, economic activities and human settlements toward the development of regional adaptation plans. The climate risk products and the results of North Adriatic case study will be here presented and discussed.
Application of Six Sigma methodology to a diagnostic imaging process.
Taner, Mehmet Tolga; Sezen, Bulent; Atwat, Kamal M
2012-01-01
This paper aims to apply the Six Sigma methodology to improve workflow by eliminating the causes of failure in the medical imaging department of a private Turkish hospital. Implementation of the design, measure, analyse, improve and control (DMAIC) improvement cycle, workflow chart, fishbone diagrams and Pareto charts were employed, together with rigorous data collection in the department. The identification of root causes of repeat sessions and delays was followed by failure, mode and effect analysis, hazard analysis and decision tree analysis. The most frequent causes of failure were malfunction of the RIS/PACS system and improper positioning of patients. Subsequent to extensive training of professionals, the sigma level was increased from 3.5 to 4.2. The data were collected over only four months. Six Sigma's data measurement and process improvement methodology is the impetus for health care organisations to rethink their workflow and reduce malpractice. It involves measuring, recording and reporting data on a regular basis. This enables the administration to monitor workflow continuously. The improvements in the workflow under study, made by determining the failures and potential risks associated with radiologic care, will have a positive impact on society in terms of patient safety. Having eliminated repeat examinations, the risk of being exposed to more radiation was also minimised. This paper supports the need to apply Six Sigma and present an evaluation of the process in an imaging department.
[Failure mode effect analysis applied to preparation of intravenous cytostatics].
Santos-Rubio, M D; Marín-Gil, R; Muñoz-de la Corte, R; Velázquez-López, M D; Gil-Navarro, M V; Bautista-Paloma, F J
2016-01-01
To proactively identify risks in the preparation of intravenous cytostatic drugs, and to prioritise and establish measures to improve safety procedures. Failure Mode Effect Analysis methodology was used. A multidisciplinary team identified potential failure modes of the procedure through a brainstorming session. The impact associated with each failure mode was assessed with the Risk Priority Number (RPN), which involves three variables: occurrence, severity, and detectability. Improvement measures were established for all identified failure modes, with those with RPN>100 considered critical. The final RPN (theoretical) that would result from the proposed measures was also calculated and the process was redesigned. A total of 34 failure modes were identified. The initial accumulated RPN was 3022 (range: 3-252), and after recommended actions the final RPN was 1292 (range: 3-189). RPN scores >100 were obtained in 13 failure modes; only the dispensing sub-process was free of critical points (RPN>100). A final reduction of RPN>50% was achieved in 9 failure modes. This prospective risk analysis methodology allows the weaknesses of the procedure to be prioritised, optimize use of resources, and a substantial improvement in the safety of the preparation of cytostatic drugs through the introduction of double checking and intermediate product labelling. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.
Assessing the Climate Resilience of Transport Infrastructure Investments in Tanzania
NASA Astrophysics Data System (ADS)
Hall, J. W.; Pant, R.; Koks, E.; Thacker, S.; Russell, T.
2017-12-01
Whilst there is an urgent need for infrastructure investment in developing countries, there is a risk that poorly planned and built infrastructure will introduce new vulnerabilities. As climate change increases the magnitudes and frequency of natural hazard events, incidence of disruptive infrastructure failures are likely to become more frequent. Therefore, it is important that infrastructure planning and investment is underpinned by climate risk assessment that can inform adaptation planning. Tanzania's rapid economic growth is placing considerable strain on the country's transportation infrastructure (roads, railways, shipping and aviation); especially at the port of Dar es Salaam and its linking transport corridors. A growing number of natural hazard events, in particular flooding, are impacting the reliability of this already over-used network. Here we report on new methodology to analyse vulnerabilities and risks due to failures of key locations in the intermodal transport network of Tanzania, including strategic connectivity to neighboring countries. To perform the national-scale risk analysis we will utilize a system-of-systems methodology. The main components of this general risk assessment, when applied to transportation systems, include: (1) Assembling data on: spatially coherent extreme hazards and intermodal transportation networks; (2) Intersecting hazards with transport network models to initiate failure conditions that trigger failure propagation across interdependent networks; (3) Quantifying failure outcomes in terms of social impacts (customers/passengers disrupted) and/or macroeconomic consequences (across multiple sectors); and (4) Simulating, testing and collecting multiple failure scenarios to perform an exhaustive risk assessment in terms of probabilities and consequences. The methodology is being used to pinpoint vulnerability and reduce climate risks to transport infrastructure investments.
De Ambrogi, Francesco; Ratti, Elisabetta Ceppi
2011-01-01
Today the Italian national debate over the Work-Related Stress Risk Assessment methodology is rather heated. Several methodological proposals and guidelines have been published in recent months, not least those by the "Commissione Consultiva". But despite this wide range of proposals, it appears that there is still a lack of attention to some of the basic methodological issues that must be taken into account in order to correctly implement the above-mentioned guidelines. The aim of this paper is to outline these methodological issues. In order to achieve this, the most authoritative methodological proposals and guidelines have been reviewed. The study focuses in particular on the methodological issues that could lead to important biases if not considered properly. The study leads to some considerations about the methodological validity of a Work-Related Stress Risk Assessment based exclusively on the literal interpretation of the considered proposals. Furthermore, the study provides some hints and working hypotheses on how to overcome these methodological limits. This study should be considered as a starting point for further investigations and debate on the Work-Related Stress Risk Assessment methodology on a national level.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tortorelli, J.P.
1995-08-01
A workshop was held at the Idaho National Engineering Laboratory, August 16--18, 1994 on the topic of risk assessment on medical devices that use radioactive isotopes. Its purpose was to review past efforts to develop a risk assessment methodology to evaluate these devices, and to develop a program plan and a scoping document for future methodology development. This report contains a summary of that workshop. Participants included experts in the fields of radiation oncology, medical physics, risk assessment, human-error analysis, and human factors. Staff from the US Nuclear Regulatory Commission (NRC) associated with the regulation of medical uses of radioactivemore » materials and with research into risk-assessment methods participated in the workshop. The workshop participants concurred in NRC`s intended use of risk assessment as an important technology in the development of regulations for the medical use of radioactive material and encouraged the NRC to proceed rapidly with a pilot study. Specific recommendations are included in the executive summary and the body of this report. An appendix contains the 8 papers presented at the conference: NRC proposed policy statement on the use of probabilistic risk assessment methods in nuclear regulatory activities; NRC proposed agency-wide implementation plan for probabilistic risk assessment; Risk evaluation of high dose rate remote afterloading brachytherapy at a large research/teaching institution; The pros and cons of using human reliability analysis techniques to analyze misadministration events; Review of medical misadministration event summaries and comparison of human error modeling; Preliminary examples of the development of error influences and effects diagrams to analyze medical misadministration events; Brachytherapy risk assessment program plan; and Principles of brachytherapy quality assurance.« less
The Methodology of Clinical Studies Used by the FDA for Approval of High-Risk Orthopaedic Devices.
Barker, Jordan P; Simon, Stephen D; Dubin, Jonathan
2017-05-03
The purpose of this investigation was to examine the methodology of clinical trials used by the U.S. Food and Drug Administration (FDA) to determine the safety and effectiveness of high-risk orthopaedic devices approved between 2001 and 2015. Utilizing the FDA's online public database, this systematic review audited study design and methodological variables intended to minimize bias and confounding. An additional analysis of blinding as well as the Checklist to Evaluate a Report of a Nonpharmacological Trial (CLEAR NPT) was applied to the randomized controlled trials (RCTs). Of the 49 studies, 46 (94%) were prospective and 37 (76%) were randomized. Forty-seven (96%) of the studies were controlled in some form. Of 35 studies that reported it, blinding was utilized in 21 (60%), of which 8 (38%) were reported as single-blinded and 13 (62%) were reported as double-blinded. Of the 37 RCTs, outcome assessors were clearly blinded in 6 (16%), whereas 15 (41%) were deemed impossible to blind as implants could be readily discerned on imaging. When the CLEAR NPT was applied to the 37 RCTs, >70% of studies were deemed "unclear" in describing generation of allocation sequences, treatment allocation concealment, and adequate blinding of participants and outcome assessors. This study manifests the highly variable reporting and strength of clinical research methodology accepted by the FDA to approve high-risk orthopaedic devices.
2015-11-05
impact analyses) satisfactorily encompasses the fundamentals of environmental health risk and can be applied to all mobile and stationary equipment...regulations. This paper does not seek to justify the EPA MHB approach, but explains the fundamentals and describes how the MHB concept can be...satisfactorily encompasses the fundamentals of environmental health risk and can be applied to all mobile and stationary equipment types. 15. SUBJECT TERMS
Multimedia-modeling integration development environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pelton, Mitchell A.; Hoopes, Bonnie L.
2002-09-02
There are many framework systems available; however, the purpose of the framework presented here is to capitalize on the successes of the Framework for Risk Analysis in Multimedia Environmental Systems (FRAMES) and Multi-media Multi-pathway Multi-receptor Risk Assessment (3MRA) methodology as applied to the Hazardous Waste Identification Rule (HWIR) while focusing on the development of software tools to simplify the module developer?s effort of integrating a module into the framework.
WE-B-BRC-03: Risk in the Context of Medical Imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Samei, E.
Prospective quality management techniques, long used by engineering and industry, have become a growing aspect of efforts to improve quality management and safety in healthcare. These techniques are of particular interest to medical physics as scope and complexity of clinical practice continue to grow, thus making the prescriptive methods we have used harder to apply and potentially less effective for our interconnected and highly complex healthcare enterprise, especially in imaging and radiation oncology. An essential part of most prospective methods is the need to assess the various risks associated with problems, failures, errors, and design flaws in our systems. Wemore » therefore begin with an overview of risk assessment methodologies used in healthcare and industry and discuss their strengths and weaknesses. The rationale for use of process mapping, failure modes and effects analysis (FMEA) and fault tree analysis (FTA) by TG-100 will be described, as well as suggestions for the way forward. This is followed by discussion of radiation oncology specific risk assessment strategies and issues, including the TG-100 effort to evaluate IMRT and other ways to think about risk in the context of radiotherapy. Incident learning systems, local as well as the ASTRO/AAPM ROILS system, can also be useful in the risk assessment process. Finally, risk in the context of medical imaging will be discussed. Radiation (and other) safety considerations, as well as lack of quality and certainty all contribute to the potential risks associated with suboptimal imaging. The goal of this session is to summarize a wide variety of risk analysis methods and issues to give the medical physicist access to tools which can better define risks (and their importance) which we work to mitigate with both prescriptive and prospective risk-based quality management methods. Learning Objectives: Description of risk assessment methodologies used in healthcare and industry Discussion of radiation oncology-specific risk assessment strategies and issues Evaluation of risk in the context of medical imaging and image quality E. Samei: Research grants from Siemens and GE.« less
WE-B-BRC-00: Concepts in Risk-Based Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
Prospective quality management techniques, long used by engineering and industry, have become a growing aspect of efforts to improve quality management and safety in healthcare. These techniques are of particular interest to medical physics as scope and complexity of clinical practice continue to grow, thus making the prescriptive methods we have used harder to apply and potentially less effective for our interconnected and highly complex healthcare enterprise, especially in imaging and radiation oncology. An essential part of most prospective methods is the need to assess the various risks associated with problems, failures, errors, and design flaws in our systems. Wemore » therefore begin with an overview of risk assessment methodologies used in healthcare and industry and discuss their strengths and weaknesses. The rationale for use of process mapping, failure modes and effects analysis (FMEA) and fault tree analysis (FTA) by TG-100 will be described, as well as suggestions for the way forward. This is followed by discussion of radiation oncology specific risk assessment strategies and issues, including the TG-100 effort to evaluate IMRT and other ways to think about risk in the context of radiotherapy. Incident learning systems, local as well as the ASTRO/AAPM ROILS system, can also be useful in the risk assessment process. Finally, risk in the context of medical imaging will be discussed. Radiation (and other) safety considerations, as well as lack of quality and certainty all contribute to the potential risks associated with suboptimal imaging. The goal of this session is to summarize a wide variety of risk analysis methods and issues to give the medical physicist access to tools which can better define risks (and their importance) which we work to mitigate with both prescriptive and prospective risk-based quality management methods. Learning Objectives: Description of risk assessment methodologies used in healthcare and industry Discussion of radiation oncology-specific risk assessment strategies and issues Evaluation of risk in the context of medical imaging and image quality E. Samei: Research grants from Siemens and GE.« less
10 CFR 300.11 - Independent verification.
Code of Federal Regulations, 2011 CFR
2011-01-01
... verifiers, and has been empowered to make decisions relevant to the provision of a verification statement... methods; and (v) Risk assessment and methodologies and materiality analysis procedures outlined by other... Accreditation Board program for Environmental Management System auditors (ANSI-RAB-EMS); Board of Environmental...
10 CFR 300.11 - Independent verification.
Code of Federal Regulations, 2010 CFR
2010-01-01
... verifiers, and has been empowered to make decisions relevant to the provision of a verification statement... methods; and (v) Risk assessment and methodologies and materiality analysis procedures outlined by other... Accreditation Board program for Environmental Management System auditors (ANSI-RAB-EMS); Board of Environmental...
Analysis of interactions among barriers in project risk management
NASA Astrophysics Data System (ADS)
Dandage, Rahul V.; Mantha, Shankar S.; Rane, Santosh B.; Bhoola, Vanita
2018-03-01
In the context of the scope, time, cost, and quality constraints, failure is not uncommon in project management. While small projects have 70% chances of success, large projects virtually have no chance of meeting the quadruple constraints. While there is no dearth of research on project risk management, the manifestation of barriers to project risk management is a less dwelt topic. The success of project management is oftentimes based on the understanding of barriers to effective risk management, application of appropriate risk management methodology, proactive leadership to avoid barriers, workers' attitude, adequate resources, organizational culture, and involvement of top management. This paper represents various risk categories and barriers to risk management in domestic and international projects through literature survey and feedback from project professionals. After analysing the various modelling methods used in project risk management literature, interpretive structural modelling (ISM) and MICMAC analysis have been used to analyse interactions among the barriers and prioritize them. The analysis indicates that lack of top management support, lack of formal training, and lack of addressing cultural differences are the high priority barriers, among many others.
[Generalization of the results of clinical studies through the analysis of subgroups].
Costa, João; Fareleira, Filipa; Ascensão, Raquel; Vaz Carneiro, António
2012-01-01
Subgroup analysis in clinical trials are usually performed to define the potential heterogeneity of treatment effect in relation with the baseline risk, physiopathology, practical application of therapy or the under-utilization in clinical practice of effective interventions due to uncertainties of its benefit/risk ratio. When appropriately planned, subgroup analysis are a valid methodology the define benefits in subgroups of patients, thus providing good quality evidence to support clinical decision making. However, in order to be correct, subgroup analysis should be defined a priori, done in small numbers, should be fully reported and, most important, must endure statistical tests for interaction. In this paper we present an example of the treatment of post-menopausal osteoporosis, in which the benefits of an intervention (the higher the fracture risk is, the better the benefit is) with a specific agent (bazedoxifene) was only disclosed after a post-hoc analysis of the initial global trial sample.
An Independent Evaluation of the FMEA/CIL Hazard Analysis Alternative Study
NASA Technical Reports Server (NTRS)
Ray, Paul S.
1996-01-01
The present instruments of safety and reliability risk control for a majority of the National Aeronautics and Space Administration (NASA) programs/projects consist of Failure Mode and Effects Analysis (FMEA), Hazard Analysis (HA), Critical Items List (CIL), and Hazard Report (HR). This extensive analytical approach was introduced in the early 1970's and was implemented for the Space Shuttle Program by NHB 5300.4 (1D-2. Since the Challenger accident in 1986, the process has been expanded considerably and resulted in introduction of similar and/or duplicated activities in the safety/reliability risk analysis. A study initiated in 1995, to search for an alternative to the current FMEA/CIL Hazard Analysis methodology generated a proposed method on April 30, 1996. The objective of this Summer Faculty Study was to participate in and conduct an independent evaluation of the proposed alternative to simplify the present safety and reliability risk control procedure.
Goodson, Patricia
2015-01-01
Background. Documented trends in health-related risk behaviors among US adolescents have remained high over time. Studies indicate relationships among mutual friends are a major influence on adolescents’ risky behaviors. Social Network Analysis (SNA) can help understand friendship ties affecting individual adolescents’ engagement in these behaviors. Moreover, a systematic literature review can synthesize findings from a range of studies using SNA, as well as assess these studies’ methodological quality. Review findings also can help health educators and promoters develop more effective programs. Objective. This review systematically examined studies of the influence of friendship networks on adolescents’ risk behaviors, which utilized SNA and the Add Health data (a nationally representative sample). Methods. We employed the Matrix Method to synthesize and evaluate 15 published studies that met our inclusion and exclusion criteria, retrieved from the Add Health website and 3 major databases (Medline, Eric, and PsycINFO). Moreover, we assigned each study a methodological quality score (MQS). Results. In all studies, friendship networks among adolescents promoted their risky behaviors, including drinking alcohol, smoking, sexual intercourse, and marijuana use. The average MQS was 4.6, an indicator of methodological rigor (scale: 1–9). Conclusion. Better understanding of risky behaviors influenced by friends can be useful for health educators and promoters, as programs targeting friendships might be more effective. Additionally, the overall MQ of these reviewed studies was good, as average scores fell above the scale’s mid-point. PMID:26157622
DOE Office of Scientific and Technical Information (OSTI.GOV)
Domínguez-Gómez, J. Andrés, E-mail: andres@uhu.es
In the last twenty years, both the increase in academic production and the expansion of professional involvement in Environmental Impact Assessment (EIA) and Social Impact Assessment (SIA) have evidenced growing scientific and business interest in risk and impact analysis. However, this growth has not brought with it parallel progress in addressing the main shortcomings of EIA/SIA, i.e. insufficient integration of environmental and social factors into development project analyses and, in cases where the social aspects are considered, technical-methodological failings in their analysis and assessment. It is clear that these weaknesses carry with them substantial threats to the sustainability (social, environmentalmore » and economic) of projects which impact on the environment, and consequently to the local contexts where they are carried out and to the delicate balance of the global ecosystem. This paper argues that, in a sociological context of complexity and dynamism, four conceptual elements should underpin approaches to socio-environmental risk and impact assessment in development projects: a theoretical base in actor–network theory; an ethical grounding in values which are internationally recognized (though not always fulfilled in practice); a (new) epistemological-scientific base; and a methodological foundation in social participation. - Highlights: • A theoretical foundation in actor–network theory • An ethical grounding in values which are internationally recognized, but rarely carried through into practice • A (new) epistemological-scientific base • A methodological foundation in social participation.« less
Zhai, Xiao; Wang, Yiran; Mu, Qingchun; Chen, Xiao; Huang, Qin; Wang, Qijin; Li, Ming
2015-07-01
To appraise the current reporting methodological quality of randomized clinical trials (RCTs) in 3 leading diabetes journals.We systematically searched the literature for RCTs in Diabetes Care, Diabetes and Diabetologia from 2011 to 2013.Characteristics were extracted based on Consolidated Standards of Reporting Trials (CONSORT) statement. Generation of allocation, concealment of allocation, intention-to-treat (ITT) analysis and handling of dropouts were defined as primary outcome and "low risk of bias." Sample size calculation, type of intervention, country, number of patients, funding source were also revealed and descriptively reported. Trials were compared among journals, study years, and other characters.A total of 305 RCTs were enrolled in this study. One hundred eight (35.4%) trials reported adequate generation of allocation, 87 (28.5%) trials reported adequate concealment of allocation, 53 (23.8%) trials used ITT analysis, and 130 (58.3%) trials were adequate in handling of dropouts. Only 15 (4.9%) were "low risk of bias" trials. Studies at a large scale (n > 100) or from European presented with more "low risk of bias" trials than those at a small scale (n ≤ 100) or from other regions. No improvements were found in these 3 years.This study shows that methodological reporting quality of RCTs in the major diabetes journals remains suboptimal. It can be further improved to meet and keep up with the standards of the CONSORT statement.
The effect of routine early amniotomy on spontaneous labor: a meta-analysis.
Brisson-Carroll, G; Fraser, W; Bréart, G; Krauss, I; Thornton, J
1996-05-01
To obtain estimates of the effects of amniotomy on the risk of cesarean delivery and on other indicators of maternal and neonatal morbidity (Apgar score less than 7 at 5 minutes, admission to neonatal intensive care unit [NICU]). Published studies were identified through manual and computerized searches using Medline and the Cochrane Collaboration Pregnancy and Childbirth Database. Our search identified ten trials, all published in peer-reviewed journals. Trials were assigned a methodological quality score based on a standardized rating system. Three trials were excluded from the analysis for methodological limitations. Data were abstracted by two trained reviewers. Typical odds ratios (OR) were calculated. Amniotomy was associated with a reduction in labor duration varying from 0.8-2.3 hours. There was a nonstatistically significant increase in the risk of cesarean delivery; OR 1.2, 95% confidence interval (CI) 0.9-1.6. The risk of a 5-minute Apgar score less than 7 was reduced in association with early amniotomy (OR 0.5, 95% CI 0.3-0.9). Groups were similar with respect to other indicators of neonatal status (arterial cord pH, NICU admissions). Routine early amniotomy is associated with both benefits and risks. Benefits include a reduction in labor duration and a possible reduction in abnormal 5-minute Apgar scores. This meta-analysis provides no support for the hypothesis that routine early amniotomy reduces the risk of cesarean delivery. An association between early amniotomy and cesarean delivery for fetal distress was noted in one large trial, suggesting that amniotomy should be reserved for patients with abnormal labor progress.
NASA Astrophysics Data System (ADS)
Rizzi, Jonathan; Torresan, Silvia; Gallina, Valentina; Critto, Andrea; Marcomini, Antonio
2013-04-01
Europe's coast faces a variety of climate change threats from extreme high tides, storm surges and rising sea levels. In particular, it is very likely that mean sea level rise will contribute to upward trends in extreme coastal high water levels, thus posing higher risks to coastal locations currently experiencing coastal erosion and inundation processes. In 2007 the European Commission approved the Flood Directive (2007/60/EC), which has the main purpose to establish a framework for the assessment and management of flood risks for inland and coastal areas, thus reducing the adverse consequences for human health, the environment, cultural heritage and economic activities. Improvements in scientific understanding are thus needed to inform decision-making about the best strategies for mitigating and managing storm surge risks in coastal areas. The CLIMDAT project is aimed at improving the understanding of the risks related to extreme storm surge events in the coastal area of the North Adriatic Sea (Italy), considering potential climate change scenarios. The project implements a Regional Risk Assessment (RRA) methodology developed in the FP7 KULTURisk project for the assessment of physical/environmental impacts posed by flood hazards and employs the DEcision support SYstem for Coastal climate change impact assessment (DESYCO) for the application of the methodology to the case study area. The proposed RRA methodology is aimed at the identification and prioritization of targets and areas at risk from water-related natural hazards in the considered region at the meso-scale. To this aim, it integrates information about extreme storm surges with bio-geophysical and socio-economic information (e.g. vegetation cover, slope, soil type, population density) of the analyzed receptors (i.e. people, economic activities, cultural heritages, natural and semi-natural systems). Extreme storm surge hazard scenarios are defined using tide gauge time series coming from 28 tide gauge stations located in the North Adriatic coastal areas from 1989 to 2011. These data, together with the sea-level rise scenarios for the considered future timeframe, represent the input for the application of the Joint Probability method (Pugh and Vassie, 1979), which allows the evaluation of the maximum height of extreme storm surge events with different return period and the number of extreme events per year. The methodology uses Geographic Information Systems to manage, process, analyse, and visualize data and employs Multi-Criteria Decision Analysis to integrate stakeholders preferences and experts judgments into the analysis in order to obtain a total risk index in the considered region. The final outputs are represented by GIS-based risk maps which allow the communication of the potential consequences of extreme storm surge to decision makers and stakeholders. Moreover, they can support the establishment of relative priorities for intervention through the identification of suitable areas for human settlements, infrastructures and economic activities. Finally the produced output can represent a basis for definition of storm surge hazard and storm surge risk management plans according to the Floods Directive. The preliminary results of the RRA application in the CLIMDAT project will be here presented and discussed.
NASA Astrophysics Data System (ADS)
Andersson-sköld, Y. B.; Tremblay, M.
2011-12-01
Climate change is in most parts of Sweden expected to result in increased precipitation and increased sea water levels causing flooding, erosion, slope instability and related secondary consequences. Landslide risks are expected to increase with climate change in large parts of Sweden due to increased annual precipitation, more intense precipitation and increased flows combined with dryer summers. In response to the potential climate related risks, and on the commission of the Ministry of Environment, the Swedish Geotechnical Institute (SGI) is at present performing a risk analysis project for the most prominent landslide risk area in Sweden: the Göta river valley. As part of this, a methodology for land slide ex-ante consequence analysis today, and in a future climate, has been developed and applied in the Göta river valley. Human life, settlements, industry, contaminated sites, infrastructure of national importance are invented and assessed important elements at risk. The goal of the consequence analysis is to produce a map of geographically distributed expected losses, which can be combined with a corresponding map displaying landslide probability to describe the risk (the combination of probability and consequence of a (negative) event). The risk analysis is GIS-aided in presenting and visualise the risk and using existing databases for quantification of the consequences represented by ex-ante estimated monetary losses. The results will be used on national, regional and as an indication of the risk on local level, to assess the need of measures to mitigate the risk. The costs and environmental and social impacts to mitigate the risk are expected to be very high but the costs and impacts of a severe landslide are expected to be even higher. Therefore, civil servants have pronounced a need of tools to assess both the vulnerability and a more holistic picture of impacts of climate change adaptation measures. At SGI a tool for the inclusion of sustainability aspects in the decision making process on adaptation measures has been developed and is currently being tested in municipalities including central Gothenburg, and smaller municipalities in Sweden and Norway. The tool is a matrix based decision support tool (MDST) aiming for encoring discussion among experts and stakeholders. The first steps in the decision process include identification, inventory and assessment of the potential impacts of climate change such as landslides (or other events or actions). These steps are also included in general technical/physical risk and vulnerability analyses such as the risk analysis of the Göta älv valley. The MDST also includes further subsequent steps of the risk management process, and the full sequence of the MDST includes risk identification, risk specification, risk assessment, identification of measures, impact analysis of measures including an assessment of environmental, social and economical costs and benefits, a weight process and visualisation of the result. Here the MDST with some examples from the methodology for the Göta river valley analysis and the risk mitigation analysis from Sweden and Norway will be presented.
Polycystic ovary syndrome and risk of endometrial, ovarian, and breast cancer: a systematic review.
Harris, Holly R; Terry, Kathryn L
2016-01-01
Polycystic ovary syndrome (PCOS) is a complex endocrine disorder with an estimated prevalence of 4-21% in reproductive aged women. The altered metabolic and hormonal environment among women with PCOS may increase their risk of some types of cancer. We performed a comprehensive review of the literature using numerous search terms for all studies examining the associations between polycystic ovary syndrome and related characteristics and cancer published in English through October 2016. This review summarizes the epidemiological findings on the associations between PCOS and endometrial, ovarian, and breast cancers and discusses the methodological issues, complexities, and underlying mechanisms of these associations. We identified 11 individual studies and 3 meta-analyses on the associations between PCOS and endometrial cancer, 8 studies and 1 meta-analysis for ovarian cancer, and 10 studies and 1 meta-analysis for breast cancer. Multiple studies reported that women with PCOS were at a higher risk for endometrial cancer; however, many did not take into account body mass index (BMI), a strong and well-established risk factor for endometrial cancer. The association with ovarian cancer was less clear, but a potentially increased risk of the borderline serous subtype was reported by two studies. No consistent association between PCOS risk and breast cancer was observed. The associations between PCOS and endometrial, ovarian, and breast cancer are complex, with the need to consider many methodological issues in future analyses. Larger well-designed studies, or pooled analyses, may help clarify these complex associations.
Young, April M.; Halgin, Daniel S.; DiClemente, Ralph J.; Sterk, Claire E.; Havens, Jennifer R.
2014-01-01
Background An HIV vaccine could substantially impact the epidemic. However, risk compensation (RC), or post-vaccination increase in risk behavior, could present a major challenge. The methodology used in previous studies of risk compensation has been almost exclusively individual-level in focus, and has not explored how increased risk behavior could affect the connectivity of risk networks. This study examined the impact of anticipated HIV vaccine-related RC on the structure of high-risk drug users' sexual and injection risk network. Methods A sample of 433 rural drug users in the US provided data on their risk relationships (i.e., those involving recent unprotected sex and/or injection equipment sharing). Dyad-specific data were collected on likelihood of increasing/initiating risk behavior if they, their partner, or they and their partner received an HIV vaccine. Using these data and social network analysis, a "post-vaccination network" was constructed and compared to the current network on measures relevant to HIV transmission, including network size, cohesiveness (e.g., diameter, component structure, density), and centrality. Results Participants reported 488 risk relationships. Few reported an intention to decrease condom use or increase equipment sharing (4% and 1%, respectively). RC intent was reported in 30 existing risk relationships and vaccination was anticipated to elicit the formation of five new relationships. RC resulted in a 5% increase in risk network size (n = 142 to n = 149) and a significant increase in network density. The initiation of risk relationships resulted in the connection of otherwise disconnected network components, with the largest doubling in size from five to ten. Conclusions This study demonstrates a new methodological approach to studying RC and reveals that behavior change following HIV vaccination could potentially impact risk network connectivity. These data will be valuable in parameterizing future network models that can determine if network-level change precipitated by RC would appreciably impact the vaccine's population-level effectiveness. PMID:24992659
Risk Assessment Methodology for Hazardous Waste Management (1998)
A methodology is described for systematically assessing and comparing the risks to human health and the environment of hazardous waste management alternatives. The methodology selects and links appropriate models and techniques for performing the process.
A stochastic conflict resolution model for trading pollutant discharge permits in river systems.
Niksokhan, Mohammad Hossein; Kerachian, Reza; Amin, Pedram
2009-07-01
This paper presents an efficient methodology for developing pollutant discharge permit trading in river systems considering the conflict of interests of involving decision-makers and the stakeholders. In this methodology, a trade-off curve between objectives is developed using a powerful and recently developed multi-objective genetic algorithm technique known as the Nondominated Sorting Genetic Algorithm-II (NSGA-II). The best non-dominated solution on the trade-off curve is defined using the Young conflict resolution theory, which considers the utility functions of decision makers and stakeholders of the system. These utility functions are related to the total treatment cost and a fuzzy risk of violating the water quality standards. The fuzzy risk is evaluated using the Monte Carlo analysis. Finally, an optimization model provides the trading discharge permit policies. The practical utility of the proposed methodology in decision-making is illustrated through a realistic example of the Zarjub River in the northern part of Iran.
Czolowski, Eliza D; Santoro, Renee L; Srebotnjak, Tanja; Shonkoff, Seth B C
2017-08-23
Higher risk of exposure to environmental health hazards near oil and gas wells has spurred interest in quantifying populations that live in proximity to oil and gas development. The available studies on this topic lack consistent methodology and ignore aspects of oil and gas development of value to public health-relevant assessment and decision-making. We aim to present a methodological framework for oil and gas development proximity studies grounded in an understanding of hydrocarbon geology and development techniques. We geospatially overlay locations of active oil and gas wells in the conterminous United States and Census data to estimate the population living in proximity to hydrocarbon development at the national and state levels. We compare our methods and findings with existing proximity studies. Nationally, we estimate that 17.6 million people live within 1,600m (∼1 mi) of at least one active oil and/or gas well. Three of the eight studies overestimate populations at risk from actively producing oil and gas wells by including wells without evidence of production or drilling completion and/or using inappropriate population allocation methods. The remaining five studies, by omitting conventional wells in regions dominated by historical conventional development, significantly underestimate populations at risk. The well inventory guidelines we present provide an improved methodology for hydrocarbon proximity studies by acknowledging the importance of both conventional and unconventional well counts as well as the relative exposure risks associated with different primary production categories (e.g., oil, wet gas, dry gas) and developmental stages of wells. https://doi.org/10.1289/EHP1535.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Almajali, Anas; Rice, Eric; Viswanathan, Arun
This paper presents a systems analysis approach to characterizing the risk of a Smart Grid to a load-drop attack. A characterization of the risk is necessary for the design of detection and remediation strategies to address the consequences of such attacks. Using concepts from systems health management and system engineering, this work (a) first identifies metrics that can be used to generate constraints for security features, and (b) lays out an end-to-end integrated methodology using separate network and power simulations to assess system risk. We demonstrate our approach by performing a systems-style analysis of a load-drop attack implemented over themore » AMI subsystem and targeted at destabilizing the underlying power grid.« less
Payne, Suzette J.; Coppersmith, Kevin J.; Coppersmith, Ryan; ...
2017-08-23
A key decision for nuclear facilities is evaluating the need for an update of an existing seismic hazard analysis in light of new data and information that has become available since the time that the analysis was completed. We introduce the newly developed risk-informed Seismic Hazard Periodic Review Methodology (referred to as the SHPRM) and present how a Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 probabilistic seismic hazard analysis (PSHA) was performed in an implementation of this new methodology. The SHPRM offers a defensible and documented approach that considers both the changes in seismic hazard and engineering-based risk informationmore » of an existing nuclear facility to assess the need for an update of an existing PSHA. The SHPRM has seven evaluation criteria that are employed at specific analysis, decision, and comparison points which are applied to seismic design categories established for nuclear facilities in United States. The SHPRM is implemented using a SSHAC Level 1 study performed for the Idaho National Laboratory, USA. The implementation focuses on the first six of the seven evaluation criteria of the SHPRM which are all provided from the SSHAC Level 1 PSHA. Finally, to illustrate outcomes of the SHPRM that do not lead to the need for an update and those that do, the example implementations of the SHPRM are performed for nuclear facilities that have target performance goals expressed as the mean annual frequency of unacceptable performance at 1x10 -4, 4x10 -5 and 1x10 -5.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Payne, Suzette J.; Coppersmith, Kevin J.; Coppersmith, Ryan
A key decision for nuclear facilities is evaluating the need for an update of an existing seismic hazard analysis in light of new data and information that has become available since the time that the analysis was completed. We introduce the newly developed risk-informed Seismic Hazard Periodic Review Methodology (referred to as the SHPRM) and present how a Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 probabilistic seismic hazard analysis (PSHA) was performed in an implementation of this new methodology. The SHPRM offers a defensible and documented approach that considers both the changes in seismic hazard and engineering-based risk informationmore » of an existing nuclear facility to assess the need for an update of an existing PSHA. The SHPRM has seven evaluation criteria that are employed at specific analysis, decision, and comparison points which are applied to seismic design categories established for nuclear facilities in United States. The SHPRM is implemented using a SSHAC Level 1 study performed for the Idaho National Laboratory, USA. The implementation focuses on the first six of the seven evaluation criteria of the SHPRM which are all provided from the SSHAC Level 1 PSHA. Finally, to illustrate outcomes of the SHPRM that do not lead to the need for an update and those that do, the example implementations of the SHPRM are performed for nuclear facilities that have target performance goals expressed as the mean annual frequency of unacceptable performance at 1x10 -4, 4x10 -5 and 1x10 -5.« less
[Risk Management: concepts and chances for public health].
Palm, Stefan; Cardeneo, Margareta; Halber, Marco; Schrappe, Matthias
2002-01-15
Errors are a common problem in medicine and occur as a result of a complex process involving many contributing factors. Medical errors significantly reduce the safety margin for the patient and contribute additional costs in health care delivery. In most cases adverse events cannot be attributed to a single underlying cause. Therefore an effective risk management strategy must follow a system approach, which is based on counting and analysis of near misses. The development of defenses against the undesired effects of errors should be the main focus rather than asking the question "Who blundered?". Analysis of near misses (which in this context can be compared to indicators) offers several methodological advantages as compared to the analysis of errors and adverse events. Risk management is an integral element of quality management.
Through ARIPAR-GIS the quantified area risk analysis supports land-use planning activities.
Spadoni, G; Egidi, D; Contini, S
2000-01-07
The paper first summarises the main aspects of the ARIPAR methodology whose steps can be applied to quantify the impact on a territory of major accident risks due to processing, storing and transporting dangerous substances. Then the capabilities of the new decision support tool ARIPAR-GIS, implementing the mentioned procedure, are described, together with its main features and types of results. These are clearly shown through a short description of the updated ARIPAR study (reference year 1994), in which the impact of changes due to industrial and transportation dynamics on the Ravenna territory in Italy were evaluated. The brief explanation of how results have been used by local administrations offers the opportunity to discuss about advantages of the quantitative area risk analysis tool in supporting activities of risk management, risk control and land-use planning.
Khadam, Ibrahim; Kaluarachchi, Jagath J
2003-07-01
Decision analysis in subsurface contamination management is generally carried out through a traditional engineering economic viewpoint. However, new advances in human health risk assessment, namely, the probabilistic risk assessment, and the growing awareness of the importance of soft data in the decision-making process, require decision analysis methodologies that are capable of accommodating non-technical and politically biased qualitative information. In this work, we discuss the major limitations of the currently practiced decision analysis framework, which evolves around the definition of risk and cost of risk, and its poor ability to communicate risk-related information. A demonstration using a numerical example was conducted to provide insight on these limitations of the current decision analysis framework. The results from this simple ground water contamination and remediation scenario were identical to those obtained from studies carried out on existing Superfund sites, which suggests serious flaws in the current risk management framework. In order to provide a perspective on how these limitations may be avoided in future formulation of the management framework, more matured and well-accepted approaches to decision analysis in dam safety and the utility industry, where public health and public investment are of great concern, are presented and their applicability in subsurface remediation management is discussed. Finally, in light of the success of the application of risk-based decision analysis in dam safety and the utility industry, potential options for decision analysis in subsurface contamination management are discussed.
Risk Factors for Low Back Pain in Childhood and Adolescence: A Systematic Review.
Calvo-Muñoz, Inmaculada; Kovacs, Francisco M; Roqué, Marta; Gago Fernández, Inés; Seco Calvo, Jesús
2018-05-01
To identify factors associated with low back pain (LBP) in children and adolescents. A systematic review was conducted (Prospero CRD42016038186). Observational studies analyzing LBP risk factors among participants aged between 9 and 16 were searched for in 13 electronic databases and 8 specialized journals until March 31, 2016, with no language restrictions. In addition, references in the identified studies were manually tracked. All identified studies that included ≥50 participants aged 9 to 16, were reviewed. Their methodological quality was assessed by 2 reviewers separately, using validated tools, which scored, from worst to best, 0 to 100 for cross-sectional and 0 to 12 for cohort studies. A sensitivity analysis only included studies that had adjusted for confounders, had ≥500 participants, and had a methodological score of ≥50%. A total of 5142 citations were screened and 61 studies, including 137,877 participants from 5 continents, were reviewed. Their mean (range) methodological scores were 74.56 (50 to 100) for cross-sectional studies and 7.36 (5 to 9) for cohort studies. The studies had assessed 35 demographic, clinical, biological, family, psychological, ergonomic, and lifestyle risk factors. The mean (range) prevalence of LBP ranged between 15.25% (3.20 to 57.00) for point prevalence and 38.98% (11.60 to 85.56) for lifetime prevalence. Results on the association between LBP and risk factors were inconsistent. In the sensitivity analysis, "older age" and "participation in competitive sports" showed a consistent association with LBP. Future studies should focus on muscle characteristics, the relationship between body and backpack weights, duration of carrying the backpack, characteristics of sport practice, and which are the factors associated with specifically chronic pain.
Use of benzodiazepine and risk of cancer: A meta-analysis of observational studies.
Kim, Hong-Bae; Myung, Seung-Kwon; Park, Yon Chul; Park, Byoungjin
2017-02-01
Several observational epidemiological studies have reported inconsistent results on the association between the use of benzodiazepine and the risk of cancer. We investigated the association by using a meta-analysis. We searched PubMed, EMBASE, and the bibliographies of relevant articles to locate additional publications in January 2016. Three evaluators independently reviewed and selected eligible studies based on predetermined selection criteria. Of 796 articles meeting our initial criteria, a total of 22 observational epidemiological studies with 18 case-control studies and 4 cohort studies were included in the final analysis. Benzodiazepine use was significantly associated with an increased risk of cancer (odds ratio [OR] or relative risk [RR] 1.19; 95% confidence interval 1.16-1.21) in a random-effects meta-analysis of all studies. Subgroup meta-analyses by various factors such as study design, type of case-control study, study region, and methodological quality of study showed consistent findings. Also, a significant dose-response relationship was observed between the use of benzodiazepine and the risk of cancer (p for trend <0.01). The current meta-analysis of observational epidemiological studies suggests that benzodiazepine use is associated with an increased risk of cancer. © 2016 UICC.
Improving default risk prediction using Bayesian model uncertainty techniques.
Kazemi, Reza; Mosleh, Ali
2012-11-01
Credit risk is the potential exposure of a creditor to an obligor's failure or refusal to repay the debt in principal or interest. The potential of exposure is measured in terms of probability of default. Many models have been developed to estimate credit risk, with rating agencies dating back to the 19th century. They provide their assessment of probability of default and transition probabilities of various firms in their annual reports. Regulatory capital requirements for credit risk outlined by the Basel Committee on Banking Supervision have made it essential for banks and financial institutions to develop sophisticated models in an attempt to measure credit risk with higher accuracy. The Bayesian framework proposed in this article uses the techniques developed in physical sciences and engineering for dealing with model uncertainty and expert accuracy to obtain improved estimates of credit risk and associated uncertainties. The approach uses estimates from one or more rating agencies and incorporates their historical accuracy (past performance data) in estimating future default risk and transition probabilities. Several examples demonstrate that the proposed methodology can assess default probability with accuracy exceeding the estimations of all the individual models. Moreover, the methodology accounts for potentially significant departures from "nominal predictions" due to "upsetting events" such as the 2008 global banking crisis. © 2012 Society for Risk Analysis.
CONSULTATION ON UPDATED METHODOLOGY FOR ...
The National Academy of Sciences (NAS) expects to publish the Biological Effects of Ionizing Radiation (BEIR) committee's report (BEIR VII) on risks from ionizing radiation exposures in calendar year 2005. The committee is expected to have analyzed the most recent epidemiology from the important exposed cohorts and to have factored in any changes resulting from the updated analysis of dosimetry for the Japanese atomic bomb survivors. To the extent practical, the Committee will also consider any relevant radiobiological data, including those from the Department of Energy's low dose effects research program. Based on their evaluation of relevant information, the Committee is then expected to propose a set of models for estimating risks from low-dose ionizing radiation. ORIA will review the BEIR VII report and consider revisions to the Agency's methodology for estimating cancer risks from exposure to ionizing radiation in light of this report and other relevant information. This will be the subject of the Consultation. This project supports a major risk management initiative to improve the basis on which radiation risk decisions are made. This project, funded by several Federal Agencies, reflects an attempt to characterize risks where there are substantial uncertainties. The outcome will improve our ability to assess risks well into the future and will strengthen EPAs overall capability for assessing and managing radiation risks. the BEIR VII report is funde
Early Detection for Dengue Using Local Indicator of Spatial Association (LISA) Analysis.
Parra-Amaya, Mayra Elizabeth; Puerta-Yepes, María Eugenia; Lizarralde-Bejarano, Diana Paola; Arboleda-Sánchez, Sair
2016-03-29
Dengue is a viral disease caused by a flavivirus that is transmitted by mosquitoes of the genus Aedes . There is currently no specific treatment or commercial vaccine for its control and prevention; therefore, mosquito population control is the only alternative for preventing the occurrence of dengue. For this reason, entomological surveillance is recommended by World Health Organization (WHO) to measure dengue risk in endemic areas; however, several works have shown that the current methodology (aedic indices) is not sufficient for predicting dengue. In this work, we modified indices proposed for epidemic periods. The raw value of the epidemiological wave could be useful for detecting risk in epidemic periods; however, risk can only be detected if analyses incorporate the maximum epidemiological wave. Risk classification was performed according to Local Indicators of Spatial Association (LISA) methodology. The modified indices were analyzed using several hypothetical scenarios to evaluate their sensitivity. We found that modified indices could detect spatial and differential risks in epidemic and endemic years, which makes them a useful tool for the early detection of a dengue outbreak. In conclusion, the modified indices could predict risk at the spatio-temporal level in endemic years and could be incorporated in surveillance activities in endemic places.
A phased approach to induced seismicity risk management
White, Joshua A.; Foxall, William
2014-01-01
This work describes strategies for assessing and managing induced seismicity risk during each phase of a carbon storage project. We consider both nuisance and damage potential from induced earthquakes, as well as the indirect risk of enhancing fault leakage pathways. A phased approach to seismicity management is proposed, in which operations are continuously adapted based on available information and an on-going estimate of risk. At each project stage, specific recommendations are made for (a) monitoring and characterization, (b) modeling and analysis, and (c) site operations. The resulting methodology can help lower seismic risk while ensuring site operations remain practical andmore » cost-effective.« less
Khan, F I; Abbasi, S A
2000-07-10
Fault tree analysis (FTA) is based on constructing a hypothetical tree of base events (initiating events) branching into numerous other sub-events, propagating the fault and eventually leading to the top event (accident). It has been a powerful technique used traditionally in identifying hazards in nuclear installations and power industries. As the systematic articulation of the fault tree is associated with assigning probabilities to each fault, the exercise is also sometimes called probabilistic risk assessment. But powerful as this technique is, it is also very cumbersome and costly, limiting its area of application. We have developed a new algorithm based on analytical simulation (named as AS-II), which makes the application of FTA simpler, quicker, and cheaper; thus opening up the possibility of its wider use in risk assessment in chemical process industries. Based on the methodology we have developed a computer-automated tool. The details are presented in this paper.
Methodological quality of systematic reviews on influenza vaccination.
Remschmidt, Cornelius; Wichmann, Ole; Harder, Thomas
2014-03-26
There is a growing body of evidence on the risks and benefits of influenza vaccination in various target groups. Systematic reviews are of particular importance for policy decisions. However, their methodological quality can vary considerably. To investigate the methodological quality of systematic reviews on influenza vaccination (efficacy, effectiveness, safety) and to identify influencing factors. A systematic literature search on systematic reviews on influenza vaccination was performed, using MEDLINE, EMBASE and three additional databases (1990-2013). Review characteristics were extracted and the methodological quality of the reviews was evaluated using the assessment of multiple systematic reviews (AMSTAR) tool. U-test, Kruskal-Wallis test, chi-square test, and multivariable linear regression analysis were used to assess the influence of review characteristics on AMSTAR-score. Fourty-six systematic reviews fulfilled the inclusion criteria. Average methodological quality was high (median AMSTAR-score: 8), but variability was large (AMSTAR range: 0-11). Quality did not differ significantly according to vaccination target group. Cochrane reviews had higher methodological quality than non-Cochrane reviews (p=0.001). Detailed analysis showed that this was due to better study selection and data extraction, inclusion of unpublished studies, and better reporting of study characteristics (all p<0.05). In the adjusted analysis, no other factor, including industry sponsorship or journal impact factor had an influence on AMSTAR score. Systematic reviews on influenza vaccination showed large differences regarding their methodological quality. Reviews conducted by the Cochrane collaboration were of higher quality than others. When using systematic reviews to guide the development of vaccination recommendations, the methodological quality of a review in addition to its content should be considered. Copyright © 2014 Elsevier Ltd. All rights reserved.
Analysis of labour risks in the Spanish industrial aerospace sector.
Laguardia, Juan; Rubio, Emilio; Garcia, Ana; Garcia-Foncillas, Rafael
2016-01-01
Labour risk prevention is an activity integrated within Safety and Hygiene at Work in Spain. In 2003, the Electronic Declaration for Accidents at Work, Delt@ (DELTA) was introduced. The industrial aerospace sector is subject to various risks. Our objective is to analyse the Spanish Industrial Aerospace Sector (SIAS) using the ACSOM methodology to assess its labour risks and to prioritise preventive actions. The SIAS and the Services Subsector (SS) were created and the relevant accident rate data were obtained. The ACSOM method was applied through double contrast (deviation and translocation) of the SIAS or SS risk polygon with the considered pattern, accidents from all sectors (ACSOM G) or the SIAS. A list of risks was obtained, ordered by action phases. In the SIAS vs. ACSOM G analysis, radiation risks were the worst, followed by overstrains. Accidents caused by living beings were also significant in the SS vs. SIAE, which will be able to be used to improve Risk Prevention. Radiation is the most significant risk in the SIAS and the SS. Preventive actions will be primary and secondary. ACSOM has shown itself to be a valid tool for the analysis of labour risks.
A Simplified Approach to Risk Assessment Based on System Dynamics: An Industrial Case Study.
Garbolino, Emmanuel; Chery, Jean-Pierre; Guarnieri, Franck
2016-01-01
Seveso plants are complex sociotechnical systems, which makes it appropriate to support any risk assessment with a model of the system. However, more often than not, this step is only partially addressed, simplified, or avoided in safety reports. At the same time, investigations have shown that the complexity of industrial systems is frequently a factor in accidents, due to interactions between their technical, human, and organizational dimensions. In order to handle both this complexity and changes in the system over time, this article proposes an original and simplified qualitative risk evaluation method based on the system dynamics theory developed by Forrester in the early 1960s. The methodology supports the development of a dynamic risk assessment framework dedicated to industrial activities. It consists of 10 complementary steps grouped into two main activities: system dynamics modeling of the sociotechnical system and risk analysis. This system dynamics risk analysis is applied to a case study of a chemical plant and provides a way to assess the technological and organizational components of safety. © 2016 Society for Risk Analysis.
Risk Analysis of a Fuel Storage Terminal Using HAZOP and FTA
Baixauli-Pérez, Mª Piedad
2017-01-01
The size and complexity of industrial chemical plants, together with the nature of the products handled, means that an analysis and control of the risks involved is required. This paper presents a methodology for risk analysis in chemical and allied industries that is based on a combination of HAZard and OPerability analysis (HAZOP) and a quantitative analysis of the most relevant risks through the development of fault trees, fault tree analysis (FTA). Results from FTA allow prioritizing the preventive and corrective measures to minimize the probability of failure. An analysis of a case study is performed; it consists in the terminal for unloading chemical and petroleum products, and the fuel storage facilities of two companies, in the port of Valencia (Spain). HAZOP analysis shows that loading and unloading areas are the most sensitive areas of the plant and where the most significant danger is a fuel spill. FTA analysis indicates that the most likely event is a fuel spill in tank truck loading area. A sensitivity analysis from the FTA results show the importance of the human factor in all sequences of the possible accidents, so it should be mandatory to improve the training of the staff of the plants. PMID:28665325
Risk Analysis of a Fuel Storage Terminal Using HAZOP and FTA.
Fuentes-Bargues, José Luis; González-Cruz, Mª Carmen; González-Gaya, Cristina; Baixauli-Pérez, Mª Piedad
2017-06-30
The size and complexity of industrial chemical plants, together with the nature of the products handled, means that an analysis and control of the risks involved is required. This paper presents a methodology for risk analysis in chemical and allied industries that is based on a combination of HAZard and OPerability analysis (HAZOP) and a quantitative analysis of the most relevant risks through the development of fault trees, fault tree analysis (FTA). Results from FTA allow prioritizing the preventive and corrective measures to minimize the probability of failure. An analysis of a case study is performed; it consists in the terminal for unloading chemical and petroleum products, and the fuel storage facilities of two companies, in the port of Valencia (Spain). HAZOP analysis shows that loading and unloading areas are the most sensitive areas of the plant and where the most significant danger is a fuel spill. FTA analysis indicates that the most likely event is a fuel spill in tank truck loading area. A sensitivity analysis from the FTA results show the importance of the human factor in all sequences of the possible accidents, so it should be mandatory to improve the training of the staff of the plants.
Carlsson, Ing-Marie; Blomqvist, Marjut; Jormfeldt, Henrika
2017-01-01
Undertaking research studies in the field of mental health is essential in mental health nursing. Qualitative research methodologies enable human experiences to become visible and recognize the importance of lived experiences. This paper argues that involving people with schizophrenia in research is critical to promote their health and well-being. The quality of qualitative research needs scrutinizing according to methodological issues such as trustworthiness and ethical standards that are a fundamental part of qualitative research and nursing curricula. The aim of this study was to critically review recent qualitative studies involving people with severe and persistent mental illness such as schizophrenia and other psychotic conditions, regarding descriptions of ethical and methodological issues in data collection and analysis. A search for relevant papers was conducted in three electronic databases, in December 2016. Fifteen qualitative interview studies were included and reviewed regarding methodological issues related to ethics, and data collection and analysis. The results revealed insufficient descriptions of methodology regarding ethical considerations and issues related to recruitment and sampling in qualitative interview studies with individuals with severe mental illness, putting trustworthiness at risk despite detailed descriptions of data analysis. Knowledge from the perspective of individuals with their own experience of mental illness is essential. Issues regarding sampling and trustworthiness in qualitative studies involving people with severe mental illness are vital to counteract the stigmatization of mental illness.
Carlsson, Ing-Marie; Blomqvist, Marjut; Jormfeldt, Henrika
2017-01-01
ABSTRACT Undertaking research studies in the field of mental health is essential in mental health nursing. Qualitative research methodologies enable human experiences to become visible and recognize the importance of lived experiences. This paper argues that involving people with schizophrenia in research is critical to promote their health and well-being. The quality of qualitative research needs scrutinizing according to methodological issues such as trustworthiness and ethical standards that are a fundamental part of qualitative research and nursing curricula. The aim of this study was to critically review recent qualitative studies involving people with severe and persistent mental illness such as schizophrenia and other psychotic conditions, regarding descriptions of ethical and methodological issues in data collection and analysis. A search for relevant papers was conducted in three electronic databases, in December 2016. Fifteen qualitative interview studies were included and reviewed regarding methodological issues related to ethics, and data collection and analysis. The results revealed insufficient descriptions of methodology regarding ethical considerations and issues related to recruitment and sampling in qualitative interview studies with individuals with severe mental illness, putting trustworthiness at risk despite detailed descriptions of data analysis. Knowledge from the perspective of individuals with their own experience of mental illness is essential. Issues regarding sampling and trustworthiness in qualitative studies involving people with severe mental illness are vital to counteract the stigmatization of mental illness. PMID:28901217
Dynamic safety assessment of natural gas stations using Bayesian network.
Zarei, Esmaeil; Azadeh, Ali; Khakzad, Nima; Aliabadi, Mostafa Mirzaei; Mohammadfam, Iraj
2017-01-05
Pipelines are one of the most popular and effective ways of transporting hazardous materials, especially natural gas. However, the rapid development of gas pipelines and stations in urban areas has introduced a serious threat to public safety and assets. Although different methods have been developed for risk analysis of gas transportation systems, a comprehensive methodology for risk analysis is still lacking, especially in natural gas stations. The present work is aimed at developing a dynamic and comprehensive quantitative risk analysis (DCQRA) approach for accident scenario and risk modeling of natural gas stations. In this approach, a FMEA is used for hazard analysis while a Bow-tie diagram and Bayesian network are employed to model the worst-case accident scenario and to assess the risks. The results have indicated that the failure of the regulator system was the worst-case accident scenario with the human error as the most contributing factor. Thus, in risk management plan of natural gas stations, priority should be given to the most probable root events and main contribution factors, which have identified in the present study, in order to reduce the occurrence probability of the accident scenarios and thus alleviate the risks. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Sperotto, Anna; Torresan, Silvia; Gallina, Valentina; Coppola, Erika; Critto, Andrea; Marcomini, Antonio
2015-04-01
Global climate change is expected to affect the intensity and frequency of extreme events (e.g. heat waves, drought, heavy precipitations events) leading to increasing natural disasters and damaging events (e.g. storms, pluvial floods and coastal flooding) worldwide. Especially in urban areas, disasters risks can be exacerbated by changes in exposure and vulnerability patterns (i.e. urbanization, population growth) and should be addressed by adopting a multi-disciplinary approach. A Regional Risk Assessment (RRA) methodology integrating climate and environmental sciences with bottom-up participative processes was developed and applied to the urban territory of the municipality of Venice in order to evaluate the potential consequences of climate change on pluvial flood risk in urban areas. Based on the consecutive analysis of hazard, exposure, vulnerability and risks, the RRA methodology is a screening risk tool to identify and prioritize major elements at risk (e.g. residential, commercial areas and infrastructures) and to localize sub-areas that are more likely to be affected by flood risk due to heavy precipitation events, in the future scenario (2041-2050). From the early stages of its development and application, the RRA followed a bottom-up approach to select and score site-specific vulnerability factors (e.g. slope, permeability of the soil, past flooded areas) and to consider the requests and perspectives of local stakeholders of the North Adriatic region, by means of interactive workshops, surveys and discussions. The main outputs of the assessment are risk and vulnerability maps and statistics aimed at increasing awareness about the potential effect of climate change on pluvial flood risks and at identifying hot-spot areas where future adaptation actions should be required to decrease physical-environmental vulnerabilities or building resilience and coping capacity of human society to climate change. The overall risk assessment methodology and the results of its application to the territory of the municipality of Venice will be here presented and discussed.
Health risks of energy systems.
Krewitt, W; Hurley, F; Trukenmüller, A; Friedrich, R
1998-08-01
Health risks from fossil, renewable and nuclear reference energy systems are estimated following a detailed impact pathway approach. Using a set of appropriate air quality models and exposure-effect functions derived from the recent epidemiological literature, a methodological framework for risk assessment has been established and consistently applied across the different energy systems, including the analysis of consequences from a major nuclear accident. A wide range of health impacts resulting from increased air pollution and ionizing radiation is quantified, and the transferability of results derived from specific power plants to a more general context is discussed.
Bero, L; Anglemyer, A; Vesterinen, H; Krauth, D
2016-01-01
A critical component of systematic review methodology is the assessment of the risks of bias of studies that are included in the review. There is controversy about whether funding source should be included in a risk of bias assessment of animal toxicology studies. To determine whether industry research sponsorship is associated with methodological biases, the results, or conclusions of animal studies examining the effect of exposure to atrazine on reproductive or developmental outcomes. We searched multiple electronic databases and the reference lists of relevant articles to identify original research studies examining the effect of any dose of atrazine exposure at any life stage on reproduction or development in non-human animals. We compared methodological risks of bias, the conclusions of the studies, the statistical significance of the findings, and the magnitude of effect estimates between industry sponsored and non-industry sponsored studies. Fifty-one studies met the inclusion criteria. There were no differences in methodological risks of bias in industry versus non-industry sponsored studies. 39 studies tested environmentally relevant concentrations of atrazine (11 industry sponsored, 24 non-industry sponsored, 4 with no funding disclosures). Non-industry sponsored studies (12/24, 50.0%) were more likely to conclude that atrazine was harmful compared to industry sponsored studies (2/11, 18.1%) (p value=0.07). A higher proportion of non-industry sponsored studies reported statistically significant harmful effects (8/24, 33.3%) compared to industry-sponsored studies (1/11; 9.1%) (p value=0.13). The association of industry sponsorship with decreased effect sizes for harm outcomes was inconclusive. Our findings support the inclusion of research sponsorship as a risk of bias criterion in tools used to assess risks of bias in animal studies for systematic reviews. The reporting of other empirically based risk of bias criteria for animal studies, such as blinded outcome assessment, randomization, and all animals included in analyses, needs to improve to facilitate the assessment of studies for systematic reviews. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Zarlenga, Antonio; de Barros, Felipe; Fiori, Aldo
2016-04-01
We present a probabilistic framework for assessing human health risk due to groundwater contamination. Our goal is to quantify how physical hydrogeological and biochemical parameters control the magnitude and uncertainty of human health risk. Our methodology captures the whole risk chain from the aquifer contamination to the tap water assumption by human population. The contaminant concentration, the key parameter for the risk estimation, is governed by the interplay between the large-scale advection, caused by heterogeneity and the degradation processes strictly related to the local scale dispersion processes. The core of the hazard identification and of the methodology is the reactive transport model: erratic displacement of contaminant in groundwater, due to the spatial variability of hydraulic conductivity (K), is characterized by a first-order Lagrangian stochastic model; different dynamics are considered as possible ways of biodegradation in aerobic and anaerobic conditions. With the goal of quantifying uncertainty, the Beta distribution is assumed for the concentration probability density function (pdf) model, while different levels of approximation are explored for the estimation of the one-point concentration moments. The information pertaining the flow and transport is connected with a proper dose response assessment which generally involves the estimation of physiological parameters of the exposed population. Human health response depends on the exposed individual metabolism (e.g. variability) and is subject to uncertainty. Therefore, the health parameters are intrinsically a stochastic. As a consequence, we provide an integrated in a global probabilistic human health risk framework which allows the propagation of the uncertainty from multiple sources. The final result, the health risk pdf, is expressed as function of a few relevant, physically-based parameters such as the size of the injection area, the Péclet number, the K structure metrics and covariance shape, reaction parameters pertaining to aerobic and anaerobic degradation processes respectively as well as the dose response parameters. Even though the final result assumes a relatively simple form, few numerical quadratures are required in order to evaluate the trajectory moments of the solute plume. In order to perform a sensitivity analysis we apply the methodology to a hypothetical case study. The scenario investigated is made by an aquifer which constitutes a water supply for a population where a continuous source of NAPL contaminant feeds a steady plume. The risk analysis is limited to carcinogenic compounds for which the well-known linear relation for human risk is assumed. Analysis performed shows few interesting findings: the risk distribution is strictly dependent on the pore scale dynamics that trigger dilution and mixing; biodegradation may involve a significant reduction of the risk.
NASA Astrophysics Data System (ADS)
Babovic, Filip; Mijic, Ana; Madani, Kaveh
2017-04-01
Urban areas around the world are growing in size and importance; however, cities experience elevated risks of pluvial flooding due to the prevalence of impermeable land surfaces within them. Urban planners and engineers encounter a great deal of uncertainty when planning adaptations to these flood risks, due to the interaction of multiple factors such as climate change and land use change. This leads to conditions of deep uncertainty. Blue-Green (BG) solutions utilise natural vegetation and processes to absorb and retain runoff while providing a host of other social, economic and environmental services. When utilised in conjunction with Decision Making under Deep Uncertainty (DMDU) methodologies, BG infrastructure provides a flexible and adaptable method of "no-regret" adaptation; resulting in a practical, economically efficient, and socially acceptable solution for flood risk mitigation. This work presents the methodology for analysing the impact of BG infrastructure in the context of the Adaptation Tipping Points approach to protect against pluvial flood risk in an iterative manner. An economic analysis of the adaptation pathways is also conducted in order to better inform decision-makers on the benefits and costs of the adaptation options presented. The methodology was applied to a case study in the Cranbrook Catchment in the North East of London. Our results show that BG infrastructure performs better under conditions of uncertainty than traditional grey infrastructure.
SSHAC Level 1 Probabilistic Seismic Hazard Analysis for the Idaho National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Payne, Suzette; Coppersmith, Ryan; Coppersmith, Kevin
A Probabilistic Seismic Hazard Analysis (PSHA) was completed for the Materials and Fuels Complex (MFC), Naval Reactors Facility (NRF), and the Advanced Test Reactor (ATR) at Idaho National Laboratory (INL) (Figure 1-1). The PSHA followed the approaches and procedures appropriate for a Study Level 1 provided in the guidance advanced by the Senior Seismic Hazard Analysis Committee (SSHAC) in U.S. Nuclear Regulatory Commission (NRC) NUREG/CR-6372 and NUREG-2117 (NRC, 1997; 2012a). The SSHAC Level 1 PSHAs for MFC and ATR were conducted as part of the Seismic Risk Assessment (SRA) project (INL Project number 31287) to develop and apply a new-riskmore » informed methodology, respectively. The SSHAC Level 1 PSHA was conducted for NRF to provide guidance on the potential use of a design margin above rock hazard levels. The SRA project is developing a new risk-informed methodology that will provide a systematic approach for evaluating the need for an update of an existing PSHA. The new methodology proposes criteria to be employed at specific analysis, decision, or comparison points in its evaluation process. The first four of seven criteria address changes in inputs and results of the PSHA and are given in U.S. Department of Energy (DOE) Standard, DOE-STD-1020-2012 (DOE, 2012a) and American National Standards Institute/American Nuclear Society (ANSI/ANS) 2.29 (ANS, 2008a). The last three criteria address evaluation of quantitative hazard and risk-focused information of an existing nuclear facility. The seven criteria and decision points are applied to Seismic Design Category (SDC) 3, 4, and 5, which are defined in American Society of Civil Engineers/Structural Engineers Institute (ASCE/SEI) 43-05 (ASCE, 2005). The application of the criteria and decision points could lead to an update or could determine that such update is not necessary.« less
Comparison of Traditional and Trial-Based Methodologies for Conducting Functional Analyses
ERIC Educational Resources Information Center
LaRue, Robert H.; Lenard, Karen; Weiss, Mary Jane; Bamond, Meredith; Palmieri, Mark; Kelley, Michael E.
2010-01-01
Functional analysis represents a sophisticated and empirically supported functional assessment procedure. While these procedures have garnered considerable empirical support, they are often underused in clinical practice. Safety risks resulting from the evocation of maladaptive behavior and the length of time required to conduct functional…
Quantified Risk Ranking Model for Condition-Based Risk and Reliability Centered Maintenance
NASA Astrophysics Data System (ADS)
Chattopadhyaya, Pradip Kumar; Basu, Sushil Kumar; Majumdar, Manik Chandra
2017-06-01
In the recent past, risk and reliability centered maintenance (RRCM) framework is introduced with a shift in the methodological focus from reliability and probabilities (expected values) to reliability, uncertainty and risk. In this paper authors explain a novel methodology for risk quantification and ranking the critical items for prioritizing the maintenance actions on the basis of condition-based risk and reliability centered maintenance (CBRRCM). The critical items are identified through criticality analysis of RPN values of items of a system and the maintenance significant precipitating factors (MSPF) of items are evaluated. The criticality of risk is assessed using three risk coefficients. The likelihood risk coefficient treats the probability as a fuzzy number. The abstract risk coefficient deduces risk influenced by uncertainty, sensitivity besides other factors. The third risk coefficient is called hazardous risk coefficient, which is due to anticipated hazards which may occur in the future and the risk is deduced from criteria of consequences on safety, environment, maintenance and economic risks with corresponding cost for consequences. The characteristic values of all the three risk coefficients are obtained with a particular test. With few more tests on the system, the values may change significantly within controlling range of each coefficient, hence `random number simulation' is resorted to obtain one distinctive value for each coefficient. The risk coefficients are statistically added to obtain final risk coefficient of each critical item and then the final rankings of critical items are estimated. The prioritization in ranking of critical items using the developed mathematical model for risk assessment shall be useful in optimization of financial losses and timing of maintenance actions.
Zimmermann, Hartmut F; Hentschel, Norbert
2011-01-01
With the publication of the quality guideline ICH Q9 "Quality Risk Management" by the International Conference on Harmonization, risk management has already become a standard requirement during the life cycle of a pharmaceutical product. Failure mode and effect analysis (FMEA) is a powerful risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to biopharmaceutical processes brings about some difficulties. The proposal presented here is intended to serve as a brief but nevertheless comprehensive and detailed guideline on how to conduct a biopharmaceutical process FMEA. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. The application for such a biopharmaceutical process FMEA is widespread. It can be useful whenever a biopharmaceutical manufacturing process is developed or scaled-up, or when it is transferred to a different manufacturing site. It may also be conducted during substantial optimization of an existing process or the development of a second-generation process. According to their resulting risk ratings, process parameters can be ranked for importance and important variables for process development, characterization, or validation can be identified. Health authorities around the world ask pharmaceutical companies to manage risk during development and manufacturing of pharmaceuticals. The so-called failure mode and effect analysis (FMEA) is an established risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to pharmaceutical processes that use modern biotechnology (biopharmaceutical processes) brings about some difficulties, because those biopharmaceutical processes differ from processes in mechanical and electrical industries. The proposal presented here explains how a biopharmaceutical process FMEA can be conducted. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. With the help of this guideline, different details of the manufacturing process can be ranked according to their potential risks, and this can help pharmaceutical companies to identify aspects with high potential risks and to react accordingly to improve the safety of medicines.
Methodology for Collision Risk Assessment of an Airspace Flow Corridor Concept
NASA Astrophysics Data System (ADS)
Zhang, Yimin
This dissertation presents a methodology to estimate the collision risk associated with a future air-transportation concept called the flow corridor. The flow corridor is a Next Generation Air Transportation System (NextGen) concept to reduce congestion and increase throughput in en-route airspace. The flow corridor has the potential to increase throughput by reducing the controller workload required to manage aircraft outside the corridor and by reducing separation of aircraft within corridor. The analysis in this dissertation is a starting point for the safety analysis required by the Federal Aviation Administration (FAA) to eventually approve and implement the corridor concept. This dissertation develops a hybrid risk analysis methodology that combines Monte Carlo simulation with dynamic event tree analysis. The analysis captures the unique characteristics of the flow corridor concept, including self-separation within the corridor, lane change maneuvers, speed adjustments, and the automated separation assurance system. Monte Carlo simulation is used to model the movement of aircraft in the flow corridor and to identify precursor events that might lead to a collision. Since these precursor events are not rare, standard Monte Carlo simulation can be used to estimate these occurrence rates. Dynamic event trees are then used to model the subsequent series of events that may lead to collision. When two aircraft are on course for a near-mid-air collision (NMAC), the on-board automated separation assurance system provides a series of safety layers to prevent the impending NNAC or collision. Dynamic event trees are used to evaluate the potential failures of these layers in order to estimate the rare-event collision probabilities. The results show that the throughput can be increased by reducing separation to 2 nautical miles while maintaining the current level of safety. A sensitivity analysis shows that the most critical parameters in the model related to the overall collision probability are the minimum separation, the probability that both flights fail to respond to traffic collision avoidance system, the probability that an NMAC results in a collision, the failure probability of the automatic dependent surveillance broadcast in receiver, and the conflict detection probability.
[Evaluation of medication risk in pregnant women: methodology of evaluation and risk management].
Eléfant, E; Sainte-Croix, A
1997-01-01
This round table discussion was devoted to the description of the tools currently available for the evaluation of drug risks and management during pregnancy. Five topics were submitted for discussion: pre-clinical data, methodological tools, benefit/risk ratio before prescription, teratogenic or fetal risk evaluation, legal comments.
Validating the Octave Allegro Information Systems Risk Assessment Methodology: A Case Study
ERIC Educational Resources Information Center
Keating, Corland G.
2014-01-01
An information system (IS) risk assessment is an important part of any successful security management strategy. Risk assessments help organizations to identify mission-critical IS assets and prioritize risk mitigation efforts. Many risk assessment methodologies, however, are complex and can only be completed successfully by highly qualified and…
A risk assessment methodology for critical transportation infrastructure.
DOT National Transportation Integrated Search
2002-01-01
Infrastructure protection typifies a problem of risk assessment and management in a large-scale system. This study offers a methodological framework to identify, prioritize, assess, and manage risks. It includes the following major considerations: (1...
Shur, P Z; Zaĭtseva, N V; Alekseev, V B; Shliapnikov, D M
2015-01-01
In accordance with the international documents in the field of occupational safety and hygiene, the assessment and minimization of occupational risks is a key instrument for the health maintenance of workers. One of the main ways to achieve it is the minimization of occupational risks. Correspondingly, the instrument for the implementation of this method is the methodology of analysis of occupational risks. In Russian Federation there were the preconditions for the formation of the system for the assessment and management of occupational risks. As the target of the national (state) policy in the field of occupational safety in accordance with ILO Conventions it can be offered the prevention of accidents and injuries to health arising from work or related with it, minimizing the causes of hazards inherent in the working environment, as far as it is reasonably and practically feasible. Global trend ofusing the methodology of the assessment and management of occupational risks to life and health of citizens requires the improvement of national policies in the field of occupational hygiene and safety. Achieving an acceptable level of occupational risk in the formation of national policy in the field of occupational hygiene and safety can be considered as one of the main tasks.
NASA Technical Reports Server (NTRS)
1974-01-01
The purpose of the BRAVO User's Manual is to describe the BRAVO methodology in terms of step-by-step procedures. The BRAVO methodology then becomes a tool which a team of analysts can utilize to perform cost effectiveness analyses on potential future space applications with a relatively general set of input information and a relatively small expenditure of resources. An overview of the BRAVO procedure is given by describing the complete procedure in a general form.
Zeng, Xiantao; Zhang, Yonggang; Kwong, Joey S W; Zhang, Chao; Li, Sheng; Sun, Feng; Niu, Yuming; Du, Liang
2015-02-01
To systematically review the methodological assessment tools for pre-clinical and clinical studies, systematic review and meta-analysis, and clinical practice guideline. We searched PubMed, the Cochrane Handbook for Systematic Reviews of Interventions, Joanna Briggs Institute (JBI) Reviewers Manual, Centre for Reviews and Dissemination, Critical Appraisal Skills Programme (CASP), Scottish Intercollegiate Guidelines Network (SIGN), and the National Institute for Clinical Excellence (NICE) up to May 20th, 2014. Two authors selected studies and extracted data; quantitative analysis was performed to summarize the characteristics of included tools. We included a total of 21 assessment tools for analysis. A number of tools were developed by academic organizations, and some were developed by only a small group of researchers. The JBI developed the highest number of methodological assessment tools, with CASP coming second. Tools for assessing the methodological quality of randomized controlled studies were most abundant. The Cochrane Collaboration's tool for assessing risk of bias is the best available tool for assessing RCTs. For cohort and case-control studies, we recommend the use of the Newcastle-Ottawa Scale. The Methodological Index for Non-Randomized Studies (MINORS) is an excellent tool for assessing non-randomized interventional studies, and the Agency for Healthcare Research and Quality (ARHQ) methodology checklist is applicable for cross-sectional studies. For diagnostic accuracy test studies, the Quality Assessment of Diagnostic Accuracy Studies-2 (QUADAS-2) tool is recommended; the SYstematic Review Centre for Laboratory animal Experimentation (SYRCLE) risk of bias tool is available for assessing animal studies; Assessment of Multiple Systematic Reviews (AMSTAR) is a measurement tool for systematic reviews/meta-analyses; an 18-item tool has been developed for appraising case series studies, and the Appraisal of Guidelines, Research and Evaluation (AGREE)-II instrument is widely used to evaluate clinical practice guidelines. We have successfully identified a variety of methodological assessment tools for different types of study design. However, further efforts in the development of critical appraisal tools are warranted since there is currently a lack of such tools for other fields, e.g. genetic studies, and some existing tools (nested case-control studies and case reports, for example) are in need of updating to be in line with current research practice and rigor. In addition, it is very important that all critical appraisal tools remain subjective and performance bias is effectively avoided. © 2015 Chinese Cochrane Center, West China Hospital of Sichuan University and Wiley Publishing Asia Pty Ltd.
New methodologies for multi-scale time-variant reliability analysis of complex lifeline networks
NASA Astrophysics Data System (ADS)
Kurtz, Nolan Scot
The cost of maintaining existing civil infrastructure is enormous. Since the livelihood of the public depends on such infrastructure, its state must be managed appropriately using quantitative approaches. Practitioners must consider not only which components are most fragile to hazard, e.g. seismicity, storm surge, hurricane winds, etc., but also how they participate on a network level using network analysis. Focusing on particularly damaged components does not necessarily increase network functionality, which is most important to the people that depend on such infrastructure. Several network analyses, e.g. S-RDA, LP-bounds, and crude-MCS, and performance metrics, e.g. disconnection bounds and component importance, are available for such purposes. Since these networks are existing, the time state is also important. If networks are close to chloride sources, deterioration may be a major issue. Information from field inspections may also have large impacts on quantitative models. To address such issues, hazard risk analysis methodologies for deteriorating networks subjected to seismicity, i.e. earthquakes, have been created from analytics. A bridge component model has been constructed for these methodologies. The bridge fragilities, which were constructed from data, required a deeper level of analysis as these were relevant for specific structures. Furthermore, chloride-induced deterioration network effects were investigated. Depending on how mathematical models incorporate new information, many approaches are available, such as Bayesian model updating. To make such procedures more flexible, an adaptive importance sampling scheme was created for structural reliability problems. Additionally, such a method handles many kinds of system and component problems with singular or multiple important regions of the limit state function. These and previously developed analysis methodologies were found to be strongly sensitive to the network size. Special network topologies may be more or less computationally difficult, while the resolution of the network also has large affects. To take advantage of some types of topologies, network hierarchical structures with super-link representation have been used in the literature to increase the computational efficiency by analyzing smaller, densely connected networks; however, such structures were based on user input and subjective at times. To address this, algorithms must be automated and reliable. These hierarchical structures may indicate the structure of the network itself. This risk analysis methodology has been expanded to larger networks using such automated hierarchical structures. Component importance is the most important objective from such network analysis; however, this may only provide the information of which bridges to inspect/repair earliest and little else. High correlations influence such component importance measures in a negative manner. Additionally, a regional approach is not appropriately modelled. To investigate a more regional view, group importance measures based on hierarchical structures have been created. Such structures may also be used to create regional inspection/repair approaches. Using these analytical, quantitative risk approaches, the next generation of decision makers may make both component and regional-based optimal decisions using information from both network function and further effects of infrastructure deterioration.
Pakvasa, Mitali Atul; Saroha, Vivek; Patel, Ravi Mangal
2018-06-01
Caffeine reduces the risk of bronchopulmonary dysplasia (BPD). Optimizing caffeine use could increase therapeutic benefit. We performed a systematic-review and random-effects meta-analysis of studies comparing different timing of initiation and dose of caffeine on the risk of BPD. Earlier initiation, compared to later, was associated with a decreased risk of BPD (5 observational studies; n = 63,049, adjusted OR 0.69; 95% CI 0.64-0.75, GRADE: low quality). High-dose caffeine, compared to standard-dose, was associated with a decreased risk of BPD (3 randomized trials, n = 432, OR 0.65; 95% CI 0.43-0.97; GRADE: low quality). Higher quality evidence is needed to guide optimal caffeine use. Copyright © 2018 Elsevier Inc. All rights reserved.
45 CFR 153.320 - Federally certified risk adjustment methodology.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 45 Public Welfare 1 2013-10-01 2013-10-01 false Federally certified risk adjustment methodology. 153.320 Section 153.320 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES REQUIREMENTS RELATING TO HEALTH CARE ACCESS STANDARDS RELATED TO REINSURANCE, RISK CORRIDORS, AND RISK ADJUSTMENT UNDER THE...
45 CFR 153.330 - State alternate risk adjustment methodology.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 45 Public Welfare 1 2012-10-01 2012-10-01 false State alternate risk adjustment methodology. 153.330 Section 153.330 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES REQUIREMENTS RELATING TO HEALTH CARE ACCESS STANDARDS RELATED TO REINSURANCE, RISK CORRIDORS, AND RISK ADJUSTMENT UNDER THE...
45 CFR 153.330 - State alternate risk adjustment methodology.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 45 Public Welfare 1 2014-10-01 2014-10-01 false State alternate risk adjustment methodology. 153.330 Section 153.330 Public Welfare Department of Health and Human Services REQUIREMENTS RELATING TO HEALTH CARE ACCESS STANDARDS RELATED TO REINSURANCE, RISK CORRIDORS, AND RISK ADJUSTMENT UNDER THE...
45 CFR 153.320 - Federally certified risk adjustment methodology.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 45 Public Welfare 1 2014-10-01 2014-10-01 false Federally certified risk adjustment methodology. 153.320 Section 153.320 Public Welfare Department of Health and Human Services REQUIREMENTS RELATING TO HEALTH CARE ACCESS STANDARDS RELATED TO REINSURANCE, RISK CORRIDORS, AND RISK ADJUSTMENT UNDER THE...
45 CFR 153.320 - Federally certified risk adjustment methodology.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 45 Public Welfare 1 2012-10-01 2012-10-01 false Federally certified risk adjustment methodology. 153.320 Section 153.320 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES REQUIREMENTS RELATING TO HEALTH CARE ACCESS STANDARDS RELATED TO REINSURANCE, RISK CORRIDORS, AND RISK ADJUSTMENT UNDER THE...
45 CFR 153.330 - State alternate risk adjustment methodology.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 45 Public Welfare 1 2013-10-01 2013-10-01 false State alternate risk adjustment methodology. 153.330 Section 153.330 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES REQUIREMENTS RELATING TO HEALTH CARE ACCESS STANDARDS RELATED TO REINSURANCE, RISK CORRIDORS, AND RISK ADJUSTMENT UNDER THE...
Horigan, V; De Nardi, M; Simons, R R L; Bertolini, S; Crescio, M I; Estrada-Peña, A; Léger, A; Maurella, C; Ru, G; Schuppers, M; Stärk, K D C; Adkin, A
2018-05-01
We present a novel approach of using the multi-criteria pathogen prioritisation methodology as a basis for selecting the most appropriate case studies for a generic risk assessment framework. The approach uses selective criteria to rank exotic animal health pathogens according to the likelihood of introduction and the impact of an outbreak if it occurred in the European Union (EU). Pathogens were evaluated based on their impact on production at the EU level and international trade. A subsequent analysis included criteria of relevance to quantitative risk assessment case study selection, such as the availability of data for parameterisation, the need for further research and the desire for the case studies to cover different routes of transmission. The framework demonstrated is flexible with the ability to adjust both the criteria and their weightings to the user's requirements. A web based tool has been developed using the RStudio shiny apps software, to facilitate this. Crown Copyright © 2018. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Cioca, Ionel-Lucian; Moraru, Roland Iosif
2012-10-01
In order to meet statutory requirements concerning the workers health and safety, it is necessary for mine managers within Valea Jiului coal basin in Romania to address the potential for underground fires and explosions and their impact on the workforce and the mine ventilation systems. Highlighting the need for a unified and systematic approach of the specific risks, the authors are developing a general framework for fire/explosion risk assessment in gassy mines, based on the quantification of the likelihood of occurrence and gravity of the consequences of such undesired events and employing Root-Cause analysis method. It is emphasized that even a small fire should be regarded as being a major hazard from the point of view of explosion initiation, should a combustible atmosphere arise. The developed methodology, for the assessment of underground fire and explosion risks, is based on the known underground explosion hazards, fire engineering principles and fire test criteria for potentially combustible materials employed in mines.
Operational Implementation of a Pc Uncertainty Construct for Conjunction Assessment Risk Analysis
NASA Technical Reports Server (NTRS)
Newman, Lauri K.; Hejduk, Matthew D.; Johnson, Lauren C.
2016-01-01
Earlier this year the NASA Conjunction Assessment and Risk Analysis (CARA) project presented the theoretical and algorithmic aspects of a method to include the uncertainties in the calculation inputs when computing the probability of collision (Pc) between two space objects, principally uncertainties in the covariances and the hard-body radius. The output of this calculation approach is to produce rather than a single Pc value an entire probability density function that will represent the range of possible Pc values given the uncertainties in the inputs and bring CA risk analysis methodologies more in line with modern risk management theory. The present study provides results from the exercise of this method against an extended dataset of satellite conjunctions in order to determine the effect of its use on the evaluation of conjunction assessment (CA) event risk posture. The effects are found to be considerable: a good number of events are downgraded from or upgraded to a serious risk designation on the basis of consideration of the Pc uncertainty. The findings counsel the integration of the developed methods into NASA CA operations.
Sun, F; Chen, J; Tong, Q; Zeng, S
2007-01-01
Management of drinking water safety is changing towards an integrated risk assessment and risk management approach that includes all processes in a water supply system from catchment to consumers. However, given the large number of water supply systems in China and the cost of implementing such a risk assessment procedure, there is a necessity to first conduct a strategic screening analysis at a national level. An integrated methodology of risk assessment and screening analysis is thus proposed to evaluate drinking water safety of a conventional water supply system. The violation probability, indicating drinking water safety, is estimated at different locations of a water supply system in terms of permanganate index, ammonia nitrogen, turbidity, residual chlorine and trihalomethanes. Critical parameters with respect to drinking water safety are then identified, based on which an index system is developed to prioritize conventional water supply systems in implementing a detailed risk assessment procedure. The evaluation results are represented as graphic check matrices for the concerned hazards in drinking water, from which the vulnerability of a conventional water supply system is characterized.
Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581
NASA Astrophysics Data System (ADS)
Prayogo, Galang Sandy; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin
2016-04-01
Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presented a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.
Dehnavieh, Reza; Ebrahimipour, Hossein; Molavi-Taleghani, Yasamin; Vafaee-Najar, Ali; Hekmat, Somayeh Noori; Esmailzdeh, Hamid
2015-01-01
Introduction: Pediatric emergency has been considered as a high risk area, and blood transfusion is known as a unique clinical measure, therefore this study was conducted with the purpose of assessing the proactive risk assessment of blood transfusion process in Pediatric Emergency of Qaem education- treatment center in Mashhad, by the Healthcare Failure Mode and Effects Analysis (HFMEA) methodology. Methodology: This cross-sectional study analyzed the failure mode and effects of blood transfusion process by a mixture of quantitative-qualitative method. The proactive HFMEA was used to identify and analyze the potential failures of the process. The information of the items in HFMEA forms was collected after obtaining a consensus of experts’ panel views via the interview and focus group discussion sessions. Results: The Number of 77 failure modes were identified for 24 sub-processes enlisted in 8 processes of blood transfusion. Totally 13 failure modes were identified as non-acceptable risk (a hazard score above 8) in the blood transfusion process and were transferred to the decision tree. Root causes of high risk modes were discussed in cause-effect meetings and were classified based on the UK national health system (NHS) approved classifications model. Action types were classified in the form of acceptance (11.6%), control (74.2%) and elimination (14.2%). Recommendations were placed in 7 categories using TRIZ (“Theory of Inventive Problem Solving.”) Conclusion: The re-engineering process for the required changes, standardizing and updating the blood transfusion procedure, root cause analysis of blood transfusion catastrophic events, patient identification bracelet, training classes and educational pamphlets for raising awareness of personnel, and monthly gathering of transfusion medicine committee have all been considered as executive strategies in work agenda in pediatric emergency. PMID:25560332
NASA Astrophysics Data System (ADS)
Bearden, David A.; Duclos, Donald P.; Barrera, Mark J.; Mosher, Todd J.; Lao, Norman Y.
1997-12-01
Emerging technologies and micro-instrumentation are changing the way remote sensing spacecraft missions are developed and implemented. Government agencies responsible for procuring space systems are increasingly requesting analyses to estimate cost, performance and design impacts of advanced technology insertion for both state-of-the-art systems as well as systems to be built 5 to 10 years in the future. Numerous spacecraft technology development programs are being sponsored by Department of Defense (DoD) and National Aeronautics and Space Administration (NASA) agencies with the goal of enhancing spacecraft performance, reducing mass, and reducing cost. However, it is often the case that technology studies, in the interest of maximizing subsystem-level performance and/or mass reduction, do not anticipate synergistic system-level effects. Furthermore, even though technical risks are often identified as one of the largest cost drivers for space systems, many cost/design processes and models ignore effects of cost risk in the interest of quick estimates. To address these issues, the Aerospace Corporation developed a concept analysis methodology and associated software tools. These tools, collectively referred to as the concept analysis and design evaluation toolkit (CADET), facilitate system architecture studies and space system conceptual designs focusing on design heritage, technology selection, and associated effects on cost, risk and performance at the system and subsystem level. CADET allows: (1) quick response to technical design and cost questions; (2) assessment of the cost and performance impacts of existing and new designs/technologies; and (3) estimation of cost uncertainties and risks. These capabilities aid mission designers in determining the configuration of remote sensing missions that meet essential requirements in a cost- effective manner. This paper discuses the development of CADET modules and their application to several remote sensing satellite mission concepts.
Statistical analysis of the uncertainty related to flood hazard appraisal
NASA Astrophysics Data System (ADS)
Notaro, Vincenza; Freni, Gabriele
2015-12-01
The estimation of flood hazard frequency statistics for an urban catchment is of great interest in practice. It provides the evaluation of potential flood risk and related damage and supports decision making for flood risk management. Flood risk is usually defined as function of the probability, that a system deficiency can cause flooding (hazard), and the expected damage, due to the flooding magnitude (damage), taking into account both the exposure and the vulnerability of the goods at risk. The expected flood damage can be evaluated by an a priori estimation of potential damage caused by flooding or by interpolating real damage data. With regard to flood hazard appraisal several procedures propose to identify some hazard indicator (HI) such as flood depth or the combination of flood depth and velocity and to assess the flood hazard corresponding to the analyzed area comparing the HI variables with user-defined threshold values or curves (penalty curves or matrixes). However, flooding data are usually unavailable or piecemeal allowing for carrying out a reliable flood hazard analysis, therefore hazard analysis is often performed by means of mathematical simulations aimed at evaluating water levels and flow velocities over catchment surface. As results a great part of the uncertainties intrinsic to flood risk appraisal can be related to the hazard evaluation due to the uncertainty inherent to modeling results and to the subjectivity of the user defined hazard thresholds applied to link flood depth to a hazard level. In the present work, a statistical methodology was proposed for evaluating and reducing the uncertainties connected with hazard level estimation. The methodology has been applied to a real urban watershed as case study.
Estimation of dose-response models for discrete and continuous data in weed science
USDA-ARS?s Scientific Manuscript database
Dose-response analysis is widely used in biological sciences and has application to a variety of risk assessment, bioassay, and calibration problems. In weed science, dose-response methodologies have typically relied on least squares estimation under an assumption of normality. Advances in computati...
Screening Methodologies to Support Risk and Technology Reviews (RTR): A Case Study Analysis
The Clean Air Act establishes a two-stage regulatory process for addressing emissions of hazardous air pollutants (HAPs) from stationary sources. In the first stage, the Act requires the EPA to develop technology-based standards for categories of industrial sources. We have lar...
76 FR 15301 - Pacific Fishery Management Council; Public Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-21
... Council Direction for 2011 Management Measures for Analysis 5. Essential Fish Habitat Review 6. Adoption... Review 3. Periodic Essential Fish Habitat Review Process 4. Formation of Risk Pools under the Trawl... Service Report 2. Exempted Fishing Permit for 2011 Aerial Survey 3. CPS Survey Methodology D. Habitat...
Methods for a study of Anticipatory and Preventive multidisciplinary Team Care in a family practice.
Dahrouge, Simone; Hogg, William; Lemelin, Jacques; Liddy, Clare; Legault, Frances
2010-02-01
BACKGROUND T o examine the methodology used to evaluate whether focusing the work of nurse practitioners and a pharmacist on frail and at-risk patients would improve the quality of care for such patients. Evaluation of methodology of a randomized controlled trial including analysis of quantitative and qualitative data over time and analysis of cost-effectiveness. A single practice in a rural area near Ottawa, Ont. A total of 241 frail patients, aged 50 years and older, at risk of experiencing adverse health outcomes. At-risk patients were randomly assigned to receive Anticipatory and Preventive Team Care (from their family physicians, 1 of 3 nurse practitioners, and a pharmacist) or usual care. The principal outcome for the study was the quality of care for chronic disease management. Secondary outcomes included other quality of care measures and evaluation of the program process and its cost-effectiveness. This article examines the effectiveness of the methodology used. Quantitative data from surveys, administrative databases, and medical records were supplemented with qualitative information from interviews, focus groups, work logs, and study notes. Three factors limit our ability to fully demonstrate the potential effects of this team structure. For reasons outside our control, the intervention duration was shorter than intended; the practice's physical layout did not facilitate interactions between the care providers; and contamination of the intervention effect into the control arm cannot be excluded. The study used a randomized design, relied on a multifaceted approach to evaluating its effects, and used several sources of data. TRIAL REGISTRATION NUMBER NCT00238836 (CONSORT).
Czolowski, Eliza D.; Santoro, Renee L.; Srebotnjak, Tanja
2017-01-01
Background: Higher risk of exposure to environmental health hazards near oil and gas wells has spurred interest in quantifying populations that live in proximity to oil and gas development. The available studies on this topic lack consistent methodology and ignore aspects of oil and gas development of value to public health–relevant assessment and decision-making. Objectives: We aim to present a methodological framework for oil and gas development proximity studies grounded in an understanding of hydrocarbon geology and development techniques. Methods: We geospatially overlay locations of active oil and gas wells in the conterminous United States and Census data to estimate the population living in proximity to hydrocarbon development at the national and state levels. We compare our methods and findings with existing proximity studies. Results: Nationally, we estimate that 17.6 million people live within 1,600m (∼1 mi) of at least one active oil and/or gas well. Three of the eight studies overestimate populations at risk from actively producing oil and gas wells by including wells without evidence of production or drilling completion and/or using inappropriate population allocation methods. The remaining five studies, by omitting conventional wells in regions dominated by historical conventional development, significantly underestimate populations at risk. Conclusions: The well inventory guidelines we present provide an improved methodology for hydrocarbon proximity studies by acknowledging the importance of both conventional and unconventional well counts as well as the relative exposure risks associated with different primary production categories (e.g., oil, wet gas, dry gas) and developmental stages of wells. https://doi.org/10.1289/EHP1535 PMID:28858829
Bow-tie diagrams for risk management in anaesthesia.
Culwick, M D; Merry, A F; Clarke, D M; Taraporewalla, K J; Gibbs, N M
2016-11-01
Bow-tie analysis is a risk analysis and management tool that has been readily adopted into routine practice in many high reliability industries such as engineering, aviation and emergency services. However, it has received little exposure so far in healthcare. Nevertheless, its simplicity, versatility, and pictorial display may have benefits for the analysis of a range of healthcare risks, including complex and multiple risks and their interactions. Bow-tie diagrams are a combination of a fault tree and an event tree, which when combined take the shape of a bow tie. Central to bow-tie methodology is the concept of an undesired or 'Top Event', which occurs if a hazard progresses past all prevention controls. Top Events may also occasionally occur idiosyncratically. Irrespective of the cause of a Top Event, mitigation and recovery controls may influence the outcome. Hence the relationship of hazard to outcome can be viewed in one diagram along with possible causal sequences or accident trajectories. Potential uses for bow-tie diagrams in anaesthesia risk management include improved understanding of anaesthesia hazards and risks, pre-emptive identification of absent or inadequate hazard controls, investigation of clinical incidents, teaching anaesthesia risk management, and demonstrating risk management strategies to third parties when required.
45 CFR 153.510 - Risk corridors establishment and payment methodology.
Code of Federal Regulations, 2012 CFR
2012-10-01
... methodology. 153.510 Section 153.510 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES REQUIREMENTS RELATING TO HEALTH CARE ACCESS STANDARDS RELATED TO REINSURANCE, RISK CORRIDORS, AND RISK ADJUSTMENT UNDER THE AFFORDABLE CARE ACT Health Insurance Issuer Standards Related to the Risk Corridors Program § 153...
45 CFR 153.510 - Risk corridors establishment and payment methodology.
Code of Federal Regulations, 2014 CFR
2014-10-01
... methodology. 153.510 Section 153.510 Public Welfare Department of Health and Human Services REQUIREMENTS RELATING TO HEALTH CARE ACCESS STANDARDS RELATED TO REINSURANCE, RISK CORRIDORS, AND RISK ADJUSTMENT UNDER THE AFFORDABLE CARE ACT Health Insurance Issuer Standards Related to the Risk Corridors Program § 153...
MULTI-MEDIA MICROBIOLOGICAL RISK ASSESSMENT METHODOLOGY FOR MUNICIPAL WASTEWATER SLUDGES
In order to reduce the risk of municipal sludge to acceptable levels, the U.S. EPA has undertaken a regulatory program based on risk assessment and risk management. The key to such a program is the development of a methodology which allows the regulatory agency to quantify the re...
U.K. Foot and Mouth Disease: A Systemic Risk Assessment of Existing Controls.
Delgado, João; Pollard, Simon; Pearn, Kerry; Snary, Emma L; Black, Edgar; Prpich, George; Longhurst, Phil
2017-09-01
This article details a systemic analysis of the controls in place and possible interventions available to further reduce the risk of a foot and mouth disease (FMD) outbreak in the United Kingdom. Using a research-based network analysis tool, we identify vulnerabilities within the multibarrier control system and their corresponding critical control points (CCPs). CCPs represent opportunities for active intervention that produce the greatest improvement to United Kingdom's resilience to future FMD outbreaks. Using an adapted 'features, events, and processes' (FEPs) methodology and network analysis, our results suggest that movements of animals and goods associated with legal activities significantly influence the system's behavior due to their higher frequency and ability to combine and create scenarios of exposure similar in origin to the U.K. FMD outbreaks of 1967/8 and 2001. The systemic risk assessment highlights areas outside of disease control that are relevant to disease spread. Further, it proves to be a powerful tool for demonstrating the need for implementing disease controls that have not previously been part of the system. © 2016 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.
Health effects of risk-assessment categories
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kramer, C.F.; Rybicka, K.; Knutson, A.
Environmental and occupational health effects associated with exposures to various chemicals are a subject of increasing concern. One recently developed methodology for assessing the health impacts of various chemical compounds involves the classification of similar chemicals into risk-assessment categories (RACs). This report reviews documented human health effects for a broad range of pollutants, classified by RACs. It complements other studies that have estimated human health effects by RAC based on analysis and extrapolation of data from animal research.
Few, Roger; Lake, Iain; Hunter, Paul R; Tran, Pham Gia; Thien, Vu Trong
2009-12-21
Understanding how risks to human health change as a result of seasonal variations in environmental conditions is likely to become of increasing importance in the context of climatic change, especially in lower-income countries. A multi-disciplinary approach can be a useful tool for improving understanding, particularly in situations where existing data resources are limited but the environmental health implications of seasonal hazards may be high. This short article describes a multi-disciplinary approach combining analysis of changes in levels of environmental contamination, seasonal variations in disease incidence and a social scientific analysis of health behaviour. The methodology was field-tested in a peri-urban environment in the Mekong Delta, Vietnam, where poor households face alternate seasonal extremes in the local environment as the water level in the Delta changes from flood to dry season. Low-income households in the research sites rely on river water for domestic uses, including provision of drinking water, and it is commonly perceived that the seasonal changes alter risk from diarrhoeal diseases and other diseases associated with contamination of water. The discussion focuses on the implementation of the methodology in the field, and draws lessons from the research process that can help in refining and developing the approach for application in other locations where seasonal dynamics of disease risk may have important consequences for public health.
Modeling the Near-Term Risk of Climate Uncertainty: Interdependencies among the U.S. States
NASA Astrophysics Data System (ADS)
Lowry, T. S.; Backus, G.; Warren, D.
2010-12-01
Decisions made to address climate change must start with an understanding of the risk of an uncertain future to human systems, which in turn means understanding both the consequence as well as the probability of a climate induced impact occurring. In other words, addressing climate change is an exercise in risk-informed policy making, which implies that there is no single correct answer or even a way to be certain about a single answer; the uncertainty in future climate conditions will always be present and must be taken as a working-condition for decision making. In order to better understand the implications of uncertainty on risk and to provide a near-term rationale for policy interventions, this study estimates the impacts from responses to climate change on U.S. state- and national-level economic activity by employing a risk-assessment methodology for evaluating uncertain future climatic conditions. Using the results from the Intergovernmental Panel on Climate Change’s (IPCC) Fourth Assessment Report (AR4) as a proxy for climate uncertainty, changes in hydrology over the next 40 years were mapped and then modeled to determine the physical consequences on economic activity and to perform a detailed 70-industry analysis of the economic impacts among the interacting lower-48 states. The analysis determines industry-level effects, employment impacts at the state level, interstate population migration, consequences to personal income, and ramifications for the U.S. trade balance. The conclusions show that the average risk of damage to the U.S. economy from climate change is on the order of $1 trillion over the next 40 years, with losses in employment equivalent to nearly 7 million full-time jobs. Further analysis shows that an increase in uncertainty raises this risk. This paper will present the methodology behind the approach, a summary of the underlying models, as well as the path forward for improving the approach.
Ambler, Graeme K; Gohel, Manjit S; Mitchell, David C; Loftus, Ian M; Boyle, Jonathan R
2015-01-01
Accurate adjustment of surgical outcome data for risk is vital in an era of surgeon-level reporting. Current risk prediction models for abdominal aortic aneurysm (AAA) repair are suboptimal. We aimed to develop a reliable risk model for in-hospital mortality after intervention for AAA, using rigorous contemporary statistical techniques to handle missing data. Using data collected during a 15-month period in the United Kingdom National Vascular Database, we applied multiple imputation methodology together with stepwise model selection to generate preoperative and perioperative models of in-hospital mortality after AAA repair, using two thirds of the available data. Model performance was then assessed on the remaining third of the data by receiver operating characteristic curve analysis and compared with existing risk prediction models. Model calibration was assessed by Hosmer-Lemeshow analysis. A total of 8088 AAA repair operations were recorded in the National Vascular Database during the study period, of which 5870 (72.6%) were elective procedures. Both preoperative and perioperative models showed excellent discrimination, with areas under the receiver operating characteristic curve of .89 and .92, respectively. This was significantly better than any of the existing models (area under the receiver operating characteristic curve for best comparator model, .84 and .88; P < .001 and P = .001, respectively). Discrimination remained excellent when only elective procedures were considered. There was no evidence of miscalibration by Hosmer-Lemeshow analysis. We have developed accurate models to assess risk of in-hospital mortality after AAA repair. These models were carefully developed with rigorous statistical methodology and significantly outperform existing methods for both elective cases and overall AAA mortality. These models will be invaluable for both preoperative patient counseling and accurate risk adjustment of published outcome data. Copyright © 2015 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.
Groundwater pollution risk assessment. Application to different carbonate aquifers in south Spain
NASA Astrophysics Data System (ADS)
Jimenez Madrid, A.; Martinez Navarrete, C.; Carrasco Cantos, F.
2009-04-01
Water protection has been considered one of the most important environmental goals in the European politics since the 2000/60/CE Water Framework Directive came into force in 2000, and more specifically in 2006 with the 2006/118/CE Directive on groundwater protection. As one of the necessary requirements to tackle groundwater protection, a pollution risk assessment has been made through the analysis of both the existing hazard human activities map and the intrinsic aquifer vulnerability map, by applying the methodologies proposed by COST Action 620 in an experimental study site in south Spain containing different carbonated aquifers, which supply 8 towns ranging from 2000 to 2500 inhabitants. In order to generate both maps it was necessary to make a field inventory over a 1:10000 topographic base map, followed by Geographic Information System (GIS) processing. The outcome maps show a clear spatial distribution of both pollution risk and intrinsic vulnerability of the carbonated aquifers studied. As a final result, a map of the intensity of groundwater pollution risk is presented, representing and important base for the development of a proper methodology for the protection of groundwater resources for human consumption protection. Keywords. Hazard, Vulnerability, Risk, SIG, Protection
Coupling Post-Event and Prospective Analyses for El Niño-related Risk Reduction in Peru
NASA Astrophysics Data System (ADS)
French, Adam; Keating, Adriana; Mechler, Reinhard; Szoenyi, Michael; Cisneros, Abel; Chuquisengo, Orlando; Etienne, Emilie; Ferradas, Pedro
2017-04-01
Analyses in the wake of natural disasters play an important role in identifying how ex ante risk reduction and ex post hazard response activities have both succeeded and fallen short in specific contexts, thereby contributing to recommendations for improving such measures in the future. Event analyses have particular relevance in settings where disasters are likely to reoccur, and especially where recurrence intervals are short. This paper applies the Post Event Review Capability (PERC) methodology to the context of frequently reoccurring El Niño Southern Oscillation (ENSO) events in the country of Peru, where over the last several decades ENSO impacts have generated high levels of damage and economic loss. Rather than analyzing the impacts of a single event, this study builds upon the existing PERC methodology by combining empirical event analysis with a critical examination of risk reduction and adaptation measures implemented both prior to and following several ENSO events in the late 20th and early 21st centuries. Additionally, the paper explores linking the empirical findings regarding the uptake and outcomes of particular risk reduction and adaptation strategies to a prospective, scenario-based approach for projecting risk several decades into the future.
Ecological Risk Assessment with MCDM of Some Invasive Alien Plants in China
NASA Astrophysics Data System (ADS)
Xie, Guowen; Chen, Weiguang; Lin, Meizhen; Zheng, Yanling; Guo, Peiguo; Zheng, Yisheng
Alien plant invasion is an urgent global issue that threatens the sustainable development of the ecosystem health. The study of its ecological risk assessment (ERA) could help us to prevent and reduce the invasion risk more effectively. Based on the theory of ERA and methods of the analytic hierarchy process (AHP) of multi-criteria decision-making (MCDM), and through the analyses of the characteristics and processes of alien plant invasion, this paper discusses the methodologies of ERA of alien plant invasion. The assessment procedure consisted of risk source analysis, receptor analysis, exposure and hazard assessment, integral assessment, and countermeasure of risk management. The indicator system of risk source assessment as well as the indices and formulas applied to measure the ecological loss and risk were established, and the method for comprehensively assessing the ecological risk of alien plant invasion was worked out. The result of ecological risk analysis to 9 representative invasive alien plants in China shows that the ecological risk of Erigeron annuus, Ageratum conyzoides, Alternanthera philoxeroides and Mikania midrantha is high (grade1-2), that of Oxalis corymbosa and Wedelia chinensis comes next (grade3), while Mirabilis jalapa, Pilea microphylla and Calendula officinalis of the last (grade 4). Risk strategies are put forward on this basis.
Assessment of seismic risk in Tashkent, Uzbekistan and Bishkek, Kyrgyz Republic
Erdik, M.; Rashidov, T.; Safak, E.; Turdukulov, A.
2005-01-01
The impact of earthquakes in urban centers prone to disastrous earthquakes necessitates the analysis of associated risk for rational formulation of contingency plans and mitigation strategies. In urban centers the seismic risk is best quantified and portrayed through the preparation of 'Earthquake damage and Loss Scenarios'. The components of such scenarios are the assessment of the hazard, inventories and the vulnerabilities of elements at risk. For the development of earthquake risk scenario in Tashkent-Uzbekistan and Bishkek-Kyrgyzstan an approach based on spectral displacements is utilized. This paper will present the important features of a comprehensive study, highlight the methodology, discuss the results and provide insights to the future developments. ?? 2005 Elsevier Ltd. All rights reserved.
76 FR 81998 - Methodology for Low Power/Shutdown Fire PRA
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-29
... NUCLEAR REGULATORY COMMISSION [NRC-2011-0295] Methodology for Low Power/Shutdown Fire PRA AGENCY..., ``Methodology for Low Power/Shutdown Fire PRA--Draft Report for Comment.'' DATES: Submit comments by March 01... risk assessment (PRA) method for quantitatively analyzing fire risk in commercial nuclear power plants...
Improving the Accuracy of Estimation of Climate Extremes
NASA Astrophysics Data System (ADS)
Zolina, Olga; Detemmerman, Valery; Trenberth, Kevin E.
2010-12-01
Workshop on Metrics and Methodologies of Estimation of Extreme Climate Events; Paris, France, 27-29 September 2010; Climate projections point toward more frequent and intense weather and climate extremes such as heat waves, droughts, and floods, in a warmer climate. These projections, together with recent extreme climate events, including flooding in Pakistan and the heat wave and wildfires in Russia, highlight the need for improved risk assessments to help decision makers and the public. But accurate analysis and prediction of risk of extreme climate events require new methodologies and information from diverse disciplines. A recent workshop sponsored by the World Climate Research Programme (WCRP) and hosted at United Nations Educational, Scientific and Cultural Organization (UNESCO) headquarters in France brought together, for the first time, a unique mix of climatologists, statisticians, meteorologists, oceanographers, social scientists, and risk managers (such as those from insurance companies) who sought ways to improve scientists' ability to characterize and predict climate extremes in a changing climate.
Demougeot-Renard, Helene; De Fouquet, Chantal
2004-10-01
Assessing the volume of soil requiring remediation and the accuracy of this assessment constitutes an essential step in polluted site management. If this remediation volume is not properly assessed, misclassification may lead both to environmental risks (polluted soils may not be remediated) and financial risks (unexpected discovery of polluted soils may generate additional remediation costs). To minimize such risks, this paper proposes a geostatistical methodology based on stochastic simulations that allows the remediation volume and the uncertainty to be assessed using investigation data. The methodology thoroughly reproduces the conditions in which the soils are classified and extracted at the remediation stage. The validity of the approach is tested by applying it on the data collected during the investigation phase of a former lead smelting works and by comparing the results with the volume that has actually been remediated. This real remediated volume was composed of all the remediation units that were classified as polluted after systematic sampling and analysis during clean-up stage. The volume estimated from the 75 samples collected during site investigation slightly overestimates (5.3% relative error) the remediated volume deduced from 212 remediation units. Furthermore, the real volume falls within the range of uncertainty predicted using the proposed methodology.
Zhang, Kejiang; Achari, Gopal; Pei, Yuansheng
2010-10-01
Different types of uncertain information-linguistic, probabilistic, and possibilistic-exist in site characterization. Their representation and propagation significantly influence the management of contaminated sites. In the absence of a framework with which to properly represent and integrate these quantitative and qualitative inputs together, decision makers cannot fully take advantage of the available and necessary information to identify all the plausible alternatives. A systematic methodology was developed in the present work to incorporate linguistic, probabilistic, and possibilistic information into the Preference Ranking Organization METHod for Enrichment Evaluation (PROMETHEE), a subgroup of Multi-Criteria Decision Analysis (MCDA) methods for ranking contaminated sites. The identification of criteria based on the paradigm of comparative risk assessment provides a rationale for risk-based prioritization. Uncertain linguistic, probabilistic, and possibilistic information identified in characterizing contaminated sites can be properly represented as numerical values, intervals, probability distributions, and fuzzy sets or possibility distributions, and linguistic variables according to their nature. These different kinds of representation are first transformed into a 2-tuple linguistic representation domain. The propagation of hybrid uncertainties is then carried out in the same domain. This methodology can use the original site information directly as much as possible. The case study shows that this systematic methodology provides more reasonable results. © 2010 SETAC.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kammerer, Annie
Department of Energy (DOE) nuclear facilities must comply with DOE Order 420.1C Facility Safety, which requires that all such facilities review their natural phenomena hazards (NPH) assessments no less frequently than every ten years. The Order points the reader to Standard DOE-STD-1020-2012. In addition to providing a discussion of the applicable evaluation criteria, the Standard references other documents, including ANSI/ANS-2.29-2008 and NUREG-2117. These documents provide supporting criteria and approaches for evaluating the need to update an existing probabilistic seismic hazard analysis (PSHA). All of the documents are consistent at a high level regarding the general conceptual criteria that should bemore » considered. However, none of the documents provides step-by-step detailed guidance on the required or recommended approach for evaluating the significance of new information and determining whether or not an existing PSHA should be updated. Further, all of the conceptual approaches and criteria given in these documents deal with changes that may have occurred in the knowledge base that might impact the inputs to the PSHA, the calculated hazard itself, or the technical basis for the hazard inputs. Given that the DOE Order is aimed at achieving and assuring the safety of nuclear facilities—which is a function not only of the level of the seismic hazard but also the capacity of the facility to withstand vibratory ground motions—the inclusion of risk information in the evaluation process would appear to be both prudent and in line with the objectives of the Order. The purpose of this white paper is to describe a risk-informed methodology for evaluating the need for an update of an existing PSHA consistent with the DOE Order. While the development of the proposed methodology was undertaken as a result of assessments for specific SDC-3 facilities at Idaho National Laboratory (INL), and it is expected that the application at INL will provide a demonstration of the methodology, there is potential for general applicability to other facilities across the DOE complex. As such, both a general methodology and a specific approach intended for INL are described in this document. The general methodology proposed in this white paper is referred to as the “seismic hazard periodic review methodology,” or SHPRM. It presents a graded approach for SDC-3, SDC-4 and SDC-5 facilities that can be applied in any risk-informed regulatory environment by once risk-objectives appropriate for the framework are developed. While the methodology was developed for seismic hazard considerations, it can also be directly applied to other types of natural hazards.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kammerer, Annie
Department of Energy (DOE) nuclear facilities must comply with DOE Order 420.1C Facility Safety, which requires that all such facilities review their natural phenomena hazards (NPH) assessments no less frequently than every ten years. The Order points the reader to Standard DOE-STD-1020-2012. In addition to providing a discussion of the applicable evaluation criteria, the Standard references other documents, including ANSI/ANS-2.29-2008 and NUREG-2117. These documents provide supporting criteria and approaches for evaluating the need to update an existing probabilistic seismic hazard analysis (PSHA). All of the documents are consistent at a high level regarding the general conceptual criteria that should bemore » considered. However, none of the documents provides step-by-step detailed guidance on the required or recommended approach for evaluating the significance of new information and determining whether or not an existing PSHA should be updated. Further, all of the conceptual approaches and criteria given in these documents deal with changes that may have occurred in the knowledge base that might impact the inputs to the PSHA, the calculated hazard itself, or the technical basis for the hazard inputs. Given that the DOE Order is aimed at achieving and assuring the safety of nuclear facilities—which is a function not only of the level of the seismic hazard but also the capacity of the facility to withstand vibratory ground motions—the inclusion of risk information in the evaluation process would appear to be both prudent and in line with the objectives of the Order. The purpose of this white paper is to describe a risk-informed methodology for evaluating the need for an update of an existing PSHA consistent with the DOE Order. While the development of the proposed methodology was undertaken as a result of assessments for specific SDC-3 facilities at Idaho National Laboratory (INL), and it is expected that the application at INL will provide a demonstration of the methodology, there is potential for general applicability to other facilities across the DOE complex. As such, both a general methodology and a specific approach intended for INL are described in this document. The general methodology proposed in this white paper is referred to as the “seismic hazard periodic review methodology,” or SHPRM. It presents a graded approach for SDC-3, SDC-4 and SDC-5 facilities that can be applied in any risk-informed regulatory environment once risk-objectives appropriate for the framework are developed. While the methodology was developed for seismic hazard considerations, it can also be directly applied to other types of natural hazards.« less
Application of the API/NPRA SVA methodology to transportation security issues.
Moore, David A
2006-03-17
Security vulnerability analysis (SVA) is becoming more prevalent as the issue of chemical process security is of greater concern. The American Petroleum Institute (API) and the National Petrochemical and Refiner's Association (NPRA) have developed a guideline for conducting SVAs of petroleum and petrochemical facilities in May 2003. In 2004, the same organizations enhanced the guidelines by adding the ability to evaluate transportation security risks (pipeline, truck, and rail). The importance of including transportation and value chain security in addition to fixed facility security in a SVA is that these issues may be critically important to understanding the total risk of the operation. Most of the SVAs done using the API/NPRA SVA and other SVA methods were centered on the fixed facility and the operations within the plant fence. Transportation interfaces alone are normally studied as a part of the facility SVA, and the entire transportation route impacts and value chain disruption are not commonly considered. Particularly from a national, regional, or local infrastructure analysis standpoint, understanding the interdependencies is critical to the risk assessment. Transportation risks may include weaponization of the asset by direct attack en route, sabotage, or a Trojan Horse style attack into a facility. The risks differ in the level of access control and the degree of public exposures, as well as the dynamic nature of the assets. The public exposures along the transportation route need to be carefully considered. Risks may be mitigated by one of many strategies including internment, staging, prioritization, conscription, or prohibition, as well as by administrative security measures and technology for monitoring and isolating the assets. This paper illustrates how these risks can be analyzed by the API/NPRA SVA methodology. Examples are given of a pipeline operation, and other examples are found in the guidelines.
White, Andrew A; Wright, Seth W; Blanco, Roberto; Lemonds, Brent; Sisco, Janice; Bledsoe, Sandy; Irwin, Cindy; Isenhour, Jennifer; Pichert, James W
2004-10-01
Identifying the etiologies of adverse outcomes is an important first step in improving patient safety and reducing malpractice risks. However, relatively little is known about the causes of emergency department-related adverse outcomes. The objective was to describe a method for identification of common causes of adverse outcomes in an emergency department. This methodology potentially can suggest ways to improve care and might provide a model for identification of factors associated with adverse outcomes. This was a retrospective analysis of 74 consecutive files opened by a malpractice insurer between 1995 and 2000. Each risk-management file was analyzed to identify potential causes of adverse outcomes. The main outcomes were rater-assigned codes for alleged problems with care (e.g., failures of communication or problems related to diagnosis). About 50% of cases were related to injuries or abdominal complaints. A contributing cause was found in 92% of cases, and most had more than one contributing cause. The most frequent contributing categories included failure to diagnose (45%), supervision problems (31%), communication problems (30%), patient behavior (24%), administrative problems (20%), and documentation (20%). Specific relating factors within these categories, such as lack of timely resident supervision and failure to follow policies and procedures, were identified. This project documented that an aggregate analysis of risk-management files has the potential to identify shared causes related to real or perceived adverse outcomes. Several potentially correctable systems problems were identified using this methodology. These simple, descriptive management tools may be useful in identifying issues for problem solving and can be easily learned by physicians and managers.
Ren, Chong; McGrath, Colman; Yang, Yanqi
2015-09-01
To assess the effectiveness of diode low-level laser therapy (LLLT) for orthodontic pain control, a systematic and extensive electronic search for randomised controlled trials (RCTs) investigating the effects of diode LLLT on orthodontic pain prior to November 2014 was performed using the Cochrane Library (Issue 9, 2014), PubMed (1997), EMBASE (1947) and Web of Science (1956). The Cochrane tool for risk of bias evaluation was used to assess the bias risk in the chosen data. A meta-analysis was conducted using RevMan 5.3. Of the 186 results, 14 RCTs, with a total of 659 participants from 11 countries, were included. Except for three studies assessed as having a 'moderate risk of bias', the RCTs were rated as having a 'high risk of bias'. The methodological weaknesses were mainly due to 'blinding' and 'allocation concealment'. The meta-analysis showed that diode LLLT significantly reduced orthodontic pain by 39 % in comparison with placebo groups (P = 0.02). Diode LLLT was shown to significantly reduce the maximum pain intensity among parallel-design studies (P = 0.003 versus placebo groups; P = 0.000 versus control groups). However, no significant effects were shown for split-mouth-design studies (P = 0.38 versus placebo groups). It was concluded that the use of diode LLLT for orthodontic pain appears promising. However, due to methodological weaknesses, there was insufficient evidence to support or refute LLLT's effectiveness. RCTs with better designs and appropriate sample power are required to provide stronger evidence for diode LLLT's clinical applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Helms, J.
2017-02-10
The US energy sector is vulnerable to multiple hazards including both natural disasters and malicious attacks from an intelligent adversary. The question that utility owners, operators and regulators face is how to prioritize their investments to mitigate the risks from a hazard that can have the most impact on the asset of interest. In order to be able to understand their risk landscape and develop a prioritized mitigation strategy, they must quantify risk in a consistent way across all hazards their asset is facing. Without being able to quantitatively measure risk, it is not possible to defensibly prioritize security investmentsmore » or evaluate trade-offs between security and functionality. Development of a methodology that will consistently measure and quantify risk across different hazards is needed.« less
NASA Technical Reports Server (NTRS)
Frigm, Ryan C.; Levi, Joshua A.; Mantziaras, Dimitrios C.
2010-01-01
An operational Conjunction Assessment Risk Analysis (CARA) concept is the real-time process of assessing risk posed by close approaches and reacting to those risks if necessary. The most effective way to completely mitigate conjunction risk is to perform an avoidance maneuver. The NASA Goddard Space Flight Center has implemented a routine CARA process since 2005. Over this period, considerable experience has been gained and many lessons have been learned. This paper identifies and presents these experiences as general concepts in the description of the Conjunction Assessment, Flight Dynamics, and Flight Operations methodologies and processes. These general concepts will be tied together and will be exemplified through a case study of an actual high risk conjunction event for the Aura mission.
Subsea release of oil from a riser: an ecological risk assessment.
Nazir, Muddassir; Khan, Faisal; Amyotte, Paul; Sadiq, Rehan
2008-10-01
This study illustrates a newly developed methodology, as a part of the U.S. EPA ecological risk assessment (ERA) framework, to predict exposure concentrations in a marine environment due to underwater release of oil and gas. It combines the hydrodynamics of underwater blowout, weathering algorithms, and multimedia fate and transport to measure the exposure concentration. Naphthalene and methane are used as surrogate compounds for oil and gas, respectively. Uncertainties are accounted for in multimedia input parameters in the analysis. The 95th percentile of the exposure concentration (EC(95%)) is taken as the representative exposure concentration for the risk estimation. A bootstrapping method is utilized to characterize EC(95%) and associated uncertainty. The toxicity data of 19 species available in the literature are used to calculate the 5th percentile of the predicted no observed effect concentration (PNEC(5%)) by employing the bootstrapping method. The risk is characterized by transforming the risk quotient (RQ), which is the ratio of EC(95%) to PNEC(5%), into a cumulative risk distribution. This article describes a probabilistic basis for the ERA, which is essential from risk management and decision-making viewpoints. Two case studies of underwater oil and gas mixture release, and oil release with no gaseous mixture are used to show the systematic implementation of the methodology, elements of ERA, and the probabilistic method in assessing and characterizing the risk.
Application of risk analysis in water resourses management
NASA Astrophysics Data System (ADS)
Varouchakis, Emmanouil; Palogos, Ioannis
2017-04-01
A common cost-benefit analysis approach, which is novel in the risk analysis of hydrologic/hydraulic applications, and a Bayesian decision analysis are applied to aid the decision making on whether or not to construct a water reservoir for irrigation purposes. The alternative option examined is a scaled parabolic fine variation in terms of over-pumping violations in contrast to common practices that usually consider short-term fines. Such an application, and in such detail, represents new feedback. The results indicate that the probability uncertainty is the driving issue that determines the optimal decision with each methodology, and depending on the unknown probability handling, each methodology may lead to a different optimal decision. Thus, the proposed tool can help decision makers (stakeholders) to examine and compare different scenarios using two different approaches before making a decision considering the cost of a hydrologic/hydraulic project and the varied economic charges that water table limit violations can cause inside an audit interval. In contrast to practices that assess the effect of each proposed action separately considering only current knowledge of the examined issue, this tool aids decision making by considering prior information and the sampling distribution of future successful audits. This tool is developed in a web service for the easier stakeholders' access.
A Practical Risk Assessment Methodology for Safety-Critical Train Control Systems
DOT National Transportation Integrated Search
2009-07-01
This project proposes a Practical Risk Assessment Methodology (PRAM) for analyzing railroad accident data and assessing the risk and benefit of safety-critical train control systems. This report documents in simple steps the algorithms and data input...
NASA Technical Reports Server (NTRS)
Hatfield, Glen S.; Hark, Frank; Stott, James
2016-01-01
Launch vehicle reliability analysis is largely dependent upon using predicted failure rates from data sources such as MIL-HDBK-217F. Reliability prediction methodologies based on component data do not take into account system integration risks such as those attributable to manufacturing and assembly. These sources often dominate component level risk. While consequence of failure is often understood, using predicted values in a risk model to estimate the probability of occurrence may underestimate the actual risk. Managers and decision makers use the probability of occurrence to influence the determination whether to accept the risk or require a design modification. The actual risk threshold for acceptance may not be fully understood due to the absence of system level test data or operational data. This paper will establish a method and approach to identify the pitfalls and precautions of accepting risk based solely upon predicted failure data. This approach will provide a set of guidelines that may be useful to arrive at a more realistic quantification of risk prior to acceptance by a program.
Using Risk Assessment Methodologies to Meet Management Objectives
NASA Technical Reports Server (NTRS)
DeMott, D. L.
2015-01-01
Current decision making involves numerous possible combinations of technology elements, safety and health issues, operational aspects and process considerations to satisfy program goals. Identifying potential risk considerations as part of the management decision making process provides additional tools to make more informed management decision. Adapting and using risk assessment methodologies can generate new perspectives on various risk and safety concerns that are not immediately apparent. Safety and operational risks can be identified and final decisions can balance these considerations with cost and schedule risks. Additional assessments can also show likelihood of event occurrence and event consequence to provide a more informed basis for decision making, as well as cost effective mitigation strategies. Methodologies available to perform Risk Assessments range from qualitative identification of risk potential, to detailed assessments where quantitative probabilities are calculated. Methodology used should be based on factors that include: 1) type of industry and industry standards, 2) tasks, tools, and environment 3) type and availability of data and 4) industry views and requirements regarding risk & reliability. Risk Assessments are a tool for decision makers to understand potential consequences and be in a position to reduce, mitigate or eliminate costly mistakes or catastrophic failures.
Risk assessment of maintenance operations: the analysis of performing task and accident mechanism.
Carrillo-Castrillo, Jesús A; Rubio-Romero, Juan Carlos; Guadix, Jose; Onieva, Luis
2015-01-01
Maintenance operations cover a great number of occupations. Most small and medium-sized enterprises lack the appropriate information to conduct risk assessments of maintenance operations. The objective of this research is to provide a method based on the concepts of task and accident mechanisms for an initial risk assessment by taking into consideration the prevalence and severity of the maintenance accidents reported. Data were gathered from 11,190 reported accidents in maintenance operations in the manufacturing sector of Andalusia from 2003 to 2012. By using a semi-quantitative methodology, likelihood and severity were evaluated based on the actual distribution of accident mechanisms in each of the tasks. Accident mechanisms and tasks were identified by using those variables included in the European Statistics of Accidents at Work methodology. As main results, the estimated risk of the most frequent accident mechanisms identified for each of the analysed tasks is low and the only accident mechanisms with medium risk are accidents when lifting or pushing with physical stress on the musculoskeletal system in tasks involving carrying, and impacts against objects after slipping or stumbling for tasks involving movements. The prioritisation of public preventive actions for the accident mechanisms with a higher estimated risk is highly recommended.
Early Detection for Dengue Using Local Indicator of Spatial Association (LISA) Analysis
Parra-Amaya, Mayra Elizabeth; Puerta-Yepes, María Eugenia; Lizarralde-Bejarano, Diana Paola; Arboleda-Sánchez, Sair
2016-01-01
Dengue is a viral disease caused by a flavivirus that is transmitted by mosquitoes of the genus Aedes. There is currently no specific treatment or commercial vaccine for its control and prevention; therefore, mosquito population control is the only alternative for preventing the occurrence of dengue. For this reason, entomological surveillance is recommended by World Health Organization (WHO) to measure dengue risk in endemic areas; however, several works have shown that the current methodology (aedic indices) is not sufficient for predicting dengue. In this work, we modified indices proposed for epidemic periods. The raw value of the epidemiological wave could be useful for detecting risk in epidemic periods; however, risk can only be detected if analyses incorporate the maximum epidemiological wave. Risk classification was performed according to Local Indicators of Spatial Association (LISA) methodology. The modified indices were analyzed using several hypothetical scenarios to evaluate their sensitivity. We found that modified indices could detect spatial and differential risks in epidemic and endemic years, which makes them a useful tool for the early detection of a dengue outbreak. In conclusion, the modified indices could predict risk at the spatio-temporal level in endemic years and could be incorporated in surveillance activities in endemic places. PMID:28933396
Sleep disturbances as an evidence-based suicide risk factor.
Bernert, Rebecca A; Kim, Joanne S; Iwata, Naomi G; Perlis, Michael L
2015-03-01
Increasing research indicates that sleep disturbances may confer increased risk for suicidal behaviors, including suicidal ideation, suicide attempts, and death by suicide. Despite increased investigation, a number of methodological problems present important limitations to the validity and generalizability of findings in this area, which warrant additional focus. To evaluate and delineate sleep disturbances as an evidence-based suicide risk factor, a systematic review of the extant literature was conducted with methodological considerations as a central focus. The following methodologic criteria were required for inclusion: the report (1) evaluated an index of sleep disturbance; (2) examined an outcome measure for suicidal behavior; (3) adjusted for presence of a depression diagnosis or depression severity, as a covariate; and (4) represented an original investigation as opposed to a chart review. Reports meeting inclusion criteria were further classified and reviewed according to: study design and timeframe; sample type and size; sleep disturbance, suicide risk, and depression covariate assessment measure(s); and presence of positive versus negative findings. Based on keyword search, the following search engines were used: PubMed and PsycINFO. Search criteria generated N = 82 articles representing original investigations focused on sleep disturbances and suicide outcomes. Of these, N = 18 met inclusion criteria for review based on systematic analysis. Of the reports identified, N = 18 evaluated insomnia or poor sleep quality symptoms, whereas N = 8 assessed nightmares in association with suicide risk. Despite considerable differences in study designs, samples, and assessment techniques, the comparison of such reports indicates preliminary, converging evidence for sleep disturbances as an empirical risk factor for suicidal behaviors, while highlighting important, future directions for increased investigation.
Clausen, J L; Georgian, T; Gardner, K H; Douglas, T A
2018-01-01
This study compares conventional grab sampling to incremental sampling methodology (ISM) to characterize metal contamination at a military small-arms-range. Grab sample results had large variances, positively skewed non-normal distributions, extreme outliers, and poor agreement between duplicate samples even when samples were co-located within tens of centimeters of each other. The extreme outliers strongly influenced the grab sample means for the primary contaminants lead (Pb) and antinomy (Sb). In contrast, median and mean metal concentrations were similar for the ISM samples. ISM significantly reduced measurement uncertainty of estimates of the mean, increasing data quality (e.g., for environmental risk assessments) with fewer samples (e.g., decreasing total project costs). Based on Monte Carlo resampling simulations, grab sampling resulted in highly variable means and upper confidence limits of the mean relative to ISM.
Ferreira, Christiane Alves; Loureiro, Carlos Alfredo Salles; Saconato, Humberto; Atallah, Alvaro Nagib
2011-03-01
Well-conducted randomized controlled trials (RCTs) represent the highest level of evidence when the research question relates to the effect of therapeutic or preventive interventions. However, the degree of control over bias between RCTs presents great variability between studies. For this reason, with the increasing interest in and production of systematic reviews and meta-analyses, it has been necessary to develop methodology supported by empirical evidence, so as to encourage and enhance the production of valid RCTs with low risk of bias. The aim here was to conduct a methodological analysis within the field of dentistry, regarding the risk of bias in open-access RCTs available in the Lilacs (Literatura Latino-Americana e do Caribe em Ciências da Saúde) database. This was a methodology study conducted at Universidade Federal de São Paulo (Unifesp) that assessed the risk of bias in RCTs, using the following dimensions: allocation sequence generation, allocation concealment, blinding, and data on incomplete outcomes. Out of the 4,503 articles classified, only 10 studies (0.22%) were considered to be true RCTs and, of these, only a single study was classified as presenting low risk of bias. The items that the authors of these RCTs most frequently controlled for were blinding and data on incomplete outcomes. The effective presence of bias seriously weakened the reliability of the results from the dental studies evaluated, such that they would be of little use for clinicians and administrators as support for decision-making processes.
Landslide Risk: Economic Valuation in The North-Eastern Zone of Medellin City
NASA Astrophysics Data System (ADS)
Vega, Johnny Alexander; Hidalgo, César Augusto; Johana Marín, Nini
2017-10-01
Natural disasters of a geodynamic nature can cause enormous economic and human losses. The economic costs of a landslide disaster include relocation of communities and physical repair of urban infrastructure. However, when performing a quantitative risk analysis, generally, the indirect economic consequences of such an event are not taken into account. A probabilistic approach methodology that considers several scenarios of hazard and vulnerability to measure the magnitude of the landslide and to quantify the economic costs is proposed. With this approach, it is possible to carry out a quantitative evaluation of the risk by landslides, allowing the calculation of the economic losses before a potential disaster in an objective, standardized and reproducible way, taking into account the uncertainty of the building costs in the study zone. The possibility of comparing different scenarios facilitates the urban planning process, the optimization of interventions to reduce risk to acceptable levels and an assessment of economic losses according to the magnitude of the damage. For the development and explanation of the proposed methodology, a simple case study is presented, located in north-eastern zone of the city of Medellín. This area has particular geomorphological characteristics, and it is also characterized by the presence of several buildings in bad structural conditions. The proposed methodology permits to obtain an estimative of the probable economic losses by earthquake-induced landslides, taking into account the uncertainty of the building costs in the study zone. The obtained estimative shows that the structural intervention of the buildings produces a reduction the order of 21 % in the total landslide risk.
Jozi, S A; Shafiee, M; MoradiMajd, N; Saffarian, S
2012-11-01
This study aims to use integrated Shannon's Entropy-TOPSIS methodology for environmental risk assessment of the Helleh protected area in Iran. In this research, first, with regard to field visits, interview with natives of the area, and investigation of the environment of the study area, the risks existing in the region were identified. Then, for final identification of the risks, the Delphi method was applied. Analysis and prioritization of risks of the area of Helleh were performed by multi-criteria decision-making methods of Shannon's Entropy and the Technique for Order Preference by Similarity to Ideal Solution (TOPSIS). In this research, risks were assessed by three criteria of severity, probability of occurrence, and vulnerability. Twenty six of the risks were identified which were specified in two groups, natural events and environmental risks. The environmental ones were classified into four groups: physicochemical, biological, social-economic, and cultural. Results of the research showed that the construction of the Rayis-Ali-Delvari Dam at the upper part of the study area threatens the wetland. Water supply for the dam 75 km away from the area with concession of 0.9999 holds the first priority of risk-generating factors. Of the managerial workable solutions suggested controlling the risks, the stopping of the pumping of water from the wetland and observation of hunting season length and permissible type and number of hunting in the area can be mentioned.
An overview of safety assessment, regulation, and control of hazardous material use at NREL
NASA Astrophysics Data System (ADS)
Nelson, B. P.; Crandall, R. S.; Moskowitz, P. D.; Fthenakis, V. M.
1992-12-01
This paper summarizes the methodology we use to ensure the safe use of hazardous materials at the National Renewable Energy Laboratory (NREL). First, we analyze the processes and the materials used in those processes to identify the hazards presented. Then we study federal, state, and local regulations and apply the relevant requirements to our operations. When necessary, we generate internal safety documents to consolidate this information. We design research operations and support systems to conform to these requirements. Before we construct the systems, we perform a semiquantitative risk analysis on likely accident scenarios. All scenarios presenting an unacceptable risk require system or procedural modifications to reduce the risk. Following these modifications, we repeat the risk analysis to ensure that the respective accident scenarios present an acceptable risk. Once all risks are acceptable, we conduct an operational readiness review (ORR). A management-appointed panel performs the ORR ensuring compliance with all relevant requirements. After successful completion of the ORR, operations can begin.
2009 Space Shuttle Probabilistic Risk Assessment Overview
NASA Technical Reports Server (NTRS)
Hamlin, Teri L.; Canga, Michael A.; Boyer, Roger L.; Thigpen, Eric B.
2010-01-01
Loss of a Space Shuttle during flight has severe consequences, including loss of a significant national asset; loss of national confidence and pride; and, most importantly, loss of human life. The Shuttle Probabilistic Risk Assessment (SPRA) is used to identify risk contributors and their significance; thus, assisting management in determining how to reduce risk. In 2006, an overview of the SPRA Iteration 2.1 was presented at PSAM 8 [1]. Like all successful PRAs, the SPRA is a living PRA and has undergone revisions since PSAM 8. The latest revision to the SPRA is Iteration 3. 1, and it will not be the last as the Shuttle program progresses and more is learned. This paper discusses the SPRA scope, overall methodology, and results, as well as provides risk insights. The scope, assumptions, uncertainties, and limitations of this assessment provide risk-informed perspective to aid management s decision-making process. In addition, this paper compares the Iteration 3.1 analysis and results to the Iteration 2.1 analysis and results presented at PSAM 8.
Risk assessment associated to possible concrete degradation of a near surface disposal facility
NASA Astrophysics Data System (ADS)
Capra, B.; Billard, Y.; Wacquier, W.; Gens, R.
2013-07-01
This article outlines a risk analysis of possible concrete degradation performed in the framework of the preparation of the Safety Report of ONDRAF/NIRAS, the Belgian Agency for Radioactive Waste and Enriched Fissile Materials, for the construction and operation of a near surface disposal facility of category A waste - short-lived low and intermediate level waste - in Dessel. The main degradation mechanism considered is the carbonation of different concrete components over different periods (from the building phase up to 2000 years), which induces corrosion of the rebars. A dedicated methodology mixing risk analysis and numerical modeling of concrete carbonation has been developed to assess the critical risks of the disposal facility at different periods. According to the results obtained, risk mapping was used to assess the impact of carbonation of concrete on the different components at the different stages. The most important risk is related to an extreme situation with complete removal of the earth cover and side embankment.
Delvosalle, Christian; Fievez, Cécile; Pipart, Aurore; Debray, Bruno
2006-03-31
In the frame of the Accidental Risk Assessment Methodology for Industries (ARAMIS) project, this paper aims at presenting the work carried out in the part of the project devoted to the definition of accident scenarios. This topic is a key-point in risk assessment and serves as basis for the whole risk quantification. The first result of the work is the building of a methodology for the identification of major accident hazards (MIMAH), which is carried out with the development of generic fault and event trees based on a typology of equipment and substances. The term "major accidents" must be understood as the worst accidents likely to occur on the equipment, assuming that no safety systems are installed. A second methodology, called methodology for the identification of reference accident scenarios (MIRAS) takes into account the influence of safety systems on both the frequencies and possible consequences of accidents. This methodology leads to identify more realistic accident scenarios. The reference accident scenarios are chosen with the help of a tool called "risk matrix", crossing the frequency and the consequences of accidents. This paper presents both methodologies and an application on an ethylene oxide storage.
Varieties of second modernity: the cosmopolitan turn in social and political theory and research.
Beck, Ulrich; Grande, Edgar
2010-09-01
The theme of this special issue is the necessity of a cosmopolitan turn in social and political theory. The question at the heart of this introductory chapter takes the challenge of 'methodological cosmopolitanism', already addressed in a Special Issue on Cosmopolitan Sociology in this journal (Beck and Sznaider 2006), an important step further: How can social and political theory be opened up, theoretically as well as methodologically and normatively, to a historically new, entangled Modernity which threatens its own foundations? How can it account for the fundamental fragility, the mutability of societal dynamics (of unintended side effects, domination and power), shaped by the globalization of capital and risks at the beginning of the twenty-first century? What theoretical and methodological problems arise and how can they be addressed in empirical research? In the following, we will develop this 'cosmopolitan turn' in four steps: firstly, we present the major conceptual tools for a theory of cosmopolitan modernities; secondly, we de-construct Western modernity by using examples taken from research on individualization and risk; thirdly, we address the key problem of methodological cosmopolitanism, namely the problem of defining the appropriate unit of analysis; and finally,we discuss normative questions, perspectives, and dilemmas of a theory of cosmopolitan modernities, in particular problems of political agency and prospects of political realization.
Flood risk analysis model in the village of St. George/Danube Delta
NASA Astrophysics Data System (ADS)
Armas, I.; Dumitrascu, S.; Nistoran, D.
2009-04-01
River deltas may have been cradles for prehistoric civilizations (Day et al. 2007) and still represent favoured areas for human habitats on the basis of their high productivity, biodiversity and favourable economical conditions for river transport (Giosan and Bhattacharya 2005). In the same time, these regions are defined through their high vulnerability to environmental changes, being extremely susceptible to natural disasters, especially to floods. The Danube Delta, with an area of 5640 km2, is the largest ecosystem of the European humid zones. Its state reflects environmental conditions at both local and regional levels via liquid and solid parameters and has to ensure the water supply for the local economy and communities. Flooding of the delta is important for the dynamics of the entire natural system. Floods sustain both alluvial processes and the water supply to deltaic lakes. In addition, flooding frequency is important in flushing the deltaic lake system water, ensuring a normal evolution of both terrestrial and aquatic ecosystems. For human communities, on the other hand, floods are perceived as a risk factor, entailing material damage, human victims and psychological stress. In the perspective of risk assessment research, every populated place faces a certain risk engaged by a disaster, the size of which depends on the specific location, existent hazards, vulnerability and the number of elements at risk. Although natural hazards are currently a main subject of interest on a global scale, a unitary methodological approach has yet to be developed. In the general context of hazard analysis, there is the need to put more emphasis on the problem of the risk analysis. In most cases, it focuses only on an assessment of the probable material damage resulted from a specific risk scenario. Taking these matters into consideration, the aim of this study is to develop an efficient flood risk assessment methodology based on the example of the village of St. George in the Danube Delta. The study area is situated at the mouth of the St. George river branch, which suffered a series of interventions resulting with the shortening with 31 km (period 1984-1988). As a direct result, the medium speed of the water grew along with the both liquid and solid flows. In fact, this is only an example of the human activity that took place in the Danube Delta starting with the second half of the last century that influenced the hydrological system for a better use of the natural resources offered by the delta. The study is structured in two stages: the analysis of the hydrological hazard together with the simulation of a series of scenarios concerning floods at various flows and the risk analysis, expressed in the shape of the calculus of the material damage. In the study of the hazard, the methodology was based on the analysis of water depth and velocity maps, done in various flow scenarios, to which were added correlations between flood risk maps with satellite pictures, cadastral plans and field data by using GIS functions. In addition, the field investigations conducted in September 2008 focused on collecting the data necessary in the assessment of the buildings. The observations that synthesize the features of each construction included in the analysis were also stored in ArcGis in the shape of a table of attributes. This information reveals the indicators used in the analysis of the vulnerability of the residences: number of floors, height, construction type, infrastructure and price per property. The analysis revealed an increased degree of the area visibility, pointing out not only certain sectors affected by floods, but also the problems that occurred at the more detailed level of the residences. In addition, the cartographic material plays also an important part in the development of a proper public awareness strategy.
A Framework for the Next Generation of Risk Science
Krewski, Daniel; Andersen, Melvin E.; Paoli, Gregory M.; Chiu, Weihsueh A.; Al-Zoughool, Mustafa; Croteau, Maxine C.; Burgoon, Lyle D.; Cote, Ila
2014-01-01
Objectives: In 2011, the U.S. Environmental Protection Agency initiated the NexGen project to develop a new paradigm for the next generation of risk science. Methods: The NexGen framework was built on three cornerstones: the availability of new data on toxicity pathways made possible by fundamental advances in basic biology and toxicological science, the incorporation of a population health perspective that recognizes that most adverse health outcomes involve multiple determinants, and a renewed focus on new risk assessment methodologies designed to better inform risk management decision making. Results: The NexGen framework has three phases. Phase I (objectives) focuses on problem formulation and scoping, taking into account the risk context and the range of available risk management decision-making options. Phase II (risk assessment) seeks to identify critical toxicity pathway perturbations using new toxicity testing tools and technologies, and to better characterize risks and uncertainties using advanced risk assessment methodologies. Phase III (risk management) involves the development of evidence-based population health risk management strategies of a regulatory, economic, advisory, community-based, or technological nature, using sound principles of risk management decision making. Conclusions: Analysis of a series of case study prototypes indicated that many aspects of the NexGen framework are already beginning to be adopted in practice. Citation: Krewski D, Westphal M, Andersen ME, Paoli GM, Chiu WA, Al-Zoughool M, Croteau MC, Burgoon LD, Cote I. 2014. A framework for the next generation of risk science. Environ Health Perspect 122:796–805; http://dx.doi.org/10.1289/ehp.1307260 PMID:24727499
Antimicrobial Resistance in the Environment.
Waseem, Hassan; Williams, Maggie R; Stedtfeld, Robert D; Hashsham, Syed A
2017-10-01
This review summarizes selected publications of 2016 with emphasis on occurrence and treatment of antibiotic resistance genes and bacteria in the aquatic environment and wastewater and drinking water treatment plants. The review is conducted with emphasis on fate, modeling, risk assessment and data analysis methodologies for characterizing abundance. After providing a brief introduction, the review is divided into the following four sections: i) Occurrence of AMR in the Environment, ii) Treatment Technologies for AMR, iii) Modeling of Fate, Risk, and Environmental Impact of AMR, and iv) ARG Databases and Pipelines.
NASA Astrophysics Data System (ADS)
Schinke, R.; Neubert, M.; Hennersdorf, J.; Stodolny, U.; Sommer, T.; Naumann, T.
2012-09-01
The analysis and management of flood risk commonly focuses on surface water floods, because these types are often associated with high economic losses due to damage to buildings and settlements. The rising groundwater as a secondary effect of these floods induces additional damage, particularly in the basements of buildings. Mostly, these losses remain underestimated, because they are difficult to assess, especially for the entire building stock of flood-prone urban areas. For this purpose an appropriate methodology has been developed and lead to a groundwater damage simulation model named GRUWAD. The overall methodology combines various engineering and geoinformatic methods to calculate major damage processes by high groundwater levels. It considers a classification of buildings by building types, synthetic depth-damage functions for groundwater inundation as well as the results of a groundwater-flow model. The modular structure of this procedure can be adapted in the level of detail. Hence, the model allows damage calculations from the local to the regional scale. Among others it can be used to prepare risk maps, for ex-ante analysis of future risks, and to simulate the effects of mitigation measures. Therefore, the model is a multifarious tool for determining urban resilience with respect to high groundwater levels.
Biosecurity Risk Assessment Methodology (BioRAM) v. 2.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
CASKEY, SUSAN; GAUDIOSO, JENNIFER; SALERNO, REYNOLDS
Sandia National Laboratories International Biological Threat Reduction Dept (SNL/IBTR) has an ongoing mission to enhance biosecurity assessment methodologies, tools, and guise. These will aid labs seeking to implement biosecurity as advocated in the recently released WHO's Biorisk Management: Lab Biosecurity Guidance. BioRAM 2.0 is the software tool developed initially using the SNL LDRD process and designed to complement the "Laboratory Biosecurity Risk Handbook" written by Ren Salerno and Jennifer Gaudioso defining biosecurity risk assessment methodologies.
Chung, Ka-Fai; Chan, Man-Sum; Lam, Ying-Yin; Lai, Cindy Sin-Yee; Yeung, Wing-Fai
2017-06-01
Insufficient sleep among students is a major school health problem. School-based sleep education programs tailored to reach large number of students may be one of the solutions. A systematic review and meta-analysis was conducted to summarize the programs' effectiveness and current status. Electronic databases were searched up until May 2015. Randomized controlled trials of school-based sleep intervention among 10- to 19-year-old students with outcome on total sleep duration were included. Methodological quality of the studies was assessed using the Cochrane's risk of bias assessment. Seven studies were included, involving 1876 students receiving sleep education programs and 2483 attending classes-as-usual. Four weekly 50-minute sleep education classes were most commonly provided. Methodological quality was only moderate, with a high or an uncertain risk of bias in several domains. Compared to classes-as-usual, sleep education programs produced significantly longer weekday and weekend total sleep time and better mood among students at immediate post-treatment, but the improvements were not maintained at follow-up. Limited by the small number of studies and methodological limitations, the preliminary data showed that school-based sleep education programs produced short-term benefits. Future studies should explore integrating sleep education with delayed school start time or other more effective approaches. © 2017, American School Health Association.
Fragility Analysis of Concrete Gravity Dams
NASA Astrophysics Data System (ADS)
Tekie, Paulos B.; Ellingwood, Bruce R.
2002-09-01
Concrete gravity dams are an important part ofthe nation's infrastructure. Many dams have been in service for over 50 years, during which time important advances in the methodologies for evaluation of natural phenomena hazards have caused the design-basis events to be revised upwards, in some cases significantly. Many existing dams fail to meet these revised safety criteria and structural rehabilitation to meet newly revised criteria may be costly and difficult. A probabilistic safety analysis (PSA) provides a rational safety assessment and decision-making tool managing the various sources of uncertainty that may impact dam performance. Fragility analysis, which depicts fl%e uncertainty in the safety margin above specified hazard levels, is a fundamental tool in a PSA. This study presents a methodology for developing fragilities of concrete gravity dams to assess their performance against hydrologic and seismic hazards. Models of varying degree of complexity and sophistication were considered and compared. The methodology is illustrated using the Bluestone Dam on the New River in West Virginia, which was designed in the late 1930's. The hydrologic fragilities showed that the Eluestone Dam is unlikely to become unstable at the revised probable maximum flood (PMF), but it is likely that there will be significant cracking at the heel ofthe dam. On the other hand, the seismic fragility analysis indicated that sliding is likely, if the dam were to be subjected to a maximum credible earthquake (MCE). Moreover, there will likely be tensile cracking at the neck of the dam at this level of seismic excitation. Probabilities of relatively severe limit states appear to be only marginally affected by extremely rare events (e.g. the PMF and MCE). Moreover, the risks posed by the extreme floods and earthquakes were not balanced for the Bluestone Dam, with seismic hazard posing a relatively higher risk.
Angerville, Ruth; Perrodin, Yves; Bazin, Christine; Emmanuel, Evens
2013-01-01
Discharges of Combined Sewer Overflows (CSOs) into periurban rivers present risks for the concerned aquatic ecosystems. In this work, a specific ecotoxicological risk assessment methodology has been developed as management tool to municipalities equipped with CSOs. This methodology comprises a detailed description of the spatio-temporal system involved, the choice of ecological targets to be preserved, and carrying out bioassays adapted to each compartment of the river receiving CSOs. Once formulated, this methodology was applied to a river flowing through the outskirts of the city of Lyon in France. The results obtained for the scenario studied showed a moderate risk for organisms of the water column and a major risk for organisms of the benthic and hyporheic zones of the river. The methodology enabled identifying the critical points of the spatio-temporal systems studied, and then making proposals for improving the management of CSOs. PMID:23812025
Barili, Fabio; Freemantle, Nick; Folliguet, Thierry; Muneretto, Claudio; De Bonis, Michele; Czerny, Martin; Obadia, Jean Francois; Al-Attar, Nawwar; Bonaros, Nikolaos; Kluin, Jolanda; Lorusso, Roberto; Punjabi, Prakash; Sadaba, Rafael; Suwalski, Piotr; Benedetto, Umberto; Böning, Andreas; Falk, Volkmar; Sousa-Uva, Miguel; Kappetein, Pieter A; Menicanti, Lorenzo
2017-06-01
The PARTNER group recently published a comparison between the latest generation SAPIEN 3 transcatheter aortic valve implantation (TAVI) system (Edwards Lifesciences, Irvine, CA, USA) and surgical aortic valve replacement (SAVR) in intermediate-risk patients, apparently demonstrating superiority of the TAVI and suggesting that TAVI might be the preferred treatment method in this risk class of patients. Nonetheless, assessment of the non-randomized methodology used in this comparison reveals challenges that should be addressed in order to elucidate the validity of the results. The study by Thourani and colleagues showed several major methodological concerns: suboptimal methods in propensity score analysis with evident misspecification of the propensity scores (PS; no adjustment for the most significantly different covariates: left ventricular ejection fraction, moderate-severe mitral regurgitation and associated procedures); use of PS quintiles rather than matching; inference on not-adjusted Kaplan-Meier curves, although the authors correctly claimed for the need of balancing score adjusting for confounding factors in order to have unbiased estimates of the treatment effect; evidence of poor fit; lack of data on valve-related death.These methodological flaws invalidate direct comparison between treatments and cannot support authors' conclusions that TAVI with SAPIEN 3 in intermediate-risk patients is superior to surgery and might be the preferred treatment alternative to surgery. © The Author 2017. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.
Management of contaminated marine marketable resources after oil and HNS spills in Europe.
Cunha, Isabel; Neuparth, Teresa; Moreira, Susana; Santos, Miguel M; Reis-Henriques, Maria Armanda
2014-03-15
Different risk evaluation approaches have been used to face oil and hazardous and noxious substances (HNS) spills all over the world. To minimize health risks and mitigate economic losses due to a long term ban on the sale of sea products after a spill, it is essential to preemptively set risk evaluation criteria and standard methodologies based on previous experience and appropriate scientifically sound criteria. Standard methodologies are analyzed and proposed in order to improve the definition of criteria for reintegrating previously contaminated marine marketable resources into the commercialization chain in Europe. The criteria used in former spills for the closing of and lifting of bans on fisheries and harvesting are analyzed. European legislation was identified regarding food sampling, food chemical analysis and maximum levels of contaminants allowed in seafood, which ought to be incorporated in the standard methodologies for the evaluation of the decision criteria defined for oil and HNS spills in Europe. A decision flowchart is proposed that opens the current decision criteria to new material that may be incorporated in the decision process. Decision criteria are discussed and compared among countries and incidents. An a priori definition of risk criteria and an elaboration of action plans are proposed to speed up actions that will lead to prompt final decisions. These decisions, based on the best available scientific data and conducing to lift or ban economic activity, will tend to be better understood and respected by citizens. Copyright © 2014 Elsevier Ltd. All rights reserved.
Romualdo, Priscilla Coutinho; de Oliveira, Katharina Morant Holanda; Nemezio, Mariana Alencar; Küchler, Erika Calvano; Silva, Raquel Assed Bezerra; Nelson-Filho, Paulo; Silva, Lea Assed Bezerra
2017-12-01
The aim of this study was to evaluate if apical negative pressure (ANP) irrigation prevents the apical extrusion of debris and irrigant compared with conventional needle irrigation through a systematic review and meta-analysis. A computer search of dental literature was performed using four different databases. A combination of the terms 'apical negative pressure', 'endovac', 'apical extrusion', 'extrusion' and 'endodontics' was used. Studies that used extracted human teeth with a mature apex and that evaluated the apical extrusion of debris and/or irrigating solution were included. After an evaluation of the full studies according to the eligibility criteria, eight studies were critically analysed and subjected to quality assessment and risk of bias. Only four studies that evaluated extrusion of irrigant were considered as having high methodological quality and were subjected to a meta-analysis. Studies evaluating extrusion of debris did not have sufficient methodological quality to be subjected to the meta-analysis. The forest plot indicated that ANP irrigation prevents the risk of irrigant extrusion compared with conventional irrigation (OR 0.07 [95%CI 0.02-0.20]; P < 0.00001). This systematic review and meta-analysis showed that ANP prevents the apical extrusion of irrigant. There is no evidence if this type of irrigation prevents the extrusion of debris. © 2017 Australian Society of Endodontology Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, L.L.; Wilson, J.R.; Sanchez, L.C.
1998-10-01
The US Department of Energy Office of Environmental Management's (DOE/EM's) National Spent Nuclear Fuel Program (NSNFP), through a collaboration between Sandia National Laboratories (SNL) and Idaho National Engineering and Environmental Laboratory (INEEL), is conducting a systematic Nuclear Dynamics Consequence Analysis (NDCA) of the disposal of SNFs in an underground geologic repository sited in unsaturated tuff. This analysis is intended to provide interim guidance to the DOE for the management of the SNF while they prepare for final compliance evaluation. This report presents results from a Nuclear Dynamics Consequence Analysis (NDCA) that examined the potential consequences and risks of criticality duringmore » the long-term disposal of spent nuclear fuel owned by DOE-EM. This analysis investigated the potential of post-closure criticality, the consequences of a criticality excursion, and the probability frequency for post-closure criticality. The results of the NDCA are intended to provide the DOE-EM with a technical basis for measuring risk which can be used for screening arguments to eliminate post-closure criticality FEPs (features, events and processes) from consideration in the compliance assessment because of either low probability or low consequences. This report is composed of an executive summary (Volume 1), the methodology and results of the NDCA (Volume 2), and the applicable appendices (Volume 3).« less
DOT National Transportation Integrated Search
1981-02-01
The report develops a set of operational definitions for three unsafe driving actions (UDAs): speeding, following too closely, and driving left of center. The definitions flow from a methodological development and from an analysis of the literature a...
The Effect of Political Stability on Public Education Quality
ERIC Educational Resources Information Center
Nir, Adam E.; Kafle, Bhojraj Sharma
2013-01-01
Purpose: The purpose of this paper is to provide a preliminary analysis to evaluate the implications of political stability for educational quality, evident in the survival rate measure. Design/methodology/approach: Secondary analyses were conducted for data drawn from the Political Risk Service Report, the World Bank Report, the United Nations…
ERIC Educational Resources Information Center
Rosique-Blasco, Mario; Madrid-Guijarro, Antonia; García-Pérez-de-Lema, Domingo
2016-01-01
Purpose: The purpose of this paper is to explore how entrepreneurial skills (such as creativity, proactivity and risk tolerance) and socio-cultural factors (such as role model and businessman image) affect secondary education students' propensity towards entrepreneurial options in their future careers. Design/methodology/approach: A sample of…
USDA-ARS?s Scientific Manuscript database
The primary goal of this study was to evaluate the efficacy of stochastic dominance and stochastic efficiency with respect to a function (SERF) methodology for ranking conventional and conservation tillage systems using 14 years (1990-2003) of economic budget data collected from 36 plots at the Iowa...
Revista de Investigacion Educativa, 1998 (Journal of Educational Research, 1998).
ERIC Educational Resources Information Center
Revista de Investigacion Educativa, 1998
1998-01-01
Articles in this volume focus on the following: specialized research; methodological challenges; establishment of a categorization system for sociometric analysis and its application in the multicultural classroom; a case study of factors to prevent school failure in children at risk; the KeyMatch-R scale (study of a curriculum-related diagnostic…
Software for Probabilistic Risk Reduction
NASA Technical Reports Server (NTRS)
Hensley, Scott; Michel, Thierry; Madsen, Soren; Chapin, Elaine; Rodriguez, Ernesto
2004-01-01
A computer program implements a methodology, denoted probabilistic risk reduction, that is intended to aid in planning the development of complex software and/or hardware systems. This methodology integrates two complementary prior methodologies: (1) that of probabilistic risk assessment and (2) a risk-based planning methodology, implemented in a prior computer program known as Defect Detection and Prevention (DDP), in which multiple requirements and the beneficial effects of risk-mitigation actions are taken into account. The present methodology and the software are able to accommodate both process knowledge (notably of the efficacy of development practices) and product knowledge (notably of the logical structure of a system, the development of which one seeks to plan). Estimates of the costs and benefits of a planned development can be derived. Functional and non-functional aspects of software can be taken into account, and trades made among them. It becomes possible to optimize the planning process in the sense that it becomes possible to select the best suite of process steps and design choices to maximize the expectation of success while remaining within budget.
Automating Risk Analysis of Software Design Models
Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.
2014-01-01
The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688
A data protection scheme for a remote vital signs monitoring healthcare service.
Gritzalis, D; Lambrinoudakis, C
2000-01-01
Personal and medical data processed by Healthcare Information Systems must be protected against unauthorized access, modification and withholding. Security measures should be selected to provide the required level of protection in a cost-efficient manner. This is only feasible if specific characteristics of the information system are examined on a basis of a risk analysis methodology. This paper presents the results of a risk analysis, based on the CRAMM methodology, for a healthcare organization offering a patient home-monitoring service through the transmission of vital signs, focusing on the identified security needs and the proposed countermeasures. The architectural and functional models of this service were utilized for identifying and valuating the system assets, the associated threats and vulnerabilities, as well as for assessing the impact on the patients and on the service provider, should the security of any of these assets is affected. A set of adequate organizational, administrative and technical countermeasures is described for the remote vital signs monitoring service, thus providing the healthcare organization with a data protection framework that can be utilized for the development of its own security plan.
Automating risk analysis of software design models.
Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P
2014-01-01
The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.
Eutrophication risk assessment in coastal embayments using simple statistical models.
Arhonditsis, G; Eleftheriadou, M; Karydis, M; Tsirtsis, G
2003-09-01
A statistical methodology is proposed for assessing the risk of eutrophication in marine coastal embayments. The procedure followed was the development of regression models relating the levels of chlorophyll a (Chl) with the concentration of the limiting nutrient--usually nitrogen--and the renewal rate of the systems. The method was applied in the Gulf of Gera, Island of Lesvos, Aegean Sea and a surrogate for renewal rate was created using the Canberra metric as a measure of the resemblance between the Gulf and the oligotrophic waters of the open sea in terms of their physical, chemical and biological properties. The Chl-total dissolved nitrogen-renewal rate regression model was the most significant, accounting for 60% of the variation observed in Chl. Predicted distributions of Chl for various combinations of the independent variables, based on Bayesian analysis of the models, enabled comparison of the outcomes of specific scenarios of interest as well as further analysis of the system dynamics. The present statistical approach can be used as a methodological tool for testing the resilience of coastal ecosystems under alternative managerial schemes and levels of exogenous nutrient loading.
Tolaymat, Thabet; El Badawy, Amro; Sequeira, Reynold; Genaidy, Ash
2015-11-15
There is an urgent need for broad and integrated studies that address the risks of engineered nanomaterials (ENMs) along the different endpoints of the society, environment, and economy (SEE) complex adaptive system. This article presents an integrated science-based methodology to assess the potential risks of engineered nanomaterials. To achieve the study objective, two major tasks are accomplished, knowledge synthesis and algorithmic computational methodology. The knowledge synthesis task is designed to capture "what is known" and to outline the gaps in knowledge from ENMs risk perspective. The algorithmic computational methodology is geared toward the provision of decisions and an understanding of the risks of ENMs along different endpoints for the constituents of the SEE complex adaptive system. The approach presented herein allows for addressing the formidable task of assessing the implications and risks of exposure to ENMs, with the long term goal to build a decision-support system to guide key stakeholders in the SEE system towards building sustainable ENMs and nano-enabled products. Published by Elsevier B.V.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gore, Bryan F.; Blackburn, Tyrone R.; Heasler, Patrick G.
2001-01-19
The objective of this report is to compare the benefits and costs of modifications proposed for intake gate closure systems at four hydroelectric stations on the Lower Snake and Upper Columbia Rivers in the Walla Walla District that are unable to meet the COE 10-minute closure rule due to the installation of fish screens. The primary benefit of the proposed modifications is to reduce the risk of damage to the station and environs when emergency intake gate closure is required. Consequently, this report presents the results and methodology of an extensive risk analysis performed to assess the reliability of powerhousemore » systems and the costs and timing of potential damages resulting from events requiring emergency intake gate closure. As part of this analysis, the level of protection provided by the nitrogen emergency closure system was also evaluated. The nitrogen system was the basis for the original recommendation to partially disable the intake gate systems. The risk analysis quantifies this protection level.« less
NASA Astrophysics Data System (ADS)
Liu, Hu-Chen; Liu, Long; Li, Ping
2014-10-01
Failure mode and effects analysis (FMEA) has shown its effectiveness in examining potential failures in products, process, designs or services and has been extensively used for safety and reliability analysis in a wide range of industries. However, its approach to prioritise failure modes through a crisp risk priority number (RPN) has been criticised as having several shortcomings. The aim of this paper is to develop an efficient and comprehensive risk assessment methodology using intuitionistic fuzzy hybrid weighted Euclidean distance (IFHWED) operator to overcome the limitations and improve the effectiveness of the traditional FMEA. The diversified and uncertain assessments given by FMEA team members are treated as linguistic terms expressed in intuitionistic fuzzy numbers (IFNs). Intuitionistic fuzzy weighted averaging (IFWA) operator is used to aggregate the FMEA team members' individual assessments into a group assessment. IFHWED operator is applied thereafter to the prioritisation and selection of failure modes. Particularly, both subjective and objective weights of risk factors are considered during the risk evaluation process. A numerical example for risk assessment is given to illustrate the proposed method finally.
Systematic review and meta-analysis of glyphosate exposure and risk of lymphohematopoietic cancers
Chang, Ellen T.; Delzell, Elizabeth
2016-01-01
ABSTRACT This systematic review and meta-analysis rigorously examines the relationship between glyphosate exposure and risk of lymphohematopoietic cancer (LHC) including NHL, Hodgkin lymphoma (HL), multiple myeloma (MM), and leukemia. Meta-relative risks (meta-RRs) were positive and marginally statistically significant for the association between any versus no use of glyphosate and risk of NHL (meta-RR = 1.3, 95% confidence interval (CI) = 1.0–1.6, based on six studies) and MM (meta-RR = 1.4, 95% CI = 1.0–1.9; four studies). Associations were statistically null for HL (meta-RR = 1.1, 95% CI = 0.7–1.6; two studies), leukemia (meta-RR = 1.0, 95% CI = 0.6–1.5; three studies), and NHL subtypes except B-cell lymphoma (two studies each). Bias and confounding may account for observed associations. Meta-analysis is constrained by few studies and a crude exposure metric, while the overall body of literature is methodologically limited and findings are not strong or consistent. Thus, a causal relationship has not been established between glyphosate exposure and risk of any type of LHC. PMID:27015139
Systematic review and meta-analysis of glyphosate exposure and risk of lymphohematopoietic cancers.
Chang, Ellen T; Delzell, Elizabeth
2016-01-01
This systematic review and meta-analysis rigorously examines the relationship between glyphosate exposure and risk of lymphohematopoietic cancer (LHC) including NHL, Hodgkin lymphoma (HL), multiple myeloma (MM), and leukemia. Meta-relative risks (meta-RRs) were positive and marginally statistically significant for the association between any versus no use of glyphosate and risk of NHL (meta-RR = 1.3, 95% confidence interval (CI) = 1.0-1.6, based on six studies) and MM (meta-RR = 1.4, 95% CI = 1.0-1.9; four studies). Associations were statistically null for HL (meta-RR = 1.1, 95% CI = 0.7-1.6; two studies), leukemia (meta-RR = 1.0, 95% CI = 0.6-1.5; three studies), and NHL subtypes except B-cell lymphoma (two studies each). Bias and confounding may account for observed associations. Meta-analysis is constrained by few studies and a crude exposure metric, while the overall body of literature is methodologically limited and findings are not strong or consistent. Thus, a causal relationship has not been established between glyphosate exposure and risk of any type of LHC.
NASA Astrophysics Data System (ADS)
Vaynshtok, Natalia
2017-10-01
The article provides the results of development of the methodology for construction compliance monitoring in the crediting of investment projects for road construction. Work scope analysis of construction audit was conducted and an algorithm of financial audit in the crediting investment projects was developed. Furthermore, the possible pitfalls and abuses of counterparties were investigated and recommendations were given allowing the bank to receive objective and independent information on the progress of the project in real time. This mechanism is useful for the bank in insurance of possible risks, targeted and rational use of credit funds.
Finite element analysis of container ship's cargo hold using ANSYS and POSEIDON software
NASA Astrophysics Data System (ADS)
Tanny, Tania Tamiz; Akter, Naznin; Amin, Osman Md.
2017-12-01
Nowadays ship structural analysis has become an integral part of the preliminary ship design providing further support for the development and detail design of ship structures. Structural analyses of container ship's cargo holds are carried out for the balancing of their safety and capacity, as those ships are exposed to the high risk of structural damage during voyage. Two different design methodologies have been considered for the structural analysis of a container ship's cargo hold. One is rule-based methodology and the other is a more conventional software based analyses. The rule based analysis is done by DNV-GL's software POSEIDON and the conventional package based analysis is done by ANSYS structural module. Both methods have been applied to analyze some of the mechanical properties of the model such as total deformation, stress-strain distribution, Von Mises stress, Fatigue etc., following different design bases and approaches, to indicate some guidance's for further improvements in ship structural design.
Milá, Lorely; Valdés, Rodolfo; Tamayo, Andrés; Padilla, Sigifredo; Ferro, Williams
2012-03-01
CB.Hep-1 monoclonal antibody (mAb) is used for a recombinant Hepatitis B vaccine manufacturing, which is included in a worldwide vaccination program against Hepatitis B disease. The use of this mAb as immunoligand has been addressed into one of the most efficient steps of active pharmaceutical ingredient purification process. Regarding this, Quality Risk Management (QRM) provides an excellent framework for the risk management use in pharmaceutical manufacturing and quality decision-making applications. Consequently, this study sought applying a prospective risk analysis methodology Failure Mode Effects Analysis (FMEA) as QRM tool for analyzing different CB.Hep-1 mAb manufacturing technologies. As main conclusions FMEA was successfully used to assess risks associated with potential problems in CB.Hep-1 mAb manufacturing processes. The severity and occurrence of risks analysis evidenced that the percentage of very high severe risks ranged 31.0-38.7% of all risks and the huge majority of risks have a very low occurrence level (61.9-83.3%) in all assessed technologies. Finally, additive Risk Priority Number, was descending ordered as follow: transgenic plants (2636), ascites (2577), transgenic animals (2046) and hollow fiber bioreactors (1654), which also corroborated that in vitro technology, should be the technology of choice for CB.Hep-1 mAb manufacturing in terms of risks and mAb molecule quality. Copyright © 2011 The International Alliance for Biological Standardization. Published by Elsevier Ltd. All rights reserved.
Risk assessment for the mercury polluted site near a pesticide plant in Changsha, Hunan, China.
Dong, Haochen; Lin, Zhijia; Wan, Xiang; Feng, Liu
2017-02-01
The distribution characteristics of mercury fractions at the site near a pesticide plant was investigated, with the total mercury concentrations ranging from 0.0250 to 44.3 mg kg -1 . The mercury bound to organic matter and residual mercury were the main fractions, and the most mobile fractions accounted for only 5.9%-9.7%, indicating a relatively low degree of potential risk. The relationships between mercury fractions and soil physicochemical properties were analysed. The results demonstrated that organic matter was one of the most important factors in soil fraction distribution, and both OM and soil pH appeared to have a significant influence on the Fe/Mn oxides of mercury. Together with the methodology of partial correlation analysis, the concept and model of delayed geochemical hazard (DGH) was introduced to reveal the potential transformation paths and chain reactions among different mercury fractions and therefore to have a better understanding of risk development. The results showed that the site may be classified as a low-risk site of mercury DGH with a probability of 10.5%, but it had an easy trend in mercury DGH development due to low critical points of DGH burst. In summary, this study provides a methodology for site risk assessment in terms of static risk and risk development. Copyright © 2016 Elsevier Ltd. All rights reserved.
Stubbs, Brendon; Stubbs, Jean; Gnanaraj, Solomon Donald; Soundy, Andrew
2016-01-01
Depressive symptomology is now widely recognized as a key risk factor for falls. The evidence regarding the impact of major depressive disorder (MDD) on falls is unclear. A systematic review and exploratory meta-analysis was undertaken to explore the relationship between MDD and falls. Major electronic database were searched from inception till April 2015. Studies that defined MDD and measured falls prospectively in older adults (≥60 years) were included. Studies relying on depressive symptomology alone were excluded. The methodological quality of included articles was assessed and study findings were synthesized using an exploratory meta-analysis. From a potential of 415 articles, only three studies met the inclusion criteria. This included 976 unique older adults with a range of mean age from ≥65 to 83 years. The methodological quality of included studies was satisfactory. None of the included studies' primary aim was to investigate the relationship between MDD and falls. The exploratory meta-analysis demonstrated older adults with MDD are at increased risk of falling compared to non-depressed older adults (odds ratio (OR) 4.0, 95% CI 2.0-8.1, I(2) = 60%, n = 976). There is a paucity of research considering falls in older adults with MDD. Our results demonstrate that the odds of falling appear to be greater among people with MDD (OR 4.0) than in previous meta-analyses that have only considered subthreshold depressive symptoms. Given the distinct nature and challenges with MDD, more research is required to better understand the falls risk in this group.
NASA Technical Reports Server (NTRS)
1991-01-01
The technical effort and computer code enhancements performed during the sixth year of the Probabilistic Structural Analysis Methods program are summarized. Various capabilities are described to probabilistically combine structural response and structural resistance to compute component reliability. A library of structural resistance models is implemented in the Numerical Evaluations of Stochastic Structures Under Stress (NESSUS) code that included fatigue, fracture, creep, multi-factor interaction, and other important effects. In addition, a user interface was developed for user-defined resistance models. An accurate and efficient reliability method was developed and was successfully implemented in the NESSUS code to compute component reliability based on user-selected response and resistance models. A risk module was developed to compute component risk with respect to cost, performance, or user-defined criteria. The new component risk assessment capabilities were validated and demonstrated using several examples. Various supporting methodologies were also developed in support of component risk assessment.
Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prayogo, Galang Sandy, E-mail: gasandylang@live.com; Haryadi, Gunawan Dwi; Ismail, Rifky
Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presentedmore » a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.« less
A methodology for post-mainshock probabilistic assessment of building collapse risk
Luco, N.; Gerstenberger, M.C.; Uma, S.R.; Ryu, H.; Liel, A.B.; Raghunandan, M.
2011-01-01
This paper presents a methodology for post-earthquake probabilistic risk (of damage) assessment that we propose in order to develop a computational tool for automatic or semi-automatic assessment. The methodology utilizes the same so-called risk integral which can be used for pre-earthquake probabilistic assessment. The risk integral couples (i) ground motion hazard information for the location of a structure of interest with (ii) knowledge of the fragility of the structure with respect to potential ground motion intensities. In the proposed post-mainshock methodology, the ground motion hazard component of the risk integral is adapted to account for aftershocks which are deliberately excluded from typical pre-earthquake hazard assessments and which decrease in frequency with the time elapsed since the mainshock. Correspondingly, the structural fragility component is adapted to account for any damage caused by the mainshock, as well as any uncertainty in the extent of this damage. The result of the adapted risk integral is a fully-probabilistic quantification of post-mainshock seismic risk that can inform emergency response mobilization, inspection prioritization, and re-occupancy decisions.
Assessing the risk profiles of potentially sensitive populations requires a 'tool chest' of methodological approaches to adequately characterize and evaluate these populations. At present, there is an extensive body of literature on methodologies that apply to the evaluation of...
SURVEY OF METHODOLOGIES FOR DEVELOPING MEDIA SCREENING VALUES FOR ECOLOGICAL RISK ASSESSMENT
Barron, Mace G. and Steve Wharton. Submitted. Survey of Methodologies for Developing Media Screening Values for Ecological Risk Assessment. Environ. Toxicol. Chem. 44 p. (ERL,GB 1200).
Concurrent with the increase in the number of ecological risk assessments over the past...
Sport Injuries Sustained by Athletes with Disability: A Systematic Review.
Weiler, Richard; Van Mechelen, Willem; Fuller, Colin; Verhagen, Evert
2016-08-01
Fifteen percent of the world's population live with disability, and many of these individuals choose to play sport. There are barriers to sport participation for athletes with disability and sports injury can greatly impact on daily life, which makes sports injury prevention additionally important. The purpose of this review is to systematically review the definitions, methodologies and injury rates in disability sport, which should assist future identification of risk factors and development of injury prevention strategies. A secondary aim is to highlight the most pressing issues for improvement of the quality of injury epidemiology research for disability sport. A search of NICE, AMED, British Nursing Index, CINAHL, EMBASE and Medline was conducted to identify all publications up to 16 June 2015. Of 489 potentially relevant articles and reference searching, a total of 15 studies were included. Wide study sample heterogeneity prevented data pooling and meta-analysis. Results demonstrated an evolving field of epidemiology, but with wide differences in sports injury definition and with studies focused on short competitions. Background data were generally sparse; there was minimal exposure analysis, and no analysis of injury severity, all of which made comparison of injury risk and injury severity difficult. There is an urgent need for consensus on sports injury definition and methodology in disability sports. The quality of studies is variable, with inconsistent sports injury definitions, methodologies and injury rates, which prevents comparison, conclusions and development of injury prevention strategies. The authors highlight the most pressing issues for improvement of the quality in injury epidemiology research for disability sport.
SSHAC Level 1 Probabilistic Seismic Hazard Analysis for the Idaho National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Payne, Suzette Jackson; Coppersmith, Ryan; Coppersmith, Kevin
A Probabilistic Seismic Hazard Analysis (PSHA) was completed for the Materials and Fuels Complex (MFC), Advanced Test Reactor (ATR), and Naval Reactors Facility (NRF) at the Idaho National Laboratory (INL). The PSHA followed the approaches and procedures for Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 study and included a Participatory Peer Review Panel (PPRP) to provide the confident technical basis and mean-centered estimates of the ground motions. A new risk-informed methodology for evaluating the need for an update of an existing PSHA was developed as part of the Seismic Risk Assessment (SRA) project. To develop and implement the newmore » methodology, the SRA project elected to perform two SSHAC Level 1 PSHAs. The first was for the Fuel Manufacturing Facility (FMF), which is classified as a Seismic Design Category (SDC) 3 nuclear facility. The second was for the ATR Complex, which has facilities classified as SDC-4. The new methodology requires defensible estimates of ground motion levels (mean and full distribution of uncertainty) for its criteria and evaluation process. The INL SSHAC Level 1 PSHA demonstrates the use of the PPRP, evaluation and integration through utilization of a small team with multiple roles and responsibilities (four team members and one specialty contractor), and the feasibility of a short duration schedule (10 months). Additionally, a SSHAC Level 1 PSHA was conducted for NRF to provide guidance on the potential use of a design margin above rock hazard levels for the Spent Fuel Handling Recapitalization Project (SFHP) process facility.« less
Bowel preparation for elective procedures in children: a systematic review and meta-analysis
Karlsen, Fiona; Isaji, Sahira; Teck, Guan-Ong
2017-01-01
Objective Reviews have investigated preparation for colonoscopy, but not for surgery, They are also often limited to patients up to 16 years, despite many paediatric gastroenterologists caring for older patients. We carried out a systematic review investigating the optimum bowel preparation agents for all indications in children and young people. Design A Cochrane format systematic review of randomised controlled trials (RCTs). Data extraction and assessment of methodological quality were performed independently by two reviewers. Methodological quality was assessed using the Cochrane risk of bias tool. Patients Young people requiring bowel preparation for any elective procedure, as defined by the primary studies. Interventions RCTs comparing bowel preparation with placebo or other interventions. Main outcome measures Adequacy of bowel preparation, tolerability and adverse events. Results The search yielded 2124 results and 15 randomised controlled studies (n=1435)but heterogeneity limited synthesis. Meta-analysis of two studies comparing polyethylene glycol (PEG) with sodium phosphate showed no difference in the quality of bowel preparation (risk ratio (RR) 1.27(95% CI 0.66 to 2.44)). Two studies comparing sodium picosulfate/magnesium citrate with PEG found no difference in bowel preparation but significantly higher number of patients needing nasogastric tube insertion in the polyethylene glycol-electrolyte lavage solution (RR 0.04(95% CI 0.01 to 0.18), 45 of 117 in PEG group vs 2 of 121 in sodium picosulfate group). Meta-analysis of three studies (n=241) found no difference between PEG and sennasoids (RR 0.73(95% CI 0.31 to 1.71)). Conclusions The evidence base is clinically heterogeneous and methodologically at risk of bias. There is evidence that all regimens are equally effective. However, sodium picosulfate was better tolerated than PEG. Future research is needed with all agents and should seek to consider safety and tolerability as well as efficacy. PMID:29637141
Ibrahim, Shewkar E; Sayed, Tarek; Ismail, Karim
2012-11-01
Several earlier studies have noted the shortcomings with existing geometric design guides which provide deterministic standards. In these standards the safety margin of the design output is generally unknown and there is little knowledge of the safety implications of deviating from the standards. To mitigate these shortcomings, probabilistic geometric design has been advocated where reliability analysis can be used to account for the uncertainty in the design parameters and to provide a mechanism for risk measurement to evaluate the safety impact of deviations from design standards. This paper applies reliability analysis for optimizing the safety of highway cross-sections. The paper presents an original methodology to select a suitable combination of cross-section elements with restricted sight distance to result in reduced collisions and consistent risk levels. The purpose of this optimization method is to provide designers with a proactive approach to the design of cross-section elements in order to (i) minimize the risk associated with restricted sight distance, (ii) balance the risk across the two carriageways of the highway, and (iii) reduce the expected collision frequency. A case study involving nine cross-sections that are parts of two major highway developments in British Columbia, Canada, was presented. The results showed that an additional reduction in collisions can be realized by incorporating the reliability component, P(nc) (denoting the probability of non-compliance), in the optimization process. The proposed approach results in reduced and consistent risk levels for both travel directions in addition to further collision reductions. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Ferrer, Laetitia; Curt, Corinne; Tacnet, Jean-Marc
2018-04-01
Major hazard prevention is a main challenge given that it is specifically based on information communicated to the public. In France, preventive information is notably provided by way of local regulatory documents. Unfortunately, the law requires only few specifications concerning their content; therefore one can question the impact on the general population relative to the way the document is concretely created. Ergo, the purpose of our work is to propose an analytical methodology to evaluate preventive risk communication document effectiveness. The methodology is based on dependability approaches and is applied in this paper to the Document d'Information Communal sur les Risques Majeurs (DICRIM; in English, Municipal Information Document on Major Risks). DICRIM has to be made by mayors and addressed to the public to provide information on major hazards affecting their municipalities. An analysis of law compliance of the document is carried out thanks to the identification of regulatory detection elements. These are applied to a database of 30 DICRIMs. This analysis leads to a discussion on points such as usefulness of the missing elements. External and internal function analysis permits the identification of the form and content requirements and service and technical functions of the document and its components (here its sections). Their results are used to carry out an FMEA (failure modes and effects analysis), which allows us to define the failure and to identify detection elements. This permits the evaluation of the effectiveness of form and content of each components of the document. The outputs are validated by experts from the different fields investigated. Those results are obtained to build, in future works, a decision support model for the municipality (or specialised consulting firms) in charge of drawing up documents.
Risk analysis of a biomass combustion process using MOSAR and FMEA methods.
Thivel, P-X; Bultel, Y; Delpech, F
2008-02-28
Thermal and chemical conversion processes that convert in energy the sewage sludge, pasty waste and other pre-processed waste are increasingly common, for economic and ecological reasons. Fluidized bed combustion is currently one of the most promising methods of energy conversion, since it burns biomass very efficiently, and produces only very small quantities of sulphur and nitrogen oxides. The hazards associated with biomass combustion processes are fire, explosion and poisoning from the combustion gases (CO, etc.). The risk analysis presented in this paper uses the MADS-MOSAR methodology, applied to a semi-industrial pilot scheme comprising a fluidization column, a conventional cyclone, two natural gas burners and a continuous supply of biomass. The methodology uses a generic approach, with an initial macroscopic stage where hazard sources are identified, scenarios for undesired events are recognized and ranked using a grid of SeverityxProbability and safety barriers suggested. A microscopic stage then analyzes in detail the major risks identified during the first stage. This analysis may use various different tools, such as HAZOP, FMEA, etc.: our analysis is based on FMEA. Using MOSAR, we were able to identify five subsystems: the reactor (fluidized bed and centrifuge), the fuel and biomass supply lines, the operator and the environment. When we drew up scenarios based on these subsystems, we found that malfunction of the gas supply burners was a common trigger in many scenarios. Our subsequent microscopic analysis, therefore, focused on the burners, looking at the ways they failed, and at the effects and criticality of those failures (FMEA). We were, thus, able to identify a number of critical factors such as the incoming gas lines and the ignition electrode.
Sailaukhanuly, Yerbolat; Zhakupbekova, Arai; Amutova, Farida; Carlsen, Lars
2013-01-01
Knowledge of the environmental behavior of chemicals is a fundamental part of the risk assessment process. The present paper discusses various methods of ranking of a series of persistent organic pollutants (POPs) according to the persistence, bioaccumulation and toxicity (PBT) characteristics. Traditionally ranking has been done as an absolute (total) ranking applying various multicriteria data analysis methods like simple additive ranking (SAR) or various utility functions (UFs) based rankings. An attractive alternative to these ranking methodologies appears to be partial order ranking (POR). The present paper compares different ranking methods like SAR, UF and POR. Significant discrepancies between the rankings are noted and it is concluded that partial order ranking, as a method without any pre-assumptions concerning possible relation between the single parameters, appears as the most attractive ranking methodology. In addition to the initial ranking partial order methodology offers a wide variety of analytical tools to elucidate the interplay between the objects to be ranked and the ranking parameters. In the present study is included an analysis of the relative importance of the single P, B and T parameters. Copyright © 2012 Elsevier Ltd. All rights reserved.
IT Operational Risk Measurement Model Based on Internal Loss Data of Banks
NASA Astrophysics Data System (ADS)
Hao, Xiaoling
Business operation of banks relies increasingly on information technology (IT) and the most important role of IT is to guarantee the operational continuity of business process. Therefore, IT Risk management efforts need to be seen from the perspective of operational continuity. Traditional IT risk studies focused on IT asset-based risk analysis and risk-matrix based qualitative risk evaluation. In practice, IT risk management practices of banking industry are still limited to the IT department and aren't integrated into business risk management, which causes the two departments to work in isolation. This paper presents an improved methodology for dealing with IT operational risk. It adopts quantitative measurement method, based on the internal business loss data about IT events, and uses Monte Carlo simulation to predict the potential losses. We establish the correlation between the IT resources and business processes to make sure risk management of IT and business can work synergistically.
A multi-disciplinary approach for the structural monitoring of Cultural Heritages in a seismic area
NASA Astrophysics Data System (ADS)
Fabrizia Buongiorno, Maria; Musacchio, Massimo; Guerra, Ignazio; Porco, Giacinto; Stramondo, Salvatore; Casula, Giuseppe; Caserta, Arrigo; Speranza, Fabio; Doumaz, Fawzi; Giovanna Bianchi, Maria; Luzi, Guido; Ilaria Pannaccione Apa, Maria; Montuori, Antonio; Gaudiosi, Iolanda; Vecchio, Antonio; Gervasi, Anna; Bonali, Elena; Romano, Dolores; Falcone, Sergio; La Piana, Carmelo
2014-05-01
In the recent years, the concepts of seismic risk vulnerability and structural health monitoring have become very important topics in the field of both structural and civil engineering for the identification of appropriate risk indicators and risk assessment methodologies in Cultural Heritages monitoring. The latter, which includes objects, building and sites with historical, architectural and/or engineering relevance, concerns the management, the preservation and the maintenance of the heritages within their surrounding environmental context, in response to climate changes and natural hazards (e.g. seismic, volcanic, landslides and flooding hazards). Within such a framework, the complexity and the great number of variables to be considered require a multi-disciplinary approach including strategies, methodologies and tools able to provide an effective monitoring of Cultural Heritages form both scientific and operational viewpoints. Based on this rationale, in this study, an advanced, technological and operationally-oriented approach is presented and tested, which enables measuring and monitoring Cultural Heritage conservation state and geophysical/geological setting of the area, in order to mitigate the seismic risk of the historical public goods at different spatial scales*. The integration between classical geophysical methods with new emerging sensing techniques enables a multi-depth, multi-resolution, and multi-scale monitoring in both space and time. An integrated system of methodologies, instrumentation and data-processing approaches for non-destructive Cultural Heritage investigations is proposed, which concerns, in detail, the analysis of seismogenetic sources, the geological-geotechnical setting of the area and site seismic effects evaluation, proximal remote sensing techniques (e.g. terrestrial laser scanner, ground-based radar systems, thermal cameras), high-resolution aerial and satellite-based remote sensing methodologies (e.g. aeromagnetic surveys, synthetic aperture radar, optical, multispectral and panchromatic measurements), static and dynamic structural health monitoring analysis (e.g. screening tests with georadar, sonic instruments, sclerometers and optic fibers). The final purpose of the proposed approach is the development of an investigation methodology for short- and long-term Cultural Heritages preservation in response to seismic stress, which has specific features of scalability, modularity and exportability for every possible monitoring configuration. Moreover, it allows gathering useful information to furnish guidelines for Institution and local Administration to plan consolidation actions and therefore prevention activity. Some preliminary results will be presented for the test site of Calabria Region, where some architectural heritages have been properly selected as case studies for monitoring purposes. *The present work is supported and funded by Ministero dell'Università, dell'Istruzione e della Ricerca (MIUR) under the research project PON01-02710 "MASSIMO" - "Monitoraggio in Area Sismica di Sistemi Monumentali".
Pan, Xin; Lopez-Olivo, Maria A; Song, Juhee; Pratt, Gregory; Suarez-Almazor, Maria E
2017-01-01
Objectives We appraised the methodological and reporting quality of randomised controlled clinical trials (RCTs) evaluating the efficacy and safety of Chinese herbal medicine (CHM) in patients with rheumatoid arthritis (RA). Design For this systematic review, electronic databases were searched from inception until June 2015. The search was limited to humans and non-case report studies, but was not limited by language, year of publication or type of publication. Two independent reviewers selected RCTs, evaluating CHM in RA (herbals and decoctions). Descriptive statistics were used to report on risk of bias and their adherence to reporting standards. Multivariable logistic regression analysis was performed to determine study characteristics associated with high or unclear risk of bias. Results Out of 2342 unique citations, we selected 119 RCTs including 18 919 patients: 10 108 patients received CHM alone and 6550 received one of 11 treatment combinations. A high risk of bias was observed across all domains: 21% had a high risk for selection bias (11% from sequence generation and 30% from allocation concealment), 85% for performance bias, 89% for detection bias, 4% for attrition bias and 40% for reporting bias. In multivariable analysis, fewer authors were associated with selection bias (allocation concealment), performance bias and attrition bias, and earlier year of publication and funding source not reported or disclosed were associated with selection bias (sequence generation). Studies published in non-English language were associated with reporting bias. Poor adherence to recommended reporting standards (<60% of the studies not providing sufficient information) was observed in 11 of the 23 sections evaluated. Limitations Study quality and data extraction were performed by one reviewer and cross-checked by a second reviewer. Translation to English was performed by one reviewer in 85% of the included studies. Conclusions Studies evaluating CHM often fail to meet expected methodological criteria, and high-quality evidence is lacking. PMID:28249848
The Application of a Residual Risk Evaluation Technique Used for Expendable Launch Vehicles
NASA Technical Reports Server (NTRS)
Latimer, John A.
2009-01-01
This presentation provides a Residual Risk Evaluation Technique (RRET) developed by Kennedy Space Center (KSC) Safety and Mission Assurance (S&MA) Launch Services Division. This technique is one of many procedures used by S&MA at KSC to evaluate residual risks for each Expendable Launch Vehicle (ELV) mission. RRET is a straight forward technique that incorporates the proven methodology of risk management, fault tree analysis, and reliability prediction. RRET derives a system reliability impact indicator from the system baseline reliability and the system residual risk reliability values. The system reliability impact indicator provides a quantitative measure of the reduction in the system baseline reliability due to the identified residual risks associated with the designated ELV mission. An example is discussed to provide insight into the application of RRET.
Assessing the risk profiles of potentially sensitive populations requires a "tool chest" of methodological approaches to adequately characterize and evaluate these populations. At present, there is an extensive body of literature on methodologies that apply to the evaluation of t...
Risk as an attribute in discrete choice experiments: a systematic review of the literature.
Harrison, Mark; Rigby, Dan; Vass, Caroline; Flynn, Terry; Louviere, Jordan; Payne, Katherine
2014-01-01
Discrete choice experiments (DCEs) are used to elicit preferences of current and future patients and healthcare professionals about how they value different aspects of healthcare. Risk is an integral part of most healthcare decisions. Despite the use of risk attributes in DCEs consistently being highlighted as an area for further research, current methods of incorporating risk attributes in DCEs have not been reviewed explicitly. This study aimed to systematically identify published healthcare DCEs that incorporated a risk attribute, summarise and appraise methods used to present and analyse risk attributes, and recommend best practice regarding including, analysing and transparently reporting the methodology supporting risk attributes in future DCEs. The Web of Science, MEDLINE, EMBASE, PsycINFO and Econlit databases were searched on 18 April 2013 for DCEs that included a risk attribute published since 1995, and on 23 April 2013 to identify studies assessing risk communication in the general (non-DCE) health literature. Healthcare-related DCEs with a risk attribute mentioned or suggested in the title/abstract were obtained and retained in the final review if a risk attribute meeting our definition was included. Extracted data were tabulated and critically appraised to summarise the quality of reporting, and the format, presentation and interpretation of the risk attribute were summarised. This review identified 117 healthcare DCEs that incorporated at least one risk attribute. Whilst there was some evidence of good practice incorporated into the presentation of risk attributes, little evidence was found that developing methods and recommendations from other disciplines about effective methods and validation of risk communication were systematically applied to DCEs. In general, the reviewed DCE studies did not thoroughly report the methodology supporting the explanation of risk in training materials, the impact of framing risk, or exploring the validity of risk communication. The primary limitation of this review was that the methods underlying presentation, format and analysis of risk attributes could only be appraised to the extent that they were reported. Improvements in reporting and transparency of risk presentation from conception to the analysis of DCEs are needed. To define best practice, further research is needed to test how the process of communicating risk affects the way in which people value risk attributes in DCEs.
NASA Astrophysics Data System (ADS)
Dentoni, Marta; Deidda, Roberto; Paniconi, Claudio; Marrocu, Marino; Lecca, Giuditta
2014-05-01
Seawater intrusion (SWI) has become a major threat to coastal freshwater resources, particularly in the Mediterranean basin, where this problem is exacerbated by the lack of appropriate groundwater resources management and with serious potential impacts from projected climate changes. A proper analysis and risk assessment that includes climate scenarios is essential for the design of water management measures to mitigate the environmental and socio-economic impacts of SWI. In this study a methodology for SWI risk analysis in coastal aquifers is developed and applied to the Gaza Strip coastal aquifer in Palestine. The method is based on the origin-pathway-target model, evaluating the final value of SWI risk by applying the overlay principle to the hazard map (representing the origin of SWI), the vulnerability map (representing the pathway of groundwater flow) and the elements map (representing the target of SWI). Results indicate the important role of groundwater simulation in SWI risk assessment and illustrate how mitigation measures can be developed according to predefined criteria to arrive at quantifiable expected benefits. Keywords: Climate change, coastal aquifer, seawater intrusion, risk analysis, simulation/optimization model. Acknowledgements. The study is partially funded by the project "Climate Induced Changes on the Hydrology of Mediterranean Basins (CLIMB)", FP7-ENV-2009-1, GA 244151.
77 FR 53059 - Risk-Based Capital Guidelines: Market Risk
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-30
...The Office of the Comptroller of the Currency (OCC), Board of Governors of the Federal Reserve System (Board), and Federal Deposit Insurance Corporation (FDIC) are revising their market risk capital rules to better capture positions for which the market risk capital rules are appropriate; reduce procyclicality; enhance the rules' sensitivity to risks that are not adequately captured under current methodologies; and increase transparency through enhanced disclosures. The final rule does not include all of the methodologies adopted by the Basel Committee on Banking Supervision for calculating the standardized specific risk capital requirements for debt and securitization positions due to their reliance on credit ratings, which is impermissible under the Dodd-Frank Wall Street Reform and Consumer Protection Act of 2010. Instead, the final rule includes alternative methodologies for calculating standardized specific risk capital requirements for debt and securitization positions.
Gambler Risk Perception: A Mental Model and Grounded Theory Analysis.
Spurrier, Michael; Blaszczynski, Alexander; Rhodes, Paul
2015-09-01
Few studies have investigated how gamblers perceive risk or the role of risk perception in disordered gambling. The purpose of the current study therefore was to obtain data on lay gamblers' beliefs on these variables and their effects on decision-making, behaviour, and disordered gambling aetiology. Fifteen regular lay gamblers (non-problem/low risk, moderate risk and problem gamblers) completed a semi-structured interview following mental models and grounded theory methodologies. Gambler interview data was compared to an expert 'map' of risk-perception, to identify comparative gaps or differences associated with harmful or safe gambling. Systematic overlapping processes of data gathering and analysis were used to iteratively extend, saturate, test for exception, and verify concepts and themes emerging from the data. The preliminary findings suggested that gambler accounts supported the presence of expert conceptual constructs, and to some degree the role of risk perception in protecting against or increasing vulnerability to harm and disordered gambling. Gambler accounts of causality, meaning, motivation, and strategy were highly idiosyncratic, and often contained content inconsistent with measures of disordered gambling. Disordered gambling appears heavily influenced by relative underestimation of risk and overvaluation of gambling, based on explicit and implicit analysis, and deliberate, innate, contextual, and learned processing evaluations and biases.
NASA Technical Reports Server (NTRS)
Hatfield, Glen S.; Hark, Frank; Stott, James
2016-01-01
Launch vehicle reliability analysis is largely dependent upon using predicted failure rates from data sources such as MIL-HDBK-217F. Reliability prediction methodologies based on component data do not take into account risks attributable to manufacturing, assembly, and process controls. These sources often dominate component level reliability or risk of failure probability. While consequences of failure is often understood in assessing risk, using predicted values in a risk model to estimate the probability of occurrence will likely underestimate the risk. Managers and decision makers often use the probability of occurrence in determining whether to accept the risk or require a design modification. Due to the absence of system level test and operational data inherent in aerospace applications, the actual risk threshold for acceptance may not be appropriately characterized for decision making purposes. This paper will establish a method and approach to identify the pitfalls and precautions of accepting risk based solely upon predicted failure data. This approach will provide a set of guidelines that may be useful to arrive at a more realistic quantification of risk prior to acceptance by a program.
Nanotechnology risk perceptions and communication: emerging technologies, emerging challenges.
Pidgeon, Nick; Harthorn, Barbara; Satterfield, Terre
2011-11-01
Nanotechnology involves the fabrication, manipulation, and control of materials at the atomic level and may also bring novel uncertainties and risks. Potential parallels with other controversial technologies mean there is a need to develop a comprehensive understanding of processes of public perception of nanotechnology uncertainties, risks, and benefits, alongside related communication issues. Study of perceptions, at so early a stage in the development trajectory of a technology, is probably unique in the risk perception and communication field. As such it also brings new methodological and conceptual challenges. These include: dealing with the inherent diversity of the nanotechnology field itself; the unfamiliar and intangible nature of the concept, with few analogies to anchor mental models or risk perceptions; and the ethical and value questions underlying many nanotechnology debates. Utilizing the lens of social amplification of risk, and drawing upon the various contributions to this special issue of Risk Analysis on Nanotechnology Risk Perceptions and Communication, nanotechnology may at present be an attenuated hazard. The generic idea of "upstream public engagement" for emerging technologies such as nanotechnology is also discussed, alongside its importance for future work with emerging technologies in the risk communication field. © 2011 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Scaini, C.; Felpeto, A.; Martí, J.; Carniel, R.
2014-05-01
This paper presents a GIS-based methodology to estimate damages produced by volcanic eruptions. The methodology is constituted by four parts: definition and simulation of eruptive scenarios, exposure analysis, vulnerability assessment and estimation of expected damages. Multi-hazard eruptive scenarios are defined for the Teide-Pico Viejo active volcanic complex, and simulated through the VORIS tool. The exposure analysis identifies the elements exposed to the hazard at stake and focuses on the relevant assets for the study area. The vulnerability analysis is based on previous studies on the built environment and complemented with the analysis of transportation and urban infrastructures. Damage assessment is performed associating a qualitative damage rating to each combination of hazard and vulnerability. This operation consists in a GIS-based overlap, performed for each hazardous phenomenon considered and for each element. The methodology is then automated into a GIS-based tool using an ArcGIS® program. Given the eruptive scenarios and the characteristics of the exposed elements, the tool produces expected damage maps. The tool is applied to the Icod Valley (North of Tenerife Island) which is likely to be affected by volcanic phenomena in case of eruption from both the Teide-Pico Viejo volcanic complex and North-West basaltic rift. Results are thematic maps of vulnerability and damage that can be displayed at different levels of detail, depending on the user preferences. The aim of the tool is to facilitate territorial planning and risk management in active volcanic areas.
NASA Astrophysics Data System (ADS)
Stephenson, V.; D'Ayala, D.
2013-10-01
The recent increase in frequency and severity of flooding in the UK has led to a shift in the perception of risk associated with flood hazards. This has extended to the conservation community, and the risks posed to historic structures that suffer from flooding are particularly concerning for those charged with preserving and maintaining such buildings. In order to fully appraise the risks in a manner appropriate to the complex issue of preservation, a new methodology is proposed that studies the nature of vulnerability of such structures, and places it in the context of risk assessment, accounting for the vulnerable object and the subsequent exposure of that object to flood hazards. The testing of the methodology is carried out using three urban case studies and the results of the survey analysis provide key findings and guidance on the development of fragility curves for historic structures exposed to flooding. This occurs through appraisal of key vulnerability indicators related to building form, structural and fabric integrity, and preservation of architectural and archaeological values. This in turn facilitates the production of strategies for mitigating and managing the losses threatened by such extreme climate events.
Psychometric evaluation of the Moral Distress Risk Scale: A methodological study.
Schaefer, Rafaela; Zoboli, Elma Lcp; Vieira, Margarida M
2017-01-01
Moral distress is a kind of suffering that nurses may experience when they act in ways that are considered inconsistent with moral values, leading to a perceived compromise of moral integrity. Consequences are mostly negative and include physical and psychological symptoms, in addition to organizational implications. To psychometrically test the Moral Distress Risk Scale. A methodological study was realized. Data were submitted to exploratory factorial analysis through the SPSS statistical program. Participants and research context: In total, 268 nurses from hospitals and primary healthcare settings participated in this research during the period of March to June of 2016. Ethical considerations: This research has ethics committee approval. The Moral Distress Risk Scale is composed of 7 factors and 30 items; it shows evidence of acceptable reliability and validity with a Cronbach's α = 0.913, a total variance explained of 59%, a Kaiser-Meyer-Olkin = 0.896, and a significant Bartlett <0.001. Concerns about moral distress should be beyond acute care settings, and a tool to help clarify critical points in other healthcare contexts may add value to moral distress speech. Psychometric results reveal that the Moral Distress Risk Scale can be applied in different healthcare contexts.
[Road traffic injuries among youth: measuring the impact of an educational intervention].
Hidalgo-Solórzano, Elisa; Híjar, Martha; Mora-Flores, Gerardo; Treviño-Siller, Sandra; Inclán-Valadez, Cristina
2008-01-01
To analyze the impact of an educative intervention intended to increase the knowledge of causes and risk factors associated with road traffic iinjries in the city of Cuernavaca. A quasi-experimental study design was administered to students from 16 to 19 years old in colleges and universities in the city of Cuernavaca. The educative intervention included radio spots, banners, pamphlets, posters and cards. The measure of impact was established as changes in knowledge about speed, alcohol and the use of seat belts, using factor analysis methodologies. A significant change in the level of knowledge (p= 0.000) was observed in 700 students from 16 institutions. Educative interventions represent an initial strategy for changes in knowledge and population behaviours. The present study offers an appropriate methodology to measure short-term changes in knowledge about risk factors associated with a significant problem affecting Mexican youth.
Davis, Jennifer C; Bryan, Stirling; Marra, Carlo A; Hsiung, Ging-Yuek R; Liu-Ambrose, Teresa
2015-10-01
Cognitive decline is one of the most prominent healthcare issues of the 21st century. Within the context of combating cognitive decline through behavioural interventions, physical activity is a promising approach. There is a dearth of health economic data in the area of behavioural interventions for dementia prevention. Yet, economic evaluations are essential for providing information to policy makers for resource allocation. It is essential we first address population and intervention-specific methodological challenges prior to building a larger evidence base. We use a cost-utility analysis conducted alongside the exercise for cognition and everyday living (EXCEL) study to illustrate methodological challenges specific to assessing the cost-effectiveness of behavioural interventions aimed at older adults at risk of cognitive decline. A cost-utility analysis conducted concurrently with a 6-month, three-arm randomised controlled trial (ie, the EXCEL study) was used as an example to identify and discuss methodological challenges. Both the aerobic training and resistance training interventions were less costly than twice weekly balance and tone classes. In critically evaluating the economic evaluation of the EXCEL study we identified four category-specific challenges: (1) analysing costs; (2) assessing quality-adjusted life-years; (3) Incomplete data; and (4) 'Intervention' activities of the control group. Resistance training and aerobic training resulted in healthcare cost saving and were equally effective to balance and tone classes after only 6 months of intervention. To ensure this population is treated fairly in terms of claims on resources, we first need to identify areas for methodological improvement. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
A Risk-Based Approach for Aerothermal/TPS Analysis and Testing
NASA Technical Reports Server (NTRS)
Wright, Michael J.; Grinstead, Jay H.; Bose, Deepak
2007-01-01
The current status of aerothermal and thermal protection system modeling for civilian entry missions is reviewed. For most such missions, the accuracy of our simulations is limited not by the tools and processes currently employed, but rather by reducible deficiencies in the underlying physical models. Improving the accuracy of and reducing the uncertainties in these models will enable a greater understanding of the system level impacts of a particular thermal protection system and of the system operation and risk over the operational life of the system. A strategic plan will be laid out by which key modeling deficiencies can be identified via mission-specific gap analysis. Once these gaps have been identified, the driving component uncertainties are determined via sensitivity analyses. A Monte-Carlo based methodology is presented for physics-based probabilistic uncertainty analysis of aerothermodynamics and thermal protection system material response modeling. These data are then used to advocate for and plan focused testing aimed at reducing key uncertainties. The results of these tests are used to validate or modify existing physical models. Concurrently, a testing methodology is outlined for thermal protection materials. The proposed approach is based on using the results of uncertainty/sensitivity analyses discussed above to tailor ground testing so as to best identify and quantify system performance and risk drivers. A key component of this testing is understanding the relationship between the test and flight environments. No existing ground test facility can simultaneously replicate all aspects of the flight environment, and therefore good models for traceability to flight are critical to ensure a low risk, high reliability thermal protection system design. Finally, the role of flight testing in the overall thermal protection system development strategy is discussed.
Process-based Cost Estimation for Ramjet/Scramjet Engines
NASA Technical Reports Server (NTRS)
Singh, Brijendra; Torres, Felix; Nesman, Miles; Reynolds, John
2003-01-01
Process-based cost estimation plays a key role in effecting cultural change that integrates distributed science, technology and engineering teams to rapidly create innovative and affordable products. Working together, NASA Glenn Research Center and Boeing Canoga Park have developed a methodology of process-based cost estimation bridging the methodologies of high-level parametric models and detailed bottoms-up estimation. The NASA GRC/Boeing CP process-based cost model provides a probabilistic structure of layered cost drivers. High-level inputs characterize mission requirements, system performance, and relevant economic factors. Design alternatives are extracted from a standard, product-specific work breakdown structure to pre-load lower-level cost driver inputs and generate the cost-risk analysis. As product design progresses and matures the lower level more detailed cost drivers can be re-accessed and the projected variation of input values narrowed, thereby generating a progressively more accurate estimate of cost-risk. Incorporated into the process-based cost model are techniques for decision analysis, specifically, the analytic hierarchy process (AHP) and functional utility analysis. Design alternatives may then be evaluated not just on cost-risk, but also user defined performance and schedule criteria. This implementation of full-trade study support contributes significantly to the realization of the integrated development environment. The process-based cost estimation model generates development and manufacturing cost estimates. The development team plans to expand the manufacturing process base from approximately 80 manufacturing processes to over 250 processes. Operation and support cost modeling is also envisioned. Process-based estimation considers the materials, resources, and processes in establishing cost-risk and rather depending on weight as an input, actually estimates weight along with cost and schedule.
A non-Gaussian approach to risk measures
NASA Astrophysics Data System (ADS)
Bormetti, Giacomo; Cisana, Enrica; Montagna, Guido; Nicrosini, Oreste
2007-03-01
Reliable calculations of financial risk require that the fat-tailed nature of prices changes is included in risk measures. To this end, a non-Gaussian approach to financial risk management is presented, modelling the power-law tails of the returns distribution in terms of a Student- t distribution. Non-Gaussian closed-form solutions for value-at-risk and expected shortfall are obtained and standard formulae known in the literature under the normality assumption are recovered as a special case. The implications of the approach for risk management are demonstrated through an empirical analysis of financial time series from the Italian stock market and in comparison with the results of the most widely used procedures of quantitative finance. Particular attention is paid to quantify the size of the errors affecting the market risk measures obtained according to different methodologies, by employing a bootstrap technique.
Risk-based Methodology for Validation of Pharmaceutical Batch Processes.
Wiles, Frederick
2013-01-01
In January 2011, the U.S. Food and Drug Administration published new process validation guidance for pharmaceutical processes. The new guidance debunks the long-held industry notion that three consecutive validation batches or runs are all that are required to demonstrate that a process is operating in a validated state. Instead, the new guidance now emphasizes that the level of monitoring and testing performed during process performance qualification (PPQ) studies must be sufficient to demonstrate statistical confidence both within and between batches. In some cases, three qualification runs may not be enough. Nearly two years after the guidance was first published, little has been written defining a statistical methodology for determining the number of samples and qualification runs required to satisfy Stage 2 requirements of the new guidance. This article proposes using a combination of risk assessment, control charting, and capability statistics to define the monitoring and testing scheme required to show that a pharmaceutical batch process is operating in a validated state. In this methodology, an assessment of process risk is performed through application of a process failure mode, effects, and criticality analysis (PFMECA). The output of PFMECA is used to select appropriate levels of statistical confidence and coverage which, in turn, are used in capability calculations to determine when significant Stage 2 (PPQ) milestones have been met. The achievement of Stage 2 milestones signals the release of batches for commercial distribution and the reduction of monitoring and testing to commercial production levels. Individuals, moving range, and range/sigma charts are used in conjunction with capability statistics to demonstrate that the commercial process is operating in a state of statistical control. The new process validation guidance published by the U.S. Food and Drug Administration in January of 2011 indicates that the number of process validation batches or runs required to demonstrate that a pharmaceutical process is operating in a validated state should be based on sound statistical principles. The old rule of "three consecutive batches and you're done" is no longer sufficient. The guidance, however, does not provide any specific methodology for determining the number of runs required, and little has been published to augment this shortcoming. The paper titled "Risk-based Methodology for Validation of Pharmaceutical Batch Processes" describes a statistically sound methodology for determining when a statistically valid number of validation runs has been acquired based on risk assessment and calculation of process capability.
Harold, P D; de Souza, A S; Louchart, P; Russell, D; Brunt, H
2014-11-01
Hazardous and noxious chemicals are increasingly being transported by sea. Current estimates indicate some 2000 hazardous and noxious substances (HNS) are carried regularly by sea with bulk trade of 165milliontonnes per year worldwide. Over 100 incidents involving HNS have been reported in EU waters. Incidents occurring in a port or coastal area can have potential and actual public health implications. A methodology has been developed for prioritisation of HNS, based upon potential public health risks. The work, undertaken for the Atlantic Region Pollution Response programme (ARCOPOL), aims to provide information for incident planning and preparedness. HNS were assessed using conventional methodology based upon acute toxicity, behaviour and reactivity. Tonnage was used as a proxy for likelihood, although other factors such as shipping frequency and local navigation may also contribute. Analysis of 350 individual HNS identified the highest priority HNS as being those that present an inhalation risk. Limitations were identified around obtaining accurate data on HNS handled on a local and regional level due to a lack of port records and also political and commercial confidentiality issues. To account for this the project also developed a software tool capable of combining chemical data from the study with user defined shipping data to be used by operators to produce area-specific prioritisations. In conclusion a risk prioritisation matrix has been developed to assess the acute risks to public health from the transportation of HNS. Its potential use in emergency planning and preparedness is discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.
2010-06-16
Clemen and Reilly (2001) Risk analysis Haimes (2009); Kaplan et al. (2001): Lowrance (1976); Kaplan and Garrick (1981) Source: The US Army Energy...collect solar energy and convert to heat (NREL presentation) • Wind turbines capture energy in wind and convert it into electricity (NREL
Schlesinger, Sabrina; Sonntag, Svenja R.
2016-01-01
Background A growing number of studies linked elevated concentrations of circulating asymmetric (ADMA) and symmetric (SDMA) dimethylarginine to mortality and cardiovascular disease (CVD) events. To summarize the evidence, we conducted a systematic review and quantified associations of ADMA and SDMA with the risks of all-cause mortality and incident CVD in meta-analyses accounting for different populations and methodological approaches of the studies. Methods Relevant studies were identified in PubMed until February 2015. We used random effect models to obtain summary relative risks (RR) and 95% confidence intervals (95%CIs), comparing top versus bottom tertiles. Dose-response relations were assessed by restricted cubic spline regression models and potential non-linearity was evaluated using a likelihood ratio test. Heterogeneity between subgroups was assessed by meta-regression analysis. Results For ADMA, 34 studies (total n = 32,428) investigating associations with all-cause mortality (events = 5,035) and 30 studies (total n = 30,624) investigating the association with incident CVD (events = 3,396) were included. The summary RRs (95%CI) for all-cause mortality were 1.52 (1.37–1.68) and for CVD 1.33 (1.22–1.45), comparing high versus low ADMA concentrations. Slight differences were observed across study populations and methodological approaches, with the strongest association of ADMA being reported with all-cause mortality in critically ill patients. For SDMA, 17 studies (total n = 18,163) were included for all-cause mortality (events = 2,903), and 13 studies (total n = 16,807) for CVD (events = 1,534). High vs. low levels of SDMA, were associated with increased risk of all-cause mortality [summary RR (95%CI): 1.31 (1.18–1.46)] and CVD [summary RR (95%CI): 1.36 (1.10–1.68) Strongest associations were observed in general population samples. Conclusions The dimethylarginines ADMA and SDMA are independent risk markers for all-cause mortality and CVD across different populations and methodological approaches. PMID:27812151
Risk assessment for physical and cyber attacks on critical infrastructures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Bryan J.; Sholander, Peter E.; Phelan, James M.
2005-08-01
Assessing the risk of malevolent attacks against large-scale critical infrastructures requires modifications to existing methodologies. Existing risk assessment methodologies consider physical security and cyber security separately. As such, they do not accurately model attacks that involve defeating both physical protection and cyber protection elements (e.g., hackers turning off alarm systems prior to forced entry). This paper presents a risk assessment methodology that accounts for both physical and cyber security. It also preserves the traditional security paradigm of detect, delay and respond, while accounting for the possibility that a facility may be able to recover from or mitigate the results ofmore » a successful attack before serious consequences occur. The methodology provides a means for ranking those assets most at risk from malevolent attacks. Because the methodology is automated the analyst can also play 'what if with mitigation measures to gain a better understanding of how to best expend resources towards securing the facilities. It is simple enough to be applied to large infrastructure facilities without developing highly complicated models. Finally, it is applicable to facilities with extensive security as well as those that are less well-protected.« less
Disordered Gambling Prevalence: Methodological Innovations in a General Danish Population Survey.
Harrison, Glenn W; Jessen, Lasse J; Lau, Morten I; Ross, Don
2018-03-01
We study Danish adult gambling behavior with an emphasis on discovering patterns relevant to public health forecasting and economic welfare assessment of policy. Methodological innovations include measurement of formative in addition to reflective constructs, estimation of prospective risk for developing gambling disorder rather than risk of being falsely negatively diagnosed, analysis with attention to sample weights and correction for sample selection bias, estimation of the impact of trigger questions on prevalence estimates and sample characteristics, and distinguishing between total and marginal effects of risk-indicating factors. The most significant novelty in our design is that nobody was excluded on the basis of their response to a 'trigger' or 'gateway' question about previous gambling history. Our sample consists of 8405 adult Danes. We administered the Focal Adult Gambling Screen to all subjects and estimate prospective risk for disordered gambling. We find that 87.6% of the population is indicated for no detectable risk, 5.4% is indicated for early risk, 1.7% is indicated for intermediate risk, 2.6% is indicated for advanced risk, and 2.6% is indicated for disordered gambling. Correcting for sample weights and controlling for sample selection has a significant effect on prevalence rates. Although these estimates of the 'at risk' fraction of the population are significantly higher than conventionally reported, we infer a significant decrease in overall prevalence rates of detectable risk with these corrections, since gambling behavior is positively correlated with the decision to participate in gambling surveys. We also find that imposing a threshold gambling history leads to underestimation of the prevalence of gambling problems.
Azoulay, Laurent; Suissa, Samy
2017-05-01
Recent randomized trials have compared the newer antidiabetic agents to treatments involving sulfonylureas, drugs associated with increased cardiovascular risks and mortality in some observational studies with conflicting results. We reviewed the methodology of these observational studies by searching MEDLINE from inception to December 2015 for all studies of the association between sulfonylureas and cardiovascular events or mortality. Each study was appraised with respect to the comparator, the outcome, and study design-related sources of bias. A meta-regression analysis was used to evaluate heterogeneity. A total of 19 studies were identified, of which six had no major design-related biases. Sulfonylureas were associated with an increased risk of cardiovascular events and mortality in five of these studies (relative risks 1.16-1.55). Overall, the 19 studies resulted in 36 relative risks as some studies assessed multiple outcomes or comparators. Of the 36 analyses, metformin was the comparator in 27 (75%) and death was the outcome in 24 (67%). The relative risk was higher by 13% when the comparator was metformin, by 20% when death was the outcome, and by 7% when the studies had design-related biases. The lowest predicted relative risk was for studies with no major bias, comparator other than metformin, and cardiovascular outcome (1.06 [95% CI 0.92-1.23]), whereas the highest was for studies with bias, metformin comparator, and mortality outcome (1.53 [95% CI 1.43-1.65]). In summary, sulfonylureas were associated with an increased risk of cardiovascular events and mortality in the majority of studies with no major design-related biases. Among studies with important biases, the association varied significantly with respect to the comparator, the outcome, and the type of bias. With the introduction of new antidiabetic drugs, the use of appropriate design and analytical tools will provide their more accurate cardiovascular safety assessment in the real-world setting. © 2017 by the American Diabetes Association.
NASA Technical Reports Server (NTRS)
Dickinson, William B.
1995-01-01
An Earth Sciences Data and Information System (ESDIS) Project Management Plan (PMP) is prepared. An ESDIS Project Systems Engineering Management Plan (SEMP) consistent with the developed PMP is also prepared. ESDIS and related EOS program requirements developments, management and analysis processes are evaluated. Opportunities to improve the effectiveness of these processes and program/project responsiveness to requirements are identified. Overall ESDIS cost estimation processes are evaluated, and recommendations to improve cost estimating and modeling techniques are developed. ESDIS schedules and scheduling tools are evaluated. Risk assessment, risk mitigation strategies and approaches, and use of risk information in management decision-making are addressed.
A technological approach to studying motor planning ability in children at high risk for ASD.
Taffoni, F; Focaroli, V; Keller, F; Iverson, J M
2014-01-01
In this work we propose a new method to study the development of motor planning abilities in children and, in particular, in children at high risk for ASD. Although several modified motor signs have been found in children with ASD, no specific markers enabling the early assessment of risk have been found yet. In this work, we discuss the problem posed by objective and quantitative behavioral analysis in non-structured environment. After an initial description of the main constraints imposed by the ecological approach, a technological and methodological solution to these issues is presented. Preliminary results on 12 children are reported and briefly discussed.
Methodological issues underlying multiple decrement life table analysis.
Mode, C J; Avery, R C; Littman, G S; Potter, R G
1977-02-01
In this paper, the actuarial method of multiple decrement life table analysis of censored, longitudinal data is examined. The discussion is organized in terms of the first segment of usage of an intrauterine device. Weaknesses of the actuarial approach are pointed out, and an alternative approach, based on the classical model of competing risks, is proposed. Finally, the actuarial and the alternative method of analyzing censored data are compared, using data from the Taichung Medical Study on Intrauterine Devices.
Haneda, Kiyofumi; Umeda, Tokuo; Koyama, Tadashi; Harauchi, Hajime; Inamura, Kiyonari
2002-01-01
The target of our study is to establish the methodology for analyzing level of security requirements, for searching suitable security measures and for optimizing security distribution to every portion of medical practice. Quantitative expression must be introduced to our study as possible for the purpose of easy follow up of security procedures and easy evaluation of security outcomes or results. Results of system analysis by fault tree analysis (FTA) clarified that subdivided system elements in detail contribute to much more accurate analysis. Such subdivided composition factors very much depended on behavior of staff, interactive terminal devices, kinds of service, and routes of network. As conclusion, we found the methods to analyze levels of security requirements for each medical information systems employing FTA, basic events for each composition factor and combination of basic events. Methods for searching suitable security measures were found. Namely risk factors for each basic event, number of elements for each composition factor and candidates of security measure elements were found. Method to optimize the security measures for each medical information system was proposed. Namely optimum distribution of risk factors in terms of basic events were figured out, and comparison of them between each medical information systems became possible.
A framework for quantifying net benefits of alternative prognostic models.
Rapsomaniki, Eleni; White, Ian R; Wood, Angela M; Thompson, Simon G
2012-01-30
New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit) of the treatment decisions they support, assuming a set of predetermined clinical treatment guidelines. The change in net benefit is more clinically interpretable than changes in traditional measures and can be used in full health economic evaluations of prognostic models used for screening and allocating risk reduction interventions. We extend previous work in this area by quantifying net benefits in life years, thus linking prognostic performance to health economic measures; by taking full account of the occurrence of events over time; and by considering estimation and cross-validation in a multiple-study setting. The method is illustrated in the context of cardiovascular disease risk prediction using an individual participant data meta-analysis. We estimate the number of cardiovascular-disease-free life years gained when statin treatment is allocated based on a risk prediction model with five established risk factors instead of a model with just age, gender and region. We explore methodological issues associated with the multistudy design and show that cost-effectiveness comparisons based on the proposed methodology are robust against a range of modelling assumptions, including adjusting for competing risks. Copyright © 2011 John Wiley & Sons, Ltd.
A framework for quantifying net benefits of alternative prognostic models‡
Rapsomaniki, Eleni; White, Ian R; Wood, Angela M; Thompson, Simon G
2012-01-01
New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit) of the treatment decisions they support, assuming a set of predetermined clinical treatment guidelines. The change in net benefit is more clinically interpretable than changes in traditional measures and can be used in full health economic evaluations of prognostic models used for screening and allocating risk reduction interventions. We extend previous work in this area by quantifying net benefits in life years, thus linking prognostic performance to health economic measures; by taking full account of the occurrence of events over time; and by considering estimation and cross-validation in a multiple-study setting. The method is illustrated in the context of cardiovascular disease risk prediction using an individual participant data meta-analysis. We estimate the number of cardiovascular-disease-free life years gained when statin treatment is allocated based on a risk prediction model with five established risk factors instead of a model with just age, gender and region. We explore methodological issues associated with the multistudy design and show that cost-effectiveness comparisons based on the proposed methodology are robust against a range of modelling assumptions, including adjusting for competing risks. Copyright © 2011 John Wiley & Sons, Ltd. PMID:21905066
Zhang, Chao; Jia, Pengli; Yu, Liu; Xu, Chang
2018-05-01
Dose-response meta-analysis (DRMA) is widely applied to investigate the dose-specific relationship between independent and dependent variables. Such methods have been in use for over 30 years and are increasingly employed in healthcare and clinical decision-making. In this article, we give an overview of the methodology used in DRMA. We summarize the commonly used regression model and the pooled method in DRMA. We also use an example to illustrate how to employ a DRMA by these methods. Five regression models, linear regression, piecewise regression, natural polynomial regression, fractional polynomial regression, and restricted cubic spline regression, were illustrated in this article to fit the dose-response relationship. And two types of pooling approaches, that is, one-stage approach and two-stage approach are illustrated to pool the dose-response relationship across studies. The example showed similar results among these models. Several dose-response meta-analysis methods can be used for investigating the relationship between exposure level and the risk of an outcome. However the methodology of DRMA still needs to be improved. © 2018 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.
Frosini, Francesco; Miniati, Roberto; Grillone, Saverio; Dori, Fabrizio; Gentili, Guido Biffi; Belardinelli, Andrea
2016-11-14
The following study proposes and tests an integrated methodology involving Health Technology Assessment (HTA) and Failure Modes, Effects and Criticality Analysis (FMECA) for the assessment of specific aspects related to robotic surgery involving safety, process and technology. The integrated methodology consists of the application of specific techniques coming from the HTA joined to the aid of the most typical models from reliability engineering such as FMEA/FMECA. The study has also included in-site data collection and interviews to medical personnel. The total number of robotic procedures included in the analysis was 44: 28 for urology and 16 for general surgery. The main outcomes refer to the comparative evaluation between robotic, laparoscopic and open surgery. Risk analysis and mitigation interventions come from FMECA application. The small sample size available for the study represents an important bias, especially for the clinical outcomes reliability. Despite this, the study seems to confirm the better trend for robotics' surgical times with comparison to the open technique as well as confirming the robotics' clinical benefits in urology. More complex situation is observed for general surgery, where robotics' clinical benefits directly measured are the lowest blood transfusion rate.
Macro-economic assessment of flood risk in Italy under current and future climate
NASA Astrophysics Data System (ADS)
Carrera, Lorenzo; Koks, Elco; Mysiak, Jaroslav; Aerts, Jeroen; Standardi, Gabriele
2014-05-01
This paper explores an integrated methodology for assessing direct and indirect costs of fluvial flooding to estimate current and future fluvial flood risk in Italy. Our methodology combines a Geographic Information System spatial approach, with a general economic equilibrium approach using a downscaled modified version of a Computable General Equilibrium model at NUTS2 scale. Given the level of uncertainty in the behavior of disaster-affected economies, the simulation considers a wide range of business recovery periods. We calculate expected annual losses for each NUTS2 region, and exceedence probability curves to determine probable maximum losses. Given a certain acceptable level of risk, we describe the conditions of flood protection and business recovery periods under which losses are contained within this limit. Because of the difference between direct costs, which are an overestimation of stock losses, and indirect costs, which represent the macro-economic effects, our results have different policy meanings. While the former is relevant for post-disaster recovery, the latter is more relevant for public policy issues, particularly for cost-benefit analysis and resilience assessment.
Hasan, Haroon; Muhammed, Taaha; Yu, Jennifer; Taguchi, Kelsi; Samargandi, Osama A; Howard, A Fuchsia; Lo, Andrea C; Olson, Robert; Goddard, Karen
2017-10-01
The objective of our study was to evaluate the methodological quality of systematic reviews and meta-analyses in Radiation Oncology. A systematic literature search was conducted for all eligible systematic reviews and meta-analyses in Radiation Oncology from 1966 to 2015. Methodological characteristics were abstracted from all works that satisfied the inclusion criteria and quality was assessed using the critical appraisal tool, AMSTAR. Regression analyses were performed to determine factors associated with a higher score of quality. Following exclusion based on a priori criteria, 410 studies (157 systematic reviews and 253 meta-analyses) satisfied the inclusion criteria. Meta-analyses were found to be of fair to good quality while systematic reviews were found to be of less than fair quality. Factors associated with higher scores of quality in the multivariable analysis were including primary studies consisting of randomized control trials, performing a meta-analysis, and applying a recommended guideline related to establishing a systematic review protocol and/or reporting. Systematic reviews and meta-analyses may introduce a high risk of bias if applied to inform decision-making based on AMSTAR. We recommend that decision-makers in Radiation Oncology scrutinize the methodological quality of systematic reviews and meta-analyses prior to assessing their utility to inform evidence-based medicine and researchers adhere to methodological standards outlined in validated guidelines when embarking on a systematic review. Copyright © 2017 Elsevier Ltd. All rights reserved.
Reliability and Probabilistic Risk Assessment - How They Play Together
NASA Technical Reports Server (NTRS)
Safie, Fayssal M.; Stutts, Richard G.; Zhaofeng, Huang
2015-01-01
PRA methodology is one of the probabilistic analysis methods that NASA brought from the nuclear industry to assess the risk of LOM, LOV and LOC for launch vehicles. PRA is a system scenario based risk assessment that uses a combination of fault trees, event trees, event sequence diagrams, and probability and statistical data to analyze the risk of a system, a process, or an activity. It is a process designed to answer three basic questions: What can go wrong? How likely is it? What is the severity of the degradation? Since 1986, NASA, along with industry partners, has conducted a number of PRA studies to predict the overall launch vehicles risks. Planning Research Corporation conducted the first of these studies in 1988. In 1995, Science Applications International Corporation (SAIC) conducted a comprehensive PRA study. In July 1996, NASA conducted a two-year study (October 1996 - September 1998) to develop a model that provided the overall Space Shuttle risk and estimates of risk changes due to proposed Space Shuttle upgrades. After the Columbia accident, NASA conducted a PRA on the Shuttle External Tank (ET) foam. This study was the most focused and extensive risk assessment that NASA has conducted in recent years. It used a dynamic, physics-based, integrated system analysis approach to understand the integrated system risk due to ET foam loss in flight. Most recently, a PRA for Ares I launch vehicle has been performed in support of the Constellation program. Reliability, on the other hand, addresses the loss of functions. In a broader sense, reliability engineering is a discipline that involves the application of engineering principles to the design and processing of products, both hardware and software, for meeting product reliability requirements or goals. It is a very broad design-support discipline. It has important interfaces with many other engineering disciplines. Reliability as a figure of merit (i.e. the metric) is the probability that an item will perform its intended function(s) for a specified mission profile. In general, the reliability metric can be calculated through the analyses using reliability demonstration and reliability prediction methodologies. Reliability analysis is very critical for understanding component failure mechanisms and in identifying reliability critical design and process drivers. The following sections discuss the PRA process and reliability engineering in detail and provide an application where reliability analysis and PRA were jointly used in a complementary manner to support a Space Shuttle flight risk assessment.
Multidisciplinary design and optimization (MDO) methodology for the aircraft conceptual design
NASA Astrophysics Data System (ADS)
Iqbal, Liaquat Ullah
An integrated design and optimization methodology has been developed for the conceptual design of an aircraft. The methodology brings higher fidelity Computer Aided Design, Engineering and Manufacturing (CAD, CAE and CAM) Tools such as CATIA, FLUENT, ANSYS and SURFCAM into the conceptual design by utilizing Excel as the integrator and controller. The approach is demonstrated to integrate with many of the existing low to medium fidelity codes such as the aerodynamic panel code called CMARC and sizing and constraint analysis codes, thus providing the multi-fidelity capabilities to the aircraft designer. The higher fidelity design information from the CAD and CAE tools for the geometry, aerodynamics, structural and environmental performance is provided for the application of the structured design methods such as the Quality Function Deployment (QFD) and the Pugh's Method. The higher fidelity tools bring the quantitative aspects of a design such as precise measurements of weight, volume, surface areas, center of gravity (CG) location, lift over drag ratio, and structural weight, as well as the qualitative aspects such as external geometry definition, internal layout, and coloring scheme early in the design process. The performance and safety risks involved with the new technologies can be reduced by modeling and assessing their impact more accurately on the performance of the aircraft. The methodology also enables the design and evaluation of the novel concepts such as the blended (BWB) and the hybrid wing body (HWB) concepts. Higher fidelity computational fluid dynamics (CFD) and finite element analysis (FEA) allow verification of the claims for the performance gains in aerodynamics and ascertain risks of structural failure due to different pressure distribution in the fuselage as compared with the tube and wing design. The higher fidelity aerodynamics and structural models can lead to better cost estimates that help reduce the financial risks as well. This helps in achieving better designs with reduced risk in lesser time and cost. The approach is shown to eliminate the traditional boundary between the conceptual and the preliminary design stages, combining the two into one consolidated preliminary design phase. Several examples for the validation and utilization of the Multidisciplinary Design and Optimization (MDO) Tool are presented using missions for the Medium and High Altitude Long Range/Endurance Unmanned Aerial Vehicles (UAVs).
Use of evidential reasoning and AHP to assess regional industrial safety
Chen, Zhichao; Chen, Tao; Qu, Zhuohua; Ji, Xuewei; Zhou, Yi; Zhang, Hui
2018-01-01
China’s fast economic growth contributes to the rapid development of its urbanization process, and also renders a series of industrial accidents, which often cause loss of life, damage to property and environment, thus requiring the associated risk analysis and safety control measures to be implemented in advance. However, incompleteness of historical failure data before the occurrence of accidents makes it difficult to use traditional risk analysis approaches such as probabilistic risk analysis in many cases. This paper aims to develop a new methodology capable of assessing regional industrial safety (RIS) in an uncertain environment. A hierarchical structure for modelling the risks influencing RIS is first constructed. The hybrid of evidential reasoning (ER) and Analytical Hierarchy Process (AHP) is then used to assess the risks in a complementary way, in which AHP is hired to evaluate the weight of each risk factor and ER is employed to synthesise the safety evaluations of the investigated region(s) against the risk factors from the bottom to the top level in the hierarchy. The successful application of the hybrid approach in a real case analysis of RIS in several major districts of Beijing (capital of China) demonstrates its feasibility as well as provides risk analysts and safety engineers with useful insights on effective solutions to comprehensive risk assessment of RIS in metropolitan cities. The contribution of this paper is made by the findings on the comparison of risk levels of RIS at different regions against various risk factors so that best practices from the good performer(s) can be used to improve the safety of the others. PMID:29795593
Use of evidential reasoning and AHP to assess regional industrial safety.
Chen, Zhichao; Chen, Tao; Qu, Zhuohua; Yang, Zaili; Ji, Xuewei; Zhou, Yi; Zhang, Hui
2018-01-01
China's fast economic growth contributes to the rapid development of its urbanization process, and also renders a series of industrial accidents, which often cause loss of life, damage to property and environment, thus requiring the associated risk analysis and safety control measures to be implemented in advance. However, incompleteness of historical failure data before the occurrence of accidents makes it difficult to use traditional risk analysis approaches such as probabilistic risk analysis in many cases. This paper aims to develop a new methodology capable of assessing regional industrial safety (RIS) in an uncertain environment. A hierarchical structure for modelling the risks influencing RIS is first constructed. The hybrid of evidential reasoning (ER) and Analytical Hierarchy Process (AHP) is then used to assess the risks in a complementary way, in which AHP is hired to evaluate the weight of each risk factor and ER is employed to synthesise the safety evaluations of the investigated region(s) against the risk factors from the bottom to the top level in the hierarchy. The successful application of the hybrid approach in a real case analysis of RIS in several major districts of Beijing (capital of China) demonstrates its feasibility as well as provides risk analysts and safety engineers with useful insights on effective solutions to comprehensive risk assessment of RIS in metropolitan cities. The contribution of this paper is made by the findings on the comparison of risk levels of RIS at different regions against various risk factors so that best practices from the good performer(s) can be used to improve the safety of the others.
Probability of Failure Analysis Standards and Guidelines for Expendable Launch Vehicles
NASA Astrophysics Data System (ADS)
Wilde, Paul D.; Morse, Elisabeth L.; Rosati, Paul; Cather, Corey
2013-09-01
Recognizing the central importance of probability of failure estimates to ensuring public safety for launches, the Federal Aviation Administration (FAA), Office of Commercial Space Transportation (AST), the National Aeronautics and Space Administration (NASA), and U.S. Air Force (USAF), through the Common Standards Working Group (CSWG), developed a guide for conducting valid probability of failure (POF) analyses for expendable launch vehicles (ELV), with an emphasis on POF analysis for new ELVs. A probability of failure analysis for an ELV produces estimates of the likelihood of occurrence of potentially hazardous events, which are critical inputs to launch risk analysis of debris, toxic, or explosive hazards. This guide is intended to document a framework for POF analyses commonly accepted in the US, and should be useful to anyone who performs or evaluates launch risk analyses for new ELVs. The CSWG guidelines provide performance standards and definitions of key terms, and are being revised to address allocation to flight times and vehicle response modes. The POF performance standard allows a launch operator to employ alternative, potentially innovative methodologies so long as the results satisfy the performance standard. Current POF analysis practice at US ranges includes multiple methodologies described in the guidelines as accepted methods, but not necessarily the only methods available to demonstrate compliance with the performance standard. The guidelines include illustrative examples for each POF analysis method, which are intended to illustrate an acceptable level of fidelity for ELV POF analyses used to ensure public safety. The focus is on providing guiding principles rather than "recipe lists." Independent reviews of these guidelines were performed to assess their logic, completeness, accuracy, self- consistency, consistency with risk analysis practices, use of available information, and ease of applicability. The independent reviews confirmed the general validity of the performance standard approach and suggested potential updates to improve the accuracy each of the example methods, especially to address reliability growth.
Shinzato, Takashi
2016-12-01
The portfolio optimization problem in which the variances of the return rates of assets are not identical is analyzed in this paper using the methodology of statistical mechanical informatics, specifically, replica analysis. We defined two characteristic quantities of an optimal portfolio, namely, minimal investment risk and investment concentration, in order to solve the portfolio optimization problem and analytically determined their asymptotical behaviors using replica analysis. Numerical experiments were also performed, and a comparison between the results of our simulation and those obtained via replica analysis validated our proposed method.
NASA Astrophysics Data System (ADS)
Shinzato, Takashi
2016-12-01
The portfolio optimization problem in which the variances of the return rates of assets are not identical is analyzed in this paper using the methodology of statistical mechanical informatics, specifically, replica analysis. We defined two characteristic quantities of an optimal portfolio, namely, minimal investment risk and investment concentration, in order to solve the portfolio optimization problem and analytically determined their asymptotical behaviors using replica analysis. Numerical experiments were also performed, and a comparison between the results of our simulation and those obtained via replica analysis validated our proposed method.
Foundation stones for a real socio-environmental integration in projects' impact assessments
NASA Astrophysics Data System (ADS)
Andres Dominguez-Gomez, J.
2015-04-01
In the last twenty years, both the increase in academic production and the expansion of professional involvement in Environmental Impact Assessment (EIA) and Social Impact Assessment (SIA), have evidenced growing scientific and business interest in risk and impact analysis. However, this growth has not brought with it a parallel progress in addressing their main shortcomings: insufficient integration of environmental and social features into development project analyses and, in cases where the social aspects are considered, technical-methodological failings in their diagnosis and assessment. It is clear that these weaknesses carry with them substantial threats to the sustainability (social, environmental and economic) of schemes which impact on the environment, and in consequence, to the local contexts where they are carried out and to the delicate balance of the global ecosystem. This paper argue that, in a sociological context of growing complexity, four foundation-stones are required to underpin research methodologies (for both diagnosis and assessment) in the socio-environmental risks of development projects: a theoretical foundation in actor-network theory; an ethical grounding in values which are internationally recognized though not always carried through into practice; a (new) epistemological-scientific base; and a methodological foundation in social participation.
Taccolini Manzoni, Ana Carolina; Bastos de Oliveira, Naiane Teixeira; Nunes Cabral, Cristina Maria; Aquaroni Ricci, Natalia
2018-02-05
The aim of this systematic review was to investigate the role of therapeutic alliance in pain relief in patients with musculoskeletal disorders treated by physiotherapy. Manual and database searches (Medline, Embase, ISI Web of Knowledge, CINAHL, PEDro, Lilacs, Cochrane Library, and PsycINFO) were performed with no restrictions of language and publication date. We included prospective studies with samples of patients undergoing physiotherapy for musculoskeletal conditions, with one measure of therapeutic alliance and the outcome pain. Methodological quality was assessed by the Methodological Index for Nonrandomized Studies and the Cochrane tool for risk of bias. Six articles from four studies were included out of the 936 manuscripts identified. All studies used samples composed of patients with chronic low back pain. Two studies applied therapeutic alliance incentive measures during treatment and reported significant improvement in pain. The remaining studies, without alliance incentives, showed divergence regarding the relationship between the therapeutic alliance and pain. Methodological quality analysis determined low risk of bias of the studies. A lack of studies on the therapeutic alliance regarding musculoskeletal physiotherapy was verified. Existing studies fail to provide evidence of a strong relationship between the therapeutic alliance and pain relief.
Armijo-Olivo, Susan; Cummings, Greta G.; Amin, Maryam; Flores-Mir, Carlos
2017-01-01
Objectives To examine the risks of bias, risks of random errors, reporting quality, and methodological quality of randomized clinical trials of oral health interventions and the development of these aspects over time. Methods We included 540 randomized clinical trials from 64 selected systematic reviews. We extracted, in duplicate, details from each of the selected randomized clinical trials with respect to publication and trial characteristics, reporting and methodologic characteristics, and Cochrane risk of bias domains. We analyzed data using logistic regression and Chi-square statistics. Results Sequence generation was assessed to be inadequate (at unclear or high risk of bias) in 68% (n = 367) of the trials, while allocation concealment was inadequate in the majority of trials (n = 464; 85.9%). Blinding of participants and blinding of the outcome assessment were judged to be inadequate in 28.5% (n = 154) and 40.5% (n = 219) of the trials, respectively. A sample size calculation before the initiation of the study was not performed/reported in 79.1% (n = 427) of the trials, while the sample size was assessed as adequate in only 17.6% (n = 95) of the trials. Two thirds of the trials were not described as double blinded (n = 358; 66.3%), while the method of blinding was appropriate in 53% (n = 286) of the trials. We identified a significant decrease over time (1955–2013) in the proportion of trials assessed as having inadequately addressed methodological quality items (P < 0.05) in 30 out of the 40 quality criteria, or as being inadequate (at high or unclear risk of bias) in five domains of the Cochrane risk of bias tool: sequence generation, allocation concealment, incomplete outcome data, other sources of bias, and overall risk of bias. Conclusions The risks of bias, risks of random errors, reporting quality, and methodological quality of randomized clinical trials of oral health interventions have improved over time; however, further efforts that contribute to the development of more stringent methodology and detailed reporting of trials are still needed. PMID:29272315
Saltaji, Humam; Armijo-Olivo, Susan; Cummings, Greta G; Amin, Maryam; Flores-Mir, Carlos
2017-01-01
To examine the risks of bias, risks of random errors, reporting quality, and methodological quality of randomized clinical trials of oral health interventions and the development of these aspects over time. We included 540 randomized clinical trials from 64 selected systematic reviews. We extracted, in duplicate, details from each of the selected randomized clinical trials with respect to publication and trial characteristics, reporting and methodologic characteristics, and Cochrane risk of bias domains. We analyzed data using logistic regression and Chi-square statistics. Sequence generation was assessed to be inadequate (at unclear or high risk of bias) in 68% (n = 367) of the trials, while allocation concealment was inadequate in the majority of trials (n = 464; 85.9%). Blinding of participants and blinding of the outcome assessment were judged to be inadequate in 28.5% (n = 154) and 40.5% (n = 219) of the trials, respectively. A sample size calculation before the initiation of the study was not performed/reported in 79.1% (n = 427) of the trials, while the sample size was assessed as adequate in only 17.6% (n = 95) of the trials. Two thirds of the trials were not described as double blinded (n = 358; 66.3%), while the method of blinding was appropriate in 53% (n = 286) of the trials. We identified a significant decrease over time (1955-2013) in the proportion of trials assessed as having inadequately addressed methodological quality items (P < 0.05) in 30 out of the 40 quality criteria, or as being inadequate (at high or unclear risk of bias) in five domains of the Cochrane risk of bias tool: sequence generation, allocation concealment, incomplete outcome data, other sources of bias, and overall risk of bias. The risks of bias, risks of random errors, reporting quality, and methodological quality of randomized clinical trials of oral health interventions have improved over time; however, further efforts that contribute to the development of more stringent methodology and detailed reporting of trials are still needed.
NASA Astrophysics Data System (ADS)
de La Cal, E. A.; Fernández, E. M.; Quiroga, R.; Villar, J. R.; Sedano, J.
In previous works a methodology was defined, based on the design of a genetic algorithm GAP and an incremental training technique adapted to the learning of series of stock market values. The GAP technique consists in a fusion of GP and GA. The GAP algorithm implements the automatic search for crisp trading rules taking as objectives of the training both the optimization of the return obtained and the minimization of the assumed risk. Applying the proposed methodology, rules have been obtained for a period of eight years of the S&P500 index. The achieved adjustment of the relation return-risk has generated rules with returns very superior in the testing period to those obtained applying habitual methodologies and even clearly superior to Buy&Hold. This work probes that the proposed methodology is valid for different assets in a different market than previous work.
Making the Hubble Space Telescope servicing mission safe
NASA Technical Reports Server (NTRS)
Bahr, N. J.; Depalo, S. V.
1992-01-01
The implementation of the HST system safety program is detailed. Numerous safety analyses are conducted through various phases of design, test, and fabrication, and results are presented to NASA management for discussion during dedicated safety reviews. Attention is given to the system safety assessment and risk analysis methodologies used, i.e., hazard analysis, fault tree analysis, and failure modes and effects analysis, and to how they are coupled with engineering and test analysis for a 'synergistic picture' of the system. Some preliminary safety analysis results, showing the relationship between hazard identification, control or abatement, and finally control verification, are presented as examples of this safety process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oldenburg, Curtis M.; Budnitz, Robert J.
If Carbon dioxide Capture and Storage (CCS) is to be effective in mitigating climate change, it will need to be carried out on a very large scale. This will involve many thousands of miles of dedicated high-pressure pipelines in order to transport many millions of tonnes of CO 2 annually, with the CO 2 delivered to many thousands of wells that will inject the CO 2 underground. The new CCS infrastructure could rival in size the current U.S. upstream natural gas pipeline and well infrastructure. This new infrastructure entails hazards for life, health, animals, the environment, and natural resources. Pipelinesmore » are known to rupture due to corrosion, from external forces such as impacts by vehicles or digging equipment, by defects in construction, or from the failure of valves and seals. Similarly, wells are vulnerable to catastrophic failure due to corrosion, cement degradation, or operational mistakes. While most accidents involving pipelines and wells will be minor, there is the inevitable possibility of accidents with very high consequences, especially to public health. The most important consequence of concern is CO 2 release to the environment in concentrations sufficient to cause death by asphyxiation to nearby populations. Such accidents are thought to be very unlikely, but of course they cannot be excluded, even if major engineering effort is devoted (as it will be) to keeping their probability low and their consequences minimized. This project has developed a methodology for analyzing the risks of these rare but high-consequence accidents, using a step-by-step probabilistic methodology. A key difference between risks for pipelines and wells is that the former are spatially distributed along the pipe whereas the latter are confined to the vicinity of the well. Otherwise, the methodology we develop for risk assessment of pipeline and well failures is similar and provides an analysis both of the annual probabilities of accident sequences of concern and of their consequences, and crucially the methodology provides insights into what measures might be taken to mitigate those accident sequences identified as of concern. Mitigating strategies could address reducing the likelihood of an accident sequence of concern, or reducing the consequences, or some combination. The methodology elucidates both local and integrated risks along the pipeline or at the well providing information useful to decision makers at various levels including local (e.g., property owners and town councils), regional (e.g., county and state representatives), and national levels (federal regulators and corporate proponents).« less
Frenette, Anne Julie; Bouchard, Josée; Bernier, Pascaline; Charbonneau, Annie; Nguyen, Long Thanh; Rioux, Jean-Philippe; Troyanov, Stéphan; Williamson, David R
2014-11-14
The risk of acute kidney injury (AKI) with the use of albumin-containing fluids compared to starches in the surgical intensive care setting remains uncertain. We evaluated the adjusted risk of AKI associated with colloids following cardiac surgery. We performed a retrospective cohort study of patients undergoing on-pump cardiac surgery in a tertiary care center from 2008 to 2010. We assessed crystalloid and colloid administration until 36 hours after surgery. AKI was defined by the RIFLE (risk, injury, failure, loss and end-stage kidney disease) risk and Acute Kidney Injury Network (AKIN) stage 1 serum creatinine criterion within 96 hours after surgery. Our cohort included 984 patients with a baseline glomerular filtration rate of 72 ± 19 ml/min/1.73 m(2). Twenty-three percent had a reduced left ventricular ejection fraction (LVEF), thirty-one percent were diabetics and twenty-three percent underwent heart valve surgery. The incidence of AKI was 5.3% based on RIFLE risk and 12.0% based on the AKIN criterion. AKI was associated with a reduced LVEF, diuretic use, anemia, heart valve surgery, duration of extracorporeal circulation, hemodynamic instability and the use of albumin, pentastarch 10% and transfusions. There was an important dose-dependent AKI risk associated with the administration of albumin, which also paralleled a higher prevalence of concomitant risk factors for AKI. To address any indication bias, we derived a propensity score predicting the likelihood to receive albumin and matched 141 cases to 141 controls with a similar risk profile. In this analysis, albumin was associated with an increased AKI risk (RIFLE risk: 12% versus 5%, P = 0.03; AKIN stage 1: 28% versus 13%, P = 0.002). We repeated this methodology in patients without postoperative hemodynamic instability and still identified an association between the use of albumin and AKI. Albumin administration was associated with a dose-dependent risk of AKI and remained significant using a propensity score methodology. Future studies should address the safety of albumin-containing fluids on kidney function in patients undergoing cardiac surgery.
da Silva, Tatiana Pastorello Pereira; Moreira, Josino Costa; Peres, Frederico
2012-02-01
This article seeks to characterize the risks related to the use of pesticides in dairy production, in terms of legislation, health and perception of risk for workers involved in this activity. It is based on methodological articulation that included: a) systematic review of the reference literature on the research topic; b) analysis of related legislation (veterinary products and pesticides); c) risk identification regarding the use of veterinary products formulated using active ingredients listed as pesticides; d) and risk perception analysis of a group of dairy production workers. Results indicate a situation of particular interest to Public Health. Regarding dairy production workers, the invisibility of risks associated with handling pesticides for veterinary use, increases their exposure and is related to several health problems, especially for women. This same invisibility leads to a neglect of the prohibition period between pesticide use and consumption of other products. Part of the problem may be associated with the non-classification of pesticides for veterinary use as 'pesticides' (they are classified as veterinary products), which highlights the importance and the urgency of discussion of the theme.
Cyber / Physical Security Vulnerability Assessment Integration
DOE Office of Scientific and Technical Information (OSTI.GOV)
MacDonald, Douglas G.; Simpkins, Bret E.
Abstract Both physical protection and cyber security domains offer solutions for the discovery of vulnerabilities through the use of various assessment processes and software tools. Each vulnerability assessment (VA) methodology provides the ability to identify and categorize vulnerabilities, and quantifies the risks within their own areas of expertise. Neither approach fully represents the true potential security risk to a site and/or a facility, nor comprehensively assesses the overall security posture. The technical approach to solving this problem was to identify methodologies and processes that blend the physical and cyber security assessments, and develop tools to accurately quantify the unaccounted formore » risk. SMEs from both the physical and the cyber security domains developed the blending methodologies, and cross trained each other on the various aspects of the physical and cyber security assessment processes. A local critical infrastructure entity volunteered to host a proof of concept physical/cyber security assessment, and the lessons learned have been leveraged by this effort. The four potential modes of attack an adversary can use in approaching a target are; Physical Only Attack, Cyber Only Attack, Physical Enabled Cyber Attack, and the Cyber Enabled Physical Attack. The Physical Only and the Cyber Only pathway analysis are two of the most widely analyzed attack modes. The pathway from an off-site location to the desired target location is dissected to ensure adversarial activity can be detected and neutralized by the protection strategy, prior to completion of a predefined task. This methodology typically explores a one way attack from the public space (or common area) inward towards the target. The Physical Enabled Cyber Attack and the Cyber Enabled Physical Attack are much more intricate. Both scenarios involve beginning in one domain to affect change in the other, then backing outward to take advantage of the reduced system effectiveness, before penetrating further into the defenses. The proper identification and assessment of the overlapping areas (and interaction between these areas) in the VA process is necessary to accurately assess the true risk.« less
Cluster Randomised Trials in Cochrane Reviews: Evaluation of Methodological and Reporting Practice.
Richardson, Marty; Garner, Paul; Donegan, Sarah
2016-01-01
Systematic reviews can include cluster-randomised controlled trials (C-RCTs), which require different analysis compared with standard individual-randomised controlled trials. However, it is not known whether review authors follow the methodological and reporting guidance when including these trials. The aim of this study was to assess the methodological and reporting practice of Cochrane reviews that included C-RCTs against criteria developed from existing guidance. Criteria were developed, based on methodological literature and personal experience supervising review production and quality. Criteria were grouped into four themes: identifying, reporting, assessing risk of bias, and analysing C-RCTs. The Cochrane Database of Systematic Reviews was searched (2nd December 2013), and the 50 most recent reviews that included C-RCTs were retrieved. Each review was then assessed using the criteria. The 50 reviews we identified were published by 26 Cochrane Review Groups between June 2013 and November 2013. For identifying C-RCTs, only 56% identified that C-RCTs were eligible for inclusion in the review in the eligibility criteria. For reporting C-RCTs, only eight (24%) of the 33 reviews reported the method of cluster adjustment for their included C-RCTs. For assessing risk of bias, only one review assessed all five C-RCT-specific risk-of-bias criteria. For analysing C-RCTs, of the 27 reviews that presented unadjusted data, only nine (33%) provided a warning that confidence intervals may be artificially narrow. Of the 34 reviews that reported data from unadjusted C-RCTs, only 13 (38%) excluded the unadjusted results from the meta-analyses. The methodological and reporting practices in Cochrane reviews incorporating C-RCTs could be greatly improved, particularly with regard to analyses. Criteria developed as part of the current study could be used by review authors or editors to identify errors and improve the quality of published systematic reviews incorporating C-RCTs.
A balanced hazard ratio for risk group evaluation from survival data.
Branders, Samuel; Dupont, Pierre
2015-07-30
Common clinical studies assess the quality of prognostic factors, such as gene expression signatures, clinical variables or environmental factors, and cluster patients into various risk groups. Typical examples include cancer clinical trials where patients are clustered into high or low risk groups. Whenever applied to survival data analysis, such groups are intended to represent patients with similar survival odds and to select the most appropriate therapy accordingly. The relevance of such risk groups, and of the related prognostic factors, is typically assessed through the computation of a hazard ratio. We first stress three limitations of assessing risk groups through the hazard ratio: (1) it may promote the definition of arbitrarily unbalanced risk groups; (2) an apparently optimal group hazard ratio can be largely inconsistent with the p-value commonly associated to it; and (3) some marginal changes between risk group proportions may lead to highly different hazard ratio values. Those issues could lead to inappropriate comparisons between various prognostic factors. Next, we propose the balanced hazard ratio to solve those issues. This new performance metric keeps an intuitive interpretation and is as simple to compute. We also show how the balanced hazard ratio leads to a natural cut-off choice to define risk groups from continuous risk scores. The proposed methodology is validated through controlled experiments for which a prescribed cut-off value is defined by design. Further results are also reported on several cancer prognosis studies, and the proposed methodology could be applied more generally to assess the quality of any prognostic markers. Copyright © 2015 John Wiley & Sons, Ltd.
Chari, Ramya; Burke, Thomas A.; White, Ronald H.; Fox, Mary A.
2012-01-01
Susceptibility to chemical toxins has not been adequately addressed in risk assessment methodologies. As a result, environmental policies may fail to meet their fundamental goal of protecting the public from harm. This study examines how characterization of risk may change when susceptibility is explicitly considered in policy development; in particular we examine the process used by the U.S. Environmental Protection Agency (EPA) to set a National Ambient Air Quality Standard (NAAQS) for lead. To determine a NAAQS, EPA estimated air lead-related decreases in child neurocognitive function through a combination of multiple data elements including concentration-response (CR) functions. In this article, we present alternative scenarios for determining a lead NAAQS using CR functions developed in populations more susceptible to lead toxicity due to socioeconomic disadvantage. The use of CR functions developed in susceptible groups resulted in cognitive decrements greater than original EPA estimates. EPA’s analysis suggested that a standard level of 0.15 µg/m3 would fulfill decision criteria, but by incorporating susceptibility we found that options for the standard could reasonably be extended to lower levels. The use of data developed in susceptible populations would result in the selection of a more protective NAAQS under the same decision framework applied by EPA. Results are used to frame discussion regarding why cumulative risk assessment methodologies are needed to help inform policy development. PMID:22690184
Comparative Risk Analysis for Metropolitan Solid Waste Management Systems
NASA Astrophysics Data System (ADS)
Chang, Ni-Bin; Wang, S. F.
1996-01-01
Conventional solid waste management planning usually focuses on economic optimization, in which the related environmental impacts or risks are rarely considered. The purpose of this paper is to illustrate the methodology of how optimization concepts and techniques can be applied to structure and solve risk management problems such that the impacts of air pollution, leachate, traffic congestion, and noise increments can be regulated in the iong-term planning of metropolitan solid waste management systems. Management alternatives are sequentially evaluated by adding several environmental risk control constraints stepwise in an attempt to improve the management strategies and reduce the risk impacts in the long run. Statistics associated with those risk control mechanisms are presented as well. Siting, routing, and financial decision making in such solid waste management systems can also be achieved with respect to various resource limitations and disposal requirements.
ERIC Educational Resources Information Center
Pack, Robert P.; Browne, Dorothy; Wallander, Jan L.
1998-01-01
Health risk behaviors (substance use, violence, suicide, and car safety) of 194 African American urban adolescents with mild mental retardation were measured using either a confidential individual interview or an anonymous group survey. The survey methodology resulted in disclosure of more risk behaviors than the interview methodology. Elevated…
Q methodology, risk training and quality management.
McKeown, M; Hinks, M; Stowell-Smith, M; Mercer, D; Forster, J
1999-01-01
The results of a Q methodological study of professional understandings of the notion of risk in mental health services within the UK are discussed in relation to the relevance for staff training and quality assurance. The study attempted to access the diversity of understandings of risk issues amongst a multi-professional group of staff (n = 60) attending inter-agency risk training workshops in 1998. Q methodology is presented as both an appropriate means for such inquiry and as a novel experiential technique for training purposes. A tentative argument is advanced that the qualitative accounts generated by Q research could assist in systematic reviews of quality, complementing the singularly quantitative approaches typically represented in the audit process.
Alcohol Intake and Risk of Thyroid Cancer: A Meta-Analysis of Observational Studies.
Hong, Seung-Hee; Myung, Seung-Kwon; Kim, Hyeon Suk
2017-04-01
The purpose of this study was to assess whether alcohol intake is associated with the risk of thyroid cancer by a meta-analysis of observational studies. We searched PubMed and EMBASE in June of 2015 to locate eligible studies. We included observational studies such as cross-sectional studies, case-control studies, and cohort studies reporting odd ratios (ORs) or relative risk (RRs) with 95% confidence intervals (CIs). We included 33 observational studies with two cross-sectional studies, 20 case-controls studies, and 11 cohort studies, which involved a total of 7,725 thyroid cancer patients and 3,113,679 participants without thyroid cancer in the final analysis. In the fixed-effect model meta-analysis of all 33 studies, we found that alcohol intake was consistently associated with a decreased risk of thyroid cancer (OR or RR, 0.74; 95% CI, 0.67 to 0.83; I 2 =38.6%). In the subgroup meta-analysis by type of study, alcohol intake also decreased the risk of thyroid cancer in both case-control studies (OR, 0.77; 95% CI, 0.65 to 0.92; I 2 =29.5%; n=20) and cohort studies (RR, 0.70; 95% CI, 0.60 to 0.82; I 2 =0%; n=11). Moreover, subgroup meta-analyses by type of thyroid cancer, gender, amount of alcohol consumed, and methodological quality of study showed that alcohol intake was significantly associated with a decreased risk of thyroid cancer. The current meta-analysis of observational studies found that, unlike most of other types of cancer, alcohol intake decreased the risk of thyroid cancer.
Framework for managing mycotoxin risks in the food industry.
Baker, Robert C; Ford, Randall M; Helander, Mary E; Marecki, Janusz; Natarajan, Ramesh; Ray, Bonnie
2014-12-01
We propose a methodological framework for managing mycotoxin risks in the food processing industry. Mycotoxin contamination is a well-known threat to public health that has economic significance for the food processing industry; it is imperative to address mycotoxin risks holistically, at all points in the procurement, processing, and distribution pipeline, by tracking the relevant data, adopting best practices, and providing suitable adaptive controls. The proposed framework includes (i) an information and data repository, (ii) a collaborative infrastructure with analysis and simulation tools, (iii) standardized testing and acceptance sampling procedures, and (iv) processes that link the risk assessments and testing results to the sourcing, production, and product release steps. The implementation of suitable acceptance sampling protocols for mycotoxin testing is considered in some detail.
Risk-based design of process plants with regard to domino effects and land use planning.
Khakzad, Nima; Reniers, Genserik
2015-12-15
Land use planning (LUP) as an effective and crucial safety measure has widely been employed by safety experts and decision makers to mitigate off-site risks posed by major accidents. Accordingly, the concept of LUP in chemical plants has traditionally been considered from two perspectives: (i) land developments around existing chemical plants considering potential off-site risks posed by major accidents and (ii) development of existing chemical plants considering nearby land developments and the level of additional off-site risks the land developments would be exposed to. However, the attempts made to design chemical plants with regard to LUP requirements have been few, most of which have neglected the role of domino effects in risk analysis of major accidents. To overcome the limitations of previous work, first, we developed a Bayesian network methodology to calculate both on-site and off-site risks of major accidents while taking domino effects into account. Second, we combined the results of risk analysis with Analytic Hierarchical Process to design an optimal layout for which the levels of on-site and off-site risks would be minimum. Copyright © 2015 Elsevier B.V. All rights reserved.
2013-01-01
Background Assessing the risk of bias of randomized controlled trials (RCTs) is crucial to understand how biases affect treatment effect estimates. A number of tools have been developed to evaluate risk of bias of RCTs; however, it is unknown how these tools compare to each other in the items included. The main objective of this study was to describe which individual items are included in RCT quality tools used in general health and physical therapy (PT) research, and how these items compare to those of the Cochrane Risk of Bias (RoB) tool. Methods We used comprehensive literature searches and a systematic approach to identify tools that evaluated the methodological quality or risk of bias of RCTs in general health and PT research. We extracted individual items from all quality tools. We calculated the frequency of quality items used across tools and compared them to those in the RoB tool. Comparisons were made between general health and PT quality tools using Chi-squared tests. Results In addition to the RoB tool, 26 quality tools were identified, with 19 being used in general health and seven in PT research. The total number of quality items included in general health research tools was 130, compared with 48 items across PT tools and seven items in the RoB tool. The most frequently included items in general health research tools (14/19, 74%) were inclusion and exclusion criteria, and appropriate statistical analysis. In contrast, the most frequent items included in PT tools (86%, 6/7) were: baseline comparability, blinding of investigator/assessor, and use of intention-to-treat analysis. Key items of the RoB tool (sequence generation and allocation concealment) were included in 71% (5/7) of PT tools, and 63% (12/19) and 37% (7/19) of general health research tools, respectively. Conclusions There is extensive item variation across tools that evaluate the risk of bias of RCTs in health research. Results call for an in-depth analysis of items that should be used to assess risk of bias of RCTs. Further empirical evidence on the use of individual items and the psychometric properties of risk of bias tools is needed. PMID:24044807
Armijo-Olivo, Susan; Fuentes, Jorge; Ospina, Maria; Saltaji, Humam; Hartling, Lisa
2013-09-17
Assessing the risk of bias of randomized controlled trials (RCTs) is crucial to understand how biases affect treatment effect estimates. A number of tools have been developed to evaluate risk of bias of RCTs; however, it is unknown how these tools compare to each other in the items included. The main objective of this study was to describe which individual items are included in RCT quality tools used in general health and physical therapy (PT) research, and how these items compare to those of the Cochrane Risk of Bias (RoB) tool. We used comprehensive literature searches and a systematic approach to identify tools that evaluated the methodological quality or risk of bias of RCTs in general health and PT research. We extracted individual items from all quality tools. We calculated the frequency of quality items used across tools and compared them to those in the RoB tool. Comparisons were made between general health and PT quality tools using Chi-squared tests. In addition to the RoB tool, 26 quality tools were identified, with 19 being used in general health and seven in PT research. The total number of quality items included in general health research tools was 130, compared with 48 items across PT tools and seven items in the RoB tool. The most frequently included items in general health research tools (14/19, 74%) were inclusion and exclusion criteria, and appropriate statistical analysis. In contrast, the most frequent items included in PT tools (86%, 6/7) were: baseline comparability, blinding of investigator/assessor, and use of intention-to-treat analysis. Key items of the RoB tool (sequence generation and allocation concealment) were included in 71% (5/7) of PT tools, and 63% (12/19) and 37% (7/19) of general health research tools, respectively. There is extensive item variation across tools that evaluate the risk of bias of RCTs in health research. Results call for an in-depth analysis of items that should be used to assess risk of bias of RCTs. Further empirical evidence on the use of individual items and the psychometric properties of risk of bias tools is needed.
Culvenor, Adam G; Ruhdorfer, Anja; Juhl, Carsten; Eckstein, Felix; Øiestad, Britt Elin
2017-05-01
To perform a systematic review and meta-analysis on the association between knee extensor strength and the risk of structural, symptomatic, or functional deterioration in individuals with or at risk of knee osteoarthritis (KOA). We systematically identified and methodologically appraised all longitudinal studies (≥1-year followup) reporting an association between knee extensor strength and structural (tibiofemoral, patellofemoral), symptomatic (self-reported, knee replacement), or functional (subjective, objective) decline in individuals with or at risk of radiographic or symptomatic KOA. Results were pooled for each of the above associations using meta-analysis, or if necessary, summarized according to a best-evidence synthesis. Fifteen studies were included, evaluating >8,000 participants (51% female), with a followup time between 1.5 and 8 years. Meta-analysis revealed that lower knee extensor strength was associated with an increased risk of symptomatic (Western Ontario and McMaster Universities Osteoarthritis Index [WOMAC] pain: odds ratio [OR] 1.35, 95% confidence interval [95% CI] 1.10-1.67) and functional decline (WOMAC function: OR 1.38, 95% CI 1.00-1.89, and chair-stand task: OR 1.03, 95% CI 1.03-1.04), but not increased risk of radiographic tibiofemoral joint space narrowing (JSN) (OR 1.15, 95% CI 0.84-1.56). No trend in risk was observed for KOA status (present versus absent). Best-evidence synthesis showed inconclusive evidence for lower knee extensor strength being associated with increased risk of patellofemoral deterioration. Meta-analysis showed that lower knee extensor strength is associated with an increased risk of symptomatic and functional deterioration, but not tibiofemoral JSN. The risk of patellofemoral deterioration in the presence of knee extensor strength deficits is inconclusive. © 2016, American College of Rheumatology.
Methodology of risk assessment of loss of water resources due to climate changes
NASA Astrophysics Data System (ADS)
Israfilov, Yusif; Israfilov, Rauf; Guliyev, Hatam; Afandiyev, Galib
2016-04-01
For sustainable development and management of rational use of water resources of Azerbaijan Republic it is actual to forecast their changes taking into account different scenarios of climate changes and assessment of possible risks of loss of sections of water resources. The major part of the Azerbaijani territory is located in the arid climate and the vast majority of water is used in the national economic production. An optimal use of conditional groundwater and surface water is of great strategic importance for economy of the country in terms of lack of common water resources. Low annual rate of sediments, high evaporation and complex natural and hydrogeological conditions prevent sustainable formation of conditioned resources of ground and surface water. In addition, reserves of fresh water resources are not equally distributed throughout the Azerbaijani territory. The lack of the common water balance creates tension in the rational use of fresh water resources in various sectors of the national economy, especially in agriculture, and as a result, in food security of the republic. However, the fresh water resources of the republic have direct proportional dependence on climatic factors. 75-85% of the resources of ground stratum-pore water of piedmont plains and fracture-vein water of mountain regions are formed by the infiltration of rainfall and condensate water. Changes of climate parameters involve changes in the hydrological cycle of the hydrosphere and as a rule, are reflected on their resources. Forecasting changes of water resources of the hydrosphere with different scenarios of climate change in regional mathematical models allowed estimating the extent of their relationship and improving the quality of decisions. At the same time, it is extremely necessary to obtain additional data for risk assessment and management to reduce water resources for a detailed analysis, forecasting the quantitative and qualitative parameters of resources, and also for optimization the use of water resources. In this regard, we have developed the methodology of risk assessment including statistical fuzzy analysis of the relationship "probability-consequences", classification of probabilities, the consequences on degree of severity and risk. The current methodology allow providing the possibility of practical use of the obtained results and giving effectual help in the sustainable development and reduction of risk degree of optimal use of water resources of the republic and, as a consequence, the national strategy of economic development.
Derailment-based Fault Tree Analysis on Risk Management of Railway Turnout Systems
NASA Astrophysics Data System (ADS)
Dindar, Serdar; Kaewunruen, Sakdirat; An, Min; Gigante-Barrera, Ángel
2017-10-01
Railway turnouts are fundamental mechanical infrastructures, which allow a rolling stock to divert one direction to another. As those are of a large number of engineering subsystems, e.g. track, signalling, earthworks, these particular sub-systems are expected to induce high potential through various kind of failure mechanisms. This could be a cause of any catastrophic event. A derailment, one of undesirable events in railway operation, often results, albeit rare occurs, in damaging to rolling stock, railway infrastructure and disrupt service, and has the potential to cause casualties and even loss of lives. As a result, it is quite significant that a well-designed risk analysis is performed to create awareness of hazards and to identify what parts of the systems may be at risk. This study will focus on all types of environment based failures as a result of numerous contributing factors noted officially as accident reports. This risk analysis is designed to help industry to minimise the occurrence of accidents at railway turnouts. The methodology of the study relies on accurate assessment of derailment likelihood, and is based on statistical multiple factors-integrated accident rate analysis. The study is prepared in the way of establishing product risks and faults, and showing the impact of potential process by Boolean algebra.
2011-01-01
Background The analysis of risk for the population residing and/or working in contaminated areas raises the topic of commuting. In fact, especially in contaminated areas, commuting groups are likely to be subject to lower exposure than residents. Only very recently environmental epidemiology has started considering the role of commuting as a differential source of exposure in contaminated areas. In order to improve the categorization of groups, this paper applies a gravitational model to the analysis of residential risk for workers in the Gela petrochemical complex, which began life in the early 60s in the municipality of Gela (Sicily, Italy) and is the main source of industrial pollution in the local area. Results A logistic regression model is implemented to measure the capacity of Gela "central location" to attract commuting flows from other sites. Drawing from gravity models, the proposed methodology: a) defines the probability of finding commuters from municipalities outside Gela as a function of the origin's "economic mass" and of its distance from each destination; b) establishes "commuting thresholds" relative to the origin's mass. The analysis includes 367 out of the 390 Sicilian municipalities. Results are applied to define "commuters" and "residents" within the cohort of petrochemical workers. The study population is composed of 5,627 workers. Different categories of residence in Gela are compared calculating Mortality Rate Ratios for lung cancer through a Poisson regression model, controlling for age and calendar period. The mobility model correctly classifies almost 90% of observations. Its application to the mortality analysis confirms a major risk for lung cancer associated with residence in Gela. Conclusions Commuting is a critical aspect of the health-environment relationship in contaminated areas. The proposed methodology can be replicated to different contexts when residential information is lacking or unreliable; however, a careful consideration of the territorial characteristics ("insularity" and its impact on transportation time and costs, in our case) is suggested when specifying the area of application for the mobility analysis. PMID:21272299
Life-table methods for detecting age-risk factor interactions in long-term follow-up studies.
Logue, E E; Wing, S
1986-01-01
Methodological investigation has suggested that age-risk factor interactions should be more evident in age of experience life tables than in follow-up time tables due to the mixing of ages of experience over follow-up time in groups defined by age at initial examination. To illustrate the two approaches, age modification of the effect of total cholesterol on ischemic heart disease mortality in two long-term follow-up studies was investigated. Follow-up time life table analysis of 116 deaths over 20 years in one study was more consistent with a uniform relative risk due to cholesterol, while age of experience life table analysis was more consistent with a monotonic negative age interaction. In a second follow-up study (160 deaths over 24 years), there was no evidence of a monotonic negative age-cholesterol interaction by either method. It was concluded that age-specific life table analysis should be used when age-risk factor interactions are considered, but that both approaches yield almost identical results in absence of age interaction. The identification of the more appropriate life-table analysis should be ultimately guided by the nature of the age or time phenomena of scientific interest.
The Trans-Pacific Partnership: Is It Everything We Feared for Health?
Labonté, Ronald; Schram, Ashley; Ruckert, Arne
2016-04-17
Negotiations surrounding the Trans-Pacific Partnership (TPP) trade and investment agreement have recently concluded. Although trade and investment agreements, part of a broader shift to global economic integration, have been argued to be vital to improved economic growth, health, and general welfare, these agreements have increasingly come under scrutiny for their direct and indirect health impacts. We conducted a prospective health impact analysis to identify and assess a selected array of potential health risks of the TPP. We adapted the standard protocol for Health impact assessments (HIAs) (screening, scoping, and appraisal) to our aim of assessing potential health risks of trade and investment policy, and selected a health impact review methodology. This methodology is used to create a summary estimation of the most significant impacts on health of a broad policy or cluster of policies, such as a comprehensive trade and investment agreement. Our analysis shows that there are a number of potentially serious health risks associated with the TPP, and details a range of policy implications for the health sector. Of particular focus are the potential implications of changes to intellectual property rights (IPRs), sanitary and phytosanitary measures (SPS), technical barriers to trade (TBT), investor-state dispute settlement (ISDS), and regulatory coherence provisions on a range of issues, including access to medicines and health services, tobacco and alcohol control, diet-related health, and domestic health policy-making. We provide a list of policy recommendations to mitigate potential health risks associated with the TPP, and suggest that broad public consultations, including on the health risks of trade and investment agreements, should be part of all trade negotiations. © 2016 by Kerman University of Medical Sciences
The Trans-Pacific Partnership: Is It Everything We Feared for Health?
Labonté, Ronald; Schram, Ashley; Ruckert, Arne
2016-01-01
Background: Negotiations surrounding the Trans-Pacific Partnership (TPP) trade and investment agreement have recently concluded. Although trade and investment agreements, part of a broader shift to global economic integration, have been argued to be vital to improved economic growth, health, and general welfare, these agreements have increasingly come under scrutiny for their direct and indirect health impacts. Methods: We conducted a prospective health impact analysis to identify and assess a selected array of potential health risks of the TPP. We adapted the standard protocol for Health impact assessments (HIAs) (screening, scoping, and appraisal) to our aim of assessing potential health risks of trade and investment policy, and selected a health impact review methodology. This methodology is used to create a summary estimation of the most significant impacts on health of a broad policy or cluster of policies, such as a comprehensive trade and investment agreement. Results: Our analysis shows that there are a number of potentially serious health risks associated with the TPP, and details a range of policy implications for the health sector. Of particular focus are the potential implications of changes to intellectual property rights (IPRs), sanitary and phytosanitary measures (SPS), technical barriers to trade (TBT), investor-state dispute settlement (ISDS), and regulatory coherence provisions on a range of issues, including access to medicines and health services, tobacco and alcohol control, diet-related health, and domestic health policy-making. Conclusion: We provide a list of policy recommendations to mitigate potential health risks associated with the TPP, and suggest that broad public consultations, including on the health risks of trade and investment agreements, should be part of all trade negotiations. PMID:27694662
Pediatric Cancer Survivorship Research: Experience of the Childhood Cancer Survivor Study
Leisenring, Wendy M.; Mertens, Ann C.; Armstrong, Gregory T.; Stovall, Marilyn A.; Neglia, Joseph P.; Lanctot, Jennifer Q.; Boice, John D.; Whitton, John A.; Yasui, Yutaka
2009-01-01
The Childhood Cancer Survivor Study (CCSS) is a comprehensive multicenter study designed to quantify and better understand the effects of pediatric cancer and its treatment on later health, including behavioral and sociodemographic outcomes. The CCSS investigators have published more than 100 articles in the scientific literature related to the study. As with any large cohort study, high standards for methodologic approaches are imperative for valid and generalizable results. In this article we describe methodological issues of study design, exposure assessment, outcome validation, and statistical analysis. Methods for handling missing data, intrafamily correlation, and competing risks analysis are addressed; each with particular relevance to pediatric cancer survivorship research. Our goal in this article is to provide a resource and reference for other researchers working in the area of long-term cancer survivorship. PMID:19364957
David, Helena Maria Scherlowski Leal; Caufield, Catherine
2005-01-01
This exploratory study aimed to investigate factors related to the use of illicit and licit drugs and workplace violence in a group of women from popular classes in the city of Rio de Janeiro. We used a descriptive and analytic quantitative approach was used, as well as a qualitative approach through in-depth interviews with women who suffered or were suffering workplace violence, using the collective subject discourse analysis methodology. The results showed sociodemographic and work situations that can be considered as possible risk factors for drug consumption and workplace violence. The qualitative analysis shows how this group perceives the phenomena of drug use and workplace violence, expanding the comprehension about these issues and providing conceptual and methodological elements for additional studies on this subject.
Barzyk, Timothy M.; Wilson, Sacoby; Wilson, Anthony
2015-01-01
Community, state, and federal approaches to conventional and cumulative risk assessment (CRA) were described and compared to assess similarities and differences, and develop recommendations for a consistent CRA approach, acceptable across each level as a rigorous scientific methodology, including partnership formation and solution development as necessary practices. Community, state, and federal examples were described and then summarized based on their adherence to CRA principles of: (1) planning, scoping, and problem formulation; (2) risk analysis and ranking, and (3) risk characterization, interpretation, and management. While each application shared the common goal of protecting human health and the environment, they adopted different approaches to achieve this. For a specific project-level analysis of a particular place or instance, this may be acceptable, but to ensure long-term applicability and transferability to other projects, recommendations for developing a consistent approach to CRA are provided. This approach would draw from best practices, risk assessment and decision analysis sciences, and historical lessons learned to provide results in an understandable and accepted manner by all entities. This approach is intended to provide a common ground around which to develop CRA methods and approaches that can be followed at all levels. PMID:25918910
Combining operational models and data into a dynamic vessel risk assessment tool for coastal regions
NASA Astrophysics Data System (ADS)
Fernandes, R.; Braunschweig, F.; Lourenço, F.; Neves, R.
2015-07-01
The technological evolution in terms of computational capacity, data acquisition systems, numerical modelling and operational oceanography is supplying opportunities for designing and building holistic approaches and complex tools for newer and more efficient management (planning, prevention and response) of coastal water pollution risk events. A combined methodology to dynamically estimate time and space variable shoreline risk levels from ships has been developed, integrating numerical metocean forecasts and oil spill simulations with vessel tracking automatic identification systems (AIS). The risk rating combines the likelihood of an oil spill occurring from a vessel navigating in a study area - Portuguese Continental shelf - with the assessed consequences to the shoreline. The spill likelihood is based on dynamic marine weather conditions and statistical information from previous accidents. The shoreline consequences reflect the virtual spilled oil amount reaching shoreline and its environmental and socio-economic vulnerabilities. The oil reaching shoreline is quantified with an oil spill fate and behaviour model running multiple virtual spills from vessels along time. Shoreline risks can be computed in real-time or from previously obtained data. Results show the ability of the proposed methodology to estimate the risk properly sensitive to dynamic metocean conditions and to oil transport behaviour. The integration of meteo-oceanic + oil spill models with coastal vulnerability and AIS data in the quantification of risk enhances the maritime situational awareness and the decision support model, providing a more realistic approach in the assessment of shoreline impacts. The risk assessment from historical data can help finding typical risk patterns, "hot spots" or developing sensitivity analysis to specific conditions, whereas real time risk levels can be used in the prioritization of individual ships, geographical areas, strategic tug positioning and implementation of dynamic risk-based vessel traffic monitoring.
Development of the Methodology Needed to Quantify Risks to Groundwater at CO2 Storage Sites
NASA Astrophysics Data System (ADS)
Brown, C. F.; Birkholzer, J. T.; Carroll, S.; Hakala, A.; Keating, E. H.; Lopano, C. L.; Newell, D. L.; Spycher, N.
2011-12-01
The National Risk Assessment Partnership (NRAP) is an effort that harnesses capabilities across five U.S. Department of Energy (DOE) national laboratories into a mission-focused platform to develop a defensible, science-based quantitative methodology for determining risk profiles at CO2 storage sites. NRAP is conducting risk and uncertainty analysis in the areas of reservoir performance, natural leakage pathways, wellbore integrity, groundwater protection, monitoring, and systems level modeling. The mission of NRAP is "to provide the scientific underpinning for risk assessment with respect to the long-term storage of CO2, including assessment of residual risk associated with a site post-closure." Additionally, NRAP will develop a strategic, risk-based monitoring protocol, such that monitoring at all stages of a project effectively minimizes uncertainty in the predicted behavior of the site, thereby increasing confidence in storage integrity. NRAP's research focus in the area of groundwater protection is divided into three main tasks: 1) development of quantitative risk profiles for potential groundwater impacts; 2) filling key science gaps in developing those risk profiles; and 3) field-based confirmation. Within these three tasks, researchers are engaged in collaborative studies to determine metrics to identify system perturbation and their associated risk factors. Reservoir simulations are being performed to understand/predict consequences of hypothetical leakage scenarios, from which reduced order models are being developed to feed risk profile development. Both laboratory-based experiments and reactive transport modeling studies provide estimates of geochemical impacts over a broad range of leakage scenarios. This presentation will provide an overview of the research objectives within NRAP's groundwater protection focus area, as well as select accomplishments achieved to date.
Aschberger, Karin; Micheletti, Christian; Sokull-Klüttgen, Birgit; Christensen, Frans M
2011-08-01
Production volumes and the use of engineered nanomaterials in many innovative products are continuously increasing, however little is known about their potential risk for the environment and human health. We have reviewed publicly available hazard and exposure data for both, the environment and human health and attempted to carry out a basic risk assessment appraisal for four types of nanomaterials: fullerenes, carbon nanotubes, metals, and metal oxides (ENRHES project 2009(1)). This paper presents a summary of the results of the basic environmental and human health risk assessments of these case studies, highlighting the cross cutting issues and conclusions about fate and behaviour, exposure, hazard and methodological considerations. The risk assessment methodology being the basis for our case studies was that of a regulatory risk assessment under REACH (ECHA, 2008(2)), with modifications to adapt to the limited available data. If possible, environmental no-effect concentrations and human no-effect levels were established from relevant studies by applying assessment factors in line with the REACH guidance and compared to available exposure data to discuss possible risks. When the data did not allow a quantitative assessment, the risk was assessed qualitatively, e.g. for the environment by evaluating the information in the literature to describe the potential to enter the environment and to reach the potential ecological targets. Results indicate that the main risk for the environment is expected from metals and metal oxides, especially for algae and Daphnia, due to exposure to both, particles and ions. The main risks for human health may arise from chronic occupational inhalation exposure, especially during the activities of high particle release and uncontrolled exposure. The information on consumer and environmental exposure of humans is too scarce to attempt a quantitative risk characterisation. It is recognised that the currently available database for both, hazard and exposure is limited and there are high uncertainties in any conclusion on a possible risk. The results should therefore not be used for any regulatory decision making. Likewise, it is recognised that the REACH guidance was developed without considering the specific behaviour and the mode of action of nanomaterials and further work in the generation of data but also in the development of methodologies is required. Copyright © 2011 Elsevier Ltd. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-27
... number of studies have shown a correlation between higher job risk and higher wages, suggesting that... & Aldy (2003) conducted an analysis of studies that use a willingness-to-pay methodology to estimate the...). Although MSHA is using the Viscusi & Aldy (2003) study as the basis for monetizing the expected benefits of...