Application of Risk Assessment Tools in the Continuous Risk Management (CRM) Process
NASA Technical Reports Server (NTRS)
Ray, Paul S.
2002-01-01
Marshall Space Flight Center (MSFC) of the National Aeronautics and Space Administration (NASA) is currently implementing the Continuous Risk Management (CRM) Program developed by the Carnegie Mellon University and recommended by NASA as the Risk Management (RM) implementation approach. The four most frequently used risk assessment tools in the center are: (a) Failure Modes and Effects Analysis (FMEA), Hazard Analysis (HA), Fault Tree Analysis (FTA), and Probabilistic Risk Analysis (PRA). There are some guidelines for selecting the type of risk assessment tools during the project formulation phase of a project, but there is not enough guidance as to how to apply these tools in the Continuous Risk Management process (CRM). But the ways the safety and risk assessment tools are used make a significant difference in the effectiveness in the risk management function. Decisions regarding, what events are to be included in the analysis, to what level of details should the analysis be continued, make significant difference in the effectiveness of risk management program. Tools of risk analysis also depends on the phase of a project e.g. at the initial phase of a project, when not much data are available on hardware, standard FMEA cannot be applied; instead a functional FMEA may be appropriate. This study attempted to provide some directives to alleviate the difficulty in applying FTA, PRA, and FMEA in the CRM process. Hazard Analysis was not included in the scope of the study due to the short duration of the summer research project.
Revealing the underlying drivers of disaster risk: a global analysis
NASA Astrophysics Data System (ADS)
Peduzzi, Pascal
2017-04-01
Disasters events are perfect examples of compound events. Disaster risk lies at the intersection of several independent components such as hazard, exposure and vulnerability. Understanding the weight of each component requires extensive standardisation. Here, I show how footprints of past disastrous events were generated using GIS modelling techniques and used for extracting population and economic exposures based on distribution models. Using past event losses, it was possible to identify and quantify a wide range of socio-politico-economic drivers associated with human vulnerability. The analysis was applied to about nine thousand individual past disastrous events covering earthquakes, floods and tropical cyclones. Using a multiple regression analysis on these individual events it was possible to quantify each risk component and assess how vulnerability is influenced by various hazard intensities. The results show that hazard intensity, exposure, poverty, governance as well as other underlying factors (e.g. remoteness) can explain the magnitude of past disasters. Analysis was also performed to highlight the role of future trends in population and climate change and how this may impacts exposure to tropical cyclones in the future. GIS models combined with statistical multiple regression analysis provided a powerful methodology to identify, quantify and model disaster risk taking into account its various components. The same methodology can be applied to various types of risk at local to global scale. This method was applied and developed for the Global Risk Analysis of the Global Assessment Report on Disaster Risk Reduction (GAR). It was first applied on mortality risk in GAR 2009 and GAR 2011. New models ranging from global assets exposure and global flood hazard models were also recently developed to improve the resolution of the risk analysis and applied through CAPRA software to provide probabilistic economic risk assessments such as Average Annual Losses (AAL) and Probable Maximum Losses (PML) in GAR 2013 and GAR 2015. In parallel similar methodologies were developed to highlitght the role of ecosystems for Climate Change Adaptation (CCA) and Disaster Risk Reduction (DRR). New developments may include slow hazards (such as e.g. soil degradation and droughts), natech hazards (by intersecting with georeferenced critical infrastructures) The various global hazard, exposure and risk models can be visualized and download through the PREVIEW Global Risk Data Platform.
Risk-benefit analysis and public policy: a bibliography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clark, E.M.; Van Horn, A.J.
1976-11-01
Risk-benefit analysis has been implicitly practiced whenever decision-makers are confronted with decisions involving risks to life, health, or to the environment. Various methodologies have been developed to evaluate relevant criteria and to aid in assessing the impacts of alternative projects. Among these have been cost-benefit analysis, which has been widely used for project evaluation. However, in many cases it has been difficult to assign dollar costs to those criteria involving risks and benefits which are not now assigned explicit monetary values in our economic system. Hence, risk-benefit analysis has evolved to become more than merely an extension of cost-benefit analysis,more » and many methods have been applied to examine the trade-offs between risks and benefits. In addition, new scientific and statistical techniques have been developed for assessing current and future risks. The 950 references included in this bibliography are meant to suggest the breadth of those methodologies which have been applied to decisions involving risk.« less
Risk analysis and its link with standards of the World Organisation for Animal Health.
Sugiura, K; Murray, N
2011-04-01
Among the agreements included in the treaty that created the World Trade Organization (WTO) in January 1995 is the Agreement on the Application of Sanitary and Phytosanitary Measures (SPS Agreement) that sets out the basic rules for food safety and animal and plant health standards. The SPS Agreement designates the World Organisation for Animal Health (OIE) as the organisation responsible for developing international standards for animal health and zoonoses. The SPS Agreement requires that the sanitary measures that WTO members apply should be based on science and encourages them to either apply measures based on the OIE standards or, if they choose to adopt a higher level of protection than that provided by these standards, apply measures based on a science-based risk assessment. The OIE also provides a procedural framework for risk analysis for its Member Countries to use. Despite the inevitable challenges that arise in carrying out a risk analysis of the international trade in animals and animal products, the OIE risk analysis framework provides a structured approach that facilitates the identification, assessment, management and communication of these risks.
2013-06-30
QUANTITATIVE RISK ANALYSIS The use of quantitative cost risk analysis tools can be valuable in measuring numerical risk to the government ( Galway , 2004...assessment of the EVMS itself. Galway (2004) practically linked project quantitative risk assessment to EVM by focusing on cost, schedule, and...www.amazon.com Galway , L. (2004, February). Quantitative risk analysis for project management: A critical review (RAND Working Paper WR-112-RC
A case study using the PrOACT-URL and BRAT frameworks for structured benefit risk assessment.
Nixon, Richard; Dierig, Christoph; Mt-Isa, Shahrul; Stöckert, Isabelle; Tong, Thaison; Kuhls, Silvia; Hodgson, Gemma; Pears, John; Waddingham, Ed; Hockley, Kimberley; Thomson, Andrew
2016-01-01
While benefit-risk assessment is a key component of the drug development and maintenance process, it is often described in a narrative. In contrast, structured benefit-risk assessment builds on established ideas from decision analysis and comprises a qualitative framework and quantitative methodology. We compare two such frameworks, applying multi-criteria decision-analysis (MCDA) within the PrOACT-URL framework and weighted net clinical benefit (wNCB), within the BRAT framework. These are applied to a case study of natalizumab for the treatment of relapsing remitting multiple sclerosis. We focus on the practical considerations of applying these methods and give recommendations for visual presentation of results. In the case study, we found structured benefit-risk analysis to be a useful tool for structuring, quantifying, and communicating the relative benefit and safety profiles of drugs in a transparent, rational and consistent way. The two frameworks were similar. MCDA is a generic and flexible methodology that can be used to perform a structured benefit-risk in any common context. wNCB is a special case of MCDA and is shown to be equivalent to an extension of the number needed to treat (NNT) principle. It is simpler to apply and understand than MCDA and can be applied when all outcomes are measured on a binary scale. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Putting problem formulation at the forefront of GMO risk analysis.
Tepfer, Mark; Racovita, Monica; Craig, Wendy
2013-01-01
When applying risk assessment and the broader process of risk analysis to decisions regarding the dissemination of genetically modified organisms (GMOs), the process has a tendency to become remarkably complex. Further, as greater numbers of countries consider authorising the large-scale dissemination of GMOs, and as GMOs with more complex traits reach late stages of development, there has been increasing concern about the burden posed by the complexity of risk analysis. We present here an improved approach for GMO risk analysis that gives a central role to problem formulation. Further, the risk analysis strategy has been clarified and simplified in order to make rigorously scientific risk assessment and risk analysis more broadly accessible to diverse stakeholder groups.
Risk analysis of gravity dam instability using credibility theory Monte Carlo simulation model.
Xin, Cao; Chongshi, Gu
2016-01-01
Risk analysis of gravity dam stability involves complicated uncertainty in many design parameters and measured data. Stability failure risk ratio described jointly by probability and possibility has deficiency in characterization of influence of fuzzy factors and representation of the likelihood of risk occurrence in practical engineering. In this article, credibility theory is applied into stability failure risk analysis of gravity dam. Stability of gravity dam is viewed as a hybrid event considering both fuzziness and randomness of failure criterion, design parameters and measured data. Credibility distribution function is conducted as a novel way to represent uncertainty of influence factors of gravity dam stability. And combining with Monte Carlo simulation, corresponding calculation method and procedure are proposed. Based on a dam section, a detailed application of the modeling approach on risk calculation of both dam foundation and double sliding surfaces is provided. The results show that, the present method is feasible to be applied on analysis of stability failure risk for gravity dams. The risk assessment obtained can reflect influence of both sorts of uncertainty, and is suitable as an index value.
Probabilistic Exposure Analysis for Chemical Risk Characterization
Bogen, Kenneth T.; Cullen, Alison C.; Frey, H. Christopher; Price, Paul S.
2009-01-01
This paper summarizes the state of the science of probabilistic exposure assessment (PEA) as applied to chemical risk characterization. Current probabilistic risk analysis methods applied to PEA are reviewed. PEA within the context of risk-based decision making is discussed, including probabilistic treatment of related uncertainty, interindividual heterogeneity, and other sources of variability. Key examples of recent experience gained in assessing human exposures to chemicals in the environment, and other applications to chemical risk characterization and assessment, are presented. It is concluded that, although improvements continue to be made, existing methods suffice for effective application of PEA to support quantitative analyses of the risk of chemically induced toxicity that play an increasing role in key decision-making objectives involving health protection, triage, civil justice, and criminal justice. Different types of information required to apply PEA to these different decision contexts are identified, and specific PEA methods are highlighted that are best suited to exposure assessment in these separate contexts. PMID:19223660
New ventures require accurate risk analyses and adjustments.
Eastaugh, S R
2000-01-01
For new business ventures to succeed, healthcare executives need to conduct robust risk analyses and develop new approaches to balance risk and return. Risk analysis involves examination of objective risks and harder-to-quantify subjective risks. Mathematical principles applied to investment portfolios also can be applied to a portfolio of departments or strategic business units within an organization. The ideal business investment would have a high expected return and a low standard deviation. Nonetheless, both conservative and speculative strategies should be considered in determining an organization's optimal service line and helping the organization manage risk.
Chen, Stephanie C; Kim, Scott Y H
2016-01-01
Background/Aims Standard of care pragmatic clinical trials (SCPCTs) that compare treatments already in use could improve care and reduce cost but there is considerable debate about the research risks of SCPCTs and how to apply informed consent regulations to such trials. We sought to develop a framework integrating the insights from opposing sides of the debate. Methods We developed a formal risk-benefit analysis framework for SCPCTs and then applied it to key provisions of the U.S. federal regulations. Results Our formal framework for SCPCT risk-benefit analysis takes into account three key considerations: the ex ante estimates of risks and benefits of the treatments to be compared in a SCPCT, the allocation ratios of treatments inside and outside a SCPCT, and the significance of some participants receiving a different treatment inside a SCPCT than outside the trial. The framework provides practical guidance on how the research ethics regulations on informed consent should be applied to SCPCTs. Conclusions Our proposed formal model makes explicit the relationship between the concepts used by opposing sides of the debate about the research risks of SCPCTs and can be used to clarify the implications for informed consent. PMID:27365010
Use-related risk analysis for medical devices based on improved FMEA.
Liu, Long; Shuai, Ma; Wang, Zhu; Li, Ping
2012-01-01
In order to effectively analyze and control use-related risk of medical devices, quantitative methodologies must be applied. Failure Mode and Effects Analysis (FMEA) is a proactive technique for error detection and risk reduction. In this article, an improved FMEA based on Fuzzy Mathematics and Grey Relational Theory is developed to better carry out user-related risk analysis for medical devices. As an example, the analysis process using this improved FMEA method for a certain medical device (C-arm X-ray machine) is described.
2013-06-01
measuring numerical risk to the government ( Galway , 2004). However, quantitative risk analysis is rarely utilized in DoD acquisition programs because the...quantitative assessment of the EVMS itself. Galway (2004) practically linked project quantitative risk assessment to EVM by focusing on cost...Kindle version]. Retrieved from Amazon.com 83 Galway , L. (2004, February). Quantitative risk analysis for project management: A critical review
NASA Astrophysics Data System (ADS)
Krechowicz, Maria
2017-10-01
Nowadays, one of the characteristic features of construction industry is an increased complexity of a growing number of projects. Almost each construction project is unique, has its project-specific purpose, its own project structural complexity, owner’s expectations, ground conditions unique to a certain location, and its own dynamics. Failure costs and costs resulting from unforeseen problems in complex construction projects are very high. Project complexity drivers pose many vulnerabilities to a successful completion of a number of projects. This paper discusses the process of effective risk management in complex construction projects in which renewable energy sources were used, on the example of the realization phase of the ENERGIS teaching-laboratory building, from the point of view of DORBUD S.A., its general contractor. This paper suggests a new approach to risk management for complex construction projects in which renewable energy sources were applied. The risk management process was divided into six stages: gathering information, identification of the top, critical project risks resulting from the project complexity, construction of the fault tree for each top, critical risks, logical analysis of the fault tree, quantitative risk assessment applying fuzzy logic and development of risk response strategy. A new methodology for the qualitative and quantitative risk assessment for top, critical risks in complex construction projects was developed. Risk assessment was carried out applying Fuzzy Fault Tree analysis on the example of one top critical risk. Application of the Fuzzy sets theory to the proposed model allowed to decrease uncertainty and eliminate problems with gaining the crisp values of the basic events probability, common during expert risk assessment with the objective to give the exact risk score of each unwanted event probability.
Evaluation of Cardiovascular Risk Scores Applied to NASA's Astronant Corps
NASA Technical Reports Server (NTRS)
Jain, I.; Charvat, J. M.; VanBaalen, M.; Lee, L.; Wear, M. L.
2014-01-01
In an effort to improve cardiovascular disease (CVD) risk prediction, this analysis evaluates and compares the applicability of multiple CVD risk scores to the NASA Astronaut Corps which is extremely healthy at selection.
Human Reliability Analysis in Support of Risk Assessment for Positive Train Control
DOT National Transportation Integrated Search
2003-06-01
This report describes an approach to evaluating the reliability of human actions that are modeled in a probabilistic risk assessment : (PRA) of train control operations. This approach to human reliability analysis (HRA) has been applied in the case o...
76 FR 32933 - International Standard-Setting Activities
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-07
... or re-evaluation by JECFA. Proposed amendments to the Risk Analysis Principles for CCRVDF for comments and consideration at the next session. Proposed revision of Risk Analysis Principles Applied by... the Classification of Foods and Animal Feeds: Tree Nuts, Herbs and Spices. Draft Principle and...
Eckermann, Simon; Coory, Michael; Willan, Andrew R
2011-02-01
Economic analysis and assessment of net clinical benefit often requires estimation of absolute risk difference (ARD) for binary outcomes (e.g. survival, response, disease progression) given baseline epidemiological risk in a jurisdiction of interest and trial evidence of treatment effects. Typically, the assumption is made that relative treatment effects are constant across baseline risk, in which case relative risk (RR) or odds ratios (OR) could be applied to estimate ARD. The objective of this article is to establish whether such use of RR or OR allows consistent estimates of ARD. ARD is calculated from alternative framing of effects (e.g. mortality vs survival) applying standard methods for translating evidence with RR and OR. For RR, the RR is applied to baseline risk in the jurisdiction to estimate treatment risk; for OR, the baseline risk is converted to odds, the OR applied and the resulting treatment odds converted back to risk. ARD is shown to be consistently estimated with OR but changes with framing of effects using RR wherever there is a treatment effect and epidemiological risk differs from trial risk. Additionally, in indirect comparisons, ARD is shown to be consistently estimated with OR, while calculation with RR allows inconsistency, with alternative framing of effects in the direction, let alone the extent, of ARD. OR ensures consistent calculation of ARD in translating evidence from trial settings and across trials in direct and indirect comparisons, avoiding inconsistencies from RR with alternative outcome framing and associated biases. These findings are critical for consistently translating evidence to inform economic analysis and assessment of net clinical benefit, as translation of evidence is proposed precisely where the advantages of OR over RR arise.
Chen, Stephanie C; Kim, Scott Yh
2016-12-01
Standard of care pragmatic clinical trials that compare treatments already in use could improve care and reduce costs, but there is considerable debate about the research risks of standard of care pragmatic clinical trials and how to apply informed consent regulations to such trials. We sought to develop a framework integrating the insights from opposing sides of the debate. We developed a formal risk-benefit analysis framework for standard of care pragmatic clinical trials and then applied it to key provisions of the US federal regulations. Our formal framework for standard of care pragmatic clinical trial risk-benefit analysis takes into account three key considerations: the ex ante estimates of risks and benefits of the treatments to be compared in a standard of care pragmatic clinical trial, the allocation ratios of treatments inside and outside such a trial, and the significance of some participants receiving a different treatment inside a trial than outside the trial. The framework provides practical guidance on how the research ethics regulations on informed consent should be applied to standard of care pragmatic clinical trials. Our proposed formal model makes explicit the relationship between the concepts used by opposing sides of the debate about the research risks of standard of care pragmatic clinical trials and can be used to clarify the implications for informed consent. © The Author(s) 2016.
Applying Uncertainty Analysis to a Risk Assessment for the Pesticide Permethrin
We discuss the application of methods of uncertainty analysis from our previous poster to the problem of a risk assessment for exposure to the food-use pesticide permethrin resulting from residential pesticide crack and crevice application. Exposures are simulated by the SHEDS (S...
Estimation of value at risk and conditional value at risk using normal mixture distributions model
NASA Astrophysics Data System (ADS)
Kamaruzzaman, Zetty Ain; Isa, Zaidi
2013-04-01
Normal mixture distributions model has been successfully applied in financial time series analysis. In this paper, we estimate the return distribution, value at risk (VaR) and conditional value at risk (CVaR) for monthly and weekly rates of returns for FTSE Bursa Malaysia Kuala Lumpur Composite Index (FBMKLCI) from July 1990 until July 2010 using the two component univariate normal mixture distributions model. First, we present the application of normal mixture distributions model in empirical finance where we fit our real data. Second, we present the application of normal mixture distributions model in risk analysis where we apply the normal mixture distributions model to evaluate the value at risk (VaR) and conditional value at risk (CVaR) with model validation for both risk measures. The empirical results provide evidence that using the two components normal mixture distributions model can fit the data well and can perform better in estimating value at risk (VaR) and conditional value at risk (CVaR) where it can capture the stylized facts of non-normality and leptokurtosis in returns distribution.
NASA Technical Reports Server (NTRS)
Fragola, Joseph R.; Maggio, Gaspare; Frank, Michael V.; Gerez, Luis; Mcfadden, Richard H.; Collins, Erin P.; Ballesio, Jorge; Appignani, Peter L.; Karns, James J.
1995-01-01
Volume 5 is Appendix C, Auxiliary Shuttle Risk Analyses, and contains the following reports: Probabilistic Risk Assessment of Space Shuttle Phase 1 - Space Shuttle Catastrophic Failure Frequency Final Report; Risk Analysis Applied to the Space Shuttle Main Engine - Demonstration Project for the Main Combustion Chamber Risk Assessment; An Investigation of the Risk Implications of Space Shuttle Solid Rocket Booster Chamber Pressure Excursions; Safety of the Thermal Protection System of the Space Shuttle Orbiter - Quantitative Analysis and Organizational Factors; Space Shuttle Main Propulsion Pressurization System Probabilistic Risk Assessment, Final Report; and Space Shuttle Probabilistic Risk Assessment Proof-of-Concept Study - Auxiliary Power Unit and Hydraulic Power Unit Analysis Report.
Annalaura, Carducci; Giulia, Davini; Stefano, Ceccanti
2013-01-01
Risk analysis is widely used in the pharmaceutical industry to manage production processes, validation activities, training, and other activities. Several methods of risk analysis are available (for example, failure mode and effects analysis, fault tree analysis), and one or more should be chosen and adapted to the specific field where they will be applied. Among the methods available, hazard analysis and critical control points (HACCP) is a methodology that has been applied since the 1960s, and whose areas of application have expanded over time from food to the pharmaceutical industry. It can be easily and successfully applied to several processes because its main feature is the identification, assessment, and control of hazards. It can be also integrated with other tools, such as fishbone diagram and flowcharting. The aim of this article is to show how HACCP can be used to manage an analytical process, propose how to conduct the necessary steps, and provide data templates necessary to document and useful to follow current good manufacturing practices. In the quality control process, risk analysis is a useful tool for enhancing the uniformity of technical choices and their documented rationale. Accordingly, it allows for more effective and economical laboratory management, is capable of increasing the reliability of analytical results, and enables auditors and authorities to better understand choices that have been made. The aim of this article is to show how hazard analysis and critical control points can be used to manage bacterial endotoxins testing and other analytical processes in a formal, clear, and detailed manner.
Phung, Dung; Connell, Des; Rutherford, Shannon; Chu, Cordia
2017-06-01
A systematic review (SR) and meta-analysis cannot provide the endpoint answer for a chemical risk assessment (CRA). The objective of this study was to apply SR and meta-regression (MR) analysis to address this limitation using a case study in cardiovascular risk from arsenic exposure in Vietnam. Published studies were searched from PubMed using the keywords of arsenic exposure and cardiovascular diseases (CVD). Random-effects meta-regression was applied to model the linear relationship between arsenic concentration in water and risk of CVD, and then the no-observable-adverse-effect level (NOAEL) were identified from the regression function. The probabilistic risk assessment (PRA) technique was applied to characterize risk of CVD due to arsenic exposure by estimating the overlapping coefficient between dose-response and exposure distribution curves. The risks were evaluated for groundwater, treated and drinking water. A total of 8 high quality studies for dose-response and 12 studies for exposure data were included for final analyses. The results of MR suggested a NOAEL of 50 μg/L and a guideline of 5 μg/L for arsenic in water which valued as a half of NOAEL and guidelines recommended from previous studies and authorities. The results of PRA indicated that the observed exposure level with exceeding CVD risk was 52% for groundwater, 24% for treated water, and 10% for drinking water in Vietnam, respectively. The study found that systematic review and meta-regression can be considered as an ideal method to chemical risk assessment due to its advantages to bring the answer for the endpoint question of a CRA. Copyright © 2017 Elsevier Ltd. All rights reserved.
Configurations of Common Childhood Psychosocial Risk Factors
ERIC Educational Resources Information Center
Copeland, William; Shanahan, Lilly; Costello, E. Jane; Angold, Adrian
2009-01-01
Background: Co-occurrence of psychosocial risk factors is commonplace, but little is known about psychiatrically-predictive configurations of psychosocial risk factors. Methods: Latent class analysis (LCA) was applied to 17 putative psychosocial risk factors in a representative population sample of 920 children ages 9 to 17. The resultant class…
Evaluating the risks of clinical research: direct comparative analysis.
Rid, Annette; Abdoler, Emily; Roberson-Nay, Roxann; Pine, Daniel S; Wendler, David
2014-09-01
Many guidelines and regulations allow children and adolescents to be enrolled in research without the prospect of clinical benefit when it poses minimal risk. However, few systematic methods exist to determine when research risks are minimal. This situation has led to significant variation in minimal risk judgments, raising concern that some children are not being adequately protected. To address this concern, we describe a new method for implementing the widely endorsed "risks of daily life" standard for minimal risk. This standard defines research risks as minimal when they do not exceed the risks posed by daily life activities or routine examinations. This study employed a conceptual and normative analysis, and use of an illustrative example. Different risks are composed of the same basic elements: Type, likelihood, and magnitude of harm. Hence, one can compare the risks of research and the risks of daily life by comparing the respective basic elements with each other. We use this insight to develop a systematic method, direct comparative analysis, for implementing the "risks of daily life" standard for minimal risk. The method offers a way of evaluating research procedures that pose the same types of risk as daily life activities, such as the risk of experiencing anxiety, stress, or other psychological harm. We thus illustrate how direct comparative analysis can be applied in practice by using it to evaluate whether the anxiety induced by a respiratory CO2 challenge poses minimal or greater than minimal risks in children and adolescents. Direct comparative analysis is a systematic method for applying the "risks of daily life" standard for minimal risk to research procedures that pose the same types of risk as daily life activities. It thereby offers a method to protect children and adolescents in research, while ensuring that important studies are not blocked because of unwarranted concerns about research risks.
Evaluating the Risks of Clinical Research: Direct Comparative Analysis
Abdoler, Emily; Roberson-Nay, Roxann; Pine, Daniel S.; Wendler, David
2014-01-01
Abstract Objectives: Many guidelines and regulations allow children and adolescents to be enrolled in research without the prospect of clinical benefit when it poses minimal risk. However, few systematic methods exist to determine when research risks are minimal. This situation has led to significant variation in minimal risk judgments, raising concern that some children are not being adequately protected. To address this concern, we describe a new method for implementing the widely endorsed “risks of daily life” standard for minimal risk. This standard defines research risks as minimal when they do not exceed the risks posed by daily life activities or routine examinations. Methods: This study employed a conceptual and normative analysis, and use of an illustrative example. Results: Different risks are composed of the same basic elements: Type, likelihood, and magnitude of harm. Hence, one can compare the risks of research and the risks of daily life by comparing the respective basic elements with each other. We use this insight to develop a systematic method, direct comparative analysis, for implementing the “risks of daily life” standard for minimal risk. The method offers a way of evaluating research procedures that pose the same types of risk as daily life activities, such as the risk of experiencing anxiety, stress, or other psychological harm. We thus illustrate how direct comparative analysis can be applied in practice by using it to evaluate whether the anxiety induced by a respiratory CO2 challenge poses minimal or greater than minimal risks in children and adolescents. Conclusions: Direct comparative analysis is a systematic method for applying the “risks of daily life” standard for minimal risk to research procedures that pose the same types of risk as daily life activities. It thereby offers a method to protect children and adolescents in research, while ensuring that important studies are not blocked because of unwarranted concerns about research risks. PMID:25210944
Risk Modeling of Interdependent Complex Systems of Systems: Theory and Practice.
Haimes, Yacov Y
2018-01-01
The emergence of the complexity characterizing our systems of systems (SoS) requires a reevaluation of the way we model, assess, manage, communicate, and analyze the risk thereto. Current models for risk analysis of emergent complex SoS are insufficient because too often they rely on the same risk functions and models used for single systems. These models commonly fail to incorporate the complexity derived from the networks of interdependencies and interconnectedness (I-I) characterizing SoS. There is a need to reevaluate currently practiced risk analysis to respond to this reality by examining, and thus comprehending, what makes emergent SoS complex. The key to evaluating the risk to SoS lies in understanding the genesis of characterizing I-I of systems manifested through shared states and other essential entities within and among the systems that constitute SoS. The term "essential entities" includes shared decisions, resources, functions, policies, decisionmakers, stakeholders, organizational setups, and others. This undertaking can be accomplished by building on state-space theory, which is fundamental to systems engineering and process control. This article presents a theoretical and analytical framework for modeling the risk to SoS with two case studies performed with the MITRE Corporation and demonstrates the pivotal contributions made by shared states and other essential entities to modeling and analysis of the risk to complex SoS. A third case study highlights the multifarious representations of SoS, which require harmonizing the risk analysis process currently applied to single systems when applied to complex SoS. © 2017 Society for Risk Analysis.
Risk and Reliability of Infrastructure Asset Management Workshop
2006-08-01
of assets within the portfolio for use in Risk and Reliability analysis ... US Army Corps of Engineers assesses its Civil Works infrastructure and applies risk and reliability in the management of that infrastructure. The ... the Corps must complete assessments across its portfolio of major assets before risk management can be used in decision making. Effective risk
Development of Improved Caprock Integrity and Risk Assessment Techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bruno, Michael
GeoMechanics Technologies has completed a geomechanical caprock integrity analysis and risk assessment study funded through the US Department of Energy. The project included: a detailed review of historical caprock integrity problems experienced in the natural gas storage industry; a theoretical description and documentation of caprock integrity issues; advanced coupled transport flow modelling and geomechanical simulation of three large-scale potential geologic sequestration sites to estimate geomechanical effects from CO₂ injection; development of a quantitative risk and decision analysis tool to assess caprock integrity risks; and, ultimately the development of recommendations and guidelines for caprock characterization and CO₂ injection operating practices. Historicalmore » data from gas storage operations and CO₂ sequestration projects suggest that leakage and containment incident risks are on the order of 10-1 to 10-2, which is higher risk than some previous studies have suggested for CO₂. Geomechanical analysis, as described herein, can be applied to quantify risks and to provide operating guidelines to reduce risks. The risk assessment tool developed for this project has been applied to five areas: The Wilmington Graben offshore Southern California, Kevin Dome in Montana, the Louden Field in Illinois, the Sleipner CO₂ sequestration operation in the North Sea, and the In Salah CO₂ sequestration operation in North Africa. Of these five, the Wilmington Graben area represents the highest relative risk while the Kevin Dome area represents the lowest relative risk.« less
Background Information and User’s Guide for MIL-F-9490
1975-01-01
requirements, although different analysis results will apply to each requirement. Basic differences between the two realibility requirements are: MIL-F-8785B...provides the rationale for establishing such limits. The specific risk analysis comprises the same data which formed the average risk analysis , except...statistical analysis will be based on statistical data taken using limited exposure Limes of components and equipment. The exposure times and resulting
Crawford, E D; Batuello, J T; Snow, P; Gamito, E J; McLeod, D G; Partin, A W; Stone, N; Montie, J; Stock, R; Lynch, J; Brandt, J
2000-05-01
The current study assesses artificial intelligence methods to identify prostate carcinoma patients at low risk for lymph node spread. If patients can be assigned accurately to a low risk group, unnecessary lymph node dissections can be avoided, thereby reducing morbidity and costs. A rule-derivation technology for simple decision-tree analysis was trained and validated using patient data from a large database (4,133 patients) to derive low risk cutoff values for Gleason sum and prostate specific antigen (PSA) level. An empiric analysis was used to derive a low risk cutoff value for clinical TNM stage. These cutoff values then were applied to 2 additional, smaller databases (227 and 330 patients, respectively) from separate institutions. The decision-tree protocol derived cutoff values of < or = 6 for Gleason sum and < or = 10.6 ng/mL for PSA. The empiric analysis yielded a clinical TNM stage low risk cutoff value of < or = T2a. When these cutoff values were applied to the larger database, 44% of patients were classified as being at low risk for lymph node metastases (0.8% false-negative rate). When the same cutoff values were applied to the smaller databases, between 11 and 43% of patients were classified as low risk with a false-negative rate of between 0.0 and 0.7%. The results of the current study indicate that a population of prostate carcinoma patients at low risk for lymph node metastases can be identified accurately using a simple decision algorithm that considers preoperative PSA, Gleason sum, and clinical TNM stage. The risk of lymph node metastases in these patients is < or = 1%; therefore, pelvic lymph node dissection may be avoided safely. The implications of these findings in surgical and nonsurgical treatment are significant.
Paques, Joseph-Jean; Gauthier, François; Perez, Alejandro
2007-01-01
To assess and plan future risk-analysis research projects, 275 documents describing methods and tools for assessing the risks associated with industrial machines or with other sectors such as the military, and the nuclear and aeronautics industries, etc., were collected. These documents were in the format of published books or papers, standards, technical guides and company procedures collected throughout industry. From the collected documents, 112 documents were selected for analysis; 108 methods applied or potentially applicable for assessing the risks associated with industrial machines were analyzed and classified. This paper presents the main quantitative results of the analysis of the methods and tools.
NASA Astrophysics Data System (ADS)
Dentoni, Marta; Deidda, Roberto; Paniconi, Claudio; Marrocu, Marino; Lecca, Giuditta
2014-05-01
Seawater intrusion (SWI) has become a major threat to coastal freshwater resources, particularly in the Mediterranean basin, where this problem is exacerbated by the lack of appropriate groundwater resources management and with serious potential impacts from projected climate changes. A proper analysis and risk assessment that includes climate scenarios is essential for the design of water management measures to mitigate the environmental and socio-economic impacts of SWI. In this study a methodology for SWI risk analysis in coastal aquifers is developed and applied to the Gaza Strip coastal aquifer in Palestine. The method is based on the origin-pathway-target model, evaluating the final value of SWI risk by applying the overlay principle to the hazard map (representing the origin of SWI), the vulnerability map (representing the pathway of groundwater flow) and the elements map (representing the target of SWI). Results indicate the important role of groundwater simulation in SWI risk assessment and illustrate how mitigation measures can be developed according to predefined criteria to arrive at quantifiable expected benefits. Keywords: Climate change, coastal aquifer, seawater intrusion, risk analysis, simulation/optimization model. Acknowledgements. The study is partially funded by the project "Climate Induced Changes on the Hydrology of Mediterranean Basins (CLIMB)", FP7-ENV-2009-1, GA 244151.
INDICATORS OF RISK: AN ANALYSIS APPROACH FOR IMPROVED RIVER MANAGEMENT
A risk index is an approach to measuring the level of risk to the plants and/or animals (biota) in a certain area using water and habitat quality information. A new technique for developing risk indices was applied to data collected from Mid-Atlantic streams of the U.S. during 1...
Risk Driven Outcome-Based Command and Control (C2) Assessment
2000-01-01
shaping the risk ranking scores into more interpretable and statistically sound risk measures. Regression analysis was applied to determine what...Architecture Framework Implementation, AFCEA Coursebook 503J, February 8-11, 2000, San Diego, California. [Morgan and Henrion, 1990] M. Granger Morgan and
Dynamic Positioning System (DPS) Risk Analysis Using Probabilistic Risk Assessment (PRA)
NASA Technical Reports Server (NTRS)
Thigpen, Eric B.; Boyer, Roger L.; Stewart, Michael A.; Fougere, Pete
2017-01-01
The National Aeronautics and Space Administration (NASA) Safety & Mission Assurance (S&MA) directorate at the Johnson Space Center (JSC) has applied its knowledge and experience with Probabilistic Risk Assessment (PRA) to projects in industries ranging from spacecraft to nuclear power plants. PRA is a comprehensive and structured process for analyzing risk in complex engineered systems and/or processes. The PRA process enables the user to identify potential risk contributors such as, hardware and software failure, human error, and external events. Recent developments in the oil and gas industry have presented opportunities for NASA to lend their PRA expertise to both ongoing and developmental projects within the industry. This paper provides an overview of the PRA process and demonstrates how this process was applied in estimating the probability that a Mobile Offshore Drilling Unit (MODU) operating in the Gulf of Mexico and equipped with a generically configured Dynamic Positioning System (DPS) loses location and needs to initiate an emergency disconnect. The PRA described in this paper is intended to be generic such that the vessel meets the general requirements of an International Maritime Organization (IMO) Maritime Safety Committee (MSC)/Circ. 645 Class 3 dynamically positioned vessel. The results of this analysis are not intended to be applied to any specific drilling vessel, although provisions were made to allow the analysis to be configured to a specific vessel if required.
Beregovykh, V V; Spitskiy, O R
2014-01-01
Risk-based approach is used for examination of impact of different factors on quality of medicinal products in technology transfer. A general diagram is offered for risk analysis execution in technology transfer from pharmaceutical development to production. When transferring technology to full- scale commercial production it is necessary to investigate and simulate production process application beforehand in new real conditions. The manufacturing process is the core factorfor risk analysis having the most impact on quality attributes of a medicinal product. Further importantfactors are linked to materials and products to be handled and manufacturing environmental conditions such as premises, equipment and personnel. Usage of risk-based approach in designing of multipurpose production facility of medicinal products is shown where quantitative risk analysis tool RAMM (Risk Analysis and Mitigation Matrix) was applied.
A risk analysis approach applied to field surveillance in utility meters in legal metrology
NASA Astrophysics Data System (ADS)
Rodrigues Filho, B. A.; Nonato, N. S.; Carvalho, A. D.
2018-03-01
Field surveillance represents the level of control in metrological supervision responsible for checking the conformity of measuring instruments in-service. Utility meters represent the majority of measuring instruments produced by notified bodies due to self-verification in Brazil. They play a major role in the economy once electricity, gas and water are the main inputs to industries in their production processes. Then, to optimize the resources allocated to control these devices, the present study applied a risk analysis in order to identify among the 11 manufacturers notified to self-verification, the instruments that demand field surveillance.
Dynamical Analysis of Stock Market Instability by Cross-correlation Matrix
NASA Astrophysics Data System (ADS)
Takaishi, Tetsuya
2016-08-01
We study stock market instability by using cross-correlations constructed from the return time series of 366 stocks traded on the Tokyo Stock Exchange from January 5, 1998 to December 30, 2013. To investigate the dynamical evolution of the cross-correlations, crosscorrelation matrices are calculated with a rolling window of 400 days. To quantify the volatile market stages where the potential risk is high, we apply the principal components analysis and measure the cumulative risk fraction (CRF), which is the system variance associated with the first few principal components. From the CRF, we detected three volatile market stages corresponding to the bankruptcy of Lehman Brothers, the 2011 Tohoku Region Pacific Coast Earthquake, and the FRB QE3 reduction observation in the study period. We further apply the random matrix theory for the risk analysis and find that the first eigenvector is more equally de-localized when the market is volatile.
Approach to proliferation risk assessment based on multiple objective analysis framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrianov, A.; Kuptsov, I.; Studgorodok 1, Obninsk, Kaluga region, 249030
2013-07-01
The approach to the assessment of proliferation risk using the methods of multi-criteria decision making and multi-objective optimization is presented. The approach allows the taking into account of the specifics features of the national nuclear infrastructure, and possible proliferation strategies (motivations, intentions, and capabilities). 3 examples of applying the approach are shown. First, the approach has been used to evaluate the attractiveness of HEU (high enriched uranium)production scenarios at a clandestine enrichment facility using centrifuge enrichment technology. Secondly, the approach has been applied to assess the attractiveness of scenarios for undeclared production of plutonium or HEU by theft of materialsmore » circulating in nuclear fuel cycle facilities and thermal reactors. Thirdly, the approach has been used to perform a comparative analysis of the structures of developing nuclear power systems based on different types of nuclear fuel cycles, the analysis being based on indicators of proliferation risk.« less
2015-11-05
impact analyses) satisfactorily encompasses the fundamentals of environmental health risk and can be applied to all mobile and stationary equipment...regulations. This paper does not seek to justify the EPA MHB approach, but explains the fundamentals and describes how the MHB concept can be...satisfactorily encompasses the fundamentals of environmental health risk and can be applied to all mobile and stationary equipment types. 15. SUBJECT TERMS
Risk assessment of vector-borne diseases for public health governance.
Sedda, L; Morley, D W; Braks, M A H; De Simone, L; Benz, D; Rogers, D J
2014-12-01
In the context of public health, risk governance (or risk analysis) is a framework for the assessment and subsequent management and/or control of the danger posed by an identified disease threat. Generic frameworks in which to carry out risk assessment have been developed by various agencies. These include monitoring, data collection, statistical analysis and dissemination. Due to the inherent complexity of disease systems, however, the generic approach must be modified for individual, disease-specific risk assessment frameworks. The analysis was based on the review of the current risk assessments of vector-borne diseases adopted by the main Public Health organisations (OIE, WHO, ECDC, FAO, CDC etc…). Literature, legislation and statistical assessment of the risk analysis frameworks. This review outlines the need for the development of a general public health risk assessment method for vector-borne diseases, in order to guarantee that sufficient information is gathered to apply robust models of risk assessment. Stochastic (especially spatial) methods, often in Bayesian frameworks are now gaining prominence in standard risk assessment procedures because of their ability to assess accurately model uncertainties. Risk assessment needs to be addressed quantitatively wherever possible, and submitted with its quality assessment in order to enable successful public health measures to be adopted. In terms of current practice, often a series of different models and analyses are applied to the same problem, with results and outcomes that are difficult to compare because of the unknown model and data uncertainties. Therefore, the risk assessment areas in need of further research are identified in this article. Copyright © 2014 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.
Adversarial risk analysis with incomplete information: a level-k approach.
Rothschild, Casey; McLay, Laura; Guikema, Seth
2012-07-01
This article proposes, develops, and illustrates the application of level-k game theory to adversarial risk analysis. Level-k reasoning, which assumes that players play strategically but have bounded rationality, is useful for operationalizing a Bayesian approach to adversarial risk analysis. It can be applied in a broad class of settings, including settings with asynchronous play and partial but incomplete revelation of early moves. Its computational and elicitation requirements are modest. We illustrate the approach with an application to a simple defend-attack model in which the defender's countermeasures are revealed with a probability less than one to the attacker before he decides on how or whether to attack. © 2011 Society for Risk Analysis.
Hazardous Wastes: A Risk Benefit Framework Applied to Cadmium and Asbestos (1977)
This study develops a decision framework for evaluating hazardous waste standards in terms of social risks and product benefits. The analysis focuses of cadmium and asbestos as examples of land waste disposal problems.
Modeling Finite-Time Failure Probabilities in Risk Analysis Applications.
Dimitrova, Dimitrina S; Kaishev, Vladimir K; Zhao, Shouqi
2015-10-01
In this article, we introduce a framework for analyzing the risk of systems failure based on estimating the failure probability. The latter is defined as the probability that a certain risk process, characterizing the operations of a system, reaches a possibly time-dependent critical risk level within a finite-time interval. Under general assumptions, we define two dually connected models for the risk process and derive explicit expressions for the failure probability and also the joint probability of the time of the occurrence of failure and the excess of the risk process over the risk level. We illustrate how these probabilistic models and results can be successfully applied in several important areas of risk analysis, among which are systems reliability, inventory management, flood control via dam management, infectious disease spread, and financial insolvency. Numerical illustrations are also presented. © 2015 Society for Risk Analysis.
A Benefit-Risk Analysis Approach to Capture Regulatory Decision-Making: Non-Small Cell Lung Cancer.
Raju, G K; Gurumurthi, K; Domike, R; Kazandjian, D; Blumenthal, G; Pazdur, R; Woodcock, J
2016-12-01
Drug regulators around the world make decisions about drug approvability based on qualitative benefit-risk analyses. There is much interest in quantifying regulatory approaches to benefit and risk. In this work the use of a quantitative benefit-risk analysis was applied to regulatory decision-making about new drugs to treat advanced non-small cell lung cancer (NSCLC). Benefits and risks associated with 20 US Food and Drug Administration (FDA) decisions associated with a set of candidate treatments submitted between 2003 and 2015 were analyzed. For benefit analysis, the median overall survival (OS) was used where available. When not available, OS was estimated based on overall response rate (ORR) or progression-free survival (PFS). Risks were analyzed based on magnitude (or severity) of harm and likelihood of occurrence. Additionally, a sensitivity analysis was explored to demonstrate analysis of systematic uncertainty. FDA approval decision outcomes considered were found to be consistent with the benefit-risk logic. © 2016 American Society for Clinical Pharmacology and Therapeutics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mossahebi, S; Feigenberg, S; Nichols, E
Purpose: GammaPod™, the first stereotactic radiotherapy device for early stage breast cancer treatment, has been recently installed and commissioned at our institution. A multidisciplinary working group applied the failure mode and effects analysis (FMEA) approach to perform a risk analysis. Methods: FMEA was applied to the GammaPod™ treatment process by: 1) generating process maps for each stage of treatment; 2) identifying potential failure modes and outlining their causes and effects; 3) scoring the potential failure modes using the risk priority number (RPN) system based on the product of severity, frequency of occurrence, and detectability (ranging 1–10). An RPN of highermore » than 150 was set as the threshold for minimal concern of risk. For these high-risk failure modes, potential quality assurance procedures and risk control techniques have been proposed. A new set of severity, occurrence, and detectability values were re-assessed in presence of the suggested mitigation strategies. Results: In the single-day image-and-treat workflow, 19, 22, and 27 sub-processes were identified for the stages of simulation, treatment planning, and delivery processes, respectively. During the simulation stage, 38 potential failure modes were found and scored, in terms of RPN, in the range of 9-392. 34 potential failure modes were analyzed in treatment planning with a score range of 16-200. For the treatment delivery stage, 47 potential failure modes were found with an RPN score range of 16-392. The most critical failure modes consisted of breast-cup pressure loss and incorrect target localization due to patient upper-body alignment inaccuracies. The final RPN score of these failure modes based on recommended actions were assessed to be below 150. Conclusion: FMEA risk analysis technique was applied to the treatment process of GammaPod™, a new stereotactic radiotherapy technology. Application of systematic risk analysis methods is projected to lead to improved quality of GammaPod™ treatments. Ying Niu and Cedric Yu are affiliated with Xcision Medical Systems.« less
[FMEA applied to the radiotherapy patient care process].
Meyrieux, C; Garcia, R; Pourel, N; Mège, A; Bodez, V
2012-10-01
Failure modes and effects analysis (FMEA), is a risk analysis method used at the Radiotherapy Department of Institute Sainte-Catherine as part of a strategy seeking to continuously improve the quality and security of treatments. The method comprises several steps: definition of main processes; for each of them, description for every step of prescription, treatment preparation, treatment application; identification of the possible risks, their consequences, their origins; research of existing safety elements which may avoid these risks; grading of risks to assign a criticality score resulting in a numerical organisation of the risks. Finally, the impact of proposed corrective actions was then estimated by a new grading round. For each process studied, a detailed map of the risks was obtained, facilitating the identification of priority actions to be undertaken. For example, we obtain five steps in patient treatment planning with an unacceptable level of risk, 62 a level of moderate risk and 31 an acceptable level of risk. The FMEA method, used in the industrial domain and applied here to health care, is an effective tool for the management of risks in patient care. However, the time and training requirements necessary to implement this method should not be underestimated. Copyright © 2012 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.
Jahn, Beate; Rochau, Ursula; Kurzthaler, Christina; Hubalek, Michael; Miksad, Rebecca; Sroczynski, Gaby; Paulden, Mike; Bundo, Marvin; Stenehjem, David; Brixner, Diana; Krahn, Murray; Siebert, Uwe
2017-10-16
Due to high survival rates and the relatively small benefit of adjuvant therapy, the application of personalized medicine (PM) through risk stratification is particularly beneficial in early breast cancer (BC) to avoid unnecessary harms from treatment. The new 21-gene assay (OncotypeDX, ODX) is a promising prognostic score for risk stratification that can be applied in conjunction with Adjuvant!Online (AO) to guide personalized chemotherapy decisions for early BC patients. Our goal was to evaluate risk-group specific cost effectiveness of adjuvant chemotherapy for women with early stage BC in Austria based on AO and ODX risk stratification. A previously validated discrete event simulation model was applied to a hypothetical cohort of 50-year-old women over a lifetime horizon. We simulated twelve risk groups derived from the joint application of ODX and AO and included respective additional costs. The primary outcomes of interest were life-years gained, quality-adjusted life-years (QALYs), costs and incremental cost-effectiveness (ICER). The robustness of results and decisions derived were tested in sensitivity analyses. A cross-country comparison of results was performed. Chemotherapy is dominated (i.e., less effective and more costly) for patients with 1) low ODX risk independent of AO classification; and 2) low AO risk and intermediate ODX risk. For patients with an intermediate or high AO risk and an intermediate or high ODX risk, the ICER is below 15,000 EUR/QALY (potentially cost effective depending on the willingness-to-pay). Applying the AO risk classification alone would miss risk groups where chemotherapy is dominated and thus should not be considered. These results are sensitive to changes in the probabilities of distant recurrence but not to changes in the costs of chemotherapy or the ODX test. Based on our modeling study, chemotherapy is effective and cost effective for Austrian patients with an intermediate or high AO risk and an intermediate or high ODX risk. In other words, low ODX risk suggests chemotherapy should not be considered but low AO risk may benefit from chemotherapy if ODX risk is high. Our analysis suggests that risk-group specific cost-effectiveness analysis, which includes companion prognostic tests are essential in PM.
Relative risk analysis of the use of radiation-emitting medical devices: A preliminary application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, E.D.
This report describes the development of a risk analysis approach for evaluating the use of radiation-emitting medial devices. This effort was performed by Lawrence Livermore National Laboratory for the US Nuclear Regulatory Commission (NRC). The assessment approach has bee applied to understand the risks in using the Gamma Knife, a gamma irradiation therapy device. This effort represents an initial step to evaluate the potential role of risk analysis for developing regulations and quality assurance requirements in the use of nuclear medical devices. The risk approach identifies and assesses the most likely risk contributors and their relative importance for the medicalmore » system. The approach uses expert screening techniques and relative risk profiling to incorporate the type, quality, and quantity of data available and to present results in an easily understood form.« less
Multiattribute risk analysis in nuclear emergency management.
Hämäläinen, R P; Lindstedt, M R; Sinkko, K
2000-08-01
Radiation protection authorities have seen a potential for applying multiattribute risk analysis in nuclear emergency management and planning to deal with conflicting objectives, different parties involved, and uncertainties. This type of approach is expected to help in the following areas: to ensure that all relevant attributes are considered in decision making; to enhance communication between the concerned parties, including the public; and to provide a method for explicitly including risk analysis in the process. A multiattribute utility theory analysis was used to select a strategy for protecting the population after a simulated nuclear accident. The value-focused approach and the use of a neutral facilitator were identified as being useful.
Impact of model-based risk analysis for liver surgery planning.
Hansen, C; Zidowitz, S; Preim, B; Stavrou, G; Oldhafer, K J; Hahn, H K
2014-05-01
A model-based risk analysis for oncologic liver surgery was described in previous work (Preim et al. in Proceedings of international symposium on computer assisted radiology and surgery (CARS), Elsevier, Amsterdam, pp. 353–358, 2002; Hansen et al. Int I Comput Assist Radiol Surg 4(5):469–474, 2009). In this paper, we present an evaluation of this method. To prove whether and how the risk analysis facilitates the process of liver surgery planning, an explorative user study with 10 liver experts was conducted. The purpose was to compare and analyze their decision-making. The results of the study show that model-based risk analysis enhances the awareness of surgical risk in the planning stage. Participants preferred smaller resection volumes and agreed more on the safety margins’ width in case the risk analysis was available. In addition, time to complete the planning task and confidence of participants were not increased when using the risk analysis. This work shows that the applied model-based risk analysis may influence important planning decisions in liver surgery. It lays a basis for further clinical evaluations and points out important fields for future research.
NASA Astrophysics Data System (ADS)
Jing, Wenjun; Zhao, Yan
2018-02-01
Stability is an important part of geotechnical engineering research. The operating experiences of underground storage caverns in salt rock all around the world show that the stability of the caverns is the key problem of safe operation. Currently, the combination of theoretical analysis and numerical simulation are the mainly adopts method of reserve stability analysis. This paper introduces the concept of risk into the stability analysis of underground geotechnical structure, and studies the instability of underground storage cavern in salt rock from the perspective of risk analysis. Firstly, the definition and classification of cavern instability risk is proposed, and the damage mechanism is analyzed from the mechanical angle. Then the main stability evaluating indicators of cavern instability risk are proposed, and an evaluation method of cavern instability risk is put forward. Finally, the established cavern instability risk assessment system is applied to the analysis and prediction of cavern instability risk after 30 years of operation in a proposed storage cavern group in the Huai’an salt mine. This research can provide a useful theoretical base for the safe operation and management of underground storage caverns in salt rock.
The effectiveness of risk management program on pediatric nurses' medication error.
Dehghan-Nayeri, Nahid; Bayat, Fariba; Salehi, Tahmineh; Faghihzadeh, Soghrat
2013-09-01
Medication therapy is one of the most complex and high-risk clinical processes that nurses deal with. Medication error is the most common type of error that brings about damage and death to patients, especially pediatric ones. However, these errors are preventable. Identifying and preventing undesirable events leading to medication errors are the main risk management activities. The aim of this study was to investigate the effectiveness of a risk management program on the pediatric nurses' medication error rate. This study is a quasi-experimental one with a comparison group. In this study, 200 nurses were recruited from two main pediatric hospitals in Tehran. In the experimental hospital, we applied the risk management program for a period of 6 months. Nurses of the control hospital did the hospital routine schedule. A pre- and post-test was performed to measure the frequency of the medication error events. SPSS software, t-test, and regression analysis were used for data analysis. After the intervention, the medication error rate of nurses at the experimental hospital was significantly lower (P < 0.001) and the error-reporting rate was higher (P < 0.007) compared to before the intervention and also in comparison to the nurses of the control hospital. Based on the results of this study and taking into account the high-risk nature of the medical environment, applying the quality-control programs such as risk management can effectively prevent the occurrence of the hospital undesirable events. Nursing mangers can reduce the medication error rate by applying risk management programs. However, this program cannot succeed without nurses' cooperation.
Fuzzy Comprehensive Evaluation Method Applied in the Real Estate Investment Risks Research
NASA Astrophysics Data System (ADS)
ML(Zhang Minli), Zhang; Wp(Yang Wenpo), Yang
Real estate investment is a high-risk and high returned of economic activity, the key of real estate analysis is the identification of their types of investment risk and the risk of different types of effective prevention. But, as the financial crisis sweeping the world, the real estate industry also faces enormous risks, how effective and correct evaluation of real estate investment risks becomes the multitudinous scholar concern[1]. In this paper, real estate investment risks were summarized and analyzed, and comparative analysis method is discussed and finally presented fuzzy comprehensive evaluation method, not only in theory has the advantages of science, in the application also has the reliability, for real estate investment risk assessment provides an effective means for investors in real estate investing guidance on risk factors and forecasts.
Risk Analysis of Earth-Rock Dam Failures Based on Fuzzy Event Tree Method
Fu, Xiao; Gu, Chong-Shi; Su, Huai-Zhi; Qin, Xiang-Nan
2018-01-01
Earth-rock dams make up a large proportion of the dams in China, and their failures can induce great risks. In this paper, the risks associated with earth-rock dam failure are analyzed from two aspects: the probability of a dam failure and the resulting life loss. An event tree analysis method based on fuzzy set theory is proposed to calculate the dam failure probability. The life loss associated with dam failure is summarized and refined to be suitable for Chinese dams from previous studies. The proposed method and model are applied to one reservoir dam in Jiangxi province. Both engineering and non-engineering measures are proposed to reduce the risk. The risk analysis of the dam failure has essential significance for reducing dam failure probability and improving dam risk management level. PMID:29710824
NASA Technical Reports Server (NTRS)
Shortle, John F.; Allocco, Michael
2005-01-01
This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.
Understanding growers' decisions to manage invasive pathogens at the farm level.
Breukers, Annemarie; van Asseldonk, Marcel; Bremmer, Johan; Beekman, Volkert
2012-06-01
Globalization causes plant production systems to be increasingly threatened by invasive pests and pathogens. Much research is devoted to support management of these risks. Yet, the role of growers' perceptions and behavior in risk management has remained insufficiently analyzed. This article aims to fill this gap by addressing risk management of invasive pathogens from a sociopsychological perspective. An analytical framework based on the Theory of Planned Behavior was used to explain growers' decisions on voluntary risk management measures. Survey information from 303 Dutch horticultural growers was statistically analyzed, including regression and cluster analysis. It appeared that growers were generally willing to apply risk management measures, and that poor risk management was mainly due to perceived barriers, such as high costs and doubts regarding efficacy of management measures. The management measures applied varied considerably among growers, depending on production sector and farm-specific circumstances. Growers' risk perception was found to play a role in their risk management, although the causal relation remained unclear. These results underscore the need to apply a holistic perspective to farm level management of invasive pathogen risk, considering the entire package of management measures and accounting for sector- and farm-specific circumstances. Moreover, they demonstrate that invasive pathogen risk management can benefit from a multidisciplinary approach that incorporates growers' perceptions and behavior.
NASA Astrophysics Data System (ADS)
Davis, Adam Christopher
This research develops a new framework for evaluating the occupational risks of exposure to hazardous substances in any setting where As Low As Reasonably Achievable (ALARA) practices are mandated or used. The evaluation is performed by developing a hypothesis-test-based procedure for evaluating the homogeneity of various epidemiological cohorts, and thus the appropriateness of the application of aggregate data-pooling techniques to those cohorts. A statistical methodology is then developed as an alternative to aggregate pooling for situations in which individual cohorts show heterogeneity between them and are thus unsuitable for pooled analysis. These methods are then applied to estimate the all-cancer mortality risks incurred by workers at four Department-of-Energy nuclear weapons laboratories. Both linear, no-threshold and dose-bin averaged risks are calculated and it is further shown that aggregate analysis tends to overestimate the risks with respect to those calculated by the methods developed in this work. The risk estimates developed in Chapter 2 are, in Chapter 3, applied to assess the risks to workers engaged in americium recovery operations at Los Alamos National Laboratory. The work described in Chapter 3 develops a full radiological protection assessment for the new americium recovery project, including development of exposure cases, creation and modification of MCNP5 models, development of a time-and-motion study, and the final synthesis of all data. This work also develops a new risk-based method of determining whether administrative controls, such as staffing increases, are ALARA-optimized. The EPA's estimate of the value of statistical life is applied to these risk estimates to determine a monetary value for risk. The rate of change of this "risk value" (marginal risk) is then compared with the rate of change of workers' compensations as additional workers are added to the project to reduce the dose (and therefore, presumably, risk) to each individual.
Polygenic risk score analysis of pathologically confirmed Alzheimer disease.
Escott-Price, Valentina; Myers, Amanda J; Huentelman, Matt; Hardy, John
2017-08-01
Previous estimates of the utility of polygenic risk score analysis for the prediction of Alzheimer disease have given area under the curve (AUC) estimates of <80%. However, these have been based on the genetic analysis of clinical case-control series. Here, we apply the same analytic approaches to a pathological case-control series and show a predictive AUC of 84%. We suggest that this analysis has clinical utility and that there is limited room for further improvement using genetic data. Ann Neurol 2017;82:311-314. © 2017 American Neurological Association.
Risk and value analysis of SETI
NASA Technical Reports Server (NTRS)
Billingham, J.
1990-01-01
This paper attempts to apply a traditional risk and value analysis to the Search for Extraterrestrial Intelligence--SETI. In view of the difficulties of assessing the probability of success, a comparison is made between SETI and a previous search for extraterrestrial life, the biological component of Project Viking. Our application of simple Utility Theory, given some reasonable assumptions, suggests that SETI is at least as worthwhile as the biological experiment on Viking.
GEO Collisional Risk Assessment Based on Analysis of NASA-WISE Data and Modeling
2015-10-18
GEO Collisional Risk Assessment Based on Analysis of NASA -WISE Data and Modeling Jeremy Murray Krezan1, Samantha Howard1, Phan D. Dao1, Derek...Surka2 1AFRL Space Vehicles Directorate,2Applied Technology Associates Incorporated From December 2009 through 2011 the NASA Wide-Field Infrared...of known debris. The NASA -WISE GEO belt debris population adds potentially thousands previously uncataloged objects. This paper describes
Risk and value analysis of SETI.
Billingham, J
1990-01-01
This paper attempts to apply a traditional risk and value analysis to the Search for Extraterrestrial Intelligence--SETI. In view of the difficulties of assessing the probability of success, a comparison is made between SETI and a previous search for extraterrestrial life, the biological component of Project Viking. Our application of simple Utility Theory, given some reasonable assumptions, suggests that SETI is at least as worthwhile as the biological experiment on Viking.
Serafini, A; Troiano, G; Franceschini, E; Calzoni, P; Nante, N; Scapellato, C
2016-01-01
Risk management is a set of actions to recognize or identify risks, errors and their consequences and to take the steps to counter it. The aim of our study was to apply FMECA (Failure Mode, Effects and Criticality Analysis) to the Activated Protein C resistance (APCR) test in order to detect and avoid mistakes in this process. We created a team and the process was divided in phases and sub phases. For each phase we calculated the probability of occurrence (O) of an error, the detectability score (D) and the severity (S). The product of these three indexes yields the RPN (Risk Priority Number). Phases with a higher RPN need corrective actions with a higher priority. The calculation of RPN showed that more than 20 activities have a score higher than 150 and need important preventive actions; 8 have a score between 100 and 150. Only 23 actions obtained an acceptable score lower than 100. This was one of the first experience of application of FMECA analysis to a laboratory process, and the first one which applies this technique to the identification of the factor V Leiden, and our results confirm that FMECA could be a simple, powerful and useful tool in risk management and helps to identify quickly the criticality in a laboratory process.
Chang, Yen-Hou; Li, Wai-Hou; Chang, Yi; Peng, Chia-Wen; Cheng, Ching-Hsuan; Chang, Wei-Pin; Chuang, Chi-Mu
2016-03-17
In the analysis of survival data for cancer patients, the problem of competing risks is often ignored. Competing risks have been recognized as a special case of time-to-event analysis. The conventional techniques for time-to-event analysis applied in the presence of competing risks often give biased or uninterpretable results. Using a prospectively collected administrative health care database in a single institution, we identified patients diagnosed with stage III or IV primary epithelial ovarian, tubal, and peritoneal cancers with minimal residual disease after primary cytoreductive surgery between 1995 and 2012. Here, we sought to evaluate whether intraperitoneal chemotherapy outperforms intravenous chemotherapy in the presence of competing risks. Unadjusted and multivariable subdistribution hazards models were applied to this database with two types of competing risks (cancer-specific mortality and other-cause mortality) coded to measure the relative effects of intraperitoneal chemotherapy. A total of 1263 patients were recruited as the initial cohort. After propensity score matching, 381 patients in each arm entered into final competing risk analysis. Cumulative incidence estimates for cancer-specific mortality were statistically significantly lower (p = 0.017, Gray test) in patients receiving intraperitoneal chemotherapy (5-year estimates, 34.5%; 95% confidence interval [CI], 29.5-39.6%, and 10-year estimates, 60.7%; 95% CI, 52.2-68.0%) versus intravenous chemotherapy (5-year estimates, 41.3%; 95% CI, 36.2-46.3%, and 10-year estimates, 67.5%, 95% CI, 61.6-72.7%). In subdistribution hazards analysis, for cancer-specific mortality, intraperitoneal chemotherapy outperforms intravenous chemotherapy (Subdistribution hazard ratio, 0.82; 95% CI, 0.70-0.96) after correcting other covariates. In conclusion, results from this comparative effectiveness study provide supportive evidence for previous published randomized trials that intraperitoneal chemotherapy outperforms intravenous chemotherapy even eliminating the confounding of competing risks. We suggest that implementation of competing risk analysis should be highly considered for the investigation of cancer patients who have medium to long-term follow-up period.
MacGillivray, Brian H
2017-08-01
In many environmental and public health domains, heuristic methods of risk and decision analysis must be relied upon, either because problem structures are ambiguous, reliable data is lacking, or decisions are urgent. This introduces an additional source of uncertainty beyond model and measurement error - uncertainty stemming from relying on inexact inference rules. Here we identify and analyse heuristics used to prioritise risk objects, to discriminate between signal and noise, to weight evidence, to construct models, to extrapolate beyond datasets, and to make policy. Some of these heuristics are based on causal generalisations, yet can misfire when these relationships are presumed rather than tested (e.g. surrogates in clinical trials). Others are conventions designed to confer stability to decision analysis, yet which may introduce serious error when applied ritualistically (e.g. significance testing). Some heuristics can be traced back to formal justifications, but only subject to strong assumptions that are often violated in practical applications. Heuristic decision rules (e.g. feasibility rules) in principle act as surrogates for utility maximisation or distributional concerns, yet in practice may neglect costs and benefits, be based on arbitrary thresholds, and be prone to gaming. We highlight the problem of rule-entrenchment, where analytical choices that are in principle contestable are arbitrarily fixed in practice, masking uncertainty and potentially introducing bias. Strategies for making risk and decision analysis more rigorous include: formalising the assumptions and scope conditions under which heuristics should be applied; testing rather than presuming their underlying empirical or theoretical justifications; using sensitivity analysis, simulations, multiple bias analysis, and deductive systems of inference (e.g. directed acyclic graphs) to characterise rule uncertainty and refine heuristics; adopting "recovery schemes" to correct for known biases; and basing decision rules on clearly articulated values and evidence, rather than convention. Copyright © 2017. Published by Elsevier Ltd.
Millen, Barbara E; Quatromoni, Paula A; Pencina, Michael; Kimokoti, Ruth; Nam, Byung-H O; Cobain, Sonia; Kozak, Waldemar; Appugliese, Danielle P; Ordovas, Jose; D'Agostino, Ralph B
2005-11-01
To identify the dietary patterns of adult men and examine their relationships with nutrient intake and chronic disease risk over long-term follow-up. Baseline 145-item food frequency questionnaires from 1,666 Framingham Offspring-Spouse cohort men were used to identify comprehensive dietary patterns. Independent 3-day dietary records at baseline and 8 years later provided estimates of subjects' nutrient intake by dietary pattern. Chronic disease risk factor status was compared at baseline and 16-year follow-up across all male dietary patterns. Cluster analysis was applied to food frequency data to identify non-overlapping male dietary patterns. Analysis of covariance and logistic regression were used to compare nutrient intake, summary nutritional risk scores, and chronic disease risk status at baseline and follow-up by male dietary pattern. Five distinct and comprehensive dietary patterns of Framingham Offspring-Spouse men were identified and ordered according to overall nutritional risk: Transition to Heart Healthy, Higher Starch, Average Male, Lower Variety, and Empty Calories. Nutritional risk was high and varied by dietary pattern; key nutrient contrasts were stable over 8-year follow-up. Chronic disease risk also varied by dietary pattern and specific subgroup differences persisted over 16 years, notably rates of overweight/obesity and smoking. Quantitative cluster analysis applied to food frequency questionnaire data identified five distinct, comprehensive, and stable dietary patterns of adult Framingham Offspring-Spouse cohort men. The close associations between the dietary patterns, nutritional risk, and chronic disease profiles of men emphasize the importance of targeted preventive nutrition interventions to promote health in the male population.
Risk Factors of Falls in Community-Dwelling Older Adults: Logistic Regression Tree Analysis
ERIC Educational Resources Information Center
Yamashita, Takashi; Noe, Douglas A.; Bailer, A. John
2012-01-01
Purpose of the Study: A novel logistic regression tree-based method was applied to identify fall risk factors and possible interaction effects of those risk factors. Design and Methods: A nationally representative sample of American older adults aged 65 years and older (N = 9,592) in the Health and Retirement Study 2004 and 2006 modules was used.…
[Risk analysis of brucelosis in the state of Tlaxcala].
García-Juárez, Guillermina; Ramírez-Bribiesca, J Efrén; Hernández-Vázquez, Maricela; Hernández-Calva, Luz Marina; Díaz-Aparicio, Efrén; Orozco-Bolaños, Hermila
2014-01-01
To identify the risk of brucellosis in the state of Tlaxcala, Mexico. A diagnosis of social type was conducted in the municipalities of Huamantla, Ixtenco and Teacalco, located in the eastern region of the state. The seroprevalence of brucellosis in goats and humans was determined. 46.9% of producers know the programs of vaccination against brucellosis; 19.7% apply the vaccine and 80.3% do not apply the vaccine. Huamantla had the highest seroprevalence of animal brucellosis in 66.8%; San Jose Teacalco distributes unpasteurized cheeses to a distance of 270 km, increasing the risk of infection with brucellosis. Ixtenco recorded the highest prevalence of brucellosis in humans, with 1.51%. The municipalities studied present risks of infection and spread of brucellosis.
Willard, Nancy; Chutuape, Kate; Stines, Stephanie; Ellen, Jonathan M
2012-01-01
HIV prevention efforts have expanded beyond individual-level interventions to address structural determinants of risk. Coalitions have been an important vehicle for addressing similar intractable and deeply rooted health-related issues. A root cause analysis process may aid coalitions in identifying fundamental, structural-level contributors to risk and in identifying appropriate solutions. For this article, strategic plans for 13 coalitions were analyzed both before and after a root cause analysis approach was applied to determine the coalitions' strategic plans potential impact and comprehensiveness. After root cause analysis, strategic plans trended toward targeting policies and practices rather than on single agency programmatic changes. Plans expanded to target multiple sectors and several changes within sectors to penetrate deeply into a sector or system. Findings suggest that root cause analysis may be a viable tool to assist coalitions in identifying structural determinants and possible solutions for HIV risk.
Suzanne M. Anderson; Wayne G. Landis
2012-01-01
An issue in forestry management has been the integration of a variety of different information into a threat analysis or risk assessment. In this instance, regional scale risk assessment was applied to the Upper Grande Ronde watershed in eastern Oregon to examine the potential of risk assessment for use in the management of broad landscapes. The site was a focus of...
Simulation Assisted Risk Assessment Applied to Launch Vehicle Conceptual Design
NASA Technical Reports Server (NTRS)
Mathias, Donovan L.; Go, Susie; Gee, Ken; Lawrence, Scott
2008-01-01
A simulation-based risk assessment approach is presented and is applied to the analysis of abort during the ascent phase of a space exploration mission. The approach utilizes groupings of launch vehicle failures, referred to as failure bins, which are mapped to corresponding failure environments. Physical models are used to characterize the failure environments in terms of the risk due to blast overpressure, resulting debris field, and the thermal radiation due to a fireball. The resulting risk to the crew is dynamically modeled by combining the likelihood of each failure, the severity of the failure environments as a function of initiator and time of the failure, the robustness of the crew module, and the warning time available due to early detection. The approach is shown to support the launch vehicle design process by characterizing the risk drivers and identifying regions where failure detection would significantly reduce the risk to the crew.
Vickers, Andrew J; Cronin, Angel M; Elkin, Elena B; Gonen, Mithat
2008-01-01
Background Decision curve analysis is a novel method for evaluating diagnostic tests, prediction models and molecular markers. It combines the mathematical simplicity of accuracy measures, such as sensitivity and specificity, with the clinical applicability of decision analytic approaches. Most critically, decision curve analysis can be applied directly to a data set, and does not require the sort of external data on costs, benefits and preferences typically required by traditional decision analytic techniques. Methods In this paper we present several extensions to decision curve analysis including correction for overfit, confidence intervals, application to censored data (including competing risk) and calculation of decision curves directly from predicted probabilities. All of these extensions are based on straightforward methods that have previously been described in the literature for application to analogous statistical techniques. Results Simulation studies showed that repeated 10-fold crossvalidation provided the best method for correcting a decision curve for overfit. The method for applying decision curves to censored data had little bias and coverage was excellent; for competing risk, decision curves were appropriately affected by the incidence of the competing risk and the association between the competing risk and the predictor of interest. Calculation of decision curves directly from predicted probabilities led to a smoothing of the decision curve. Conclusion Decision curve analysis can be easily extended to many of the applications common to performance measures for prediction models. Software to implement decision curve analysis is provided. PMID:19036144
Vickers, Andrew J; Cronin, Angel M; Elkin, Elena B; Gonen, Mithat
2008-11-26
Decision curve analysis is a novel method for evaluating diagnostic tests, prediction models and molecular markers. It combines the mathematical simplicity of accuracy measures, such as sensitivity and specificity, with the clinical applicability of decision analytic approaches. Most critically, decision curve analysis can be applied directly to a data set, and does not require the sort of external data on costs, benefits and preferences typically required by traditional decision analytic techniques. In this paper we present several extensions to decision curve analysis including correction for overfit, confidence intervals, application to censored data (including competing risk) and calculation of decision curves directly from predicted probabilities. All of these extensions are based on straightforward methods that have previously been described in the literature for application to analogous statistical techniques. Simulation studies showed that repeated 10-fold crossvalidation provided the best method for correcting a decision curve for overfit. The method for applying decision curves to censored data had little bias and coverage was excellent; for competing risk, decision curves were appropriately affected by the incidence of the competing risk and the association between the competing risk and the predictor of interest. Calculation of decision curves directly from predicted probabilities led to a smoothing of the decision curve. Decision curve analysis can be easily extended to many of the applications common to performance measures for prediction models. Software to implement decision curve analysis is provided.
USDA-ARS?s Scientific Manuscript database
Recently, a variant of stochastic dominance called stochastic efficiency with respect to a function (SERF) has been developed and applied. Unlike traditional stochastic dominance approaches, SERF uses the concept of certainty equivalents (CEs) to rank a set of risk-efficient alternatives instead of...
Project risk management in the construction of high-rise buildings
NASA Astrophysics Data System (ADS)
Titarenko, Boris; Hasnaoui, Amir; Titarenko, Roman; Buzuk, Liliya
2018-03-01
This paper shows the project risk management methods, which allow to better identify risks in the construction of high-rise buildings and to manage them throughout the life cycle of the project. One of the project risk management processes is a quantitative analysis of risks. The quantitative analysis usually includes the assessment of the potential impact of project risks and their probabilities. This paper shows the most popular methods of risk probability assessment and tries to indicate the advantages of the robust approach over the traditional methods. Within the framework of the project risk management model a robust approach of P. Huber is applied and expanded for the tasks of regression analysis of project data. The suggested algorithms used to assess the parameters in statistical models allow to obtain reliable estimates. A review of the theoretical problems of the development of robust models built on the methodology of the minimax estimates was done and the algorithm for the situation of asymmetric "contamination" was developed.
Selenium Exposure and Cancer Risk: an Updated Meta-analysis and Meta-regression
Cai, Xianlei; Wang, Chen; Yu, Wanqi; Fan, Wenjie; Wang, Shan; Shen, Ning; Wu, Pengcheng; Li, Xiuyang; Wang, Fudi
2016-01-01
The objective of this study was to investigate the associations between selenium exposure and cancer risk. We identified 69 studies and applied meta-analysis, meta-regression and dose-response analysis to obtain available evidence. The results indicated that high selenium exposure had a protective effect on cancer risk (pooled OR = 0.78; 95%CI: 0.73–0.83). The results of linear and nonlinear dose-response analysis indicated that high serum/plasma selenium and toenail selenium had the efficacy on cancer prevention. However, we did not find a protective efficacy of selenium supplement. High selenium exposure may have different effects on specific types of cancer. It decreased the risk of breast cancer, lung cancer, esophageal cancer, gastric cancer, and prostate cancer, but it was not associated with colorectal cancer, bladder cancer, and skin cancer. PMID:26786590
Segmented Poincaré plot analysis for risk stratification in patients with dilated cardiomyopathy.
Voss, A; Fischer, C; Schroeder, R; Figulla, H R; Goernig, M
2010-01-01
The prognostic value of heart rate variability in patients with dilated cardiomyopathy (DCM) is limited and does not contribute to risk stratification although the dynamics of ventricular repolarization differs considerably between DCM patients and healthy subjects. Neither linear nor nonlinear methods of heart rate variability analysis could discriminate between patients at high and low risk for sudden cardiac death. The aim of this study was to analyze the suitability of the new developed segmented Poincaré plot analysis (SPPA) to enhance risk stratification in DCM. In contrast to the usual applied Poincaré plot analysis the SPPA retains nonlinear features from investigated beat-to-beat interval time series. Main features of SPPA are the rotation of cloud of points and their succeeded variability depended segmentation. Significant row and column probabilities were calculated from the segments and led to discrimination (up to p<0.005) between low and high risk in DCM patients. For the first time an index from Poincaré plot analysis of heart rate variability was able to contribute to risk stratification in patients suffering from DCM.
Health Risk-Based Assessment and Management of Heavy Metals-Contaminated Soil Sites in Taiwan
Lai, Hung-Yu; Hseu, Zeng-Yei; Chen, Ting-Chien; Chen, Bo-Ching; Guo, Horng-Yuh; Chen, Zueng-Sang
2010-01-01
Risk-based assessment is a way to evaluate the potential hazards of contaminated sites and is based on considering linkages between pollution sources, pathways, and receptors. These linkages can be broken by source reduction, pathway management, and modifying exposure of the receptors. In Taiwan, the Soil and Groundwater Pollution Remediation Act (SGWPR Act) uses one target regulation to evaluate the contamination status of soil and groundwater pollution. More than 600 sites contaminated with heavy metals (HMs) have been remediated and the costs of this process are always high. Besides using soil remediation techniques to remove contaminants from these sites, the selection of possible remediation methods to obtain rapid risk reduction is permissible and of increasing interest. This paper discusses previous soil remediation techniques applied to different sites in Taiwan and also clarified the differences of risk assessment before and after soil remediation obtained by applying different risk assessment models. This paper also includes many case studies on: (1) food safety risk assessment for brown rice growing in a HMs-contaminated site; (2) a tiered approach to health risk assessment for a contaminated site; (3) risk assessment for phytoremediation techniques applied in HMs-contaminated sites; and (4) soil remediation cost analysis for contaminated sites in Taiwan. PMID:21139851
Acquaintance Rape: Applying Crime Scene Analysis to the Prediction of Sexual Recidivism.
Lehmann, Robert J B; Goodwill, Alasdair M; Hanson, R Karl; Dahle, Klaus-Peter
2016-10-01
The aim of the current study was to enhance the assessment and predictive accuracy of risk assessments for sexual offenders by utilizing detailed crime scene analysis (CSA). CSA was conducted on a sample of 247 male acquaintance rapists from Berlin (Germany) using a nonmetric, multidimensional scaling (MDS) Behavioral Thematic Analysis (BTA) approach. The age of the offenders at the time of the index offense ranged from 14 to 64 years (M = 32.3; SD = 11.4). The BTA procedure revealed three behavioral themes of hostility, criminality, and pseudo-intimacy, consistent with previous CSA research on stranger rape. The construct validity of the three themes was demonstrated through correlational analyses with known sexual offending measures and criminal histories. The themes of hostility and pseudo-intimacy were significant predictors of sexual recidivism. In addition, the pseudo-intimacy theme led to a significant increase in the incremental validity of the Static-99 actuarial risk assessment instrument for the prediction of sexual recidivism. The results indicate the potential utility and validity of crime scene behaviors in the applied risk assessment of sexual offenders. © The Author(s) 2015.
Laganà, Pasqualina; Moscato, Umberto; Poscia, Andrea; La Milia, Daniele Ignazio; Boccia, Stefania; Avventuroso, Emanuela; Delia, Santi
2015-01-01
Legionnaires' disease is normally acquired by inhalation of legionellae from a contaminated environmental source. Water systems of large buildings, such as hospitals, are often contaminated with legionellae and therefore represent a potential risk for the hospital population. The aim of this study was to evaluate the potential contamination of Legionella pneumophila (LP) in a large hospital in Italy through georeferential statistical analysis to assess the possible sources of dispersion and, consequently, the risk of exposure for both health care staff and patients. LP serogroups 1 and 2-14 distribution was considered in the wards housed on two consecutive floors of the hospital building. On the basis of information provided by 53 bacteriological analysis, a 'random' grid of points was chosen and spatial geostatistics or FAIk Kriging was applied and compared with the results of classical statistical analysis. Over 50% of the examined samples were positive for Legionella pneumophila. LP 1 was isolated in 69% of samples from the ground floor and in 60% of sample from the first floor; LP 2-14 in 36% of sample from the ground floor and 24% from the first. The iso-estimation maps show clearly the most contaminated pipe and the difference in the diffusion of the different L. pneumophila serogroups. Experimental work has demonstrated that geostatistical methods applied to the microbiological analysis of water matrices allows a better modeling of the phenomenon under study, a greater potential for risk management and a greater choice of methods of prevention and environmental recovery to be put in place with respect to the classical statistical analysis.
A stable systemic risk ranking in China's banking sector: Based on principal component analysis
NASA Astrophysics Data System (ADS)
Fang, Libing; Xiao, Binqing; Yu, Honghai; You, Qixing
2018-02-01
In this paper, we compare five popular systemic risk rankings, and apply principal component analysis (PCA) model to provide a stable systemic risk ranking for the Chinese banking sector. Our empirical results indicate that five methods suggest vastly different systemic risk rankings for the same bank, while the combined systemic risk measure based on PCA provides a reliable ranking. Furthermore, according to factor loadings of the first component, PCA combined ranking is mainly based on fundamentals instead of market price data. We clearly find that price-based rankings are not as practical a method as fundamentals-based ones. This PCA combined ranking directly shows systemic risk contributions of each bank for banking supervision purpose and reminds banks to prevent and cope with the financial crisis in advance.
Rocca, Elena; Andersen, Fredrik
2017-08-14
Scientific risk evaluations are constructed by specific evidence, value judgements and biological background assumptions. The latter are the framework-setting suppositions we apply in order to understand some new phenomenon. That background assumptions co-determine choice of methodology, data interpretation, and choice of relevant evidence is an uncontroversial claim in modern basic science. Furthermore, it is commonly accepted that, unless explicated, disagreements in background assumptions can lead to misunderstanding as well as miscommunication. Here, we extend the discussion on background assumptions from basic science to the debate over genetically modified (GM) plants risk assessment. In this realm, while the different political, social and economic values are often mentioned, the identity and role of background assumptions at play are rarely examined. We use an example from the debate over risk assessment of stacked genetically modified plants (GM stacks), obtained by applying conventional breeding techniques to GM plants. There are two main regulatory practices of GM stacks: (i) regulate as conventional hybrids and (ii) regulate as new GM plants. We analyzed eight papers representative of these positions and found that, in all cases, additional premises are needed to reach the stated conclusions. We suggest that these premises play the role of biological background assumptions and argue that the most effective way toward a unified framework for risk analysis and regulation of GM stacks is by explicating and examining the biological background assumptions of each position. Once explicated, it is possible to either evaluate which background assumptions best reflect contemporary biological knowledge, or to apply Douglas' 'inductive risk' argument.
Donovan, Sarah-Louise; Salmon, Paul M; Lenné, Michael G; Horberry, Tim
2017-10-01
Safety leadership is an important factor in supporting safety in high-risk industries. This article contends that applying systems-thinking methods to examine safety leadership can support improved learning from incidents. A case study analysis was undertaken of a large-scale mining landslide incident in which no injuries or fatalities were incurred. A multi-method approach was adopted, in which the Critical Decision Method, Rasmussen's Risk Management Framework and Accimap method were applied to examine the safety leadership decisions and actions which enabled the safe outcome. The approach enabled Rasmussen's predictions regarding safety and performance to be examined in the safety leadership context, with findings demonstrating the distribution of safety leadership across leader and system levels, and the presence of vertical integration as key to supporting the successful safety outcome. In doing so, the findings also demonstrate the usefulness of applying systems-thinking methods to examine and learn from incidents in terms of what 'went right'. The implications, including future research directions, are discussed. Practitioner Summary: This paper presents a case study analysis, in which systems-thinking methods are applied to the examination of safety leadership decisions and actions during a large-scale mining landslide incident. The findings establish safety leadership as a systems phenomenon, and furthermore, demonstrate the usefulness of applying systems-thinking methods to learn from incidents in terms of what 'went right'. Implications, including future research directions, are discussed.
Tommasino, Francesco; Durante, Marco; D'Avino, Vittoria; Liuzzi, Raffaele; Conson, Manuel; Farace, Paolo; Palma, Giuseppe; Schwarz, Marco; Cella, Laura; Pacelli, Roberto
2017-05-01
Proton beam therapy represents a promising modality for left-side breast cancer (BC) treatment, but concerns have been raised about skin toxicity and poor cosmesis. The aim of this study is to apply skin normal tissue complication probability (NTCP) model for intensity modulated proton therapy (IMPT) optimization in left-side BC. Ten left-side BC patients undergoing photon irradiation after breast-conserving surgery were randomly selected from our clinical database. Intensity modulated photon (IMRT) and IMPT plans were calculated with iso-tumor-coverage criteria and according to RTOG 1005 guidelines. Proton plans were computed with and without skin optimization. Published NTCP models were employed to estimate the risk of different toxicity endpoints for skin, lung, heart and its substructures. Acute skin NTCP evaluation suggests a lower toxicity level with IMPT compared to IMRT when the skin is included in proton optimization strategy (0.1% versus 1.7%, p < 0.001). Dosimetric results show that, with the same level of tumor coverage, IMPT attains significant heart and lung dose sparing compared with IMRT. By NTCP model-based analysis, an overall reduction in the cardiopulmonary toxicity risk prediction can be observed for all IMPT compared to IMRT plans: the relative risk reduction from protons varies between 0.1 and 0.7 depending on the considered toxicity endpoint. Our analysis suggests that IMPT might be safely applied without increasing the risk of severe acute radiation induced skin toxicity. The quantitative risk estimates also support the potential clinical benefits of IMPT for left-side BC irradiation due to lower risk of cardiac and pulmonary morbidity. The applied approach might be relevant on the long term for the setup of cost-effectiveness evaluation strategies based on NTCP predictions.
NASA Astrophysics Data System (ADS)
Khabbazan, Mohammad Mohammadi; Roshan, Elnaz; Held, Hermann
2017-04-01
In principle solar radiation management (SRM) offers an option to ameliorate anthropogenic temperature rise. However we cannot expect it to simultaneously compensate for anthropogenic changes in further climate variables in a perfect manner. Here, we ask to what extent a proponent of the 2°C-temperature target would apply SRM in conjunction with mitigation in view of global or regional disparities in precipitation changes. We apply cost-risk analysis (CRA), which is a decision analytic framework that makes a trade-off between the expected welfare-loss from climate policy costs and the climate risks from transgressing a climate target. Here, in both global-scale and 'Giorgi'-regional-scale analyses, we evaluate the optimal mixture of SRM and mitigation under probabilistic information about climate sensitivity. To do so, we generalize CRA for the sake of including not only temperature risk, but also globally aggregated and regionally disaggregated precipitation risks. Social welfare is maximized for the following three valuation scenarios: temperature-risk-only, precipitation-risk-only, and equally weighted both-risks. For now, the Giorgi regions are treated by equal weight. We find that for regionally differentiated precipitation targets, the usage of SRM will be comparably more restricted. In the course of time, a cooling of up to 1.3°C can be attributed to SRM for the latter scenario and for a median climate sensitivity of 3°C (for a global target only, this number reduces by 0.5°C). Our results indicate that although SRM would almost completely substitute for mitigation in the globally aggregated analysis, it only saves 70% to 75% of the welfare-loss compared to a purely mitigation-based analysis (from economic costs and climate risks, approximately 4% in terms of BGE) when considering regional precipitation risks in precipitation-risk-only and both-risks scenarios. It remains to be shown how the inclusion of further risks or different regional weights would change that picture.
Hansson, Helena; Lagerkvist, Carl Johan
2014-06-01
This study integrated risk-benefit analysis with prospect theory with the overall objective of identifying the type of management behavior represented by farmers' choices of mastitis control options (MCOs). Two exploratory factor analyses, based on 163 and 175 Swedish farmers, respectively, highlighted attitudes to MCOs related to: (1) grouping cows and applying milking order to prevent spread of existing infection and (2) working in a precautionary way to prevent mastitis occurring. This was interpreted as being based on (1) reactive management behavior on detection of udder-health problems in individual cows and (2) proactive management behavior to prevent mastitis developing. Farmers' assessments of these MCOs were found to be based on asymmetrical evaluations of risks and benefits, suggesting that farmers' management behavior depends on their individual reference point. In particular, attitudes to MCOs related to grouping cows and applying milking order to prevent the spread of mastitis once infected cows were detected were stronger in the risk domain than in the benefit domain, in accordance with loss aversion. In contrast, attitudes to MCOs related to working in a precautionary way to prevent cows from becoming infected in the first place were stronger in the benefit domain than in the risk domain, in accordance with reverse loss aversion. These findings are of practical importance for farmers and agribusiness and in public health protection work to reduce the current extensive use of antibiotics in dairy herds. © 2013 Society for Risk Analysis.
ERIC Educational Resources Information Center
Denham, Bryan E.
2009-01-01
Grounded conceptually in social cognitive theory, this research examines how personal, behavioral, and environmental factors are associated with risk perceptions of anabolic-androgenic steroids. Ordinal logistic regression and logit log-linear models applied to data gathered from high-school seniors (N = 2,160) in the 2005 Monitoring the Future…
NASA Astrophysics Data System (ADS)
Tahri, Meryem; Maanan, Mohamed; Hakdaoui, Mustapha
2016-04-01
This paper shows a method to assess the vulnerability of coastal risks such as coastal erosion or submarine applying Fuzzy Analytic Hierarchy Process (FAHP) and spatial analysis techniques with Geographic Information System (GIS). The coast of the Mohammedia located in Morocco was chosen as the study site to implement and validate the proposed framework by applying a GIS-FAHP based methodology. The coastal risk vulnerability mapping follows multi-parametric causative factors as sea level rise, significant wave height, tidal range, coastal erosion, elevation, geomorphology and distance to an urban area. The Fuzzy Analytic Hierarchy Process methodology enables the calculation of corresponding criteria weights. The result shows that the coastline of the Mohammedia is characterized by a moderate, high and very high level of vulnerability to coastal risk. The high vulnerability areas are situated in the east at Monika and Sablette beaches. This technical approach is based on the efficiency of the Geographic Information System tool based on Fuzzy Analytical Hierarchy Process to help decision maker to find optimal strategies to minimize coastal risks.
Some considerations on the definition of risk based on concepts of systems theory and probability.
Andretta, Massimo
2014-07-01
The concept of risk has been applied in many modern science and technology fields. Despite its successes in many applicative fields, there is still not a well-established vision and universally accepted definition of the principles and fundamental concepts of the risk assessment discipline. As emphasized recently, the risk fields suffer from a lack of clarity on their scientific bases that can define, in a unique theoretical framework, the general concepts in the different areas of application. The aim of this article is to make suggestions for another perspective of risk definition that could be applied and, in a certain sense, generalize some of the previously known definitions (at least in the fields of technical and scientific applications). By drawing on my experience of risk assessment in different applicative situations (particularly in the risk estimation for major industrial accidents, and in the health and ecological risk assessment for contaminated sites), I would like to revise some general and foundational concepts of risk analysis in as consistent a manner as possible from the axiomatic/deductive point of view. My proposal is based on the fundamental concepts of the systems theory and of the probability. In this way, I try to frame, in a single, broad, and general theoretical context some fundamental concepts and principles applicable in many different fields of risk assessment. I hope that this article will contribute to the revitalization and stimulation of useful discussions and new insights into the key issues and theoretical foundations of risk assessment disciplines. © 2013 Society for Risk Analysis.
Can Link Analysis Be Applied to Identify Behavioral Patterns in Train Recorder Data?
Strathie, Ailsa; Walker, Guy H
2016-03-01
A proof-of-concept analysis was conducted to establish whether link analysis could be applied to data from on-train recorders to detect patterns of behavior that could act as leading indicators of potential safety issues. On-train data recorders capture data about driving behavior on thousands of routine journeys every day and offer a source of untapped data that could be used to offer insights into human behavior. Data from 17 journeys undertaken by six drivers on the same route over a 16-hr period were analyzed using link analysis, and four key metrics were examined: number of links, network density, diameter, and sociometric status. The results established that link analysis can be usefully applied to data captured from on-vehicle recorders. The four metrics revealed key differences in normal driver behavior. These differences have promising construct validity as leading indicators. Link analysis is one method that could be usefully applied to exploit data routinely gathered by on-vehicle data recorders. It facilitates a proactive approach to safety based on leading indicators, offers a clearer understanding of what constitutes normal driving behavior, and identifies trends at the interface of people and systems, which is currently a key area of strategic risk. These research findings have direct applications in the field of transport data monitoring. They offer a means of automatically detecting patterns in driver behavior that could act as leading indicators of problems during operation and that could be used in the proactive monitoring of driver competence, risk management, and even infrastructure design. © 2015, Human Factors and Ergonomics Society.
Vorovencii, Iosif
2017-09-26
The desertification risk affects around 40% of the agricultural land in various regions of Romania. The purpose of this study is to analyse the risk of desertification in the south-west of Romania in the period 1984-2011 using the change vector analysis (CVA) technique and Landsat thematic mapper (TM) satellite images. CVA was applied to combinations of normalised difference vegetation index (NDVI)-albedo, NDVI-bare soil index (BI) and tasselled cap greenness (TCG)-tasselled cap brightness (TCB). The combination NDVI-albedo proved to be the best in assessing the desertification risk, with an overall accuracy of 87.67%, identifying a desertification risk on 25.16% of the studied period. The classification of the maps was performed for the following classes: desertification risk, re-growing and persistence. Four degrees of desertification risk and re-growing were used: low, medium, high and extreme. Using the combination NDVI-albedo, 0.53% of the analysed surface was assessed as having an extreme degree of desertification risk, 3.93% a high degree, 8.72% a medium degree and 11.98% a low degree. The driving forces behind the risk of desertification are both anthropogenic and climatic causes. The anthropogenic causes include the destruction of the irrigation system, deforestation, the destruction of the forest shelterbelts, the fragmentation of agricultural land and its inefficient management. Climatic causes refer to increase of temperatures, frequent and prolonged droughts and decline of the amount of precipitation.
A risk-based approach to robotic mission requirements
NASA Technical Reports Server (NTRS)
Dias, William C.; Bourke, Roger D.
1992-01-01
A NASA Risk Team has developed a method for the application of risk management to the definition of robotic mission requirements for the Space Exploration Initiative. These requirements encompass environmental information, infrastructural emplacement in advance, and either technology testing or system/subsystems demonstration. Attention is presently given to a method for step-by-step consideration and analysis of the risk component inherent in mission architecture, followed by a calculation of the subjective risk level. Mitigation strategies are then applied with the same rules, and a comparison is made.
Risk management in the competitive electric power industry
NASA Astrophysics Data System (ADS)
Dahlgren, Robert William
From 1990 until present day, the electric power industry has experienced dramatic changes worldwide. This recent evolution of the power industry has included creation and multiple iterations of competitive wholesale markets in many different forms. The creation of these competitive markets has resulted in increased short-term volatility of power prices. Vertically integrated utilities emerged from years of regulatory controls to now experience the need to perform risk assessment. The goal of this dissertation is to provide background and details of the evolution of market structures combined with examples of how to apply price risk assessment techniques such as Value-at-Risk (VaR). In Chapter 1, the history and evolution of three selected regional markets, PJM, California, and England and Wales is presented. A summary of the commonalities and differences is presented to provide an overview of the rate of transformation of the industry in recent years. The broad area of risk management in the power industry is also explored through a State-of-the-Art Literature Survey. In Chapter 2, an illustration of risk assessment to power trading is presented. The techniques of Value-at-Risk and Conditional Value-at-Risk are introduced and applied to a common scenario. The advantages and limitations of the techniques are compared through observation of their results against the common example. Volatility in the California Power Markets is presented in Chapter 3. This analysis explores the California markets in the summer of 2000 including the application of VaR analysis to the extreme volatility observed during this period. In Chapter 4, CVaR is applied to the same California historical data used in Chapter 3. In addition, the unique application of minimizing the risk of a power portfolio by minimizing CVaR is presented. The application relies on recent research into CVaR whereby the portfolio optimization problem can be reduced to a Linear Programming problem.
Depression and cancer risk: a systematic review and meta-analysis.
Jia, Y; Li, F; Liu, Y F; Zhao, J P; Leng, M M; Chen, L
2017-08-01
To assess the associations between depression and incident cancer risk. Systematic review and meta-analysis. The Cochrane Library, Web of Science, MEDLINE, and PubMed databases were searched to identify studies. The quality of included studies was assessed using the Newcastle Ottawa Scale. Risk ratios (RRs) were used to measure effect size. A random-effects model was applied to synthesize the associations between depression and cancer risk. A forest plot was produced to visually assess RRs and 95% confidence intervals (CIs). Heterogeneity across studies was assessed using the I-squared statistic. A funnel plot was generated to assess potential publication bias, and Egger's regression was applied to test the symmetry of the funnel plot. In total, 1,469,179 participants and 89,716 incident cases of cancer from 25 studies were included. Depression was significantly associated with overall cancer risk (RR = 1.15, 95% CI: 1.09-1.22) and with liver cancer (RR = 1.20, 95% CI: 1.01-1.43) and lung cancer (RR = 1.33, 95% CI: 1.04-1.72). Subgroup analysis of studies in North America resulted in a significant summary relative risk (RR = 1.30, 95% CI: 1.15-1.48). No significant associations were found for breast, prostate, or colorectal/colon cancer. The average Newcastle Ottawa score was 7.56 for all included studies. Our findings showed a small and positive association between depression and the overall occurrence risk of cancer, as well as liver cancer and lung cancer risks. However, multinational and larger sample studies are required to further research and support these associations. Moreover, confounding factors such as cigarette smoking and alcohol use/abuse should be considered in future studies. Copyright © 2017 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.
Navoni, J A; De Pietri, D; Olmos, V; Gimenez, C; Bovi Mitre, G; de Titto, E; Villaamil Lepori, E C
2014-11-15
Arsenic (As) is a ubiquitous element widely distributed in the environment. This metalloid has proven carcinogenic action in man. The aim of this work was to assess the health risk related to As exposure through drinking water in an Argentinean population, applying spatial analytical techniques in addition to conventional approaches. The study involved 650 inhabitants from Chaco and Santiago del Estero provinces. Arsenic in drinking water (Asw) and urine (UAs) was measured by hydride generation atomic absorption spectrophotometry. Average daily dose (ADD), hazard quotient (HQ), and carcinogenic risk (CR) were estimated, geo-referenced and integrated with demographical data by a health composite index (HI) applying geographic information system (GIS) analysis. Asw covered a wide range of concentration: from non-detectable (ND) to 2000 μg/L. More than 90% of the population was exposed to As, with UAs levels above the intervention level of 100 μg/g creatinine. GIS analysis described an expected level of exposure lower than the observed, indicating possible additional source/s of exposure to inorganic arsenic. In 68% of the locations, the population had a HQ greater than 1, and the CR ranged between 5·10(-5) and 2,1·10(-2). An environmental exposure area through ADD geo-referencing defined a baseline scenario for space-time risk assessment. The time of residence, the demographic density and the potential health considered outcomes helped characterize the health risk in the region. The geospatial analysis contributed to delimitate and analyze the change tendencies of risk in the region, broadening the scopes of the results for a decision-making process. Copyright © 2014 Elsevier B.V. All rights reserved.
Applying machine learning to pattern analysis for automated in-design layout optimization
NASA Astrophysics Data System (ADS)
Cain, Jason P.; Fakhry, Moutaz; Pathak, Piyush; Sweis, Jason; Gennari, Frank; Lai, Ya-Chieh
2018-04-01
Building on previous work for cataloging unique topological patterns in an integrated circuit physical design, a new process is defined in which a risk scoring methodology is used to rank patterns based on manufacturing risk. Patterns with high risk are then mapped to functionally equivalent patterns with lower risk. The higher risk patterns are then replaced in the design with their lower risk equivalents. The pattern selection and replacement is fully automated and suitable for use for full-chip designs. Results from 14nm product designs show that the approach can identify and replace risk patterns with quantifiable positive impact on the risk score distribution after replacement.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barratt, B.I.P.; Moeed, A.; Malone, L.A.
2006-05-15
An analysis of established biosafety protocols for release into the environment of exotic plants and biological control agents for weeds and arthropod pests has been carried out to determine whether such protocols can be applied to relatively new and emerging technologies intended for the primary production industries, such as transgenic plants. Example case studies are described to indicate the scope of issues considered by regulators who make decisions on new organism releases. No transgenic plants have been released to date in New Zealand, but two field test approvals are described as examples. An analysis of the biosafety protocols has shownmore » that, while many of the risk criteria considered for decision-making by regulators are similar for all new organisms, a case-by-case examination of risks and potential impacts is required in order to fully assess risk. The value of post-release monitoring and validation of decisions made by regulators is emphasised.« less
Risk analysis of analytical validations by probabilistic modification of FMEA.
Barends, D M; Oldenhof, M T; Vredenbregt, M J; Nauta, M J
2012-05-01
Risk analysis is a valuable addition to validation of an analytical chemistry process, enabling not only detecting technical risks, but also risks related to human failures. Failure Mode and Effect Analysis (FMEA) can be applied, using a categorical risk scoring of the occurrence, detection and severity of failure modes, and calculating the Risk Priority Number (RPN) to select failure modes for correction. We propose a probabilistic modification of FMEA, replacing the categorical scoring of occurrence and detection by their estimated relative frequency and maintaining the categorical scoring of severity. In an example, the results of traditional FMEA of a Near Infrared (NIR) analytical procedure used for the screening of suspected counterfeited tablets are re-interpretated by this probabilistic modification of FMEA. Using this probabilistic modification of FMEA, the frequency of occurrence of undetected failure mode(s) can be estimated quantitatively, for each individual failure mode, for a set of failure modes, and the full analytical procedure. Copyright © 2012 Elsevier B.V. All rights reserved.
Patriarca, Peter A; Van Auken, R Michael; Kebschull, Scott A
2018-01-01
Benefit-risk evaluations of drugs have been conducted since the introduction of modern regulatory systems more than 50 years ago. Such judgments are typically made on the basis of qualitative or semiquantitative approaches, often without the aid of quantitative assessment methods, the latter having often been applied asymmetrically to place emphasis on benefit more so than harm. In an effort to preliminarily evaluate the utility of lives lost or saved, or quality-adjusted life-years (QALY) lost and gained as a means of quantitatively assessing the potential benefits and risks of a new chemical entity, we focused our attention on the unique scenario in which a drug was initially approved based on one set of data, but later withdrawn from the market based on a second set of data. In this analysis, a dimensionless risk to benefit ratio was calculated in each instance, based on the risk and benefit quantified in similar units. The results indicated that FDA decisions to approve the drug corresponded to risk to benefit ratios less than or equal to 0.136, and that decisions to withdraw the drug from the US market corresponded to risk to benefit ratios greater than or equal to 0.092. The probability of FDA approval was then estimated using logistic regression analysis. The results of this analysis indicated that there was a 50% probability of FDA approval if the risk to benefit ratio was 0.121, and that the probability approaches 100% for values much less than 0.121, and the probability approaches 0% for values much greater than 0.121. The large uncertainty in these estimates due to the small sample size and overlapping data may be addressed in the future by applying the methodology to other drugs.
NASA Astrophysics Data System (ADS)
Cui, Jia; Hong, Bei; Jiang, Xuepeng; Chen, Qinghua
2017-05-01
With the purpose of reinforcing correlation analysis of risk assessment threat factors, a dynamic assessment method of safety risks based on particle filtering is proposed, which takes threat analysis as the core. Based on the risk assessment standards, the method selects threat indicates, applies a particle filtering algorithm to calculate influencing weight of threat indications, and confirms information system risk levels by combining with state estimation theory. In order to improve the calculating efficiency of the particle filtering algorithm, the k-means cluster algorithm is introduced to the particle filtering algorithm. By clustering all particles, the author regards centroid as the representative to operate, so as to reduce calculated amount. The empirical experience indicates that the method can embody the relation of mutual dependence and influence in risk elements reasonably. Under the circumstance of limited information, it provides the scientific basis on fabricating a risk management control strategy.
Education and Labor Market Risk: Understanding the Role of Data Cleaning
ERIC Educational Resources Information Center
Whalley, Alexander
2011-01-01
This paper examines whether conclusions about the relationship between education and labor market risk depend on the use of commonly applied procedures to clean data of extreme values. The analysis uses fifteen years of data from the Panel Study of Income Dynamics to demonstrate that conclusions about the relationship between education and labor…
Monitoring Socio-Demographic Risk: A Cohort Analysis of Families Using Census Micro-Data
ERIC Educational Resources Information Center
Davis, Peter; McPherson, Mervyl; Wheldon, Mark; von Randow, Martin
2012-01-01
We apply cohort techniques to monitor four indicators of socio-demographic risk crucial to family wellbeing; namely, income, employment, education, and housing. The data were derived from New Zealand's five-yearly Census for the period 1981-2006. This allowed us to track birth cohorts of mothers (and their families) over six successive New Zealand…
[Theoretical model study about the application risk of high risk medical equipment].
Shang, Changhao; Yang, Fenghui
2014-11-01
Research for establishing a risk monitoring theoretical model of high risk medical equipment at applying site. Regard the applying site as a system which contains some sub-systems. Every sub-system consists of some risk estimating indicators. After quantizing of each indicator, the quantized values are multiplied with corresponding weight and then the products are accumulated. Hence, the risk estimating value of each subsystem is attained. Follow the calculating method, the risk estimating values of each sub-system are multiplied with corresponding weights and then the product is accumulated. The cumulative sum is the status indicator of the high risk medical equipment at applying site. The status indicator reflects the applying risk of the medical equipment at applying site. Establish a risk monitoring theoretical model of high risk medical equipment at applying site. The model can monitor the applying risk of high risk medical equipment at applying site dynamically and specially.
A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.
Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco
2018-01-01
One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.
Analysis of labour risks in the Spanish industrial aerospace sector.
Laguardia, Juan; Rubio, Emilio; Garcia, Ana; Garcia-Foncillas, Rafael
2016-01-01
Labour risk prevention is an activity integrated within Safety and Hygiene at Work in Spain. In 2003, the Electronic Declaration for Accidents at Work, Delt@ (DELTA) was introduced. The industrial aerospace sector is subject to various risks. Our objective is to analyse the Spanish Industrial Aerospace Sector (SIAS) using the ACSOM methodology to assess its labour risks and to prioritise preventive actions. The SIAS and the Services Subsector (SS) were created and the relevant accident rate data were obtained. The ACSOM method was applied through double contrast (deviation and translocation) of the SIAS or SS risk polygon with the considered pattern, accidents from all sectors (ACSOM G) or the SIAS. A list of risks was obtained, ordered by action phases. In the SIAS vs. ACSOM G analysis, radiation risks were the worst, followed by overstrains. Accidents caused by living beings were also significant in the SS vs. SIAE, which will be able to be used to improve Risk Prevention. Radiation is the most significant risk in the SIAS and the SS. Preventive actions will be primary and secondary. ACSOM has shown itself to be a valid tool for the analysis of labour risks.
Tenenhaus-Aziza, Fanny; Daudin, Jean-Jacques; Maffre, Alexandre; Sanaa, Moez
2014-01-01
According to Codex Alimentarius Commission recommendations, management options applied at the process production level should be based on good hygiene practices, HACCP system, and new risk management metrics such as the food safety objective. To follow this last recommendation, the use of quantitative microbiological risk assessment is an appealing approach to link new risk-based metrics to management options that may be applied by food operators. Through a specific case study, Listeria monocytogenes in soft cheese made from pasteurized milk, the objective of the present article is to practically show how quantitative risk assessment could be used to direct potential intervention strategies at different food processing steps. Based on many assumptions, the model developed estimates the risk of listeriosis at the moment of consumption taking into account the entire manufacturing process and potential sources of contamination. From pasteurization to consumption, the amplification of a primo-contamination event of the milk, the fresh cheese or the process environment is simulated, over time, space, and between products, accounting for the impact of management options, such as hygienic operations and sampling plans. A sensitivity analysis of the model will help orientating data to be collected prioritarily for the improvement and the validation of the model. What-if scenarios were simulated and allowed for the identification of major parameters contributing to the risk of listeriosis and the optimization of preventive and corrective measures. © 2013 Society for Risk Analysis.
Seismic Hazard Analysis — Quo vadis?
NASA Astrophysics Data System (ADS)
Klügel, Jens-Uwe
2008-05-01
The paper is dedicated to the review of methods of seismic hazard analysis currently in use, analyzing the strengths and weaknesses of different approaches. The review is performed from the perspective of a user of the results of seismic hazard analysis for different applications such as the design of critical and general (non-critical) civil infrastructures, technical and financial risk analysis. A set of criteria is developed for and applied to an objective assessment of the capabilities of different analysis methods. It is demonstrated that traditional probabilistic seismic hazard analysis (PSHA) methods have significant deficiencies, thus limiting their practical applications. These deficiencies have their roots in the use of inadequate probabilistic models and insufficient understanding of modern concepts of risk analysis, as have been revealed in some recent large scale studies. These deficiencies result in the lack of ability of a correct treatment of dependencies between physical parameters and finally, in an incorrect treatment of uncertainties. As a consequence, results of PSHA studies have been found to be unrealistic in comparison with empirical information from the real world. The attempt to compensate these problems by a systematic use of expert elicitation has, so far, not resulted in any improvement of the situation. It is also shown that scenario-earthquakes developed by disaggregation from the results of a traditional PSHA may not be conservative with respect to energy conservation and should not be used for the design of critical infrastructures without validation. Because the assessment of technical as well as of financial risks associated with potential damages of earthquakes need a risk analysis, current method is based on a probabilistic approach with its unsolved deficiencies. Traditional deterministic or scenario-based seismic hazard analysis methods provide a reliable and in general robust design basis for applications such as the design of critical infrastructures, especially with systematic sensitivity analyses based on validated phenomenological models. Deterministic seismic hazard analysis incorporates uncertainties in the safety factors. These factors are derived from experience as well as from expert judgment. Deterministic methods associated with high safety factors may lead to too conservative results, especially if applied for generally short-lived civil structures. Scenarios used in deterministic seismic hazard analysis have a clear physical basis. They are related to seismic sources discovered by geological, geomorphologic, geodetic and seismological investigations or derived from historical references. Scenario-based methods can be expanded for risk analysis applications with an extended data analysis providing the frequency of seismic events. Such an extension provides a better informed risk model that is suitable for risk-informed decision making.
Risk Interfaces to Support Integrated Systems Analysis and Development
NASA Technical Reports Server (NTRS)
Mindock, Jennifer; Lumpkins, Sarah; Shelhamer, Mark; Anton, Wilma; Havenhill, Maria
2016-01-01
Objectives for systems analysis capability: Develop integrated understanding of how a complex human physiological-socio-technical mission system behaves in spaceflight. Why? Support development of integrated solutions that prevent unwanted outcomes (Implementable approaches to minimize mission resources(mass, power, crew time, etc.)); Support development of tools for autonomy (need for exploration) (Assess and maintain resilience -individuals, teams, integrated system). Output of this exercise: -Representation of interfaces based on Human System Risk Board (HSRB) Risk Summary information and simple status based on Human Research Roadmap; Consolidated HSRB information applied to support communication; Point-of-Departure for HRP Element planning; Ability to track and communicate status of collaborations. 4
Decision Analysis Techniques for Adult Learners: Application to Leadership
ERIC Educational Resources Information Center
Toosi, Farah
2017-01-01
Most decision analysis techniques are not taught at higher education institutions. Leaders, project managers and procurement agents in industry have strong technical knowledge, and it is crucial for them to apply this knowledge at the right time to make critical decisions. There are uncertainties, problems, and risks involved in business…
Low-thrust mission risk analysis, with application to a 1980 rendezvous with the comet Encke
NASA Technical Reports Server (NTRS)
Yen, C. L.; Smith, D. B.
1973-01-01
A computerized failure process simulation procedure is used to evaluate the risk in a solar electric space mission. The procedure uses currently available thrust-subsystem reliability data and performs approximate simulations of the thrust sybsystem burn operation, the system failure processes, and the retargeting operations. The method is applied to assess the risks in carrying out a 1980 rendezvous mission to the comet Encke. Analysis of the results and evaluation of the effects of various risk factors on the mission show that system component failure rates are the limiting factors in attaining a high mission relability. It is also shown that a well-designed trajectory and system operation mode can be used effectively to partially compensate for unreliable thruster performance.
A Simplified Approach to Risk Assessment Based on System Dynamics: An Industrial Case Study.
Garbolino, Emmanuel; Chery, Jean-Pierre; Guarnieri, Franck
2016-01-01
Seveso plants are complex sociotechnical systems, which makes it appropriate to support any risk assessment with a model of the system. However, more often than not, this step is only partially addressed, simplified, or avoided in safety reports. At the same time, investigations have shown that the complexity of industrial systems is frequently a factor in accidents, due to interactions between their technical, human, and organizational dimensions. In order to handle both this complexity and changes in the system over time, this article proposes an original and simplified qualitative risk evaluation method based on the system dynamics theory developed by Forrester in the early 1960s. The methodology supports the development of a dynamic risk assessment framework dedicated to industrial activities. It consists of 10 complementary steps grouped into two main activities: system dynamics modeling of the sociotechnical system and risk analysis. This system dynamics risk analysis is applied to a case study of a chemical plant and provides a way to assess the technological and organizational components of safety. © 2016 Society for Risk Analysis.
The dissection of risk: a conceptual analysis.
O'Byrne, Patrick
2008-03-01
Recently, patient safety has gained popularity in the nursing literature. While this topic is used extensively and has been analyzed thoroughly, some of the concepts upon which it relies, such as risk, have remained undertheorized. In fact, despite its considerable use, the term 'risk' has been largely assumed to be inherently neutral - meaning that its definition and discovery is seen as objective and impartial, and that risk avoidance is natural and logical. Such an oversight in evaluation requires that the concept of risk be thoroughly analyzed as it relates to nursing practices, particularly in relation to those practices surrounding bio-political nursing care, such as public health, as well as other more trendy nursing topics, such as patient safety. Thus, this paper applies the Evolutionary Model of concept analysis to explore 'risk', and expose it as one mechanism of maintaining prescribed/ proscribed social practices. Thereby, an analysis of risk results in the definitions and roles of the discipline and profession of nursing expanding from solely being dedicated to patient care, to include, in addition, its functions as a governmental body that unwittingly maintains hegemonic infrastructures.
Akterian, S G; Fernandez, P S; Hendrickx, M E; Tobback, P P; Periago, P M; Martinez, A
1999-03-01
A risk analysis was applied to experimental heat resistance data. This analysis is an approach for processing experimental thermobacteriological data in order to study the variability of D and z values of target microorganisms depending on the deviations range of environmental factors, to determine the critical factors and to specify their critical tolerance. This analysis is based on sets of sensitivity functions applied to a specific case of experimental data related to the thermoresistance of Clostridium sporogenes and Bacillus stearothermophilus spores. The effect of the following factors was analyzed: the type of target microorganism; nature of the heating substrate; pH, temperature; type of acid employed and NaCl concentration. The type of target microorganism to be inactivated, the nature of the substrate (reference or real food) and the heating temperature were identified as critical factors, determining about 90% of the alteration of the microbiological risk. The effect of the type of acid used for the acidification of products and the concentration of NaCl can be assumed to be negligible factors for the purposes of engineering calculations. The critical non-uniformity in temperature during thermobacteriological studies was set as 0.5% and the critical tolerances of pH value and NaCl concentration were 5%. These results are related to a specific case study, for that reason their direct generalization is not correct.
Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581
NASA Astrophysics Data System (ADS)
Prayogo, Galang Sandy; Haryadi, Gunawan Dwi; Ismail, Rifky; Kim, Seon Jin
2016-04-01
Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presented a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.
Li, Xianbo; Zuo, Rui; Teng, Yanguo; Wang, Jinsheng; Wang, Bin
2015-01-01
Increasing pressure on water supply worldwide, especially in arid areas, has resulted in groundwater overexploitation and contamination, and subsequent deterioration of the groundwater quality and threats to public health. Environmental risk assessment of regional groundwater is an important tool for groundwater protection. This study presents a new approach for assessing the environmental risk assessment of regional groundwater. It was carried out with a relative risk model (RRM) coupled with a series of indices, such as a groundwater vulnerability index, which includes receptor analysis, risk source analysis, risk exposure and hazard analysis, risk characterization, and management of groundwater. The risk map is a product of the probability of environmental contamination and impact. The reliability of the RRM was verified using Monte Carlo analysis. This approach was applied to the lower Liaohe River Plain (LLRP), northeastern China, which covers 23604 km2. A spatial analysis tool within GIS which was used to interpolate and manipulate the data to develop environmental risk maps of regional groundwater, divided the level of risk from high to low into five ranks (V, IV, III, II, I). The results indicate that areas of relative risk rank (RRR) V cover 2324 km2, covering 9.8% of the area; RRR IV covers 3986 km2, accounting for 16.9% of the area. It is a new and appropriate method for regional groundwater resource management and land use planning, and is a rapid and effective tool for improving strategic decision making to protect groundwater and reduce environmental risk. PMID:26020518
NASA Astrophysics Data System (ADS)
Escuder-Bueno, I.; Castillo-Rodríguez, J. T.; Zechner, S.; Jöbstl, C.; Perales-Momparler, S.; Petaccia, G.
2012-09-01
Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009-2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative). First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.
Risk-sensitive reinforcement learning.
Shen, Yun; Tobia, Michael J; Sommer, Tobias; Obermayer, Klaus
2014-07-01
We derive a family of risk-sensitive reinforcement learning methods for agents, who face sequential decision-making tasks in uncertain environments. By applying a utility function to the temporal difference (TD) error, nonlinear transformations are effectively applied not only to the received rewards but also to the true transition probabilities of the underlying Markov decision process. When appropriate utility functions are chosen, the agents' behaviors express key features of human behavior as predicted by prospect theory (Kahneman & Tversky, 1979 ), for example, different risk preferences for gains and losses, as well as the shape of subjective probability curves. We derive a risk-sensitive Q-learning algorithm, which is necessary for modeling human behavior when transition probabilities are unknown, and prove its convergence. As a proof of principle for the applicability of the new framework, we apply it to quantify human behavior in a sequential investment task. We find that the risk-sensitive variant provides a significantly better fit to the behavioral data and that it leads to an interpretation of the subject's responses that is indeed consistent with prospect theory. The analysis of simultaneously measured fMRI signals shows a significant correlation of the risk-sensitive TD error with BOLD signal change in the ventral striatum. In addition we find a significant correlation of the risk-sensitive Q-values with neural activity in the striatum, cingulate cortex, and insula that is not present if standard Q-values are used.
2009-04-16
Aeronautics Technical Seminar: Dr. Elisabeth Pate-Cornell, Burt and Deedee McMurtry professor and chair of the Department of Management Science and Engineering at Stanford University presents 'Lessons Learned in Applying Engineering Risk Analysis'.
2009-04-16
Aeronautics Technical Seminar: Dr. Elisabeth Pate-Cornell, Burt and Deedee McMurtry professor and chair of the Department of Management Science and Engineering at Stanford University presents 'Lessons Learned in Applying Engineering Risk Analysis'.
2009-04-16
Aeronautics Technical Seminar: Dr. Elisabeth Pate-Cornell, Burt and Deedee McMurtry professor and chair of the Department of Management Science and Engineering at Stanford University presents 'Lessons Learned in Applying Engineering Risk Analysis'.
2009-04-16
Aeronautics Technical Seminar: Dr. Elisabeth Pate-Cornell, Burt and Deedee McMurtry professor and chair of the Department of Management Science and Engineering at Stanford University presents 'Lessons Learned in Applying Engineering Risk Analysis'.
2009-04-16
Aeronautics Technical Seminar: Dr. Elisabeth Pate-Cornell, Burt and Deedee McMurtry professor and chair of the Department of Management Science and Engineering at Stanford University presents 'Lessons Learned in Applying Engineering Risk Analysis'.
A Case Study of Measuring Process Risk for Early Insights into Software Safety
NASA Technical Reports Server (NTRS)
Layman, Lucas; Basili, Victor; Zelkowitz, Marvin V.; Fisher, Karen L.
2011-01-01
In this case study, we examine software safety risk in three flight hardware systems in NASA's Constellation spaceflight program. We applied our Technical and Process Risk Measurement (TPRM) methodology to the Constellation hazard analysis process to quantify the technical and process risks involving software safety in the early design phase of these projects. We analyzed 154 hazard reports and collected metrics to measure the prevalence of software in hazards and the specificity of descriptions of software causes of hazardous conditions. We found that 49-70% of 154 hazardous conditions could be caused by software or software was involved in the prevention of the hazardous condition. We also found that 12-17% of the 2013 hazard causes involved software, and that 23-29% of all causes had a software control. The application of the TPRM methodology identified process risks in the application of the hazard analysis process itself that may lead to software safety risk.
2012-01-01
Background To demonstrate the use of risk-benefit analysis for comparing multiple competing interventions in the absence of randomized trials, we applied this approach to the evaluation of five anticoagulants to prevent thrombosis in patients undergoing orthopedic surgery. Methods Using a cost-effectiveness approach from a clinical perspective (i.e. risk benefit analysis) we compared thromboprophylaxis with warfarin, low molecular weight heparin, unfractionated heparin, fondaparinux or ximelagatran in patients undergoing major orthopedic surgery, with sub-analyses according to surgery type. Proportions and variances of events defining risk (major bleeding) and benefit (thrombosis averted) were obtained through a meta-analysis and used to define beta distributions. Monte Carlo simulations were conducted and used to calculate incremental risks, benefits, and risk-benefit ratios. Finally, net clinical benefit was calculated for all replications across a range of risk-benefit acceptability thresholds, with a reference range obtained by estimating the case fatality rate - ratio of thrombosis to bleeding. Results The analysis showed that compared to placebo ximelagatran was superior to other options but final results were influenced by type of surgery, since ximelagatran was superior in total knee replacement but not in total hip replacement. Conclusions Using simulation and economic techniques we demonstrate a method that allows comparing multiple competing interventions in the absence of randomized trials with multiple arms by determining the option with the best risk-benefit profile. It can be helpful in clinical decision making since it incorporates risk, benefit, and personal risk acceptance. PMID:22233221
Bacterial vaginosis in pregnancy and the risk of prematurity: a meta-analysis.
Flynn, C A; Helwig, A L; Meurer, L N
1999-11-01
We conducted this meta-analysis to determine the magnitude of risk conferred by bacterial vaginosis during pregnancy on preterm delivery. We selected articles from a combination of the results of a MEDLINE search (1966-1996), a manual search of bibliographies, and contact with leading researchers. We included case control and cohort studies evaluating the risk of preterm delivery, low birth weight, preterm premature rupture of membranes, or preterm labor for pregnant women who had bacterial vaginosis and those who did not. DATA COLLECTION AND ANALYSIS. Two investigators independently conducted literature searches, applied inclusion criteria, performed data extraction, and critically appraised included studies. Summary estimates of risk were calculated as odds ratios (ORs) using the fixed and random effects models. We included 19 studies in the final analysis. Bacterial vaginosis during pregnancy was associated with a statistically significant increased risk for all outcomes evaluated. In the subanalyses for preterm delivery, bacterial vaginosis remained a significant risk factor. Pooling adjusted ORs yielded a 60% increased risk of preterm delivery given the presence of bacterial vaginosis. Bacterial vaginosis is an important risk factor for prematurity and pregnancy morbidity. Further studies will help clarify the benefits of treating bacterial vaginosis and the potential role of screening during pregnancy.
ERIC Educational Resources Information Center
Sanders, Jackie; Munford, Robyn; Thimasarn-Anwar, Tewaporn; Liebenberg, Linda
2017-01-01
Purpose: This article reports on an examination of the psychometric properties of the 28-item Child and Youth Resilience Measure (CYRM-28). Methods: Exploratory factor analysis, confirmatory factor analysis, Cronbach's a, "t"-tests, correlations, and multivariate analysis of variance were applied to data collected via interviews from 593…
ERIC Educational Resources Information Center
Guilamo-Ramos; Vincent; Jaccard, James; Dittus, Patricia; Gonzalez, Bernardo; Bouris, Alida
2008-01-01
A framework for the analysis of adolescent problem behaviors was explicated that draws on five major theories of human behavior. The framework emphasizes intentions to perform behaviors and factors that influence intentions as well as moderate the impact of intentions on behavior. The framework was applied to the analysis of adolescent sexual risk…
Anthropic Risk Assessment on Biodiversity
NASA Astrophysics Data System (ADS)
Piragnolo, M.; Pirotti, F.; Vettore, A.; Salogni, G.
2013-01-01
This paper presents a methodology for risk assessment of anthropic activities on habitats and species. The method has been developed for Veneto Region, in order to simplify and improve the quality of EIA procedure (VINCA). Habitats and species, animals and plants, are protected by European Directive 92/43/EEC and 2009/147/EC but they are subject at hazard due to pollution produced by human activities. Biodiversity risks may conduct to deterioration and disturbance in ecological niches, with consequence of loss of biodiversity. Ecological risk assessment applied on Natura 2000 network, is needed to best practice of management and monitoring of environment and natural resources. Threats, pressure and activities, stress and indicators may be managed by geodatabase and analysed using GIS technology. The method used is the classic risk assessment in ecological context, and it defines the natural hazard as influence, element of risk as interference and vulnerability. Also it defines a new parameter called pressure. It uses risk matrix for the risk analysis on spatial and temporal scale. The methodology is qualitative and applies the precautionary principle in environmental assessment. The final product is a matrix which excludes the risk and could find application in the development of a territorial information system.
Through ARIPAR-GIS the quantified area risk analysis supports land-use planning activities.
Spadoni, G; Egidi, D; Contini, S
2000-01-07
The paper first summarises the main aspects of the ARIPAR methodology whose steps can be applied to quantify the impact on a territory of major accident risks due to processing, storing and transporting dangerous substances. Then the capabilities of the new decision support tool ARIPAR-GIS, implementing the mentioned procedure, are described, together with its main features and types of results. These are clearly shown through a short description of the updated ARIPAR study (reference year 1994), in which the impact of changes due to industrial and transportation dynamics on the Ravenna territory in Italy were evaluated. The brief explanation of how results have been used by local administrations offers the opportunity to discuss about advantages of the quantitative area risk analysis tool in supporting activities of risk management, risk control and land-use planning.
Translational benchmark risk analysis
Piegorsch, Walter W.
2010-01-01
Translational development – in the sense of translating a mature methodology from one area of application to another, evolving area – is discussed for the use of benchmark doses in quantitative risk assessment. Illustrations are presented with traditional applications of the benchmark paradigm in biology and toxicology, and also with risk endpoints that differ from traditional toxicological archetypes. It is seen that the benchmark approach can apply to a diverse spectrum of risk management settings. This suggests a promising future for this important risk-analytic tool. Extensions of the method to a wider variety of applications represent a significant opportunity for enhancing environmental, biomedical, industrial, and socio-economic risk assessments. PMID:20953283
NASA Technical Reports Server (NTRS)
1972-01-01
Nuclear safety analysis as applied to a space base mission is presented. The nuclear safety analysis document summarizes the mission and the credible accidents/events which may lead to nuclear hazards to the general public. The radiological effects and associated consequences of the hazards are discussed in detail. The probability of occurrence is combined with the potential number of individuals exposed to or above guideline values to provide a measure of accident and total mission risk. The overall mission risk has been determined to be low with the potential exposure to or above 25 rem limited to less than 4 individuals per every 1000 missions performed. No radiological risk to the general public occurs during the prelaunch phase at KSC. The most significant risks occur from prolonged exposure to reactor debris following land impact generally associated with the disposal phase of the mission where fission product inventories can be high.
Applying geologic sensitivity analysis to environmental risk management: The financial implications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rogers, D.T.
The financial risks associated with environmental contamination can be staggering and are often difficult to identify and accurately assess. Geologic sensitivity analysis is gaining recognition as a significant and useful tool that can empower the user with crucial information concerning environmental risk management and brownfield redevelopment. It is particularly useful when (1) evaluating the potential risks associated with redevelopment of historical industrial facilities (brownfields) and (2) planning for future development, especially in areas of rapid development because the number of potential contaminating sources often increases with an increase in economic development. An examination of the financial implications relating to geologicmore » sensitivity analysis in southeastern Michigan from numerous case studies indicate that the environmental cost of contamination may be 100 to 1,000 times greater at a geologically sensitive location compared to the least sensitive location. Geologic sensitivity analysis has demonstrated that near-surface geology may influence the environmental impact of a contaminated site to a greater extent than the amount and type of industrial development.« less
Network Analysis: A Novel Approach to Understand Suicidal Behaviour
de Beurs, Derek
2017-01-01
Although suicide is a major public health issue worldwide, we understand little of the onset and development of suicidal behaviour. Suicidal behaviour is argued to be the end result of the complex interaction between psychological, social and biological factors. Epidemiological studies resulted in a range of risk factors for suicidal behaviour, but we do not yet understand how their interaction increases the risk for suicidal behaviour. A new approach called network analysis can help us better understand this process as it allows us to visualize and quantify the complex association between many different symptoms or risk factors. A network analysis of data containing information on suicidal patients can help us understand how risk factors interact and how their interaction is related to suicidal thoughts and behaviour. A network perspective has been successfully applied to the field of depression and psychosis, but not yet to the field of suicidology. In this theoretical article, I will introduce the concept of network analysis to the field of suicide prevention, and offer directions for future applications and studies.
Decision strategies to reduce teenage and young adult deaths in the United States.
Keeney, Ralph L; Palley, Asa B
2013-09-01
This article uses decision analysis concepts and techniques to address an extremely important problem to any family with children, namely, how to avoid the tragic death of a child during the high-risk ages of 15-24. Descriptively, our analysis indicates that of the 35,000 annual deaths among this age group in the United States, approximately 20,000 could be avoided if individuals chose readily available alternatives for decisions relating to these deaths. Prescriptively, we develop a decision framework for parents and a child to both identify and proactively pursue decisions that can lower that child's exposure to life-threatening risks and positively alter decisions when facing such risks. Applying this framework for parents and the youth themselves, we illustrate the logic and process of generating proactive alternatives with numerous examples that each could pursue to lower these life-threatening risks and possibly avoid a tragic premature death, and discuss some public policy implications of our findings. © 2013 Society for Risk Analysis.
Biomechanical analysis on fracture risk associated with bone deformity
NASA Astrophysics Data System (ADS)
Kamal, Nur Amalina Nadiah Mustafa; Som, Mohd Hanafi Mat; Basaruddin, Khairul Salleh; Daud, Ruslizam
2017-09-01
Osteogenesis Imperfecta (OI) is a disease related to bone deformity and is also known as `brittle bone' disease. Currently, medical personnel predict the bone fracture solely based on their experience. In this study, the prediction for risk of fracture was carried out by using finite element analysis on the simulated OI bone of femur. The main objective of this research was to analyze the fracture risk of OI-affected bone with respect to various loadings. A total of 12 models of OI bone were developed by applying four load cases and the angle of deformation for each of the models was calculated. The models were differentiated into four groups, namely standard, light, mild and severe. The results show that only a small amount of load is required to increase the fracture risk of the bone when the model is tested with hopping conditions. The analysis also shows that the torsional load gives a small effect to the increase of the fracture risk of the bone.
Applications of Earth Observations for Fisheries Management: An analysis of socioeconomic benefits
NASA Astrophysics Data System (ADS)
Friedl, L.; Kiefer, D. A.; Turner, W.
2013-12-01
This paper will discuss the socioeconomic impacts of a project applying Earth observations and models to support management and conservation of tuna and other marine resources in the eastern Pacific Ocean. A project team created a software package that produces statistical analyses and dynamic maps of habitat for pelagic ocean biota. The tool integrates sea surface temperature and chlorophyll imagery from MODIS, ocean circulation models, and other data products. The project worked with the Inter-American Tropical Tuna Commission, which issues fishery management information, such as stock assessments, for the eastern Pacific region. The Commission uses the tool and broader habitat information to produce better estimates of stock and thus improve their ability to identify species that could be at risk of overfishing. The socioeconomic analysis quantified the relative value that Earth observations contributed to accurate stock size assessments through improvements in calculating population size. The analysis team calculated the first-order economic costs of a fishery collapse (or shutdown), and they calculated the benefits of improved estimates that reduce the uncertainty of stock size and thus reduce the risk of fishery collapse. The team estimated that the project reduced the probability of collapse of different fisheries, and the analysis generated net present values of risk mitigation. USC led the project with sponsorship from the NASA Earth Science Division's Applied Sciences Program, which conducted the socioeconomic impact analysis. The paper will discuss the project and focus primarily on the analytic methods, impact metrics, and the results of the socioeconomic benefits analysis.
Product Quality Improvement Using FMEA for Electric Parking Brake (EPB)
NASA Astrophysics Data System (ADS)
Dumitrescu, C. D.; Gruber, G. C.; Tişcă, I. A.
2016-08-01
One of the most frequently used methods to improve product quality is complex FMEA. (Failure Modes and Effects Analyses). In the literature various FMEA is known, depending on the mode and depending on the targets; we mention here some of these names: Failure Modes and Effects Analysis Process, or analysis Failure Mode and Effects Reported (FMECA). Whatever option is supported by the work team, the goal of the method is the same: optimize product design activities in research, design processes, implementation of manufacturing processes, optimization of mining product to beneficiaries. According to a market survey conducted on parts suppliers to vehicle manufacturers FMEA method is used in 75%. One purpose of the application is that after the research and product development is considered resolved, any errors which may be detected; another purpose of applying the method is initiating appropriate measures to avoid mistakes. Achieving these two goals leads to a high level distribution in applying, to avoid errors already in the design phase of the product, thereby avoiding the emergence and development of additional costs in later stages of product manufacturing. During application of FMEA method using standardized forms; with their help will establish the initial assemblies of product structure, in which all components will be viewed without error. The work is an application of the method FMEA quality components to optimize the structure of the electrical parking brake (Electric Parching Brake - E.P.B). This is a component attached to the roller system which ensures automotive replacement of conventional mechanical parking brake while ensuring its comfort, functionality, durability and saves space in the passenger compartment. The paper describes the levels at which they appealed in applying FMEA, working arrangements in the 4 distinct levels of analysis, and how to determine the number of risk (Risk Priority Number); the analysis of risk factors and established authors who have imposed measures to reduce / eliminate risk completely exploiting this complex product.
Ecological Risk Assessment with MCDM of Some Invasive Alien Plants in China
NASA Astrophysics Data System (ADS)
Xie, Guowen; Chen, Weiguang; Lin, Meizhen; Zheng, Yanling; Guo, Peiguo; Zheng, Yisheng
Alien plant invasion is an urgent global issue that threatens the sustainable development of the ecosystem health. The study of its ecological risk assessment (ERA) could help us to prevent and reduce the invasion risk more effectively. Based on the theory of ERA and methods of the analytic hierarchy process (AHP) of multi-criteria decision-making (MCDM), and through the analyses of the characteristics and processes of alien plant invasion, this paper discusses the methodologies of ERA of alien plant invasion. The assessment procedure consisted of risk source analysis, receptor analysis, exposure and hazard assessment, integral assessment, and countermeasure of risk management. The indicator system of risk source assessment as well as the indices and formulas applied to measure the ecological loss and risk were established, and the method for comprehensively assessing the ecological risk of alien plant invasion was worked out. The result of ecological risk analysis to 9 representative invasive alien plants in China shows that the ecological risk of Erigeron annuus, Ageratum conyzoides, Alternanthera philoxeroides and Mikania midrantha is high (grade1-2), that of Oxalis corymbosa and Wedelia chinensis comes next (grade3), while Mirabilis jalapa, Pilea microphylla and Calendula officinalis of the last (grade 4). Risk strategies are put forward on this basis.
Bates, Matthew E; Keisler, Jeffrey M; Zussblatt, Niels P; Plourde, Kenton J; Wender, Ben A; Linkov, Igor
2016-02-01
Risk research for nanomaterials is currently prioritized by means of expert workshops and other deliberative processes. However, analytical techniques that quantify and compare alternative research investments are increasingly recommended. Here, we apply value of information and portfolio decision analysis-methods commonly applied in financial and operations management-to prioritize risk research for multiwalled carbon nanotubes and nanoparticulate silver and titanium dioxide. We modify the widely accepted CB Nanotool hazard evaluation framework, which combines nano- and bulk-material properties into a hazard score, to operate probabilistically with uncertain inputs. Literature is reviewed to develop uncertain estimates for each input parameter, and a Monte Carlo simulation is applied to assess how different research strategies can improve hazard classification. The relative cost of each research experiment is elicited from experts, which enables identification of efficient research portfolios-combinations of experiments that lead to the greatest improvement in hazard classification at the lowest cost. Nanoparticle shape, diameter, solubility and surface reactivity were most frequently identified within efficient portfolios in our results.
Estimating risk of foreign exchange portfolio: Using VaR and CVaR based on GARCH-EVT-Copula model
NASA Astrophysics Data System (ADS)
Wang, Zong-Run; Chen, Xiao-Hong; Jin, Yan-Bo; Zhou, Yan-Ju
2010-11-01
This paper introduces GARCH-EVT-Copula model and applies it to study the risk of foreign exchange portfolio. Multivariate Copulas, including Gaussian, t and Clayton ones, were used to describe a portfolio risk structure, and to extend the analysis from a bivariate to an n-dimensional asset allocation problem. We apply this methodology to study the returns of a portfolio of four major foreign currencies in China, including USD, EUR, JPY and HKD. Our results suggest that the optimal investment allocations are similar across different Copulas and confidence levels. In addition, we find that the optimal investment concentrates on the USD investment. Generally speaking, t Copula and Clayton Copula better portray the correlation structure of multiple assets than Normal Copula.
Research on Upgrading Structures for Host and Risk Area Shelters
1982-09-01
both "as-built" and upgraded configurations. These analysis and prediction techniques have been applied to floors and roofs constructed of many...scale program and were previously applied to full-scale wood floor tests (Ref. 2). TEST ELEMENTS AND PROCEDURES Three tests were conducted on 8-inch...weights. A 14,000-lb crane counterweight was used for the preload, applying a load of 7,000 lb to each one-third point on the plank. The drop weight was
The development of the Problematic Online Gaming Questionnaire (POGQ).
Demetrovics, Zsolt; Urbán, Róbert; Nagygyörgy, Katalin; Farkas, Judit; Griffiths, Mark D; Pápay, Orsolya; Kökönyei, Gyöngyi; Felvinczi, Katalin; Oláh, Attila
2012-01-01
Online gaming has become increasingly popular. However, this has led to concerns that these games might induce serious problems and/or lead to dependence for a minority of players. The aim of this study was to uncover and operationalize the components of problematic online gaming. A total of 3415 gamers (90% males; mean age 21 years), were recruited through online gaming websites. A combined method of exploratory factor analysis (EFA) and confirmatory factor analysis (CFA) was applied. Latent profile analysis was applied to identify persons at-risk. EFA revealed a six-factor structure in the background of problematic online gaming that was also confirmed by a CFA. For the assessment of the identified six dimensions--preoccupation, overuse, immersion, social isolation, interpersonal conflicts, and withdrawal--the 18-item Problematic Online Gaming Questionnaire (POGQ) proved to be exceedingly suitable. Based on the latent profile analysis, 3.4% of the gamer population was considered to be at high risk, while another 15.2% was moderately problematic. The POGQ seems to be an adequate measurement tool for the differentiated assessment of gaming related problems on six subscales.
Goode, Natassia; Salmon, Paul M; Lenné, Michael G; Hillard, Peter
2014-07-01
Injuries resulting from manual handling tasks represent an on-going problem for the transport and storage industry. This article describes an application of a systems theory-based approach, Rasmussen's (1997. Safety Science 27, 183), risk management framework, to the analysis of the factors influencing safety during manual handling activities in a freight handling organisation. Observations of manual handling activities, cognitive decision method interviews with workers (n=27) and interviews with managers (n=35) were used to gather information about three manual handling activities. Hierarchical task analysis and thematic analysis were used to identify potential risk factors and performance shaping factors across the levels of Rasmussen's framework. These different data sources were then integrated using Rasmussen's Accimap technique to provide an overall analysis of the factors influencing safety during manual handling activities in this context. The findings demonstrate how a systems theory-based approach can be applied to this domain, and suggest that policy-orientated, rather than worker-orientated, changes are required to prevent future manual handling injuries. Copyright © 2013 Elsevier Ltd. All rights reserved.
How to make the most of failure mode and effect analysis.
Stalhandske, Erik; DeRosier, Joseph; Patail, Bryanne; Gosbee, John
2003-01-01
Current accreditation standards issued by the Joint Commission for the Accreditation of Healthcare Organizations (JCAHO) require hospitals to carry out a proactive risk assessment on at least 1 high-risk activity each year for each accredited program. Because hospital risk managers and patient safety managers generally do not have the knowledge or level of comfort for conducting a proactive risk assessment, they will appreciate the expertise offered by biomedical equipment technicians (BMETs), occupational safety and health professionals, and others. The skills that have been developed by BMETs and others while conducting job safety analyses or failure mode effect analysis can now be applied to a health care proactive analysis. This article touches on the Health Care Failure Mode and Effect Analysis (HFMEA) model that the Department of Veterans Affairs (VA) National Center for Patient Safety developed for proactive risk assessment within the health care community. The goal of this article is to enlighten BMETs and others on the growth of proactive risk assessment within health care and also on the support documents and materials produced by the VA. For additional information on HFMEA, visit the VA website at www.patientsafety.gov/HFMEA.html.
Ganga, G M D; Esposto, K F; Braatz, D
2012-01-01
The occupational exposure limits of different risk factors for development of low back disorders (LBDs) have not yet been established. One of the main problems in setting such guidelines is the limited understanding of how different risk factors for LBDs interact in causing injury, since the nature and mechanism of these disorders are relatively unknown phenomena. Industrial ergonomists' role becomes further complicated because the potential risk factors that may contribute towards the onset of LBDs interact in a complex manner, which makes it difficult to discriminate in detail among the jobs that place workers at high or low risk of LBDs. The purpose of this paper was to develop a comparative study between predictions based on the neural network-based model proposed by Zurada, Karwowski & Marras (1997) and a linear discriminant analysis model, for making predictions about industrial jobs according to their potential risk of low back disorders due to workplace design. The results obtained through applying the discriminant analysis-based model proved that it is as effective as the neural network-based model. Moreover, the discriminant analysis-based model proved to be more advantageous regarding cost and time savings for future data gathering.
Simulation of investment returns of toll projects.
DOT National Transportation Integrated Search
2013-08-01
This research develops a methodological framework to illustrate key stages in applying the simulation of investment returns of toll projects, acting as an example process of helping agencies conduct numerical risk analysis by taking certain uncertain...
High-performance concrete : applying life-cycle cost analysis and developing specifications.
DOT National Transportation Integrated Search
2016-12-01
Numerous studies and transportation agency experience across the nation have established that highperformance concrete (HPC) technology improves concrete quality and extends the service life of concrete structures at risk of chlorideinduced cor...
A Multifaceted Mathematical Approach for Complex Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alexander, F.; Anitescu, M.; Bell, J.
2012-03-07
Applied mathematics has an important role to play in developing the tools needed for the analysis, simulation, and optimization of complex problems. These efforts require the development of the mathematical foundations for scientific discovery, engineering design, and risk analysis based on a sound integrated approach for the understanding of complex systems. However, maximizing the impact of applied mathematics on these challenges requires a novel perspective on approaching the mathematical enterprise. Previous reports that have surveyed the DOE's research needs in applied mathematics have played a key role in defining research directions with the community. Although these reports have had significantmore » impact, accurately assessing current research needs requires an evaluation of today's challenges against the backdrop of recent advances in applied mathematics and computing. To address these needs, the DOE Applied Mathematics Program sponsored a Workshop for Mathematics for the Analysis, Simulation and Optimization of Complex Systems on September 13-14, 2011. The workshop had approximately 50 participants from both the national labs and academia. The goal of the workshop was to identify new research areas in applied mathematics that will complement and enhance the existing DOE ASCR Applied Mathematics Program efforts that are needed to address problems associated with complex systems. This report describes recommendations from the workshop and subsequent analysis of the workshop findings by the organizing committee.« less
Organizational sensemaking about risk controls: the case of offshore hydrocarbons production.
Busby, J S; Collins, A M
2014-09-01
In the same way that individuals' risk perceptions can influence how they behave toward risks, how organizational members make sense of risk controls is an important influence on how they apply and maintain such controls. In this article, we describe an analysis of sensemaking about the control of risk in offshore hydrocarbons production, an industry that continues to produce disasters of societal significance. A field study of 80 interviews was conducted in five offshore oil and gas companies and the agency that regulates them. The interviews were analyzed using qualitative template analysis. This provided a categorization of the many ways of acting through which informants made sense of the risk control task, and indicated that the organizations placed substantially different emphases on different ways of acting. Nevertheless, this sensemaking fell into two broad classes: that which tended to limit or be pessimistic about organizational controls, and that which tended to extend or be optimistic about organizational controls. All the participating organizations collectively placed a balanced emphasis on these two classes. We argue that this balanced sensemaking is an adaptation rather than a deliberate choice, but that it is an important element of controlling risk in its own right. © 2014 Society for Risk Analysis.
A semi-quantitative approach to GMO risk-benefit analysis.
Morris, E Jane
2011-10-01
In many countries there are increasing calls for the benefits of genetically modified organisms (GMOs) to be considered as well as the risks, and for a risk-benefit analysis to form an integral part of GMO regulatory frameworks. This trend represents a shift away from the strict emphasis on risks, which is encapsulated in the Precautionary Principle that forms the basis for the Cartagena Protocol on Biosafety, and which is reflected in the national legislation of many countries. The introduction of risk-benefit analysis of GMOs would be facilitated if clear methodologies were available to support the analysis. Up to now, methodologies for risk-benefit analysis that would be applicable to the introduction of GMOs have not been well defined. This paper describes a relatively simple semi-quantitative methodology that could be easily applied as a decision support tool, giving particular consideration to the needs of regulators in developing countries where there are limited resources and experience. The application of the methodology is demonstrated using the release of an insect resistant maize variety in South Africa as a case study. The applicability of the method in the South African regulatory system is also discussed, as an example of what might be involved in introducing changes into an existing regulatory process.
Improved FTA methodology and application to subsea pipeline reliability design.
Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan
2014-01-01
An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form.
Improved FTA Methodology and Application to Subsea Pipeline Reliability Design
Lin, Jing; Yuan, Yongbo; Zhang, Mingyuan
2014-01-01
An innovative logic tree, Failure Expansion Tree (FET), is proposed in this paper, which improves on traditional Fault Tree Analysis (FTA). It describes a different thinking approach for risk factor identification and reliability risk assessment. By providing a more comprehensive and objective methodology, the rather subjective nature of FTA node discovery is significantly reduced and the resulting mathematical calculations for quantitative analysis are greatly simplified. Applied to the Useful Life phase of a subsea pipeline engineering project, the approach provides a more structured analysis by constructing a tree following the laws of physics and geometry. Resulting improvements are summarized in comparison table form. PMID:24667681
Environmental Risk Assessment of dredging processes - application to Marin harbour (NW Spain)
NASA Astrophysics Data System (ADS)
Gómez, A. G.; García Alba, J.; Puente, A.; Juanes, J. A.
2014-04-01
A methodological procedure to estimate the environmental risk of dredging operations in aquatic systems has been developed. Environmental risk estimations are based on numerical models results, which provide an appropriated spatio-temporal framework analysis to guarantee an effective decision-making process. The methodological procedure has been applied on a real dredging operation in the port of Marin (NW Spain). Results from Marin harbour confirmed the suitability of the developed methodology and the conceptual approaches as a comprehensive and practical management tool.
NASA Astrophysics Data System (ADS)
Sugarindra, Muchamad; Ragil Suryoputro, Muhammad; Tiya Novitasari, Adi
2017-06-01
Plantation company needed to identify hazard and perform risk assessment as an Identification of Hazard and Risk Assessment Crime and Safety which was approached by using JSA (Job Safety Analysis). The identification was aimed to identify the potential hazards that might be the risk of workplace accidents so that preventive action could be taken to minimize the accidents. The data was collected by direct observation to the workers concerned and the results were recorded on a Job Safety Analysis form. The data were as forklift operator, macerator worker, worker’s creeper, shredder worker, workers’ workshop, mechanical line worker, trolley cleaning workers and workers’ crepe decline. The result showed that shredder worker value was 30 and had the working level with extreme risk with the risk value range was above 20. So to minimize the accidents could provide Personal Protective Equipment (PPE) which were appropriate, information about health and safety, the company should have watched the activities of workers, and rewards for the workers who obey the rules that applied in the plantation.
Bahouth, George; Digges, Kennerly; Schulman, Carl
2012-01-01
This paper presents methods to estimate crash injury risk based on crash characteristics captured by some passenger vehicles equipped with Advanced Automatic Crash Notification technology. The resulting injury risk estimates could be used within an algorithm to optimize rescue care. Regression analysis was applied to the National Automotive Sampling System / Crashworthiness Data System (NASS/CDS) to determine how variations in a specific injury risk threshold would influence the accuracy of predicting crashes with serious injuries. The recommended thresholds for classifying crashes with severe injuries are 0.10 for frontal crashes and 0.05 for side crashes. The regression analysis of NASS/CDS indicates that these thresholds will provide sensitivity above 0.67 while maintaining a positive predictive value in the range of 0.20. PMID:23169132
Read, Gemma J M; Salmon, Paul M; Lenné, Michael G; Stanton, Neville A
2016-03-01
Pedestrian fatalities at rail level crossings (RLXs) are a public safety concern for governments worldwide. There is little literature examining pedestrian behaviour at RLXs and no previous studies have adopted a formative approach to understanding behaviour in this context. In this article, cognitive work analysis is applied to understand the constraints that shape pedestrian behaviour at RLXs in Melbourne, Australia. The five phases of cognitive work analysis were developed using data gathered via document analysis, behavioural observation, walk-throughs and critical decision method interviews. The analysis demonstrates the complex nature of pedestrian decision making at RLXs and the findings are synthesised to provide a model illustrating the influences on pedestrian decision making in this context (i.e. time, effort and social pressures). Further, the CWA outputs are used to inform an analysis of the risks to safety associated with pedestrian behaviour at RLXs and the identification of potential interventions to reduce risk. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.
Moutel, G; Hergon, E; Duchange, N; Bellier, L; Rouger, P; Hervé, C
2005-02-01
The precautionary principle first appeared in France during the health crisis following the contamination of patients with HIV via blood transfusion. This study analyses whether the risk associated with blood transfusion was taken into account early enough considering the context of scientific uncertainty between 1982 and 1985. The aim was to evaluate whether a precautionary principle was applied and whether it was relevant. First, we investigated the context of scientific uncertainty and controversies prevailing between 1982 and 1985. Then we analysed the attitude and decisions of the French authorities in this situation to determine whether a principle of precaution was applied. Finally, we explored the reasons at the origin of the delay in controlling the risk. Despite the scientific uncertainties associated with the potential risk of HIV contamination by transfusion in 1983, we found that a list of recommendations aiming to reduce this risk was published in June of that year. In the prevailing climate of uncertainty, these measures could be seen as precautionary. However, the recommended measures were not widely applied. Cultural, structural and economic factors hindered their implementation. Our analysis provides insight into the use of precautionary principle in the domain of blood transfusion and, more generally, medicine. It also sheds light on the expectations that health professionals should have of this principle. The aim of the precautionary principle is to manage rather than to reduce scientific uncertainty. The principle is not a futile search for zero risk. Rather, it is a principle for action allowing precautionary measures to be taken. However, we show that these measures must appear legitimate to be applied. This legitimacy requires an adapted decision-making process, involving all those concerned in the management of collective risks.
Ponzetti, Clemente; Canciani, Monica; Farina, Massimo; Era, Sara; Walzer, Stefan
2016-01-01
In oncology, an important parameter of safety is the potential treatment error in hospitals. The analyzed hypothesis is that of subcutaneous therapies would provide a superior safety benefit over intravenous therapies through fixed-dose administrations, when analyzed with trastuzumab and rituximab. For the calculation of risk levels, the Failure Mode and Effect Analysis approach was applied. Within this approach, the critical treatment path is followed and risk classification for each individual step is estimated. For oncology and hematology administration, 35 different risk steps were assessed. The study was executed in 17 hematology and 16 breast cancer centers in Italy. As intravenous and subcutaneous were the only injection routes in medical available for trastuzumab and rituximab in oncology at the time of the study, these two therapies were chosen. When the risk classes were calculated, eight high-risk areas were identified for the administration of an intravenous therapy in hematology or oncology; 13 areas would be defined as having a median-risk classification and 14 areas as having a low-risk classification (total risk areas: n=35). When the new subcutaneous formulation would be applied, 23 different risk levels could be completely eliminated (65% reduction). Important high-risk classes such as dose calculation, preparation and package labeling, preparation of the access to the vein, pump infusion preparation, and infusion monitoring were included in the eliminations. The overall risk level for the intravenous administration was estimated to be 756 (ex-ante) and could be reduced by 70% (ex-post). The potential harm compensation for errors related to pharmacy would be decreased from eight risk classes to only three risk classes. The subcutaneous administration of trastuzumab (breast cancer) and rituximab (hematology) might lower the risk of administration and treatment errors for patients and could hence indirectly have a positive financial impact for hospitals.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dandini, Vincent John; Duran, Felicia Angelica; Wyss, Gregory Dane
2003-09-01
This article describes how features of event tree analysis and Monte Carlo-based discrete event simulation can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology, with some of the best features of each. The resultant object-based event scenario tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible. Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST methodology is then applied to anmore » aviation safety problem that considers mechanisms by which an aircraft might become involved in a runway incursion incident. The resulting OBEST model demonstrates how a close link between human reliability analysis and probabilistic risk assessment methods can provide important insights into aviation safety phenomenology.« less
ERIC Educational Resources Information Center
Kim, Dongil; Kim, Woori; Koh, Hyejung; Lee, Jaeho; Shin, Jaehyun; Kim, Heeju
2014-01-01
The purpose of this study was to identify students at risk of reading comprehension difficulties by using the responsiveness to intervention (RTI) approach. The participants were 177 students in Grades 1-3 in three elementary schools in South Korea. The students received Tier 1 instruction of RTI from March to May 2011, and their performance was…
Milá, Lorely; Valdés, Rodolfo; Tamayo, Andrés; Padilla, Sigifredo; Ferro, Williams
2012-03-01
CB.Hep-1 monoclonal antibody (mAb) is used for a recombinant Hepatitis B vaccine manufacturing, which is included in a worldwide vaccination program against Hepatitis B disease. The use of this mAb as immunoligand has been addressed into one of the most efficient steps of active pharmaceutical ingredient purification process. Regarding this, Quality Risk Management (QRM) provides an excellent framework for the risk management use in pharmaceutical manufacturing and quality decision-making applications. Consequently, this study sought applying a prospective risk analysis methodology Failure Mode Effects Analysis (FMEA) as QRM tool for analyzing different CB.Hep-1 mAb manufacturing technologies. As main conclusions FMEA was successfully used to assess risks associated with potential problems in CB.Hep-1 mAb manufacturing processes. The severity and occurrence of risks analysis evidenced that the percentage of very high severe risks ranged 31.0-38.7% of all risks and the huge majority of risks have a very low occurrence level (61.9-83.3%) in all assessed technologies. Finally, additive Risk Priority Number, was descending ordered as follow: transgenic plants (2636), ascites (2577), transgenic animals (2046) and hollow fiber bioreactors (1654), which also corroborated that in vitro technology, should be the technology of choice for CB.Hep-1 mAb manufacturing in terms of risks and mAb molecule quality. Copyright © 2011 The International Alliance for Biological Standardization. Published by Elsevier Ltd. All rights reserved.
Zhao, Lue Ping; Carlsson, Annelie; Larsson, Helena Elding; Forsander, Gun; Ivarsson, Sten A; Kockum, Ingrid; Ludvigsson, Johnny; Marcus, Claude; Persson, Martina; Samuelsson, Ulf; Örtqvist, Eva; Pyo, Chul-Woo; Bolouri, Hamid; Zhao, Michael; Nelson, Wyatt C; Geraghty, Daniel E; Lernmark, Åke
2017-11-01
It is of interest to predict possible lifetime risk of type 1 diabetes (T1D) in young children for recruiting high-risk subjects into longitudinal studies of effective prevention strategies. Utilizing a case-control study in Sweden, we applied a recently developed next generation targeted sequencing technology to genotype class II genes and applied an object-oriented regression to build and validate a prediction model for T1D. In the training set, estimated risk scores were significantly different between patients and controls (P = 8.12 × 10 -92 ), and the area under the curve (AUC) from the receiver operating characteristic (ROC) analysis was 0.917. Using the validation data set, we validated the result with AUC of 0.886. Combining both training and validation data resulted in a predictive model with AUC of 0.903. Further, we performed a "biological validation" by correlating risk scores with 6 islet autoantibodies, and found that the risk score was significantly correlated with IA-2A (Z-score = 3.628, P < 0.001). When applying this prediction model to the Swedish population, where the lifetime T1D risk ranges from 0.5% to 2%, we anticipate identifying approximately 20 000 high-risk subjects after testing all newborns, and this calculation would identify approximately 80% of all patients expected to develop T1D in their lifetime. Through both empirical and biological validation, we have established a prediction model for estimating lifetime T1D risk, using class II HLA. This prediction model should prove useful for future investigations to identify high-risk subjects for prevention research in high-risk populations. Copyright © 2017 John Wiley & Sons, Ltd.
Schinasi, L; De Roos, AJ; Ray, RM; Edlefsen, KL; Parks, CG; Howard, BV; Meliker; Bonner, MR; Wallace, RB; LaCroix, AZ
2017-01-01
Purpose Relationships of farm history and insecticide exposure at home or work with lymphohematopoietic (LH) neoplasm risk were investigated in a large prospective cohort of United States women. Methods In questionnaires, women self-reported history living or working on a farm, personally mixing or applying insecticides, insecticide application in the home or workplace by a commercial service, and treating pets with insecticides. Relationships with non-Hodgkin lymphoma (NHL), chronic lymphocytic leukemia/small lymphocytic lymphoma (CLL/SLL), diffuse large B-cell lymphoma (DLBCL), follicular lymphoma, plasma cell neoplasms, and myeloid leukemia were investigated using Cox proportional hazard models. Age and farming history were explored as effect modifiers. Results The analysis included 76,493 women and 822 NHL cases. Women who ever lived or worked on a farm had 1.12 times the risk of NHL (95% CI: 0.95–1.32) compared to those who did not. Women who reported that a commercial service ever applied insecticides in their immediate surroundings had 65% higher risk of CLL/SLL (95% CI: 1.15–2.38). Women younger than 65 who ever applied insecticides had 87% higher risk of DLBCL (95% CI: 1.13–3.09). Conclusions Insecticide exposures may contribute to risk of CLL/SLL and DLBCL. Future studies should examine relationships of LH subtypes with specific types of household insecticides. PMID:26365305
Jungle Act Featured in RPLR Road Show.
ERIC Educational Resources Information Center
Kozlowski, James C.
1984-01-01
Some rules of law and legal analysis applied by courts to resolve issues of personal injury and negligence liability in youth sports programs are discussed in this article. Assumption of risk and responsibility for injury are explored. (DF)
A novel risk classification system for 30-day mortality in children undergoing surgery
Walter, Arianne I.; Jones, Tamekia L.; Huang, Eunice Y.; Davis, Robert L.
2018-01-01
A simple, objective and accurate way of grouping children undergoing surgery into clinically relevant risk groups is needed. The purpose of this study, is to develop and validate a preoperative risk classification system for postsurgical 30-day mortality for children undergoing a wide variety of operations. The National Surgical Quality Improvement Project-Pediatric participant use file data for calendar years 2012–2014 was analyzed to determine preoperative variables most associated with death within 30 days of operation (D30). Risk groups were created using classification tree analysis based on these preoperative variables. The resulting risk groups were validated using 2015 data, and applied to neonates and higher risk CPT codes to determine validity in high-risk subpopulations. A five-level risk classification was found to be most accurate. The preoperative need for ventilation, oxygen support, inotropic support, sepsis, the need for emergent surgery and a do not resuscitate order defined non-overlapping groups with observed rates of D30 that vary from 0.075% (Very Low Risk) to 38.6% (Very High Risk). When CPT codes where death was never observed are eliminated or when the system is applied to neonates, the groupings remained predictive of death in an ordinal manner. PMID:29351327
Humphries Choptiany, John Michael; Pelot, Ronald
2014-09-01
Multicriteria decision analysis (MCDA) has been applied to various energy problems to incorporate a variety of qualitative and quantitative criteria, usually spanning environmental, social, engineering, and economic fields. MCDA and associated methods such as life-cycle assessments and cost-benefit analysis can also include risk analysis to address uncertainties in criteria estimates. One technology now being assessed to help mitigate climate change is carbon capture and storage (CCS). CCS is a new process that captures CO2 emissions from fossil-fueled power plants and injects them into geological reservoirs for storage. It presents a unique challenge to decisionmakers (DMs) due to its technical complexity, range of environmental, social, and economic impacts, variety of stakeholders, and long time spans. The authors have developed a risk assessment model using a MCDA approach for CCS decisions such as selecting between CO2 storage locations and choosing among different mitigation actions for reducing risks. The model includes uncertainty measures for several factors, utility curve representations of all variables, Monte Carlo simulation, and sensitivity analysis. This article uses a CCS scenario example to demonstrate the development and application of the model based on data derived from published articles and publicly available sources. The model allows high-level DMs to better understand project risks and the tradeoffs inherent in modern, complex energy decisions. © 2014 Society for Risk Analysis.
Risk analysis of heat recovery steam generator with semi quantitative risk based inspection API 581
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prayogo, Galang Sandy, E-mail: gasandylang@live.com; Haryadi, Gunawan Dwi; Ismail, Rifky
Corrosion is a major problem that most often occurs in the power plant. Heat recovery steam generator (HRSG) is an equipment that has a high risk to the power plant. The impact of corrosion damage causing HRSG power plant stops operating. Furthermore, it could be threaten the safety of employees. The Risk Based Inspection (RBI) guidelines by the American Petroleum Institute (API) 58 has been used to risk analysis in the HRSG 1. By using this methodology, the risk that caused by unexpected failure as a function of the probability and consequence of failure can be estimated. This paper presentedmore » a case study relating to the risk analysis in the HRSG, starting with a summary of the basic principles and procedures of risk assessment and applying corrosion RBI for process industries. The risk level of each HRSG equipment were analyzed: HP superheater has a medium high risk (4C), HP evaporator has a medium-high risk (4C), and the HP economizer has a medium risk (3C). The results of the risk assessment using semi-quantitative method of standard API 581 based on the existing equipment at medium risk. In the fact, there is no critical problem in the equipment components. Damage mechanisms were prominent throughout the equipment is thinning mechanism. The evaluation of the risk approach was done with the aim of reducing risk by optimizing the risk assessment activities.« less
Regulatory Science in Professional Education.
Akiyama, Hiroshi
2017-01-01
In the field of pharmaceutical sciences, the subject of regulatory science (RS) includes pharmaceuticals, food, and living environments. For pharmaceuticals, considering the balance between efficacy and safety is a point required for public acceptance, and in that balance, more importance is given to efficacy in curing disease. For food, however, safety is the most important consideration for public acceptance because food should be essentially free of risk. To ensure food safety, first, any hazard that is an agent in food or condition of food with the potential to cause adverse health effects should be identified and characterized. Then the risk that it will affect public health is scientifically analyzed. This process is called risk assessment. Second, risk management should be conducted to reduce a risk that has the potential to affect public health found in a risk assessment. Furthermore, risk communication, which is the interactive exchange of information and opinions concerning risk and risk management among risk assessors, risk managers, consumers, and other interested parties, should be conducted. Food safety is ensured based on risk analysis consisting of the three components of risk assessment, risk management, and risk communication. RS in the field of food safety supports risk analysis, such as scientific research and development of test methods to evaluate food quality, efficacy, and safety. RS is also applied in the field of living environments because the safety of environmental chemical substances is ensured based on risk analysis, similar to that conducted for food.
Stocker, Elena; Becker, Karin; Hate, Siddhi; Hohl, Roland; Schiemenz, Wolfgang; Sacher, Stephan; Zimmer, Andreas; Salar-Behzadi, Sharareh
2017-01-01
This study aimed to apply quality risk management based on the The International Conference on Harmonisation guideline Q9 for the early development stage of hot melt coated multiparticulate systems for oral administration. N-acetylcysteine crystals were coated with a formulation composing tripalmitin and polysorbate 65. The critical quality attributes (CQAs) were initially prioritized using failure mode and effects analysis. The CQAs of the coated material were defined as particle size, taste-masking efficiency, and immediate release profile. The hot melt coated process was characterized via a flowchart, based on the identified potential critical process parameters (CPPs) and their impact on the CQAs. These CPPs were prioritized using a process failure mode, effects, and criticality analysis and their critical impact on the CQAs was experimentally confirmed using a statistical design of experiments. Spray rate, atomization air pressure, and air flow rate were identified as CPPs. Coating amount and content of polysorbate 65 in the coating formulation were identified as critical material attributes. A hazard and critical control points analysis was applied to define control strategies at the critical process points. A fault tree analysis evaluated causes for potential process failures. We successfully demonstrated that a standardized quality risk management approach optimizes the product development sustainability and supports the regulatory aspects. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
Garnett, Kenisha; Parsons, David J
2017-03-01
The precautionary principle was formulated to provide a basis for political action to protect the environment from potentially severe or irreversible harm in circumstances of scientific uncertainty that prevent a full risk or cost-benefit analysis. It underpins environmental law in the European Union and has been extended to include public health and consumer safety. The aim of this study was to examine how the precautionary principle has been interpreted and subsequently applied in practice, whether these applications were consistent, and whether they followed the guidance from the Commission. A review of the literature was used to develop a framework for analysis, based on three attributes: severity of potential harm, standard of evidence (or degree of uncertainty), and nature of the regulatory action. This was used to examine 15 pieces of legislation or judicial decisions. The decision whether or not to apply the precautionary principle appears to be poorly defined, with ambiguities inherent in determining what level of uncertainty and significance of hazard justifies invoking it. The cases reviewed suggest that the Commission's guidance was not followed consistently in forming legislation, although judicial decisions tended to be more consistent and to follow the guidance by requiring plausible evidence of potential hazard in order to invoke precaution. © 2016 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Hansen, Christian; Schlichting, Stefan; Zidowitz, Stephan; Köhn, Alexander; Hindennach, Milo; Kleemann, Markus; Peitgen, Heinz-Otto
2008-03-01
Tumor resections from the liver are complex surgical interventions. With recent planning software, risk analyses based on individual liver anatomy can be carried out preoperatively. However, additional tumors within the liver are frequently detected during oncological interventions using intraoperative ultrasound. These tumors are not visible in preoperative data and their existence may require changes to the resection strategy. We propose a novel method that allows an intraoperative risk analysis adaptation by merging newly detected tumors with a preoperative risk analysis. To determine the exact positions and sizes of these tumors we make use of a navigated ultrasound-system. A fast communication protocol enables our application to exchange crucial data with this navigation system during an intervention. A further motivation for our work is to improve the visual presentation of a moving ultrasound plane within a complex 3D planning model including vascular systems, tumors, and organ surfaces. In case the ultrasound plane is located inside the liver, occlusion of the ultrasound plane by the planning model is an inevitable problem for the applied visualization technique. Our system allows the surgeon to focus on the ultrasound image while perceiving context-relevant planning information. To improve orientation ability and distance perception, we include additional depth cues by applying new illustrative visualization algorithms. Preliminary evaluations confirm that in case of intraoperatively detected tumors a risk analysis adaptation is beneficial for precise liver surgery. Our new GPU-based visualization approach provides the surgeon with a simultaneous visualization of planning models and navigated 2D ultrasound data while minimizing occlusion problems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheong, S-K; Kim, J
Purpose: The aim of the study is the application of a Failure Modes and Effects Analysis (FMEA) to access the risks for patients undergoing a Low Dose Rate (LDR) Prostate Brachytherapy Treatment. Methods: FMEA was applied to identify all the sub processes involved in the stages of identifying patient, source handling, treatment preparation, treatment delivery, and post treatment. These processes characterize the radiation treatment associated with LDR Prostate Brachytherapy. The potential failure modes together with their causes and effects were identified and ranked in order of their importance. Three indexes were assigned for each failure mode: the occurrence rating (O),more » the severity rating (S), and the detection rating (D). A ten-point scale was used to score each category, ten being the number indicating most severe, most frequent, and least detectable failure mode, respectively. The risk probability number (RPN) was calculated as a product of the three attributes: RPN = O X S x D. The analysis was carried out by a working group (WG) at UPMC. Results: The total of 56 failure modes were identified including 32 modes before the treatment, 13 modes during the treatment, and 11 modes after the treatment. In addition to the protocols already adopted in the clinical practice, the prioritized risk management will be implanted to the high risk procedures on the basis of RPN score. Conclusion: The effectiveness of the FMEA method was established. The FMEA methodology provides a structured and detailed assessment method for the risk analysis of the LDR Prostate Brachytherapy Procedure and can be applied to other radiation treatment modes.« less
Downside Risk analysis applied to the Hedge Funds universe
NASA Astrophysics Data System (ADS)
Perelló, Josep
2007-09-01
Hedge Funds are considered as one of the portfolio management sectors which shows a fastest growing for the past decade. An optimal Hedge Fund management requires an appropriate risk metrics. The classic CAPM theory and its Ratio Sharpe fail to capture some crucial aspects due to the strong non-Gaussian character of Hedge Funds statistics. A possible way out to this problem while keeping the CAPM simplicity is the so-called Downside Risk analysis. One important benefit lies in distinguishing between good and bad returns, that is: returns greater or lower than investor's goal. We revisit most popular Downside Risk indicators and provide new analytical results on them. We compute these measures by taking the Credit Suisse/Tremont Investable Hedge Fund Index Data and with the Gaussian case as a benchmark. In this way, an unusual transversal lecture of the existing Downside Risk measures is provided.
Hazmat transport: a methodological framework for the risk analysis of marshalling yards.
Cozzani, Valerio; Bonvicini, Sarah; Spadoni, Gigliola; Zanelli, Severino
2007-08-17
A methodological framework was outlined for the comprehensive risk assessment of marshalling yards in the context of quantified area risk analysis. Three accident typologies were considered for yards: (i) "in-transit-accident-induced" releases; (ii) "shunting-accident-induced" spills; and (iii) "non-accident-induced" leaks. A specific methodology was developed for the assessment of expected release frequencies and equivalent release diameters, based on the application of HazOp and Fault Tree techniques to reference schemes defined for the more common types of railcar vessels used for "hazmat" transportation. The approach was applied to the assessment of an extended case-study. The results evidenced that "non-accident-induced" leaks in marshalling yards represent an important contribution to the overall risk associated to these zones. Furthermore, the results confirmed the considerable role of these fixed installations to the overall risk associated to "hazmat" transportation.
Wu, Bing; Zhang, Yan; Zhang, Xu-Xiang; Cheng, Shu-Pei
2011-12-01
A carcinogenic risk assessment of polycyclic aromatic hydrocarbons (PAHs) in source water and drinking water of China was conducted using probabilistic techniques from a national perspective. The published monitoring data of PAHs were gathered and converted into BaP equivalent (BaP(eq)) concentrations. Based on the transformed data, comprehensive risk assessment was performed by considering different age groups and exposure pathways. Monte Carlo simulation and sensitivity analysis were applied to quantify uncertainties of risk estimation. The risk analysis indicated that, the risk values for children and teens were lower than the accepted value (1.00E-05), indicating no significant carcinogenic risk. The probability of risk values above 1.00E-05 was 5.8% and 6.7% for adults and lifetime groups, respectively. Overall, carcinogenic risks of PAHs in source water and drinking water of China were mostly accepted. However, specific regions, such as Yellow river of Lanzhou reach and Qiantang river should be paid more attention. Notwithstanding the uncertainties inherent in the risk assessment, this study is the first attempt to provide information on carcinogenic risk of PAHs in source water and drinking water of China, and might be useful for potential strategies of carcinogenic risk management and reduction. Copyright © 2011 Elsevier B.V. All rights reserved.
Li, Chen; Yichao, Jin; Jiaxin, Lin; Yueting, Zhang; Qin, Lu; Tonghua, Yang
2015-01-01
Reported evidence supports a role for methylenetetrahydrofolate reductase (MTHFR) in the risk of chronic myelogenous leykemia (CML). However, these reports arrived at non-conclusive and even conflicting results regarding the association between two common MTHFR polymorphisms (C677T and A1298C) and CML risk. Thus, a meta-analysis was carried out to clarify a more precise association between these two polymorphisms and the CML risk by updating the available publications. Pooled odds ratios (OR) with corresponding 95% confidence interval (95% CI) and stratification analysis were performed to estimate the relationship between MTHFR polymorphisms and the risk of CML under different genetic comparison models. Data from the meta-analysis showed no significant association between MTHFR C677T polymorphism and CML risk. However, significant associations were found between MTHFR A1298C variants and CML risk under homozygous comparison model (CC vs AA, OR=1.62, 95% CI=1.11-2.36, p=0.01) and dominant comparison model (CC+AC vs AA, OR=1.68, 95% CI=1.17-2.43, p=0.005) in overall population; especially more obvious impacts were noticed for Asian populations in subgroup analysis for homozygous model (CC vs AA, OR=2.00, 95% CI=1.25-3.21, p=0.004) and dominant model (CC+AC vs AA, OR=2.49, 95% CI=1.42-4.36, p=0.001), but this did not apply in Caucasian populations. The results of this meta-analysis suggested no significant association between MTHFR C677T polymorphism and CML risk, while an increased CML risk was noticed for 1298C variant carriers, especially in Asian populations but not in Caucasian populations, which suggested ethnicity differences between MTHFR A1298C polymorphisms and risk of CML.
The role of building models in the evaluation of heat-related risks
NASA Astrophysics Data System (ADS)
Buchin, Oliver; Jänicke, Britta; Meier, Fred; Scherer, Dieter; Ziegler, Felix
2016-04-01
Hazard-risk relationships in epidemiological studies are generally based on the outdoor climate, despite the fact that most of humans' lifetime is spent indoors. By coupling indoor and outdoor climates with a building model, the risk concept developed can still be based on the outdoor conditions but also includes exposure to the indoor climate. The influence of non-linear building physics and the impact of air conditioning on heat-related risks can be assessed in a plausible manner using this risk concept. For proof of concept, the proposed risk concept is compared to a traditional risk analysis. As an example, daily and city-wide mortality data of the age group 65 and older in Berlin, Germany, for the years 2001-2010 are used. Four building models with differing complexity are applied in a time-series regression analysis. This study shows that indoor hazard better explains the variability in the risk data compared to outdoor hazard, depending on the kind of building model. Simplified parameter models include the main non-linear effects and are proposed for the time-series analysis. The concept shows that the definitions of heat events, lag days, and acclimatization in a traditional hazard-risk relationship are influenced by the characteristics of the prevailing building stock.
Yang, Jun; Goddard, Ellen
2011-01-01
Cluster analysis is applied in this study to group Canadian households by two characteristics, their risk perceptions and risk attitudes toward beef. There are some similarities in demographic profiles, meat purchases, and bovine spongiform encephalopathy (BSE) media recall between the cluster that perceives beef to be the most risky and the cluster that has little willingness to accept the risks of eating beef. There are similarities between the medium risk perception cluster and the medium risk attitude cluster, as well as between the cluster that perceives beef to have little risk and the cluster that is most willing to accept the risks of eating beef. Regression analysis shows that risk attitudes have a larger impact on household-level beef purchasing decisions than do risk perceptions for all consumer clusters. This implies that it may be more effective to undertake policies that reduce the risks associated with eating beef, instead of enhancing risk communication to improve risk perceptions. Only for certain clusters with higher willingness to accept the risks of eating beef might enhancing risk communication increase beef consumption significantly. The different role of risk perceptions and risk attitudes in beef consumption needs to be recognized during the design of risk management policies.
A Western Dietary Pattern Increases Prostate Cancer Risk: A Systematic Review and Meta-Analysis.
Fabiani, Roberto; Minelli, Liliana; Bertarelli, Gaia; Bacci, Silvia
2016-10-12
Dietary patterns were recently applied to examine the relationship between eating habits and prostate cancer (PC) risk. While the associations between PC risk with the glycemic index and Mediterranean score have been reviewed, no meta-analysis is currently available on dietary patterns defined by "a posteriori" methods. A literature search was carried out (PubMed, Web of Science) to identify studies reporting the relationship between dietary patterns and PC risk. Relevant dietary patterns were selected and the risks estimated were calculated by a random-effect model. Multivariable-adjusted odds ratios (ORs), for a first-percentile increase in dietary pattern score, were combined by a dose-response meta-analysis. Twelve observational studies were included in the meta-analysis which identified a "Healthy pattern" and a "Western pattern". The Healthy pattern was not related to PC risk (OR = 0.96; 95% confidence interval (CI): 0.88-1.04) while the Western pattern significantly increased it (OR = 1.34; 95% CI: 1.08-1.65). In addition, the "Carbohydrate pattern", which was analyzed in four articles, was positively associated with a higher PC risk (OR = 1.64; 95% CI: 1.35-2.00). A significant linear trend between the Western ( p = 0.011) pattern, the Carbohydrate ( p = 0.005) pattern, and the increment of PC risk was observed. The small number of studies included in the meta-analysis suggests that further investigation is necessary to support these findings.
A Western Dietary Pattern Increases Prostate Cancer Risk: A Systematic Review and Meta-Analysis
Fabiani, Roberto; Minelli, Liliana; Bertarelli, Gaia; Bacci, Silvia
2016-01-01
Dietary patterns were recently applied to examine the relationship between eating habits and prostate cancer (PC) risk. While the associations between PC risk with the glycemic index and Mediterranean score have been reviewed, no meta-analysis is currently available on dietary patterns defined by “a posteriori” methods. A literature search was carried out (PubMed, Web of Science) to identify studies reporting the relationship between dietary patterns and PC risk. Relevant dietary patterns were selected and the risks estimated were calculated by a random-effect model. Multivariable-adjusted odds ratios (ORs), for a first-percentile increase in dietary pattern score, were combined by a dose-response meta-analysis. Twelve observational studies were included in the meta-analysis which identified a “Healthy pattern” and a “Western pattern”. The Healthy pattern was not related to PC risk (OR = 0.96; 95% confidence interval (CI): 0.88–1.04) while the Western pattern significantly increased it (OR = 1.34; 95% CI: 1.08–1.65). In addition, the “Carbohydrate pattern”, which was analyzed in four articles, was positively associated with a higher PC risk (OR = 1.64; 95% CI: 1.35–2.00). A significant linear trend between the Western (p = 0.011) pattern, the Carbohydrate (p = 0.005) pattern, and the increment of PC risk was observed. The small number of studies included in the meta-analysis suggests that further investigation is necessary to support these findings. PMID:27754328
Accidental Risk Analyses of the Istanbul and Canakkale Straits
NASA Astrophysics Data System (ADS)
Essiz, Betül; Dagkiran, Berat
2017-12-01
Maritime transportation plays an important role in the world. Commercial transport and navy are international maritime activities in different countries. Thanks to the role of straits and channels, these activities can be easier and faster, Turkey has a crucial importance on it because of importance of geographical location. The Turkish Straits are a series of internationally significant waterways connecting Mediterranean Sea and Black Sea. They consist of the Canakkale Strait, the Sea of Marmara, and the Istanbul Strait, all part of the sovereign sea territory of Turkey and subject to the regime of internal waters. They are conventionally considered by the boundary between the continents of Europe and Asia. Because of this geographical importance, all kinds of huge sized vessel activities and high volume cargo transportation always keep going in this waterway. On the other hand, the more maritime activities grow the more accident risks increase. So, can be examined the accident risks on Istanbul and Canakkale Straits and can be assessed risk analysis for them. In the context of the study, one can see general information of the Turkish Straits and the regulatory regime. In addition, tables are applied for vessel movement in the Turkish Straits by years in detail in order to sense variation of the vessel. Risk analyses may also be described in sections with many variables. This paper outlines ship accidents and the risk analysis of ship accidents is applied and resulted for the Turkish Straits. The last chapter concerns the Vessel Traffic Service (VTS) System in the Turkish Straits.
Brenner, M H
1983-01-01
This paper discusses a first-stage analysis of the link of unemployment rates, as well as other economic, social and environmental health risk factors, to mortality rates in postwar Britain. The results presented represent part of an international study of the impact of economic change on mortality patterns in industrialized countries. The mortality patterns examined include total and infant mortality and (by cause) cardiovascular (total), cerebrovascular and heart disease, cirrhosis of the liver, and suicide, homicide and motor vehicle accidents. Among the most prominent factors that beneficially influence postwar mortality patterns in England/Wales and Scotland are economic growth and stability and health service availability. A principal detrimental factor to health is a high rate of unemployment. Additional factors that have an adverse influence on mortality rates are cigarette consumption and heavy alcohol use and unusually cold winter temperatures (especially in Scotland). The model of mortality that includes both economic changes and behavioral and environmental risk factors was successfully applied to infant mortality rates in the interwar period. In addition, the "simple" economic change model of mortality (using only economic indicators) was applied to other industrialized countries. In Canada, the United States, the United Kingdom, and Sweden, the simple version of the economic change model could be successfully applied only if the analysis was begun before World War II; for analysis beginning in the postwar era, the more sophisticated economic change model, including behavioral and environmental risk factors, was required. In France, West Germany, Italy, and Spain, by contrast, some success was achieved using the simple economic change model.
Tago, Damian; Andersson, Henrik; Treich, Nicolas
2014-01-01
This study presents literature reviews for the period 2000-2013 on (i) the health effects of pesticides and on (ii) preference valuation of health risks related to pesticides, as well as a discussion of the role of benefit-cost analysis applied to pesticide regulatory measures. This study indicates that the health literature has focused on individuals with direct exposure to pesticides, i.e. farmers, while the literature on preference valuation has focused on those with indirect exposure, i.e. consumers. The discussion highlights the need to clarify the rationale for regulating pesticides, the role of risk perceptions in benefit-cost analysis, and the importance of inter-disciplinary research in this area. This study relates findings of different disciplines (health, economics, public policy) regarding pesticides, and identifies gaps for future research.
NASA Astrophysics Data System (ADS)
Melliana, Armen, Yusrizal, Akmal, Syarifah
2017-11-01
PT Nira Murni construction is a contractor of PT Chevron Pacific Indonesia which engaged in contractor, fabrication, maintenance construction suppliers, and labor services. The high of accident rate in this company is caused the lack of awareness of workplace safety. Therefore, it requires an effort to reduce the accident rate on the company so that the financial losses can be minimized. In this study, Safe T-Score method is used to analyze the accident rate by measuring the level of frequency. Analysis is continued using risk management methods which identify hazards, risk measurement and risk management. The last analysis uses Job safety analysis (JSA) which will identify the effect of accidents. From the result of this study can be concluded that Job Safety Analysis (JSA) methods has not been implemented properly. Therefore, JSA method needs to follow-up in the next study, so that can be well applied as prevention of occupational accidents.
Sala, Emma; Bonfiglioli, Roberta; Fostinellil, Jacopo; Tomasi, Cesare; Graziosi, Francesca; Violante, Francesco S; Apostoli, Pietro
2014-01-01
Risk assessment for upper extremity work related muscoloskeletal disorders by applying six methods of ergonomic: a ten years experience. The objective of this research was to verify and validate the multiple step method suggested by SIMLII guidelines and to compare results obtained by use of these methods: Washington State Standard, OCRA, HAL, RULA, OREGE and STRAIN INDEX. 598 workstations for a total of 1800 analysis by different methods were considered, by adopting the following multiple step procedure: prelinminary evaluation by Washington State method and OCRA checklist in all the working stations, RULA or HAL as first level evaluation, OREGE or SI as second level evaluation. The preliminary evaluation resulted negative (risk absent) in the 75% of examined work stations and by using checklist OCRA optimal-acceptable condition was found in 58% by HAL in 92% of analysis, by RULA in 100%, by OREGE in 64%; by SI in 70% of examined working positions. We observed similar evaluation of strain among methods and main differences have been observed in posture and frequency assessment. The preliminary evaluation by State of Washington method appears to be an adequate instrument for identify the working condition at risk. All the adopted methods were in a good agreement in two estreme situations: high risk or absent risk, expecially in absent risk conditions. Level of accordance varied on the basis of their rationale and of the role of their different components so SIML indications about the critical use of biouzechanical methods and about the possible use of more than one of them (considering working chlaracteristics) have been confirmed.
On the Concept and Definition of Terrorism Risk.
Aven, Terje; Guikema, Seth
2015-12-01
In this article, we provide some reflections on how to define and understand the concept of terrorism risk in a professional risk assessment context. As a basis for this discussion we introduce a set of criteria that we believe should apply to any conceptualization of terrorism risk. These criteria are based on both criteria used in other areas of risk analysis and our experience with terrorism risk analysis. That is, these criteria offer our perspective. We show that several of the suggested perspectives and definitions have weaknesses in relation to these criteria. A main problem identified is the idea that terrorism risk can be conceptualized as a function of probability and consequence, not as a function of the interactions between adaptive individuals and organizations. We argue that perspectives based solely on probability and consequence should be used cautiously or not at all because they fail to reflect the essential features of the concept of terrorism risk, the threats and attacks, their consequences, and the uncertainties, all in the context of adaptation by the adversaries. These three elements should in our view constitute the main pillars of the terrorism risk concept. From this concept we can develop methods for assessing the risk by identifying a set of threats, attacks, and consequence measures associated with the possible outcome scenarios together with a description of the uncertainties and interactions between the adversaries. © 2015 Society for Risk Analysis.
Multivariate generalized multifactor dimensionality reduction to detect gene-gene interactions
2013-01-01
Background Recently, one of the greatest challenges in genome-wide association studies is to detect gene-gene and/or gene-environment interactions for common complex human diseases. Ritchie et al. (2001) proposed multifactor dimensionality reduction (MDR) method for interaction analysis. MDR is a combinatorial approach to reduce multi-locus genotypes into high-risk and low-risk groups. Although MDR has been widely used for case-control studies with binary phenotypes, several extensions have been proposed. One of these methods, a generalized MDR (GMDR) proposed by Lou et al. (2007), allows adjusting for covariates and applying to both dichotomous and continuous phenotypes. GMDR uses the residual score of a generalized linear model of phenotypes to assign either high-risk or low-risk group, while MDR uses the ratio of cases to controls. Methods In this study, we propose multivariate GMDR, an extension of GMDR for multivariate phenotypes. Jointly analysing correlated multivariate phenotypes may have more power to detect susceptible genes and gene-gene interactions. We construct generalized estimating equations (GEE) with multivariate phenotypes to extend generalized linear models. Using the score vectors from GEE we discriminate high-risk from low-risk groups. We applied the multivariate GMDR method to the blood pressure data of the 7,546 subjects from the Korean Association Resource study: systolic blood pressure (SBP) and diastolic blood pressure (DBP). We compare the results of multivariate GMDR for SBP and DBP to the results from separate univariate GMDR for SBP and DBP, respectively. We also applied the multivariate GMDR method to the repeatedly measured hypertension status from 5,466 subjects and compared its result with those of univariate GMDR at each time point. Results Results from the univariate GMDR and multivariate GMDR in two-locus model with both blood pressures and hypertension phenotypes indicate best combinations of SNPs whose interaction has significant association with risk for high blood pressures or hypertension. Although the test balanced accuracy (BA) of multivariate analysis was not always greater than that of univariate analysis, the multivariate BAs were more stable with smaller standard deviations. Conclusions In this study, we have developed multivariate GMDR method using GEE approach. It is useful to use multivariate GMDR with correlated multiple phenotypes of interests. PMID:24565370
Multimedia-modeling integration development environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pelton, Mitchell A.; Hoopes, Bonnie L.
2002-09-02
There are many framework systems available; however, the purpose of the framework presented here is to capitalize on the successes of the Framework for Risk Analysis in Multimedia Environmental Systems (FRAMES) and Multi-media Multi-pathway Multi-receptor Risk Assessment (3MRA) methodology as applied to the Hazardous Waste Identification Rule (HWIR) while focusing on the development of software tools to simplify the module developer?s effort of integrating a module into the framework.
A Classification and Analysis of Contracting Literature
1989-12-01
Pricing Model ( CAPM . This is a model designed by investment analysts to determine required rates of return given the systematic risk of a company. The...For the amount of risk they take, these profit margins were not excessively high. The author examined profitability in terms of the Capital Asset ...taxonomy was applied was limited , the results were necessarily qualified. However, at the least this application provided areas for further research
Hazarding health: experiences of body, work, and risk among factory women in Malaysia.
Root, Robin
2009-10-01
In the 1970s, Malaysia launched an export-oriented development strategy as a means of financing the nation's modernization. The success of the strategy hinged significantly on intensive recruitment of women for factory employment. I draw on descriptive qualitative research, including interviews (51), surveys (106), and ethnography in Malaysia to investigate factory women's experiences of work and work-related health risks. Discourse analysis surfaced a latent consciousness of bodily changes in relation to work. A grounded theory analysis showed a compromised access to occupational risk knowledge that may bear negatively on women's well-being and the role women's new labor identities played in mediating the meanings of work and risks. Given the predominance of women workers in low-end manufacturing globally, I aimed to contribute to theoretical and applied understandings of gender, globalization, and health.
Multi-hazard risk assessment applied to hydraulic fracturing operations
NASA Astrophysics Data System (ADS)
Garcia-Aristizabal, Alexander; Gasparini, Paolo; Russo, Raffaella; Capuano, Paolo
2017-04-01
Without exception, the exploitation of any energy resource produces impacts and intrinsically bears risks. Therefore, to make sound decisions about future energy resource exploitation, it is important to clearly understand the potential environmental impacts in the full life-cycle of an energy development project, distinguishing between the specific impacts intrinsically related to exploiting a given energy resource and those shared with the exploitation of other energy resources. Technological advances as directional drilling and hydraulic fracturing have led to a rapid expansion of unconventional resources (UR) exploration and exploitation; as a consequence, both public health and environmental concerns have risen. The main objective of a multi-hazard risk assessment applied to the development of UR is to assess the rate (or the likelihood) of occurrence of incidents and the relative potential impacts on surrounding environment, considering different hazards and their interactions. Such analyses have to be performed considering the different stages of development of a project; however, the discussion in this paper is mainly focused on the analysis applied to the hydraulic fracturing stage of a UR development project. The multi-hazard risk assessment applied to the development of UR poses a number of challenges, making of this one a particularly complex problem. First, a number of external hazards might be considered as potential triggering mechanisms. Such hazards can be either of natural origin or anthropogenic events caused by the same industrial activities. Second, failures might propagate through the industrial elements, leading to complex scenarios according to the layout of the industrial site. Third, there is a number of potential risk receptors, ranging from environmental elements (as the air, soil, surface water, or groundwater) to local communities and ecosystems. The multi-hazard risk approach for this problem is set by considering multiple hazards (and their possible interactions) as possible sources of system's perturbation that might drive to the development of an incidental event. Given the complexity of the problem, we adopt a multi-level approach: first, perform a qualitative analysis oriented to the identification of a wide range of possible scenarios; this process is based on a review of potential impacts in different risk receptors reported in literature, which is condensed in a number of causal diagrams created for different stages of a UR development project. Second, the most important scenarios for quantitative multi-hazard risk analyses are selected for further quantification. This selection is based on the identification of major risks, i.e., those related with the occurrence of low probability/high impact extreme events. The general framework for the quantitative multi-hazard risk analysis is represented using a so-called bow-tie structure. It is composed of a fault tree on the left hand side of the graphic plot, identifying the possible events causing the critical (or top) event, and an event tree on the right-hand side showing the possible consequences of the critical event. This work was supported under SHEER: "Shale Gas Exploration and Exploitation Induced Risks" project n.640896, funded from Horizon 2020 - R&I Framework Programme, call H2020-LCE-2014-1
Seoane-Pillado, María Teresa; Pita-Fernández, Salvador; Valdés-Cañedo, Francisco; Seijo-Bestilleiro, Rocio; Pértega-Díaz, Sonia; Fernández-Rivera, Constantino; Alonso-Hernández, Ángel; González-Martín, Cristina; Balboa-Barreiro, Vanesa
2017-03-07
The high prevalence of cardiovascular risk factors among the renal transplant population accounts for increased mortality. The aim of this study is to determine the incidence of cardiovascular events and factors associated with cardiovascular events in these patients. An observational ambispective follow-up study of renal transplant recipients (n = 2029) in the health district of A Coruña (Spain) during the period 1981-2011 was completed. Competing risk survival analysis methods were applied to estimate the cumulative incidence of developing cardiovascular events over time and to identify which characteristics were associated with the risk of these events. Post-transplant cardiovascular events are defined as the presence of myocardial infarction, invasive coronary artery therapy, cerebral vascular events, new-onset angina, congestive heart failure, rhythm disturbances, peripheral vascular disease and cardiovascular disease and death. The cause of death was identified through the medical history and death certificate using ICD9 (390-459, except: 427.5, 435, 446, 459.0). The mean age of patients at the time of transplantation was 47.0 ± 14.2 years; 62% were male. 16.5% had suffered some cardiovascular disease prior to transplantation and 9.7% had suffered a cardiovascular event. The mean follow-up period for the patients with cardiovascular event was 3.5 ± 4.3 years. Applying competing risk methodology, it was observed that the accumulated incidence of the event was 5.0% one year after transplantation, 8.1% after five years, and 11.9% after ten years. After applying multivariate models, the variables with an independent effect for predicting cardiovascular events are: male sex, age of recipient, previous cardiovascular disorders, pre-transplant smoking and post-transplant diabetes. This study makes it possible to determine in kidney transplant patients, taking into account competitive events, the incidence of post-transplant cardiovascular events and the risk factors of these events. Modifiable risk factors are identified, owing to which, changes in said factors would have a bearing of the incidence of events.
An overview of safety assessment, regulation, and control of hazardous material use at NREL
NASA Astrophysics Data System (ADS)
Nelson, B. P.; Crandall, R. S.; Moskowitz, P. D.; Fthenakis, V. M.
1992-12-01
This paper summarizes the methodology we use to ensure the safe use of hazardous materials at the National Renewable Energy Laboratory (NREL). First, we analyze the processes and the materials used in those processes to identify the hazards presented. Then we study federal, state, and local regulations and apply the relevant requirements to our operations. When necessary, we generate internal safety documents to consolidate this information. We design research operations and support systems to conform to these requirements. Before we construct the systems, we perform a semiquantitative risk analysis on likely accident scenarios. All scenarios presenting an unacceptable risk require system or procedural modifications to reduce the risk. Following these modifications, we repeat the risk analysis to ensure that the respective accident scenarios present an acceptable risk. Once all risks are acceptable, we conduct an operational readiness review (ORR). A management-appointed panel performs the ORR ensuring compliance with all relevant requirements. After successful completion of the ORR, operations can begin.
Why risk is not variance: an expository note.
Cox, Louis Anthony Tony
2008-08-01
Variance (or standard deviation) of return is widely used as a measure of risk in financial investment risk analysis applications, where mean-variance analysis is applied to calculate efficient frontiers and undominated portfolios. Why, then, do health, safety, and environmental (HS&E) and reliability engineering risk analysts insist on defining risk more flexibly, as being determined by probabilities and consequences, rather than simply by variances? This note suggests an answer by providing a simple proof that mean-variance decision making violates the principle that a rational decisionmaker should prefer higher to lower probabilities of receiving a fixed gain, all else being equal. Indeed, simply hypothesizing a continuous increasing indifference curve for mean-variance combinations at the origin is enough to imply that a decisionmaker must find unacceptable some prospects that offer a positive probability of gain and zero probability of loss. Unlike some previous analyses of limitations of variance as a risk metric, this expository note uses only simple mathematics and does not require the additional framework of von Neumann Morgenstern utility theory.
Delgado, João; Pollard, Simon; Snary, Emma; Black, Edgar; Prpich, George; Longhurst, Phil
2013-08-01
Exotic animal diseases (EADs) are characterized by their capacity to spread global distances, causing impacts on animal health and welfare with significant economic consequences. We offer a critique of current import risk analysis approaches employed in the EAD field, focusing on their capacity to assess complex systems at a policy level. To address the shortcomings identified, we propose a novel method providing a systematic analysis of the likelihood of a disease incursion, developed by reference to the multibarrier system employed for the United Kingdom. We apply the network model to a policy-level risk assessment of classical swine fever (CSF), a notifiable animal disease caused by the CSF virus. In doing so, we document and discuss a sequence of analyses that describe system vulnerabilities and reveal the critical control points (CCPs) for intervention, reducing the likelihood of U.K. pig herds being exposed to the CSF virus. © 2012 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Liu, Hu-Chen; Liu, Long; Li, Ping
2014-10-01
Failure mode and effects analysis (FMEA) has shown its effectiveness in examining potential failures in products, process, designs or services and has been extensively used for safety and reliability analysis in a wide range of industries. However, its approach to prioritise failure modes through a crisp risk priority number (RPN) has been criticised as having several shortcomings. The aim of this paper is to develop an efficient and comprehensive risk assessment methodology using intuitionistic fuzzy hybrid weighted Euclidean distance (IFHWED) operator to overcome the limitations and improve the effectiveness of the traditional FMEA. The diversified and uncertain assessments given by FMEA team members are treated as linguistic terms expressed in intuitionistic fuzzy numbers (IFNs). Intuitionistic fuzzy weighted averaging (IFWA) operator is used to aggregate the FMEA team members' individual assessments into a group assessment. IFHWED operator is applied thereafter to the prioritisation and selection of failure modes. Particularly, both subjective and objective weights of risk factors are considered during the risk evaluation process. A numerical example for risk assessment is given to illustrate the proposed method finally.
Enhanced low-template DNA analysis conditions and investigation of allele dropout patterns.
Hedell, Ronny; Dufva, Charlotte; Ansell, Ricky; Mostad, Petter; Hedman, Johannes
2015-01-01
Forensic DNA analysis applying PCR enables profiling of minute biological samples. Enhanced analysis conditions can be applied to further push the limit of detection, coming with the risk of visualising artefacts and allele imbalances. We have evaluated the consecutive increase of PCR cycles from 30 to 35 to investigate the limitations of low-template (LT) DNA analysis, applying the short tandem repeat (STR) analysis kit PowerPlex ESX 16. Mock crime scene DNA extracts of four different quantities (from around 8-84 pg) were tested. All PCR products were analysed using 5, 10 and 20 capillary electrophoresis (CE) injection seconds. Bayesian models describing allele dropout patterns, allele peak heights and heterozygote balance were developed to assess the overall improvements in EPG quality with altered PCR/CE settings. The models were also used to evaluate the impact of amplicon length, STR marker and fluorescent label on the risk for allele dropout. The allele dropout probability decreased for each PCR cycle increment from 30 to 33 PCR cycles. Irrespective of DNA amount, the dropout probability was not affected by further increasing the number of PCR cycles. For the 42 and 84 pg samples, mainly complete DNA profiles were generated applying 32 PCR cycles. For the 8 and 17 pg samples, the allele dropouts decreased from 100% using 30 cycles to about 75% and 20%, respectively. The results for 33, 34 and 35 PCR cycles indicated that heterozygote balance and stutter ratio were mainly affected by DNA amount, and not directly by PCR cycle number and CE injection settings. We found 32 and 33 PCR cycles with 10 CE injection seconds to be optimal, as 34 and 35 PCR cycles did not improve allele detection and also included CE saturation problems. We find allele dropout probability differences between several STR markers. Markers labelled with the fluorescent dyes CXR-ET (red in electropherogram) and TMR-ET (shown as black) generally have higher dropout risks compared with those labelled with JOE (green) and fluorescein (blue). Overall, the marker D10S1248 has the lowest allele dropout probability and D8S1179 the highest. The marker effect is mainly pronounced for 30-32 PCR cycles. Such effects would not be expected if the amplification efficiencies were identical for all markers. Understanding allele dropout risks and the variability in peak heights and balances is important for correct interpretation of forensic DNA profiles. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Health risks of energy systems.
Krewitt, W; Hurley, F; Trukenmüller, A; Friedrich, R
1998-08-01
Health risks from fossil, renewable and nuclear reference energy systems are estimated following a detailed impact pathway approach. Using a set of appropriate air quality models and exposure-effect functions derived from the recent epidemiological literature, a methodological framework for risk assessment has been established and consistently applied across the different energy systems, including the analysis of consequences from a major nuclear accident. A wide range of health impacts resulting from increased air pollution and ionizing radiation is quantified, and the transferability of results derived from specific power plants to a more general context is discussed.
Suchard, Marc A; Zorych, Ivan; Simpson, Shawn E; Schuemie, Martijn J; Ryan, Patrick B; Madigan, David
2013-10-01
The self-controlled case series (SCCS) offers potential as an statistical method for risk identification involving medical products from large-scale observational healthcare data. However, analytic design choices remain in encoding the longitudinal health records into the SCCS framework and its risk identification performance across real-world databases is unknown. To evaluate the performance of SCCS and its design choices as a tool for risk identification in observational healthcare data. We examined the risk identification performance of SCCS across five design choices using 399 drug-health outcome pairs in five real observational databases (four administrative claims and one electronic health records). In these databases, the pairs involve 165 positive controls and 234 negative controls. We also consider several synthetic databases with known relative risks between drug-outcome pairs. We evaluate risk identification performance through estimating the area under the receiver-operator characteristics curve (AUC) and bias and coverage probability in the synthetic examples. The SCCS achieves strong predictive performance. Twelve of the twenty health outcome-database scenarios return AUCs >0.75 across all drugs. Including all adverse events instead of just the first per patient and applying a multivariate adjustment for concomitant drug use are the most important design choices. However, the SCCS as applied here returns relative risk point-estimates biased towards the null value of 1 with low coverage probability. The SCCS recently extended to apply a multivariate adjustment for concomitant drug use offers promise as a statistical tool for risk identification in large-scale observational healthcare databases. Poor estimator calibration dampens enthusiasm, but on-going work should correct this short-coming.
Meta-analysis of thirty-two case-control and two ecological radon studies of lung cancer.
Dobrzynski, Ludwik; Fornalski, Krzysztof W; Reszczynska, Joanna
2018-03-01
A re-analysis has been carried out of thirty-two case-control and two ecological studies concerning the influence of radon, a radioactive gas, on the risk of lung cancer. Three mathematically simplest dose-response relationships (models) were tested: constant (zero health effect), linear, and parabolic (linear-quadratic). Health effect end-points reported in the analysed studies are odds ratios or relative risk ratios, related either to morbidity or mortality. In our preliminary analysis, we show that the results of dose-response fitting are qualitatively (within uncertainties, given as error bars) the same, whichever of these health effect end-points are applied. Therefore, we deemed it reasonable to aggregate all response data into the so-called Relative Health Factor and jointly analysed such mixed data, to obtain better statistical power. In the second part of our analysis, robust Bayesian and classical methods of analysis were applied to this combined dataset. In this part of our analysis, we selected different subranges of radon concentrations. In view of substantial differences between the methodology used by the authors of case-control and ecological studies, the mathematical relationships (models) were applied mainly to the thirty-two case-control studies. The degree to which the two ecological studies, analysed separately, affect the overall results when combined with the thirty-two case-control studies, has also been evaluated. In all, as a result of our meta-analysis of the combined cohort, we conclude that the analysed data concerning radon concentrations below ~1000 Bq/m3 (~20 mSv/year of effective dose to the whole body) do not support the thesis that radon may be a cause of any statistically significant increase in lung cancer incidence.
Rethnam, Ulfin; Yesupalan, Rajam; Gandham, Giri
2008-06-16
A cautious outlook towards neck injuries has been the norm to avoid missing cervical spine injuries. Consequently there has been an increased use of cervical spine radiography. The Canadian Cervical Spine rule was proposed to reduce unnecessary use of cervical spine radiography in alert and stable patients. Our aim was to see whether applying the Canadian Cervical Spine rule reduced the need for cervical spine radiography without missing significant cervical spine injuries. This was a retrospective study conducted in 2 hospitals. 114 alert and stable patients who had cervical spine radiographs for suspected neck injuries were included in the study. Data on patient demographics, high risk & low risk factors as per the Canadian Cervical Spine rule and cervical spine radiography results were collected and analysed. 28 patients were included in the high risk category according to the Canadian Cervical Spine rule. 86 patients fell into the low risk category. If the Canadian Cervical Spine rule was applied, there would have been a significant reduction in cervical spine radiographs as 86/114 patients (75.4%) would not have needed cervical spine radiograph. 2/114 patients who had significant cervical spine injuries would have been identified when the Canadian Cervical Spine rule was applied. Applying the Canadian Cervical Spine rule for neck injuries in alert and stable patients would have reduced the use of cervical spine radiographs without missing out significant cervical spine injuries. This relates to reduction in radiation exposure to patients and health care costs.
Rethnam, Ulfin; Yesupalan, Rajam; Gandham, Giri
2008-01-01
Background A cautious outlook towards neck injuries has been the norm to avoid missing cervical spine injuries. Consequently there has been an increased use of cervical spine radiography. The Canadian Cervical Spine rule was proposed to reduce unnecessary use of cervical spine radiography in alert and stable patients. Our aim was to see whether applying the Canadian Cervical Spine rule reduced the need for cervical spine radiography without missing significant cervical spine injuries. Methods This was a retrospective study conducted in 2 hospitals. 114 alert and stable patients who had cervical spine radiographs for suspected neck injuries were included in the study. Data on patient demographics, high risk & low risk factors as per the Canadian Cervical Spine rule and cervical spine radiography results were collected and analysed. Results 28 patients were included in the high risk category according to the Canadian Cervical Spine rule. 86 patients fell into the low risk category. If the Canadian Cervical Spine rule was applied, there would have been a significant reduction in cervical spine radiographs as 86/114 patients (75.4%) would not have needed cervical spine radiograph. 2/114 patients who had significant cervical spine injuries would have been identified when the Canadian Cervical Spine rule was applied. Conclusion Applying the Canadian Cervical Spine rule for neck injuries in alert and stable patients would have reduced the use of cervical spine radiographs without missing out significant cervical spine injuries. This relates to reduction in radiation exposure to patients and health care costs. PMID:18557998
NASA Astrophysics Data System (ADS)
Ferrero, A. M.; Migliazza, M.; Roncella, R.; Segalini, A.
2011-02-01
The town of Campione del Garda (located on the west coast of Lake Garda) and its access road have been historically subject to rockfall phenomena with risk for public security in several areas of the coast. This paper presents a study devoted to the determination of risk for coastal cliffs and the design of mitigation measures. Our study was based on statistical rockfall analysis performed with a commercial code and on stability analysis of rock slopes based on the key block method. Hazard from block kinematics and rock-slope failure are coupled by applying the Rockfall Hazard Assessment Procedure (RHAP). Because of the huge dimensions of the slope, its morphology and the geostructural survey were particularly complicated and demanding. For these reasons, noncontact measurement methods, based on aerial photogrammetry by helicopter, were adopted. A special software program, developed by the authors, was applied for discontinuity identification and for their orientation measurements. The potentially of aerial photogrammetic survey in rock mechanic application and its improvement in the rock mass knowledge is analysed in the article.
The Development of the Problematic Online Gaming Questionnaire (POGQ)
Demetrovics, Zsolt; Urbán, Róbert; Nagygyörgy, Katalin; Farkas, Judit; Griffiths, Mark D.; Pápay, Orsolya; Kökönyei, Gyöngyi; Felvinczi, Katalin; Oláh, Attila
2012-01-01
Background Online gaming has become increasingly popular. However, this has led to concerns that these games might induce serious problems and/or lead to dependence for a minority of players. Aim: The aim of this study was to uncover and operationalize the components of problematic online gaming. Methods A total of 3415 gamers (90% males; mean age 21 years), were recruited through online gaming websites. A combined method of exploratory factor analysis (EFA) and confirmatory factor analysis (CFA) was applied. Latent profile analysis was applied to identify persons at-risk. Results EFA revealed a six-factor structure in the background of problematic online gaming that was also confirmed by a CFA. For the assessment of the identified six dimensions – preoccupation, overuse, immersion, social isolation, interpersonal conflicts, and withdrawal – the 18-item Problematic Online Gaming Questionnaire (POGQ) proved to be exceedingly suitable. Based on the latent profile analysis, 3.4% of the gamer population was considered to be at high risk, while another 15.2% was moderately problematic. Conclusions The POGQ seems to be an adequate measurement tool for the differentiated assessment of gaming related problems on six subscales. PMID:22590541
Risk analysis for autonomous underwater vehicle operations in extreme environments.
Brito, Mario Paulo; Griffiths, Gwyn; Challenor, Peter
2010-12-01
Autonomous underwater vehicles (AUVs) are used increasingly to explore hazardous marine environments. Risk assessment for such complex systems is based on subjective judgment and expert knowledge as much as on hard statistics. Here, we describe the use of a risk management process tailored to AUV operations, the implementation of which requires the elicitation of expert judgment. We conducted a formal judgment elicitation process where eight world experts in AUV design and operation were asked to assign a probability of AUV loss given the emergence of each fault or incident from the vehicle's life history of 63 faults and incidents. After discussing methods of aggregation and analysis, we show how the aggregated risk estimates obtained from the expert judgments were used to create a risk model. To estimate AUV survival with mission distance, we adopted a statistical survival function based on the nonparametric Kaplan-Meier estimator. We present theoretical formulations for the estimator, its variance, and confidence limits. We also present a numerical example where the approach is applied to estimate the probability that the Autosub3 AUV would survive a set of missions under Pine Island Glacier, Antarctica in January-March 2009. © 2010 Society for Risk Analysis.
Periodic benefit-risk assessment using Bayesian stochastic multi-criteria acceptability analysis
Li, Kan; Yuan, Shuai Sammy; Wang, William; Wan, Shuyan Sabrina; Ceesay, Paulette; Heyse, Joseph F.; Mt-Isa, Shahrul; Luo, Sheng
2018-01-01
Benefit-risk (BR) assessment is essential to ensure the best decisions are made for a medical product in the clinical development process, regulatory marketing authorization, post-market surveillance, and coverage and reimbursement decisions. One challenge of BR assessment in practice is that the benefit and risk profile may keep evolving while new evidence is accumulating. Regulators and the International Conference on Harmonization (ICH) recommend performing periodic benefit-risk evaluation report (PBRER) through the product's lifecycle. In this paper, we propose a general statistical framework for periodic benefit-risk assessment, in which Bayesian meta-analysis and stochastic multi-criteria acceptability analysis (SMAA) will be combined to synthesize the accumulating evidence. The proposed approach allows us to compare the acceptability of different drugs dynamically and effectively and accounts for the uncertainty of clinical measurements and imprecise or incomplete preference information of decision makers. We apply our approaches to two real examples in a post-hoc way for illustration purpose. The proposed method may easily be modified for other pre and post market settings, and thus be an important complement to the current structured benefit-risk assessment (sBRA) framework to improve the transparent and consistency of the decision-making process. PMID:29505866
Tarafdar, Abhrajyoti; Sinha, Alok
2017-10-01
A carcinogenic risk assessment of polycyclic aromatic hydrocarbons in soils and sediments was conducted using the probabilistic approach from a national perspective. Published monitoring data of polycyclic aromatic hydrocarbons present in soils and sediments at different study points across India were collected and converted to their corresponding BaP equivalent concentrations. These BaP equivalent concentrations were used to evaluate comprehensive cancer risk for two different age groups. Monte Carlo simulation and sensitivity analysis were applied to quantify uncertainties of risk estimation. The analysis denotes 90% cancer risk value of 1.770E-5 for children and 3.156E-5 for adults at heavily polluted site soils. Overall carcinogenic risks of polycyclic aromatic hydrocarbons in soils of India were mostly in acceptance limits. However, the food ingestion exposure route for sediments leads them to a highly risked zone. The 90% risk values from sediments are 7.863E-05 for children and 3.999E-04 for adults. Sensitivity analysis reveals exposure duration and relative skin adherence factor for soil as the most influential parameter of the assessment, followed by BaP equivalent concentration of polycyclic aromatic hydrocarbons. For sediments, biota to sediment accumulation factor of fish in terms of BaP is most sensitive on the total outcome, followed by BaP equivalent and exposure duration. Individual exposure route analysis showed dermal contact for soils and food ingestion for sediments as the main exposure pathway. Some specific locations such as surrounding areas of Bhavnagar, Raniganj, Sunderban, Raipur, and Delhi demand potential strategies of carcinogenic risk management and reduction. The current study is probably the first attempt to provide information on the carcinogenic risk of polycyclic aromatic hydrocarbons in soil and sediments across India.
NASA Astrophysics Data System (ADS)
Tarafdar, Abhrajyoti; Sinha, Alok
2017-10-01
A carcinogenic risk assessment of polycyclic aromatic hydrocarbons in soils and sediments was conducted using the probabilistic approach from a national perspective. Published monitoring data of polycyclic aromatic hydrocarbons present in soils and sediments at different study points across India were collected and converted to their corresponding BaP equivalent concentrations. These BaP equivalent concentrations were used to evaluate comprehensive cancer risk for two different age groups. Monte Carlo simulation and sensitivity analysis were applied to quantify uncertainties of risk estimation. The analysis denotes 90% cancer risk value of 1.770E-5 for children and 3.156E-5 for adults at heavily polluted site soils. Overall carcinogenic risks of polycyclic aromatic hydrocarbons in soils of India were mostly in acceptance limits. However, the food ingestion exposure route for sediments leads them to a highly risked zone. The 90% risk values from sediments are 7.863E-05 for children and 3.999E-04 for adults. Sensitivity analysis reveals exposure duration and relative skin adherence factor for soil as the most influential parameter of the assessment, followed by BaP equivalent concentration of polycyclic aromatic hydrocarbons. For sediments, biota to sediment accumulation factor of fish in terms of BaP is most sensitive on the total outcome, followed by BaP equivalent and exposure duration. Individual exposure route analysis showed dermal contact for soils and food ingestion for sediments as the main exposure pathway. Some specific locations such as surrounding areas of Bhavnagar, Raniganj, Sunderban, Raipur, and Delhi demand potential strategies of carcinogenic risk management and reduction. The current study is probably the first attempt to provide information on the carcinogenic risk of polycyclic aromatic hydrocarbons in soil and sediments across India.
Global computer-assisted appraisal of osteoporosis risk in Asian women: an innovative study.
Chang, Shu F; Hong, Chin M; Yang, Rong S
2011-05-01
To develop a computer-assisted appraisal system of osteoporosis that can predict osteoporosis health risk in community-dwelling women and to use it in an empirical analysis of the risk in Asian women. As the literature indicates, health risk assessment tools are generally applied in clinical practice for patient diagnosis. However, few studies have explored how to assist community-dwelling women to understand the risk of osteoporosis without invasive data. A longitudinal, evidence-based study. The first stage of this study is to establish a system that combines expertise in nursing, medicine and information technology. This part includes information from random samples (n = 700), including data on bone mineral density, osteoporosis risk factors, knowledge, beliefs and behaviour, which are used as the health risk appraisal system database. The second stage is to apply an empirical study. The relative risks of osteoporosis of the participants (n = 300) were determined with the system. The participants that were classified as at-risk were randomly grouped into experimental and control groups. Each group was treated using different nursing intervention methods. The sensitivity and specificity of the analytical tools was 75%. In empirical study, analysis results indicate that the prevalence of osteoporosis was 14.0%. Data indicate that strategic application of multiple nursing interventions can promote osteoporosis prevention knowledge in high-risk women and enhance the effectiveness of preventive action. The system can also provide people in remote areas or with insufficient medical resources a simple and effective means of managing health risk and implement the idea of self-evaluation and self-caring among community-dwelling women at home to achieve the final goal of early detection and early treatment of osteoporosis. This study developed a useful approach for providing Asia women with a reliable, valid, convenient and economical self-health management model. Health care professionals can explore the use of advanced information systems and nursing interventions to increase the effectiveness of osteoporosis prevention programmes for women. © 2011 Blackwell Publishing Ltd.
Schinasi, Leah H; De Roos, Anneclaire J; Ray, Roberta M; Edlefsen, Kerstin L; Parks, Christine G; Howard, Barbara V; Meliker, Jaymie R; Bonner, Matthew R; Wallace, Robert B; LaCroix, Andrea Z
2015-11-01
Relationships of farm history and insecticide exposure at home or work with lymphohematopoietic (LH) neoplasm risk were investigated in a large prospective cohort of US women. In questionnaires, women self-reported history living or working on a farm, personally mixing or applying insecticides, insecticide application in the home or workplace by a commercial service, and treating pets with insecticides. Relationships with non-Hodgkin lymphoma (NHL), chronic lymphocytic leukemia/small lymphocytic lymphoma (CLL/SLL), diffuse large B-cell lymphoma (DLBCL), follicular lymphoma, plasma cell neoplasms, and myeloid leukemia were investigated using Cox proportional hazard models. Age and farming history were explored as effect modifiers. The analysis included 76,493 women and 822 NHL cases. Women who ever lived or worked on a farm had 1.12 times the risk of NHL (95% confidence interval [CI] = 0.95-1.32) compared to those who did not. Women who reported that a commercial service ever applied insecticides in their immediate surroundings had 65% higher risk of CLL/SLL (95% CI = 1.15-2.38). Women aged less than 65 years who ever applied insecticides had 87% higher risk of DLBCL (95% CI = 1.13-3.09). Insecticide exposures may contribute to risk of CLL/SLL and DLBCL. Future studies should examine relationships of LH subtypes with specific types of household insecticides. Copyright © 2015 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Giosa, L.; Margiotta, M. R.; Sdao, F.; Sole, A.; Albano, R.; Cappa, G.; Giammatteo, C.; Pagliuca, R.; Piccolo, G.; Statuto, D.
2009-04-01
The Environmental Engineering Faculty of University of Basilicata have higher-level course for students in the field of natural hazard. The curriculum provides expertise in the field of prediction, prevention and management of earthquake risk, hydrologic-hydraulic risk, and geomorphological risk. These skills will contribute to the training of specialists, as well as having a thorough knowledge of the genesis and the phenomenology of natural risks, know how to interpret, evaluate and monitor the dynamic of environment and of territory. In addition to basic training in the fields of mathematics and physics, the course of study provides specific lessons relating to seismic and structural dynamics of land, environmental and computational hydraulics, hydrology and applied hydrogeology. In particular in this course there are organized two connected examination arguments: Laboratory of hydrologic and hydraulic risk management and Applied geomorphology. These course foresee the development and resolution of natural hazard problems through the study of a real natural disaster. In the last year, the work project has regarded the collapse of two decantation basins of fluorspar, extracted from some mines in Stava Valley, 19 July 1985, northern Italy. During the development of the course, data and event information has been collected, a guided tour to the places of the disaster has been organized, and finally the application of mathematical models to simulate the disaster and analysis of the results has been carried out. The student work has been presented in a public workshop.
NASA Astrophysics Data System (ADS)
Baruffini, Mirko
2010-05-01
Due to the topographical conditions in Switzerland, the highways and the railway lines are frequently exposed to natural hazards as rockfalls, debris flows, landslides, avalanches and others. With the rising incidence of those natural hazards, protection measures become an important political issue. However, they are costly, and maximal protection is most probably not economically feasible. Furthermore risks are distributed in space and time. Consequently, important decision problems to the public sector decision makers are derived. This asks for a high level of surveillance and preservation along the transalpine lines. Efficient protection alternatives can be obtained consequently considering the concept of integral risk management. Risk analysis, as the central part of risk management, has become gradually a generally accepted approach for the assessment of current and future scenarios (Loat & Zimmermann 2004). The procedure aims at risk reduction which can be reached by conventional mitigation on one hand and the implementation of land-use planning on the other hand: a combination of active and passive mitigation measures is applied to prevent damage to buildings, people and infrastructures. With a Geographical Information System adapted to run with a tool developed to manage Risk analysis it is possible to survey the data in time and space, obtaining an important system for managing natural risks. As a framework, we adopt the Swiss system for risk analysis of gravitational natural hazards (BUWAL 1999). It offers a complete framework for the analysis and assessment of risks due to natural hazards, ranging from hazard assessment for gravitational natural hazards, such as landslides, collapses, rockfalls, floodings, debris flows and avalanches, to vulnerability assessment and risk analysis, and the integration into land use planning at the cantonal and municipality level. The scheme is limited to the direct consequences of natural hazards. Thus, we develop a system which integrates the procedures for a complete risk analysis in a Geographic Information System (GIS) toolbox, in order to be applied to our testbed, the Alps-crossing corridor of St. Gotthard. The simulation environment is developed within ArcObjects, the development platform for ArcGIS. The topic of ArcObjects usually emerges when users realize that programming ArcObjects can actually reduce the amount of repetitive work, streamline the workflow, and even produce functionalities that are not easily available in ArcGIS. We have adopted Visual Basic for Applications (VBA) for programming ArcObjects. Because VBA is already embedded within ArcMap and ArcCatalog, it is convenient for ArcGIS users to program ArcObjects in VBA. Our tool visualises the obtained data by an analysis of historical data (aerial photo imagery, field surveys, documentation of past events) or an environmental modeling (estimations of the area affected by a given event), and event such as route number and route position and thematic maps. As a result of this step the record appears in WebGIS. The user can select a specific area to overview previous hazards in the region. After performing the analysis, a double click on the visualised infrastructures opens the corresponding results. The constantly updated risk maps show all sites that require more protection against natural hazards. The final goal of our work is to offer a versatile tool for risk analysis which can be applied to different situations. Today our GIS application mainly centralises the documentation of natural hazards. Additionally the system offers information about natural hazard at the Gotthard line. It is very flexible and can be used as a simple program to model the expansion of natural hazards, as a program of quantitatively estimate risks or as a detailed analysis at a municipality level. The tool is extensible and can be expanded with additional modules. The initial results of the experimental case study show how useful a GIS-based system can be for effective and efficient disaster response management. In the coming years our GIS application will be a data base containing all information needed for the evaluation of risk sites along the Gotthard line. Our GIS application can help the technical management to decide about protection measures because of, in addition to the visualisation, tools for spatial data analysis will be available. REFERENCES Bründl M. (Ed.) 2009 : Risikokonzept für Naturgefahren - Leitfaden. Nationale Plattform für Naturgefahren PLANAT, Bern. 416 S. BUWAL 1999: Risikoanalyse bei gravitativen Naturgefahren - Methode, Fallbeispiele und Daten (Risk analyses for gravitational natural hazards). Bundesamt für Umwelt, Wald und Landschaft (BUWAL). Umwelt-Materialen Nr. 107, 1-244. Loat, R. & Zimmermann, M. 2004: La gestion des risques en Suisse (Risk Management in Switzerland). In: Veyret, Y., Garry, G., Meschinet de Richemont, N. & Armand Colin (eds) 2002: Colloque Arche de la Défense 22-24 octobre 2002, dans Risques naturels et aménagement en Europe, 108-120. Maggi R. et al, 2009: Evaluation of the optimal resilience for vulnerable infrastructure networks. An interdisciplinary pilot study on the transalpine transportation corridors, NRP 54 "Sustainable Development of the Built Environment", Projekt Nr. 405 440, Final Scientific Report, Lugano
Meyer, Rüdiger; Freitag-Wolf, Sandra; Blindow, Silke; Büning, Jürgen; Habermann, Jens K
2017-02-01
Cancer risk assessment for ulcerative colitis patients by evaluating histological changes through colonoscopy surveillance is still challenging. Thus, additional parameters of high prognostic impact for the development of colitis-associated carcinoma are necessary. This meta-analysis was conducted to clarify the value of aneuploidy as predictor for individual cancer risk compared with current surveillance parameters. A systematic web-based search identified studies published in English that addressed the relevance of the ploidy status for individual cancer risk during surveillance in comparison to neoplastic mucosal changes. The resulting data were included into a meta-analysis, and odds ratios (OR) were calculated for aneuploidy or dysplasia or aneuploidy plus dysplasia. Twelve studies addressing the relevance of aneuploidy compared to dyplasia were comprehensively evaluated and further used for meta-analysis. The meta-analysis revealed that aneuploidy (OR 5.31 [95 % CI 2.03, 13.93]) is an equally effective parameter for cancer risk assessment in ulcerative colitis patients as dysplasia (OR 4.93 [1.61, 15.11]). Strikingly, the combined assessment of dysplasia and aneuploidy is superior compared to applying each parameter alone (OR 8.99 [3.08, 26.26]). This meta-analysis reveals that aneuploidy is an equally effective parameter for individual cancer risk assessment in ulcerative colitis as the detection of dysplasia. More important, the combined assessment of dysplasia and aneuploidy outperforms the use of each parameter alone. We suggest image cytometry for ploidy assessment to become an additional feature of consensus criteria to individually assess cancer risk in UC.
Madsen, Ida E H; Hannerz, Harald; Nyberg, Solja T; Magnusson Hanson, Linda L; Ahola, Kirsi; Alfredsson, Lars; Batty, G David; Bjorner, Jakob B; Borritz, Marianne; Burr, Hermann; Dragano, Nico; Ferrie, Jane E; Hamer, Mark; Jokela, Markus; Knutsson, Anders; Koskenvuo, Markku; Koskinen, Aki; Leineweber, Constanze; Nielsen, Martin L; Nordin, Maria; Oksanen, Tuula; Pejtersen, Jan H; Pentti, Jaana; Salo, Paula; Singh-Manoux, Archana; Suominen, Sakari; Theorell, Töres; Toppinen-Tanner, Salla; Vahtera, Jussi; Väänänen, Ari; Westerholm, Peter J M; Westerlund, Hugo; Fransson, Eleonor; Heikkilä, Katriina; Virtanen, Marianna; Rugulies, Reiner; Kivimäki, Mika
2013-01-01
Previous studies have shown that gainfully employed individuals with high work demands and low control at work (denoted "job strain") are at increased risk of common mental disorders, including depression. Most existing studies have, however, measured depression using self-rated symptom scales that do not necessarily correspond to clinically diagnosed depression. In addition, a meta-analysis from 2008 indicated publication bias in the field. This study protocol describes the planned design and analyses of an individual participant data meta-analysis, to examine whether job strain is associated with an increased risk of clinically diagnosed unipolar depression based on hospital treatment registers. The study will be based on data from approximately 120,000 individuals who participated in 14 studies on work environment and health in 4 European countries. The self-reported working conditions data will be merged with national registers on psychiatric hospital treatment, primarily hospital admissions. Study-specific risk estimates for the association between job strain and depression will be calculated using Cox regressions. The study-specific risk estimates will be pooled using random effects meta-analysis. The planned analyses will help clarify whether job strain is associated with an increased risk of clinically diagnosed unipolar depression. As the analysis is based on pre-planned study protocols and an individual participant data meta-analysis, the pooled risk estimates will not be influenced by selective reporting and publication bias. However, the results of the planned study may only pertain to severe cases of unipolar depression, because of the outcome measure applied.
Innovative neuro-fuzzy system of smart transport infrastructure for road traffic safety
NASA Astrophysics Data System (ADS)
Beinarovica, Anna; Gorobetz, Mikhail; Levchenkov, Anatoly
2017-09-01
The proposed study describes applying of neural network and fuzzy logic in transport control for safety improvement by evaluation of accidents’ risk by intelligent infrastructure devices. Risk evaluation is made by following multiple-criteria: danger, changeability and influence of changes for risk increasing. Neuro-fuzzy algorithms are described and proposed for task solution. The novelty of the proposed system is proved by deep analysis of known studies in the field. The structure of neuro-fuzzy system for risk evaluation and mathematical model is described in the paper. The simulation model of the intelligent devices for transport infrastructure is proposed to simulate different situations, assess the risks and propose the possible actions for infrastructure or vehicles to minimize the risk of possible accidents.
Motamedzade, Majid; Ashuri, Mohammad Reza; Golmohammadi, Rostam; Mahjub, Hossein
2011-06-13
During the last decades, to assess the risk factors of work-related musculoskeletal disorders (WMSDs), enormous observational methods have been developed. Rapid Entire Body Assessment (REBA) and Quick Exposure Check (QEC) are two general methods in this field. This study aimed to compare ergonomic risk assessment outputs from QEC and REBA in terms of agreement in distribution of postural loading scores based on analysis of working postures. This cross-sectional study was conducted in an engine oil company in which 40 jobs were studied. All jobs were observed by a trained occupational health practitioner. Job information was collected to ensure the completion of ergonomic risk assessment tools, including QEC, and REBA. The result revealed that there was a significant correlation between final scores (r=0.731) and the action levels (r =0.893) of two applied methods. Comparison between the action levels and final scores of two methods showed that there was no significant difference among working departments. Most of studied postures acquired low and moderate risk level in QEC assessment (low risk=20%, moderate risk=50% and High risk=30%) and in REBA assessment (low risk=15%, moderate risk=60% and high risk=25%). There is a significant correlation between two methods. They have a strong correlation in identifying risky jobs, and determining the potential risk for incidence of WMSDs. Therefore, there is possibility for researchers to apply interchangeably both methods, for postural risk assessment in appropriate working environments.
Uncertainty Analysis of A Flood Risk Mapping Procedure Applied In Urban Areas
NASA Astrophysics Data System (ADS)
Krause, J.; Uhrich, S.; Bormann, H.; Diekkrüger, B.
In the framework of IRMA-Sponge program the presented study was part of the joint research project FRHYMAP (flood risk and hydrological mapping). A simple con- ceptual flooding model (FLOODMAP) has been developed to simulate flooded areas besides rivers within cities. FLOODMAP requires a minimum of input data (digital el- evation model (DEM), river line, water level plain) and parameters and calculates the flood extent as well as the spatial distribution of flood depths. of course the simulated model results are affected by errors and uncertainties. Possible sources of uncertain- ties are the model structure, model parameters and input data. Thus after the model validation (comparison of simulated water to observed extent, taken from airborne pictures) the uncertainty of the essential input data set (digital elevation model) was analysed. Monte Carlo simulations were performed to assess the effect of uncertain- ties concerning the statistics of DEM quality and to derive flooding probabilities from the set of simulations. The questions concerning a minimum resolution of a DEM re- quired for flood simulation and concerning the best aggregation procedure of a given DEM was answered by comparing the results obtained using all available standard GIS aggregation procedures. Seven different aggregation procedures were applied to high resolution DEMs (1-2m) in three cities (Bonn, Cologne, Luxembourg). Basing on this analysis the effect of 'uncertain' DEM data was estimated and compared with other sources of uncertainties. Especially socio-economic information and monetary transfer functions required for a damage risk analysis show a high uncertainty. There- fore this study helps to analyse the weak points of the flood risk and damage risk assessment procedure.
Statistical analysis of the uncertainty related to flood hazard appraisal
NASA Astrophysics Data System (ADS)
Notaro, Vincenza; Freni, Gabriele
2015-12-01
The estimation of flood hazard frequency statistics for an urban catchment is of great interest in practice. It provides the evaluation of potential flood risk and related damage and supports decision making for flood risk management. Flood risk is usually defined as function of the probability, that a system deficiency can cause flooding (hazard), and the expected damage, due to the flooding magnitude (damage), taking into account both the exposure and the vulnerability of the goods at risk. The expected flood damage can be evaluated by an a priori estimation of potential damage caused by flooding or by interpolating real damage data. With regard to flood hazard appraisal several procedures propose to identify some hazard indicator (HI) such as flood depth or the combination of flood depth and velocity and to assess the flood hazard corresponding to the analyzed area comparing the HI variables with user-defined threshold values or curves (penalty curves or matrixes). However, flooding data are usually unavailable or piecemeal allowing for carrying out a reliable flood hazard analysis, therefore hazard analysis is often performed by means of mathematical simulations aimed at evaluating water levels and flow velocities over catchment surface. As results a great part of the uncertainties intrinsic to flood risk appraisal can be related to the hazard evaluation due to the uncertainty inherent to modeling results and to the subjectivity of the user defined hazard thresholds applied to link flood depth to a hazard level. In the present work, a statistical methodology was proposed for evaluating and reducing the uncertainties connected with hazard level estimation. The methodology has been applied to a real urban watershed as case study.
NASA Astrophysics Data System (ADS)
Varzakas, Theodoros H.; Arvanitoyannis, Ioannis S.
Failure Mode and Effect Analysis (FMEA) model has been applied for the risk assessment of poultry slaughtering and manufacturing. In this work comparison of ISO22000 analysis with HACCP is carried out over poultry slaughtering, processing and packaging. Critical Control points and Prerequisite programs (PrPs) have been identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram and fishbone diagram).
Novel risk score of contrast-induced nephropathy after percutaneous coronary intervention.
Ji, Ling; Su, XiaoFeng; Qin, Wei; Mi, XuHua; Liu, Fei; Tang, XiaoHong; Li, Zi; Yang, LiChuan
2015-08-01
Contrast-induced nephropathy (CIN) post-percutaneous coronary intervention (PCI) is a major cause of acute kidney injury. In this study, we established a comprehensive risk score model to assess risk of CIN after PCI procedure, which could be easily used in a clinical environment. A total of 805 PCI patients, divided into analysis cohort (70%) and validation cohort (30%), were enrolled retrospectively in this study. Risk factors for CIN were identified using univariate analysis and multivariate logistic regression in the analysis cohort. Risk score model was developed based on multiple regression coefficients. Sensitivity and specificity of the new risk score system was validated in the validation cohort. Comparisons between the new risk score model and previous reported models were applied. The incidence of post-PCI CIN in the analysis cohort (n = 565) was 12%. Considerably high CIN incidence (50%) was observed in patients with chronic kidney disease (CKD). Age >75, body mass index (BMI) >25, myoglobin level, cardiac function level, hypoalbuminaemia, history of chronic kidney disease (CKD), Intra-aortic balloon pump (IABP) and peripheral vascular disease (PVD) were identified as independent risk factors of post-PCI CIN. A novel risk score model was established using multivariate regression coefficients, which showed highest sensitivity and specificity (0.917, 95%CI 0.877-0.957) compared with previous models. A new post-PCI CIN risk score model was developed based on a retrospective study of 805 patients. Application of this model might be helpful to predict CIN in patients undergoing PCI procedure. © 2015 Asian Pacific Society of Nephrology.
A UNIFYING CONCEPT FOR ASSESSING TOXICOLOGICAL INTERACTIONS: CHANGES IN SLOPE
Robust statistical methods are important to the evaluation of interactions among chemicals in a mixture. However, different concepts of interaction as applied to the statistical analysis of chemical mixture toxicology data or as used in environmental risk assessment often can ap...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-30
... with 7 CFR part 305 with a minimum absorbed dose of 400 Gy. If the irradiation treatment is applied... attesting that the fruit received the required irradiation treatment. If the irradiation treatment is...
The development of mountain risk governance: challenges for application
NASA Astrophysics Data System (ADS)
Link, S.; Stötter, J.
2015-01-01
The complexity the management of mountain risks in the Alps has considerably increased since its institutionalisation in the late nineteenth century. In the history of approaches to dealing with mountain risks four successive paradigms can be distinguished on the basis of key indicators such as guiding principles, characteristic elements and typical instruments: "hazard protection", "hazard management", "risk management", and "risk governance". In this contribution, special attention is paid to the development of hazard zone planning and the growing importance of communication and participation over the course of this transformation. At present, the risk management paradigm has reached maturity. In the Alps, risk governance frameworks are not yet applied to risks from natural hazards. Based on a historical analysis, the suitability and applicability of general risk governance frameworks in the context of mountain risks are discussed. Necessary adaptations (e.g., in administration, educational, and legal structures) are proposed for the upcoming transformation towards mountain risk governance.
NASA Astrophysics Data System (ADS)
Mardi Safitri, Dian; Arfi Nabila, Zahra; Azmi, Nora
2018-03-01
Musculoskeletal Disorders (MSD) is one of the ergonomic risks due to manual activity, non-neutral posture and repetitive motion. The purpose of this study is to measure risk and implement ergonomic interventions to reduce the risk of MSD on the paper pallet assembly work station. Measurements to work posture are done by Ovako Working Posture Analysis (OWAS) methods and Rapid Entire Body Assessment (REBA) method, while the measurement of work repetitiveness was using Strain Index (SI) method. Assembly processes operators are identified has the highest risk level. OWAS score, Strain Index, and REBA values are 4, 20.25, and 11. Ergonomic improvements are needed to reduce that level of risk. Proposed improvements will be developed using the Quality Function Deployment (QFD) method applied with Axiomatic House of Quality (AHOQ) and Morphological Chart. As the result, risk level based on OWAS score & REBA score turn out from 4 & 11 to be 1 & 2. Biomechanics analysis of the operator also shows the decreasing values for L4-L5 moment, compression, joint shear, and joint moment strength.
Fault tree analysis for system modeling in case of intentional EMI
NASA Astrophysics Data System (ADS)
Genender, E.; Mleczko, M.; Döring, O.; Garbe, H.; Potthast, S.
2011-08-01
The complexity of modern systems on the one hand and the rising threat of intentional electromagnetic interference (IEMI) on the other hand increase the necessity for systematical risk analysis. Most of the problems can not be treated deterministically since slight changes in the configuration (source, position, polarization, ...) can dramatically change the outcome of an event. For that purpose, methods known from probabilistic risk analysis can be applied. One of the most common approaches is the fault tree analysis (FTA). The FTA is used to determine the system failure probability and also the main contributors to its failure. In this paper the fault tree analysis is introduced and a possible application of that method is shown using a small computer network as an example. The constraints of this methods are explained and conclusions for further research are drawn.
[Mathematic analysis of risk factors influence on occupational respiratory diseases development].
Budkar', L N; Bugaeva, I V; Obukhova, T Iu; Tereshina, L G; Karpova, E A; Shmonina, O G
2010-01-01
Analysis covered 1348 case histories of workers exposed to industrial dust in Urals region. The analysis applied mathematical processing of survival theory and correlation analysis. The authors studied influence of various factors: dust concentration, connective tissue dysplasia, smoking habits--on duration for diseases caused by dust to appear. Findings are that occupational diseases develop reliably faster with higher ambient dust concentrations and with connective tissue dysplasia syndrome. Smoking habits do not alter duration of pneumoconiosis development, but reliably increases development of occupational dust bronchitis.
Random matrix theory filters in portfolio optimisation: A stability and risk assessment
NASA Astrophysics Data System (ADS)
Daly, J.; Crane, M.; Ruskin, H. J.
2008-07-01
Random matrix theory (RMT) filters, applied to covariance matrices of financial returns, have recently been shown to offer improvements to the optimisation of stock portfolios. This paper studies the effect of three RMT filters on the realised portfolio risk, and on the stability of the filtered covariance matrix, using bootstrap analysis and out-of-sample testing. We propose an extension to an existing RMT filter, (based on Krzanowski stability), which is observed to reduce risk and increase stability, when compared to other RMT filters tested. We also study a scheme for filtering the covariance matrix directly, as opposed to the standard method of filtering correlation, where the latter is found to lower the realised risk, on average, by up to 6.7%. We consider both equally and exponentially weighted covariance matrices in our analysis, and observe that the overall best method out-of-sample was that of the exponentially weighted covariance, with our Krzanowski stability-based filter applied to the correlation matrix. We also find that the optimal out-of-sample decay factors, for both filtered and unfiltered forecasts, were higher than those suggested by Riskmetrics [J.P. Morgan, Reuters, Riskmetrics technical document, Technical Report, 1996. http://www.riskmetrics.com/techdoc.html], with those for the latter approaching a value of α=1. In conclusion, RMT filtering reduced the realised risk, on average, and in the majority of cases when tested out-of-sample, but increased the realised risk on a marked number of individual days-in some cases more than doubling it.
Using incident response trees as a tool for risk management of online financial services.
Gorton, Dan
2014-09-01
The article introduces the use of probabilistic risk assessment for modeling the incident response process of online financial services. The main contribution is the creation of incident response trees, using event tree analysis, which provides us with a visual tool and a systematic way to estimate the probability of a successful incident response process against the currently known risk landscape, making it possible to measure the balance between front-end and back-end security measures. The model is presented using an illustrative example, and is then applied to the incident response process of a Swedish bank. Access to relevant data is verified and the applicability and usability of the proposed model is verified using one year of historical data. Potential advantages and possible shortcomings are discussed, referring to both the design phase and the operational phase, and future work is presented. © 2014 Society for Risk Analysis.
NASA Technical Reports Server (NTRS)
Greenhalgh, Phillip O.
2004-01-01
In the production of each Space Shuttle Reusable Solid Rocket Motor (RSRM), over 100,000 inspections are performed. ATK Thiokol Inc. reviewed these inspections to ensure a robust inspection system is maintained. The principal effort within this endeavor was the systematic identification and evaluation of inspections considered to be single-point. Single-point inspections are those accomplished on components, materials, and tooling by only one person, involving no other check. The purpose was to more accurately characterize risk and ultimately address and/or mitigate risk associated with single-point inspections. After the initial review of all inspections and identification/assessment of single-point inspections, review teams applied risk prioritization methodology similar to that used in a Process Failure Modes Effects Analysis to derive a Risk Prioritization Number for each single-point inspection. After the prioritization of risk, all single-point inspection points determined to have significant risk were provided either with risk-mitigating actions or rationale for acceptance. This effort gave confidence to the RSRM program that the correct inspections are being accomplished, that there is appropriate justification for those that remain as single-point inspections, and that risk mitigation was applied to further reduce risk of higher risk single-point inspections. This paper examines the process, results, and lessons learned in identifying, assessing, and mitigating risk associated with single-point inspections accomplished in the production of the Space Shuttle RSRM.
Hansson, Helena; Lagerkvist, Carl Johan
2013-01-01
This study integrated risk-benefit analysis with prospect theory with the overall objective of identifying the type of management behavior represented by farmers’ choices of mastitis control options (MCOs). Two exploratory factor analyses, based on 163 and 175 Swedish farmers, respectively, highlighted attitudes to MCOs related to: (1) grouping cows and applying milking order to prevent spread of existing infection and (2) working in a precautionary way to prevent mastitis occurring. This was interpreted as being based on (1) reactive management behavior on detection of udder-health problems in individual cows and (2) proactive management behavior to prevent mastitis developing. Farmers’ assessments of these MCOs were found to be based on asymmetrical evaluations of risks and benefits, suggesting that farmers’ management behavior depends on their individual reference point. In particular, attitudes to MCOs related to grouping cows and applying milking order to prevent the spread of mastitis once infected cows were detected were stronger in the risk domain than in the benefit domain, in accordance with loss aversion. In contrast, attitudes to MCOs related to working in a precautionary way to prevent cows from becoming infected in the first place were stronger in the benefit domain than in the risk domain, in accordance with reverse loss aversion. These findings are of practical importance for farmers and agribusiness and in public health protection work to reduce the current extensive use of antibiotics in dairy herds. PMID:24372180
Bite the apple, get driven out of the garden: A risky story telling at the ASME town meeting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Majumdar, K.C.
1994-11-01
Risk, the all-encompassing four-letter word became a widely used household cliche and an institutional mantra in the nineties. Risk analysis models from the Garden of Eden to the Capitol Hill lawn have made a number of sharp paradigm shifts to evolve itself as a decision-making tool from individual risk perception to societal risk-based regulatory media. Risk always coexists with benefit and is arbitrated by costs. Risk-benefit analysis has been in use in business and industry in economic ventures for a long time. Only recently risk management in its current state of development, evolved as a regulatory tool for controlling largemore » technological systems that have potential impacts on the health and safety of the public and on the sustainability of the ecology and the environment. This paper summarizes the evolution of the risk management concepts and models in industry and the regulatory agencies in the US over the last three decades. It also discusses the benefits and limitations of this evolving discipline as it is applied to high-risk technologies from the nuclear power plant and petrochemical industry, etc. to nuclear weapons technology.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuo, H; Tome, W; FOX, J
2014-06-15
Purpose: To study the feasibility of applying cancer risk model established from treated patients to predict the risk of recurrence on follow-up mammography after radiation therapy for both ipsilateral and contralateral breast. Methods: An extensive set of textural feature functions was applied to a set of 196 Mammograms from 50 patients. 56 Mammograms from 28 patients were used as training set, 44 mammograms from 22 patients were used as test set and the rest were used for prediction. Feature functions include Histogram, Gradient, Co-Occurrence Matrix, Run-Length Matrix and Wavelet Energy. An optimum subset of the feature functions was selected bymore » Fisher Coefficient (FO) or Mutual Information (MI) (up to top 10 features) or a method combined FO, MI and Principal Component (FMP) (up to top 30 features). One-Nearest Neighbor (1-NN), Linear Discriminant Analysis (LDA) and Nonlinear Discriminant Analysis (NDA) were utilized to build a risk model of breast cancer from the training set of mammograms at the time of diagnosis. The risk model was then used to predict the risk of recurrence from mammogram taken one year and three years after RT. Results: FPM with NDA has the best classification power in classifying the training set of the mammogram with lesions versus those without lesions. The model of FPM with NDA achieved a true positive (TP) rate of 82% compared to 45.5% of using FO with 1-NN. The best false positive (FP) rates were 0% and 3.6% in contra-lateral breast of 1-year and 3-years after RT, and 10.9% in ipsi-lateral breast of 3-years after RT. Conclusion: Texture analysis offers high dimension to differentiate breast tissue in mammogram. Using NDA to classify mammogram with lesion from mammogram without lesion, it can achieve rather high TP and low FP in the surveillance of mammogram for patient with conservative surgery combined RT.« less
Hou, Wen-Hsuan; Kang, Chun-Mei; Ho, Mu-Hsing; Kuo, Jessie Ming-Chuan; Chen, Hsiao-Lien; Chang, Wen-Yin
2017-03-01
To evaluate the accuracy of the inpatient fall risk screening tool and to identify the most critical fall risk factors in inpatients. Variations exist in several screening tools applied in acute care hospitals for examining risk factors for falls and identifying high-risk inpatients. Secondary data analysis. A subset of inpatient data for the period from June 2011-June 2014 was extracted from the nursing information system and adverse event reporting system of an 818-bed teaching medical centre in Taipei. Data were analysed using descriptive statistics, receiver operating characteristic curve analysis and logistic regression analysis. During the study period, 205 fallers and 37,232 nonfallers were identified. The results revealed that the inpatient fall risk screening tool (cut-off point of ≥3) had a low sensitivity level (60%), satisfactory specificity (87%), a positive predictive value of 2·0% and a negative predictive value of 99%. The receiver operating characteristic curve analysis revealed an area under the curve of 0·805 (sensitivity, 71·8%; specificity, 78%). To increase the sensitivity values, the Youden index suggests at least 1·5 points to be the most suitable cut-off point for the inpatient fall risk screening tool. Multivariate logistic regression analysis revealed a considerably increased fall risk in patients with impaired balance and impaired elimination. The fall risk factor was also significantly associated with days of hospital stay and with admission to surgical wards. The findings can raise awareness about the two most critical risk factors for falls among future clinical nurses and other healthcare professionals and thus facilitate the development of fall prevention interventions. This study highlights the needs for redefining the cut-off points of the inpatient fall risk screening tool to effectively identify inpatients at a high risk of falls. Furthermore, inpatients with impaired balance and impaired elimination should be closely monitored by nurses to prevent falling during hospitalisations. © 2016 John Wiley & Sons Ltd.
Probability and possibility-based representations of uncertainty in fault tree analysis.
Flage, Roger; Baraldi, Piero; Zio, Enrico; Aven, Terje
2013-01-01
Expert knowledge is an important source of input to risk analysis. In practice, experts might be reluctant to characterize their knowledge and the related (epistemic) uncertainty using precise probabilities. The theory of possibility allows for imprecision in probability assignments. The associated possibilistic representation of epistemic uncertainty can be combined with, and transformed into, a probabilistic representation; in this article, we show this with reference to a simple fault tree analysis. We apply an integrated (hybrid) probabilistic-possibilistic computational framework for the joint propagation of the epistemic uncertainty on the values of the (limiting relative frequency) probabilities of the basic events of the fault tree, and we use possibility-probability (probability-possibility) transformations for propagating the epistemic uncertainty within purely probabilistic and possibilistic settings. The results of the different approaches (hybrid, probabilistic, and possibilistic) are compared with respect to the representation of uncertainty about the top event (limiting relative frequency) probability. Both the rationale underpinning the approaches and the computational efforts they require are critically examined. We conclude that the approaches relevant in a given setting depend on the purpose of the risk analysis, and that further research is required to make the possibilistic approaches operational in a risk analysis context. © 2012 Society for Risk Analysis.
Labib, Sarah; Bourdon-Lacombe, Julie; Kuo, Byron; Buick, Julie K.; Lemieux, France; Williams, Andrew; Halappanavar, Sabina; Malik, Amal; Luijten, Mirjam; Aubrecht, Jiri; Hyduke, Daniel R.; Fornace, Albert J.; Swartz, Carol D.; Recio, Leslie; Yauk, Carole L.
2015-01-01
Toxicogenomics is proposed to be a useful tool in human health risk assessment. However, a systematic comparison of traditional risk assessment approaches with those applying toxicogenomics has never been done. We conducted a case study to evaluate the utility of toxicogenomics in the risk assessment of benzo[a]pyrene (BaP), a well-studied carcinogen, for drinking water exposures. Our study was intended to compare methodologies, not to evaluate drinking water safety. We compared traditional (RA1), genomics-informed (RA2) and genomics-only (RA3) approaches. RA2 and RA3 applied toxicogenomics data from human cell cultures and mice exposed to BaP to determine if these data could provide insight into BaP's mode of action (MOA) and derive tissue-specific points of departure (POD). Our global gene expression analysis supported that BaP is genotoxic in mice and allowed the development of a detailed MOA. Toxicogenomics analysis in human lymphoblastoid TK6 cells demonstrated a high degree of consistency in perturbed pathways with animal tissues. Quantitatively, the PODs for traditional and transcriptional approaches were similar (liver 1.2 vs. 1.0 mg/kg-bw/day; lung 0.8 vs. 3.7 mg/kg-bw/day; forestomach 0.5 vs. 7.4 mg/kg-bw/day). RA3, which applied toxicogenomics in the absence of apical toxicology data, demonstrates that this approach provides useful information in data-poor situations. Overall, our study supports the use of toxicogenomics as a relatively fast and cost-effective tool for hazard identification, preliminary evaluation of potential carcinogens, and carcinogenic potency, in addition to identifying current limitations and practical questions for future work. PMID:25605026
Continuous Risk Management: A NASA Program Initiative
NASA Technical Reports Server (NTRS)
Hammer, Theodore F.; Rosenberg, Linda
1999-01-01
NPG 7120.5A, "NASA Program and Project Management Processes and Requirements" enacted in April, 1998, requires that "The program or project manager shall apply risk management principles..." The Software Assurance Technology Center (SATC) at NASA GSFC has been tasked with the responsibility for developing and teaching a systems level course for risk management that provides information on how to comply with this edict. The course was developed in conjunction with the Software Engineering Institute at Carnegie Mellon University, then tailored to the NASA systems community. This presentation will briefly discuss the six functions for risk management: (1) Identify the risks in a specific format; (2) Analyze the risk probability, impact/severity, and timeframe; (3) Plan the approach; (4) Track the risk through data compilation and analysis; (5) Control and monitor the risk; (6) Communicate and document the process and decisions.
Han, Tae Hee; Kim, Moon Jung; Kim, Shinyoung; Kim, Hyun Ok; Lee, Mi Ae; Choi, Ji Seon; Hur, Mina; St John, Andrew
2013-05-01
Failure modes and effects analysis (FMEA) is a risk management tool used by the manufacturing industry but now being applied in laboratories. Teams from six South Korean blood banks used this tool to map their manual and automated blood grouping processes and determine the risk priority numbers (RPNs) as a total measure of error risk. The RPNs determined by each of the teams consistently showed that the use of automation dramatically reduced the RPN compared to manual processes. In addition, FMEA showed where the major risks occur in each of the manual processes and where attention should be prioritized to improve the process. Despite no previous experience with FMEA, the teams found the technique relatively easy to use and the subjectivity associated with assigning risk numbers did not affect the validity of the data. FMEA should become a routine technique for improving processes in laboratories. © 2012 American Association of Blood Banks.
Spatio-temporal assessment of food safety risks in Canadian food distribution systems using GIS.
Hashemi Beni, Leila; Villeneuve, Sébastien; LeBlanc, Denyse I; Côté, Kevin; Fazil, Aamir; Otten, Ainsley; McKellar, Robin; Delaquis, Pascal
2012-09-01
While the value of geographic information systems (GIS) is widely applied in public health there have been comparatively few examples of applications that extend to the assessment of risks in food distribution systems. GIS can provide decision makers with strong computing platforms for spatial data management, integration, analysis, querying and visualization. The present report addresses some spatio-analyses in a complex food distribution system and defines influence areas as travel time zones generated through road network analysis on a national scale rather than on a community scale. In addition, a dynamic risk index is defined to translate a contamination event into a public health risk as time progresses. More specifically, in this research, GIS is used to map the Canadian produce distribution system, analyze accessibility to contaminated product by consumers, and estimate the level of risk associated with a contamination event over time, as illustrated in a scenario. Crown Copyright © 2012. Published by Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mays, S.E.; Poloski, J.P.; Sullivan, W.H.
1982-07-01
This report describes a risk study of the Browns Ferry, Unit 1, nuclear plant. The study is one of four such studies sponsored by the NRC Office of Research, Division of Risk Assessment, as part of its Interim Reliability Evaluation Program (IREP), Phase II. This report is contained in four volumes: a main report and three appendixes. Appendix B provides a description of Browns Ferry, Unit 1, plant systems and the failure evaluation of those systems as they apply to accidents at Browns Ferry. Information is presented concerning front-line system fault analysis; support system fault analysis; human error models andmore » probabilities; and generic control circuit analyses.« less
Uncertainty analysis on simple mass balance model to calculate critical loads for soil acidity
Harbin Li; Steven G. McNulty
2007-01-01
Simple mass balance equations (SMBE) of critical acid loads (CAL) in forest soil were developed to assess potential risks of air pollutants to ecosystems. However, to apply SMBE reliably at large scales, SMBE must be tested for adequacy and uncertainty. Our goal was to provide a detailed analysis of uncertainty in SMBE so that sound strategies for scaling up CAL...
Relating Data and Models to Characterize Parameter and Prediction Uncertainty
Applying PBPK models in risk analysis requires that we realistically assess the uncertainty of relevant model predictions in as quantitative a way as possible. The reality of human variability may add a confusing feature to the overall uncertainty assessment, as uncertainty and v...
Greenwood, Taylor J; Lopez-Costa, Rodrigo I; Rhoades, Patrick D; Ramírez-Giraldo, Juan C; Starr, Matthew; Street, Mandie; Duncan, James; McKinstry, Robert C
2015-01-01
The marked increase in radiation exposure from medical imaging, especially in children, has caused considerable alarm and spurred efforts to preserve the benefits but reduce the risks of imaging. Applying the principles of the Image Gently campaign, data-driven process and quality improvement techniques such as process mapping and flowcharting, cause-and-effect diagrams, Pareto analysis, statistical process control (control charts), failure mode and effects analysis, "lean" or Six Sigma methodology, and closed feedback loops led to a multiyear program that has reduced overall computed tomographic (CT) examination volume by more than fourfold and concurrently decreased radiation exposure per CT study without compromising diagnostic utility. This systematic approach involving education, streamlining access to magnetic resonance imaging and ultrasonography, auditing with comparison with benchmarks, applying modern CT technology, and revising CT protocols has led to a more than twofold reduction in CT radiation exposure between 2005 and 2012 for patients at the authors' institution while maintaining diagnostic utility. (©)RSNA, 2015.
Spatial Analysis of Ambient PM2.5 Exposure and Bladder Cancer Mortality in Taiwan
Yeh, Hsin-Ling; Hsu, Shang-Wei; Chang, Yu-Chia; Chan, Ta-Chien; Tsou, Hui-Chen; Chang, Yen-Chen; Chiang, Po-Huang
2017-01-01
Fine particulate matter (PM2.5) is an air pollutant that is receiving intense regulatory attention in Taiwan. In previous studies, the effect of air pollution on bladder cancer has been explored. This study was conducted to elucidate the effect of atmospheric PM2.5 and other local risk factors on bladder cancer mortality based on available 13-year mortality data. Geographically weighted regression (GWR) was applied to estimate and interpret the spatial variability of the relationships between bladder cancer mortality and ambient PM2.5 concentrations, and other variables were covariates used to adjust for the effect of PM2.5. After applying a GWR model, the concentration of ambient PM2.5 showed a positive correlation with bladder cancer mortality in males in northern Taiwan and females in most of the townships in Taiwan. This is the first time PM2.5 has been identified as a risk factor for bladder cancer based on the statistical evidence provided by GWR analysis. PMID:28489042
Li, Chunhui; Sun, Lian; Jia, Junxiang; Cai, Yanpeng; Wang, Xuan
2016-07-01
Source water areas are facing many potential water pollution risks. Risk assessment is an effective method to evaluate such risks. In this paper an integrated model based on k-means clustering analysis and set pair analysis was established aiming at evaluating the risks associated with water pollution in source water areas, in which the weights of indicators were determined through the entropy weight method. Then the proposed model was applied to assess water pollution risks in the region of Shiyan in which China's key source water area Danjiangkou Reservoir for the water source of the middle route of South-to-North Water Diversion Project is located. The results showed that eleven sources with relative high risk value were identified. At the regional scale, Shiyan City and Danjiangkou City would have a high risk value in term of the industrial discharge. Comparatively, Danjiangkou City and Yunxian County would have a high risk value in terms of agricultural pollution. Overall, the risk values of north regions close to the main stream and reservoir of the region of Shiyan were higher than that in the south. The results of risk level indicated that five sources were in lower risk level (i.e., level II), two in moderate risk level (i.e., level III), one in higher risk level (i.e., level IV) and three in highest risk level (i.e., level V). Also risks of industrial discharge are higher than that of the agricultural sector. It is thus essential to manage the pillar industry of the region of Shiyan and certain agricultural companies in the vicinity of the reservoir to reduce water pollution risks of source water areas. Copyright © 2016 Elsevier B.V. All rights reserved.
Higgins, Agnes; Doyle, Louise; Morrissey, Jean; Downes, Carmel; Gill, Ailish; Bailey, Sive
2016-08-01
Despite the articulated need for policies and processes to guide risk assessment and safety planning, limited guidance exists on the processes or procedures to be used to develop such policies, and there is no body of research that examines the quality or content of the risk-management policies developed. The aim of the present study was to analyse the policies of risk and safety management used to guide mental health nursing practice in Ireland. A documentary analysis was performed on 123 documents received from 22 of the 23 directors of nursing contacted. Findings from the analysis revealed a wide variation in how risk, risk assessment, and risk management were defined. Emphasis within the risk documentation submitted was on risk related to self and others, with minimal attention paid to other types of risks. In addition, there was limited evidence of recovery-focused approaches to positive risk taking that involved service users and their families within the risk-related documentation. Many of the risk-assessment tools had not been validated, and lacked consistency or guidance in relation to how they were to be used or applied. The tick-box approach and absence of space for commentary within documentation have the potential to impact severely on the quality of information collected and documented, and subsequent clinical decision-making. Managers, and those tasked with ensuring safety and quality, need to ensure that policies and processes are, where possible, informed by best evidence and are in line with national mental health policy on recovery. © 2016 Australian College of Mental Health Nurses Inc.
Degelman, Michelle L; Herman, Katya M
2017-10-01
Despite being one of the most common neurological disorders globally, the cause(s) of multiple sclerosis (MS) remain unknown. Cigarette smoking has been studied with regards to both the development and progression of MS. The Bradford Hill criteria for causation can contribute to a more comprehensive evaluation of a potentially causal risk factor-disease outcome relationship. The objective of this systematic review and meta-analysis was to assess the relationship between smoking and both MS risk and MS progression, subsequently applying Hill's criteria to further evaluate the likelihood of causal associations. The Medline, EMBASE, CINAHL, PsycInfo, and Cochrane Library databases were searched for relevant studies up until July 28, 2015. A random-effects meta-analysis was conducted for three outcomes: MS risk, conversion from clinically isolated syndrome (CIS) to clinically definite multiple sclerosis (CDMS), and progression from relapsing-remitting multiple sclerosis (RRMS) to secondary-progressive multiple sclerosis (SPMS). Dose-response relationships and risk factor interactions, and discussions of mechanisms and analogous associations were noted. Hill's criteria were applied to assess causality of the relationships between smoking and each outcome. The effect of second-hand smoke exposure was also briefly reviewed. Smoking had a statistically significant association with both MS risk (conservative: OR/RR 1.54, 95% CI [1.46-1.63]) and SPMS risk (HR 1.80, 95% CI [1.04-3.10]), but the association with progression from CIS to CDMS was non-significant (HR 1.13, 95% CI [0.73-1.76]). Using Hill's criteria, there was strong evidence of a causal role of smoking in MS risk, but only moderate evidence of a causal association between smoking and MS progression. Heterogeneity in study designs and target populations, inconsistent results, and an overall scarcity of studies point to the need for more research on second-hand smoke exposure in relation to MS prior to conducting a detailed meta-analysis. This first review to supplement systematic review and meta-analytic methods with Hill's criteria to analyze the smoking-MS association provides evidence supporting the causal involvement of smoking in the development and progression of MS. Smoking prevention and cessation programs and policies should consider MS as an additional health risk when aiming to reduce smoking prevalence in the population. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Jo, J. A.; Fang, Q.; Papaioannou, T.; Qiao, J. H.; Fishbein, M. C.; Beseth, B.; Dorafshar, A. H.; Reil, T.; Baker, D.; Freischlag, J.; Marcu, L.
2006-02-01
This study introduces new methods of time-resolved laser-induced fluorescence spectroscopy (TR-LIFS) data analysis for tissue characterization. These analytical methods were applied for the detection of atherosclerotic vulnerable plaques. Upon pulsed nitrogen laser (337 nm, 1 ns) excitation, TR-LIFS measurements were obtained from carotid atherosclerotic plaque specimens (57 endarteroctomy patients) at 492 distinct areas. The emission was both spectrally- (360-600 nm range at 5 nm interval) and temporally- (0.3 ns resolution) resolved using a prototype clinically compatible fiber-optic catheter TR-LIFS apparatus. The TR-LIFS measurements were subsequently analyzed using a standard multiexponential deconvolution and a recently introduced Laguerre deconvolution technique. Based on their histopathology, the lesions were classified as early (thin intima), fibrotic (collagen-rich intima), and high-risk (thin cap over necrotic core and/or inflamed intima). Stepwise linear discriminant analysis (SLDA) was applied for lesion classification. Normalized spectral intensity values and Laguerre expansion coefficients (LEC) at discrete emission wavelengths (390, 450, 500 and 550 nm) were used as features for classification. The Laguerre based SLDA classifier provided discrimination of high-risk lesions with high sensitivity (SE>81%) and specificity (SP>95%). Based on these findings, we believe that TR-LIFS information derived from the Laguerre expansion coefficients can provide a valuable additional dimension for the diagnosis of high-risk vulnerable atherosclerotic plaques.
Risk assessment in ginecology and obstetrics in Sicily: an approach based on Wolff's Criteria.
Matranga, D; Marsala, M G L; Vadalà, M; Morici, M; Restivo, V; Ferrara, C; Vitale, F; Firenze, A
2013-01-01
To apply Wolff's Criteria to hospital discharge records (HDR) in order to detect adverse events worthy of further study. Gynecology and Obstetrics Units of three Sicilian hospitals were considered and HDR regarding ordinary and day hospital admissions in 2008 were collected. A matched case-control study was designed, by random selection of 10 controls at maximum for each case. Matching was performed on the variables age and speciality of admission (gynecology or obstetrics). Out of a total of 7011 HDR examined, 114 cases were identified with Wolff's Criteria. Multivariate analysis confirmed a statistically significant association with the origin of admission, diagnosis at the acceptance and length of stay: there was a decreased risk of Wolff's event in patients having urgent admission compared to elective (OR = 0.47, 95% CI = [0.28-0.78]), an increased risk in patients reporting tumor (OR = 5:41, 95 % CI [1.89-15.47]) and other causes (OR = 2.16, 95% CI [1.10-4.24]) compared to delivery diagnosis at acceptance and in patients whose length of stay was more than 6 days (OR = 23.17, 95% CI = [12.56-42.7]) compared to less or equal than 3 days Wolff's Criteria can be applied for the analysis of clinical risk in hospitals with different structural characteristics, on condition that the HDR database is complete and good quality.
NASA Astrophysics Data System (ADS)
Sun, Wenqing; Tseng, Tzu-Liang B.; Zheng, Bin; Zhang, Jianying; Qian, Wei
2015-03-01
A novel breast cancer risk analysis approach is proposed for enhancing performance of computerized breast cancer risk analysis using bilateral mammograms. Based on the intensity of breast area, five different sub-regions were acquired from one mammogram, and bilateral features were extracted from every sub-region. Our dataset includes 180 bilateral mammograms from 180 women who underwent routine screening examinations, all interpreted as negative and not recalled by the radiologists during the original screening procedures. A computerized breast cancer risk analysis scheme using four image processing modules, including sub-region segmentation, bilateral feature extraction, feature selection, and classification was designed to detect and compute image feature asymmetry between the left and right breasts imaged on the mammograms. The highest computed area under the curve (AUC) is 0.763 ± 0.021 when applying the multiple sub-region features to our testing dataset. The positive predictive value and the negative predictive value were 0.60 and 0.73, respectively. The study demonstrates that (1) features extracted from multiple sub-regions can improve the performance of our scheme compared to using features from whole breast area only; (2) a classifier using asymmetry bilateral features can effectively predict breast cancer risk; (3) incorporating texture and morphological features with density features can boost the classification accuracy.
Heart failure disease management programs: a cost-effectiveness analysis.
Chan, David C; Heidenreich, Paul A; Weinstein, Milton C; Fonarow, Gregg C
2008-02-01
Heart failure (HF) disease management programs have shown impressive reductions in hospitalizations and mortality, but in studies limited to short time frames and high-risk patient populations. Current guidelines thus only recommend disease management targeted to high-risk patients with HF. This study applied a new technique to infer the degree to which clinical trials have targeted patients by risk based on observed rates of hospitalization and death. A Markov model was used to assess the incremental life expectancy and cost of providing disease management for high-risk to low-risk patients. Sensitivity analyses of various long-term scenarios and of reduced effectiveness in low-risk patients were also considered. The incremental cost-effectiveness ratio of extending coverage to all patients was $9700 per life-year gained in the base case. In aggregate, universal coverage almost quadrupled life-years saved as compared to coverage of only the highest quintile of risk. A worst case analysis with simultaneous conservative assumptions yielded an incremental cost-effectiveness ratio of $110,000 per life-year gained. In a probabilistic sensitivity analysis, 99.74% of possible incremental cost-effectiveness ratios were <$50,000 per life-year gained. Heart failure disease management programs are likely cost-effective in the long-term along the whole spectrum of patient risk. Health gains could be extended by enrolling a broader group of patients with HF in disease management.
Augmenting the Deliberative Method for Ranking Risks.
Susel, Irving; Lasley, Trace; Montezemolo, Mark; Piper, Joel
2016-01-01
The Department of Homeland Security (DHS) characterized and prioritized the physical cross-border threats and hazards to the nation stemming from terrorism, market-driven illicit flows of people and goods (illegal immigration, narcotics, funds, counterfeits, and weaponry), and other nonmarket concerns (movement of diseases, pests, and invasive species). These threats and hazards pose a wide diversity of consequences with very different combinations of magnitudes and likelihoods, making it very challenging to prioritize them. This article presents the approach that was used at DHS to arrive at a consensus regarding the threats and hazards that stand out from the rest based on the overall risk they pose. Due to time constraints for the decision analysis, it was not feasible to apply multiattribute methodologies like multiattribute utility theory or the analytic hierarchy process. Using a holistic approach was considered, such as the deliberative method for ranking risks first published in this journal. However, an ordinal ranking alone does not indicate relative or absolute magnitude differences among the risks. Therefore, the use of the deliberative method for ranking risks is not sufficient for deciding whether there is a material difference between the top-ranked and bottom-ranked risks, let alone deciding what the stand-out risks are. To address this limitation of ordinal rankings, the deliberative method for ranking risks was augmented by adding an additional step to transform the ordinal ranking into a ratio scale ranking. This additional step enabled the selection of stand-out risks to help prioritize further analysis. © 2015 Society for Risk Analysis.
Harris, Meagan J; Stinson, Jonah; Landis, Wayne G
2017-07-01
We conducted a regional-scale integrated ecological and human health risk assessment by applying the relative risk model with Bayesian networks (BN-RRM) to a case study of the South River, Virginia mercury-contaminated site. Risk to four ecological services of the South River (human health, water quality, recreation, and the recreational fishery) was evaluated using a multiple stressor-multiple endpoint approach. These four ecological services were selected as endpoints based on stakeholder feedback and prioritized management goals for the river. The BN-RRM approach allowed for the calculation of relative risk to 14 biotic, human health, recreation, and water quality endpoints from chemical and ecological stressors in five risk regions of the South River. Results indicated that water quality and the recreational fishery were the ecological services at highest risk in the South River. Human health risk for users of the South River was low relative to the risk to other endpoints. Risk to recreation in the South River was moderate with little spatial variability among the five risk regions. Sensitivity and uncertainty analysis identified stressors and other parameters that influence risk for each endpoint in each risk region. This research demonstrates a probabilistic approach to integrated ecological and human health risk assessment that considers the effects of chemical and ecological stressors across the landscape. © 2017 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Harbitz, C. B.; Frauenfelder, R.; Kaiser, G.; Glimsdal, S.; Sverdrup-thygeson, K.; Løvholt, F.; Gruenburg, L.; Mc Adoo, B. G.
2015-12-01
The 2011 Tōhoku tsunami caused a high number of fatalities and massive destruction. Data collected after the event allow for retrospective analyses. Since 2009, NGI has developed a generic GIS model for local analyses of tsunami vulnerability and mortality risk. The mortality risk convolves the hazard, exposure, and vulnerability. The hazard is represented by the maximum tsunami flow depth (with a corresponding likelihood), the exposure is described by the population density in time and space, while the vulnerability is expressed by the probability of being killed as a function of flow depth and building class. The analysis is further based on high-resolution DEMs. Normally a certain tsunami scenario with a corresponding return period is applied for vulnerability and mortality risk analysis. Hence, the model was first employed for a tsunami forecast scenario affecting Bridgetown, Barbados, and further developed in a forecast study for the city of Batangas in the Philippines. Subsequently, the model was tested by hindcasting the 2009 South Pacific tsunami in American Samoa. This hindcast was based on post-tsunami information. The GIS model was adapted for optimal use of the available data and successfully estimated the degree of mortality.For further validation and development, the model was recently applied in the RAPSODI project for hindcasting the 2011 Tōhoku tsunami in Sendai and Ishinomaki. With reasonable choices of building vulnerability, the estimated expected number of fatalities agree well with the reported death toll. The results of the mortality hindcast for the 2011 Tōhoku tsunami substantiate that the GIS model can help to identify high tsunami mortality risk areas, as well as identify the main risk drivers.The research leading to these results has received funding from CONCERT-Japan Joint Call on Efficient Energy Storage and Distribution/Resilience against Disasters (http://www.concertjapan.eu; project RAPSODI - Risk Assessment and design of Prevention Structures fOr enhanced tsunami DIsaster resilience http://www.ngi.no/en/Project-pages/RAPSODI/), and from the European Union's Seventh Framework Programme (FP7/2007-2013) under grant agreement n° 603839 (Project ASTARTE - Assessment, STrategy And Risk reduction for Tsunamis in Europe http://www.astarte-project.eu/).
Development of probabilistic multimedia multipathway computer codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, C.; LePoire, D.; Gnanapragasam, E.
2002-01-01
The deterministic multimedia dose/risk assessment codes RESRAD and RESRAD-BUILD have been widely used for many years for evaluation of sites contaminated with residual radioactive materials. The RESRAD code applies to the cleanup of sites (soils) and the RESRAD-BUILD code applies to the cleanup of buildings and structures. This work describes the procedure used to enhance the deterministic RESRAD and RESRAD-BUILD codes for probabilistic dose analysis. A six-step procedure was used in developing default parameter distributions and the probabilistic analysis modules. These six steps include (1) listing and categorizing parameters; (2) ranking parameters; (3) developing parameter distributions; (4) testing parameter distributionsmore » for probabilistic analysis; (5) developing probabilistic software modules; and (6) testing probabilistic modules and integrated codes. The procedures used can be applied to the development of other multimedia probabilistic codes. The probabilistic versions of RESRAD and RESRAD-BUILD codes provide tools for studying the uncertainty in dose assessment caused by uncertain input parameters. The parameter distribution data collected in this work can also be applied to other multimedia assessment tasks and multimedia computer codes.« less
Magnetic resonance angiography for the nonpalpable testis: a cost and cancer risk analysis.
Eggener, S E; Lotan, Y; Cheng, E Y
2005-05-01
For the unilateral nonpalpable testis standard management is open surgical or laparoscopic exploration. An ideal imaging technique would reliably identify testicular nubbins and safely allow children to forgo surgical exploration without compromising future health or fertility. Our goal was to perform a cost and risk analysis of magnetic resonance angiography (MRA) for unilateral nonpalpable cryptorchid testes. A search of the English medical literature revealed 3 studies addressing the usefulness of MRA for the nonpalpable testicle. We performed a meta-analysis and applied the results to a hypothetical set of patients using historical testicular localization data. Analysis was then performed using 3 different management protocols-MRA with removal of testicular nubbin tissue, MRA with observation of testicular nubbin tissue and diagnostic laparoscopy. A cancer risk and cost analysis was then performed. MRA with observation of testicular nubbin tissue results in 29% of patients avoiding surgery without any increased cost of care. Among the 29% of boys with testicular nubbins left in situ and observed the highest estimated risk was 1 in 300 of cancer developing, and 1 in 5,300 of dying of cancer. A protocol using MRA with observation of inguinal nubbins results in nearly a third of boys avoiding surgical intervention at a similar cost to standard care without any significant increased risk of development of testis cancer.
Risk Evaluation of Railway Coal Transportation Network Based on Multi Level Grey Evaluation Model
NASA Astrophysics Data System (ADS)
Niu, Wei; Wang, Xifu
2018-01-01
The railway transport mode is currently the most important way of coal transportation, and now China’s railway coal transportation network has become increasingly perfect, but there is still insufficient capacity, some lines close to saturation and other issues. In this paper, the theory and method of risk assessment, analytic hierarchy process and multi-level gray evaluation model are applied to the risk evaluation of coal railway transportation network in China. Based on the example analysis of Shanxi railway coal transportation network, to improve the internal structure and the competitiveness of the market.
TH-EF-BRC-03: Fault Tree Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomadsen, B.
2016-06-15
This Hands-on Workshop will be focused on providing participants with experience with the principal tools of TG 100 and hence start to build both competence and confidence in the use of risk-based quality management techniques. The three principal tools forming the basis of TG 100’s risk analysis: Process mapping, Failure-Modes and Effects Analysis and fault-tree analysis will be introduced with a 5 minute refresher presentation and each presentation will be followed by a 30 minute small group exercise. An exercise on developing QM from the risk analysis follows. During the exercise periods, participants will apply the principles in 2 differentmore » clinical scenarios. At the conclusion of each exercise there will be ample time for participants to discuss with each other and the faculty their experience and any challenges encountered. Learning Objectives: To review the principles of Process Mapping, Failure Modes and Effects Analysis and Fault Tree Analysis. To gain familiarity with these three techniques in a small group setting. To share and discuss experiences with the three techniques with faculty and participants. Director, TreatSafely, LLC. Director, Center for the Assessment of Radiological Sciences. Occasional Consultant to the IAEA and Varian.« less
Te Beest, D E; Paveley, N D; Shaw, M W; van den Bosch, F
2013-07-01
A method is presented to calculate economic optimum fungicide doses accounting for the risk aversion of growers responding to variability in disease severity between crops. Simple dose-response and disease-yield loss functions are used to estimate net disease-related costs (fungicide cost plus disease-induced yield loss) as a function of dose and untreated severity. With fairly general assumptions about the shapes of the probability distribution of disease severity and the other functions involved, we show that a choice of fungicide dose which minimizes net costs, on average, across seasons results in occasional large net costs caused by inadequate control in high disease seasons. This may be unacceptable to a grower with limited capital. A risk-averse grower can choose to reduce the size and frequency of such losses by applying a higher dose as insurance. For example, a grower may decide to accept "high-loss" years 1 year in 10 or 1 year in 20 (i.e., specifying a proportion of years in which disease severity and net costs will be above a specified level). Our analysis shows that taking into account disease severity variation and risk aversion will usually increase the dose applied by an economically rational grower. The analysis is illustrated with data on Septoria tritici leaf blotch of wheat caused by Mycosphaerella graminicola. Observations from untreated field plots at sites across England over 3 years were used to estimate the probability distribution of disease severities at mid-grain filling. In the absence of a fully reliable disease forecasting scheme, reducing the frequency of high-loss years requires substantially higher doses to be applied to all crops. Disease-resistant cultivars reduce both the optimal dose at all levels of risk and the disease-related costs at all doses.
Godos, J; Bella, F; Torrisi, A; Sciacca, S; Galvano, F; Grosso, G
2016-12-01
Current evidence suggests that dietary patterns may play an important role in colorectal cancer risk. The present study aimed to perform a systematic review and meta-analysis of observational studies exploring the association between dietary patterns and colorectal adenomas (a precancerous condition). Pubmed and EMBASE electronic databases were systematically searched to retrieve eligible studies. Only studies exploring the risk or association with colorectal adenomas for the highest versus lowest category of exposure to a posteriori dietary patterns were included in the quantitative analysis. Random-effects models were applied to calculate relative risks (RRs) of colorectal adenomas for high adherence to healthy or unhealthy dietary patterns. Statistical heterogeneity and publication bias were explored. Twelve studies were reviewed. Three studies explored a priori dietary patterns using scores identifying adherence to the Mediterranean, Paleolithic and Dietary Approaches to Stop Hypertension (DASH) diet and reported an association with decreased colorectal adenoma risk. Two studies tested the association with colorectal adenomas between a posteriori dietary patterns showing lower odds of disease related to plant-based compared to meat-based dietary patterns. Seven studies identified 23 a posteriori dietary patterns and the analysis revealed that higher adherence to healthy and unhealthy dietary patterns was significantly associated risk of colorectal adenomas (RR = 0.81, 95% confidence interval = 0.71, 0.94 and RR = 1.24, 95% confidence interval = 1.13, 1.35, respectively) with no evidence of heterogeneity or publication bias. The results of this systematic review and meta-analysis indicate that dietary patterns may be associated with the risk of colorectal adenomas. © 2016 The British Dietetic Association Ltd.
Study of a risk-based piping inspection guideline system.
Tien, Shiaw-Wen; Hwang, Wen-Tsung; Tsai, Chih-Hung
2007-02-01
A risk-based inspection system and a piping inspection guideline model were developed in this study. The research procedure consists of two parts--the building of a risk-based inspection model for piping and the construction of a risk-based piping inspection guideline model. Field visits at the plant were conducted to develop the risk-based inspection and strategic analysis system. A knowledge-based model had been built in accordance with international standards and local government regulations, and the rational unified process was applied for reducing the discrepancy in the development of the models. The models had been designed to analyze damage factors, damage models, and potential damage positions of piping in the petrochemical plants. The purpose of this study was to provide inspection-related personnel with the optimal planning tools for piping inspections, hence, to enable effective predictions of potential piping risks and to enhance the better degree of safety in plant operations that the petrochemical industries can be expected to achieve. A risk analysis was conducted on the piping system of a petrochemical plant. The outcome indicated that most of the risks resulted from a small number of pipelines.
Risk evaluation of highway engineering project based on the fuzzy-AHP
NASA Astrophysics Data System (ADS)
Yang, Qian; Wei, Yajun
2011-10-01
Engineering projects are social activities, which integrate with technology, economy, management and organization. There are uncertainties in each respect of engineering projects, and it needs to strengthen risk management urgently. Based on the analysis of the characteristics of highway engineering, and the study of the basic theory on risk evaluation, the paper built an index system of highway project risk evaluation. Besides based on fuzzy mathematics principle, analytical hierarchy process was used and as a result, the model of the comprehensive appraisal method of fuzzy and AHP was set up for the risk evaluation of express way concessionary project. The validity and the practicability of the risk evaluation of expressway concessionary project were verified after the model was applied to the practice of a project.
Risk-based decision making for terrorism applications.
Dillon, Robin L; Liebe, Robert M; Bestafka, Thomas
2009-03-01
This article describes the anti-terrorism risk-based decision aid (ARDA), a risk-based decision-making approach for prioritizing anti-terrorism measures. The ARDA model was developed as part of a larger effort to assess investments for protecting U.S. Navy assets at risk and determine whether the most effective anti-terrorism alternatives are being used to reduce the risk to the facilities and war-fighting assets. With ARDA and some support from subject matter experts, we examine thousands of scenarios composed of 15 attack modes against 160 facility types on two installations and hundreds of portfolios of 22 mitigation alternatives. ARDA uses multiattribute utility theory to solve some of the commonly identified challenges in security risk analysis. This article describes the process and documents lessons learned from applying the ARDA model for this application.
Consumer default risk assessment in a banking institution
NASA Astrophysics Data System (ADS)
Costa e Silva, Eliana; Lopes, Isabel Cristina; Correia, Aldina; Faria, Susana
2016-12-01
Credit scoring is an application of financial risk forecasting to consumer lending. In this study, statistical analysis is applied to credit scoring data from a financial institution to evaluate the default risk of consumer loans. The default risk was found to be influenced by the spread, the age of the consumer, the number of credit cards owned by the consumer. A lower spread, a higher number of credit cards and a younger age of the borrower are factors that decrease the risk of default. Clients receiving the salary in the same banking institution of the loan have less chances of default than clients receiving their salary in another institution. We also found that clients in the lowest income tax echelon have more propensity to default.
Cost/benefit analysis of advanced materials technology candidates for the 1980's, part 2
NASA Technical Reports Server (NTRS)
Dennis, R. E.; Maertins, H. F.
1980-01-01
Cost/benefit analyses to evaluate advanced material technologies projects considered for general aviation and turboprop commuter aircraft through estimated life-cycle costs, direct operating costs, and development costs are discussed. Specifically addressed is the selection of technologies to be evaluated; development of property goals; assessment of candidate technologies on typical engines and aircraft; sensitivity analysis of the changes in property goals on performance and economics, cost, and risk analysis for each technology; and ranking of each technology by relative value. The cost/benefit analysis was applied to a domestic, nonrevenue producing, business-type jet aircraft configured with two TFE731-3 turbofan engines, and to a domestic, nonrevenue producing, business type turboprop aircraft configured with two TPE331-10 turboprop engines. In addition, a cost/benefit analysis was applied to a commercial turboprop aircraft configured with a growth version of the TPE331-10.
Predictive susceptibility analysis of typhoon induced landslides in Central Taiwan
NASA Astrophysics Data System (ADS)
Shou, Keh-Jian; Lin, Zora
2017-04-01
Climate change caused by global warming affects Taiwan significantly for the past decade. The increasing frequency of extreme rainfall events, in which concentrated and intensive rainfalls generally cause geohazards including landslides and debris flows. The extraordinary, such as 2004 Mindulle and 2009 Morakot, hit Taiwan and induced serious flooding and landslides. This study employs rainfall frequency analysis together with the atmospheric general circulation model (AGCM) downscaling estimation to understand the temporal rainfall trends, distributions, and intensities in the adopted Wu River watershed in Central Taiwan. To assess the spatial hazard of the landslides, landslide susceptibility analysis was also applied. Different types of rainfall factors were tested in the susceptibility models for a better accuracy. In addition, the routes of typhoons were also considered in the predictive analysis. The results of predictive analysis can be applied for risk prevention and management in the study area.
Bonofiglio, Federico; Beyersmann, Jan; Schumacher, Martin; Koller, Michael; Schwarzer, Guido
2016-09-01
Meta-analysis of a survival endpoint is typically based on the pooling of hazard ratios (HRs). If competing risks occur, the HRs may lose translation into changes of survival probability. The cumulative incidence functions (CIFs), the expected proportion of cause-specific events over time, re-connect the cause-specific hazards (CSHs) to the probability of each event type. We use CIF ratios to measure treatment effect on each event type. To retrieve information on aggregated, typically poorly reported, competing risks data, we assume constant CSHs. Next, we develop methods to pool CIF ratios across studies. The procedure computes pooled HRs alongside and checks the influence of follow-up time on the analysis. We apply the method to a medical example, showing that follow-up duration is relevant both for pooled cause-specific HRs and CIF ratios. Moreover, if all-cause hazard and follow-up time are large enough, CIF ratios may reveal additional information about the effect of treatment on the cumulative probability of each event type. Finally, to improve the usefulness of such analysis, better reporting of competing risks data is needed. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
Javidi Sabbaghian, Reza; Zarghami, Mahdi; Nejadhashemi, A Pouyan; Sharifi, Mohammad Bagher; Herman, Matthew R; Daneshvar, Fariborz
2016-03-01
Effective watershed management requires the evaluation of agricultural best management practice (BMP) scenarios which carefully consider the relevant environmental, economic, and social criteria involved. In the Multiple Criteria Decision-Making (MCDM) process, scenarios are first evaluated and then ranked to determine the most desirable outcome for the particular watershed. The main challenge of this process is the accurate identification of the best solution for the watershed in question, despite the various risk attitudes presented by the associated decision-makers (DMs). This paper introduces a novel approach for implementation of the MCDM process based on a comparative neutral risk/risk-based decision analysis, which results in the selection of the most desirable scenario for use in the entire watershed. At the sub-basin level, each scenario includes multiple BMPs with scores that have been calculated using the criteria derived from two cases of neutral risk and risk-based decision-making. The simple additive weighting (SAW) operator is applied for use in neutral risk decision-making, while the ordered weighted averaging (OWA) and induced OWA (IOWA) operators are effective for risk-based decision-making. At the watershed level, the BMP scores of the sub-basins are aggregated to calculate each scenarios' combined goodness measurements; the most desirable scenario for the entire watershed is then selected based on the combined goodness measurements. Our final results illustrate the type of operator and risk attitudes needed to satisfy the relevant criteria within the number of sub-basins, and how they ultimately affect the final ranking of the given scenarios. The methodology proposed here has been successfully applied to the Honeyoey Creek-Pine Creek watershed in Michigan, USA to evaluate various BMP scenarios and determine the best solution for both the stakeholders and the overall stream health. Copyright © 2015 Elsevier Ltd. All rights reserved.
Allometric scaling: analysis of LD50 data.
Burzala-Kowalczyk, Lidia; Jongbloed, Geurt
2011-04-01
The need to identify toxicologically equivalent doses across different species is a major issue in toxicology and risk assessment. In this article, we investigate interspecies scaling based on the allometric equation applied to the single, oral LD (50) data previously analyzed by Rhomberg and Wolff. We focus on the statistical approach, namely, regression analysis of the mentioned data. In contrast to Rhomberg and Wolff's analysis of species pairs, we perform an overall analysis based on the whole data set. From our study it follows that if one assumes one single scaling rule for all species and substances in the data set, then β = 1 is the most natural choice among a set of candidates known in the literature. In fact, we obtain quite narrow confidence intervals for this parameter. However, the estimate of the variance in the model is relatively high, resulting in rather wide prediction intervals. © 2010 Society for Risk Analysis.
Liyanage, H; de Lusignan, S; Liaw, S-T; Kuziemsky, C E; Mold, F; Krause, P; Fleming, D; Jones, S
2014-08-15
Generally benefits and risks of vaccines can be determined from studies carried out as part of regulatory compliance, followed by surveillance of routine data; however there are some rarer and more long term events that require new methods. Big data generated by increasingly affordable personalised computing, and from pervasive computing devices is rapidly growing and low cost, high volume, cloud computing makes the processing of these data inexpensive. To describe how big data and related analytical methods might be applied to assess the benefits and risks of vaccines. We reviewed the literature on the use of big data to improve health, applied to generic vaccine use cases, that illustrate benefits and risks of vaccination. We defined a use case as the interaction between a user and an information system to achieve a goal. We used flu vaccination and pre-school childhood immunisation as exemplars. We reviewed three big data use cases relevant to assessing vaccine benefits and risks: (i) Big data processing using crowdsourcing, distributed big data processing, and predictive analytics, (ii) Data integration from heterogeneous big data sources, e.g. the increasing range of devices in the "internet of things", and (iii) Real-time monitoring for the direct monitoring of epidemics as well as vaccine effects via social media and other data sources. Big data raises new ethical dilemmas, though its analysis methods can bring complementary real-time capabilities for monitoring epidemics and assessing vaccine benefit-risk balance.
Increasing cancer detection yield of breast MRI using a new CAD scheme of mammograms
NASA Astrophysics Data System (ADS)
Tan, Maxine; Aghaei, Faranak; Hollingsworth, Alan B.; Stough, Rebecca G.; Liu, Hong; Zheng, Bin
2016-03-01
Although breast MRI is the most sensitive imaging modality to detect early breast cancer, its cancer detection yield in breast cancer screening is quite low (< 3 to 4% even for the small group of high-risk women) to date. The purpose of this preliminary study is to test the potential of developing and applying a new computer-aided detection (CAD) scheme of digital mammograms to identify women at high risk of harboring mammography-occult breast cancers, which can be detected by breast MRI. For this purpose, we retrospectively assembled a dataset involving 30 women who had both mammography and breast MRI screening examinations. All mammograms were interpreted as negative, while 5 cancers were detected using breast MRI. We developed a CAD scheme of mammograms, which include a new quantitative mammographic image feature analysis based risk model, to stratify women into two groups with high and low risk of harboring mammography-occult cancer. Among 30 women, 9 were classified into the high risk group by CAD scheme, which included all 5 women who had cancer detected by breast MRI. All 21 low risk women remained negative on the breast MRI examinations. The cancer detection yield of breast MRI applying to this dataset substantially increased from 16.7% (5/30) to 55.6% (5/9), while eliminating 84% (21/25) unnecessary breast MRI screenings. The study demonstrated the potential of applying a new CAD scheme to significantly increase cancer detection yield of breast MRI, while simultaneously reducing the number of negative MRIs in breast cancer screening.
Liyanage, H.; Liaw, S-T.; Kuziemsky, C.; Mold, F.; Krause, P.; Fleming, D.; Jones, S.
2014-01-01
Summary Background Generally benefits and risks of vaccines can be determined from studies carried out as part of regulatory compliance, followed by surveillance of routine data; however there are some rarer and more long term events that require new methods. Big data generated by increasingly affordable personalised computing, and from pervasive computing devices is rapidly growing and low cost, high volume, cloud computing makes the processing of these data inexpensive. Objective To describe how big data and related analytical methods might be applied to assess the benefits and risks of vaccines. Method: We reviewed the literature on the use of big data to improve health, applied to generic vaccine use cases, that illustrate benefits and risks of vaccination. We defined a use case as the interaction between a user and an information system to achieve a goal. We used flu vaccination and pre-school childhood immunisation as exemplars. Results We reviewed three big data use cases relevant to assessing vaccine benefits and risks: (i) Big data processing using crowd-sourcing, distributed big data processing, and predictive analytics, (ii) Data integration from heterogeneous big data sources, e.g. the increasing range of devices in the “internet of things”, and (iii) Real-time monitoring for the direct monitoring of epidemics as well as vaccine effects via social media and other data sources. Conclusions Big data raises new ethical dilemmas, though its analysis methods can bring complementary real-time capabilities for monitoring epidemics and assessing vaccine benefit-risk balance. PMID:25123718
A Value Measure for Public-Sector Enterprise Risk Management: A TSA Case Study.
Fletcher, Kenneth C; Abbas, Ali E
2018-05-01
This article presents a public value measure that can be used to aid executives in the public sector to better assess policy decisions and maximize value to the American people. Using Transportation Security Administration (TSA) programs as an example, we first identify the basic components of public value. We then propose a public value account to quantify the outcomes of various risk scenarios, and we determine the certain equivalent of several important TSA programs. We illustrate how this proposed measure can quantify the effects of two main challenges that government organizations face when conducting enterprise risk management: (1) short-term versus long-term incentives and (2) avoiding potential negative consequences even if they occur with low probability. Finally, we illustrate how this measure enables the use of various tools from decision analysis to be applied in government settings, such as stochastic dominance arguments and certain equivalent calculations. Regarding the TSA case study, our analysis demonstrates the value of continued expansion of the TSA trusted traveler initiative and increasing the background vetting for passengers who are afforded expedited security screening. © 2017 Society for Risk Analysis.
A theoretical treatment of technical risk in modern propulsion system design
NASA Astrophysics Data System (ADS)
Roth, Bryce Alexander
2000-09-01
A prevalent trend in modern aerospace systems is increasing complexity and cost, which in turn drives increased risk. Consequently, there is a clear and present need for the development of formalized methods to analyze the impact of risk on the design of aerospace vehicles. The objective of this work is to develop such a method that enables analysis of risk via a consistent, comprehensive treatment of aerothermodynamic and mass properties aspects of vehicle design. The key elements enabling the creation of this methodology are recent developments in the analytical estimation of work potential based on the second law of thermodynamics. This dissertation develops the theoretical foundation of a vehicle analysis method based on work potential and validates it using the Northrop F-5E with GE J85-GE-21 engines as a case study. Although the method is broadly applicable, emphasis is given to aircraft propulsion applications. Three work potential figures of merit are applied using this method: exergy, available energy, and thrust work potential. It is shown that each possesses unique properties making them useful for specific vehicle analysis tasks, though the latter two are actually special cases of exergy. All three are demonstrated on the analysis of the J85-GE-21 propulsion system, resulting in a comprehensive description of propulsion system thermodynamic loss. This "loss management" method is used to analyze aerodynamic drag loss of the F-5E and is then used in conjunction with the propulsive loss model to analyze the usage of fuel work potential throughout the F-5E design mission. The results clearly show how and where work potential is used during flight and yield considerable insight as to where the greatest opportunity for design improvement is. Next, usage of work potential is translated into fuel weight so that the aerothermodynamic performance of the F-5E can be expressed entirely in terms of vehicle gross weight. This technique is then applied as a means to quantify the impact of engine cycle technologies on the F-5E airframe. Finally, loss management methods are used in conjunction with probabilistic analysis methods to quantify the impact of risk on F-5E aerothermodynamic performance.
Rah, Jeong-Eun; Manger, Ryan P; Yock, Adam D; Kim, Gwe-Ya
2016-12-01
To examine the abilities of a traditional failure mode and effects analysis (FMEA) and modified healthcare FMEA (m-HFMEA) scoring methods by comparing the degree of congruence in identifying high risk failures. The authors applied two prospective methods of the quality management to surface image guided, linac-based radiosurgery (SIG-RS). For the traditional FMEA, decisions on how to improve an operation were based on the risk priority number (RPN). The RPN is a product of three indices: occurrence, severity, and detectability. The m-HFMEA approach utilized two indices, severity and frequency. A risk inventory matrix was divided into four categories: very low, low, high, and very high. For high risk events, an additional evaluation was performed. Based upon the criticality of the process, it was decided if additional safety measures were needed and what they comprise. The two methods were independently compared to determine if the results and rated risks matched. The authors' results showed an agreement of 85% between FMEA and m-HFMEA approaches for top 20 risks of SIG-RS-specific failure modes. The main differences between the two approaches were the distribution of the values and the observation that failure modes (52, 54, 154) with high m-HFMEA scores do not necessarily have high FMEA-RPN scores. In the m-HFMEA analysis, when the risk score is determined, the basis of the established HFMEA Decision Tree™ or the failure mode should be more thoroughly investigated. m-HFMEA is inductive because it requires the identification of the consequences from causes, and semi-quantitative since it allows the prioritization of high risks and mitigation measures. It is therefore a useful tool for the prospective risk analysis method to radiotherapy.
Vulnerability and risk of deltaic social-ecological systems exposed to multiple hazards.
Hagenlocher, Michael; Renaud, Fabrice G; Haas, Susanne; Sebesvari, Zita
2018-08-01
Coastal river deltas are hotspots of global change impacts. Sustainable delta futures are increasingly threatened due to rising hazard exposure combined with high vulnerabilities of deltaic social-ecological systems. While the need for integrated multi-hazard approaches has been clearly articulated, studies on vulnerability and risk in deltas either focus on local case studies or single hazards and do not apply a social-ecological systems perspective. As a result, vulnerabilities and risks in areas with strong social and ecological coupling, such as coastal deltas, are not fully understood and the identification of risk reduction and adaptation strategies are often based on incomplete assumptions. To overcome these limitations, we propose an innovative modular indicator library-based approach for the assessment of multi-hazard risk of social-ecological systems across and within coastal deltas globally, and apply it to the Amazon, Ganges-Brahmaputra-Meghna (GBM), and Mekong deltas. Results show that multi-hazard risk is highest in the GBM delta and lowest in the Amazon delta. The analysis reveals major differences between social and environmental vulnerability across the three deltas, notably in the Mekong and the GBM deltas where environmental vulnerability is significantly higher than social vulnerability. Hotspots and drivers of risk vary spatially, thus calling for spatially targeted risk reduction and adaptation strategies within the deltas. Ecosystems have been identified as both an important element at risk as well as an entry point for risk reduction and adaptation strategies. Copyright © 2018. Published by Elsevier B.V.
Caparros-Midwood, Daniel; Barr, Stuart; Dawson, Richard
2017-11-01
Future development in cities needs to manage increasing populations, climate-related risks, and sustainable development objectives such as reducing greenhouse gas emissions. Planners therefore face a challenge of multidimensional, spatial optimization in order to balance potential tradeoffs and maximize synergies between risks and other objectives. To address this, a spatial optimization framework has been developed. This uses a spatially implemented genetic algorithm to generate a set of Pareto-optimal results that provide planners with the best set of trade-off spatial plans for six risk and sustainability objectives: (i) minimize heat risks, (ii) minimize flooding risks, (iii) minimize transport travel costs to minimize associated emissions, (iv) maximize brownfield development, (v) minimize urban sprawl, and (vi) prevent development of greenspace. The framework is applied to Greater London (U.K.) and shown to generate spatial development strategies that are optimal for specific objectives and differ significantly from the existing development strategies. In addition, the analysis reveals tradeoffs between different risks as well as between risk and sustainability objectives. While increases in heat or flood risk can be avoided, there are no strategies that do not increase at least one of these. Tradeoffs between risk and other sustainability objectives can be more severe, for example, minimizing heat risk is only possible if future development is allowed to sprawl significantly. The results highlight the importance of spatial structure in modulating risks and other sustainability objectives. However, not all planning objectives are suited to quantified optimization and so the results should form part of an evidence base to improve the delivery of risk and sustainability management in future urban development. © 2017 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Garland, A.
2015-12-01
The Arctic Risk Management Network (ARMNet) was conceived as a trans-disciplinary hub to encourage and facilitate greater cooperation, communication and exchange among American and Canadian academics and practitioners actively engaged in the research, management and mitigation of risks, emergencies and disasters in the Arctic regions. Its aim is to assist regional decision-makers through the sharing of applied research and best practices and to support greater inter-operability and bilateral collaboration through improved networking, joint exercises, workshops, teleconferences, radio programs, and virtual communications (eg. webinars). Most importantly, ARMNet is a clearinghouse for all information related to the management of the frequent hazards of Arctic climate and geography in North America, including new and emerging challenges arising from climate change, increased maritime polar traffic and expanding economic development in the region. ARMNet is an outcome of the Arctic Observing Network (AON) for Long Term Observations, Governance, and Management Discussions, www.arcus.org/search-program. The AON goals continue with CRIOS (www.ariesnonprofit.com/ARIESprojects.php) and coastal erosion research (www.ariesnonprofit.com/webinarCoastalErosion.php) led by the North Slope Borough Risk Management Office with assistance from ARIES (Applied Research in Environmental Sciences Nonprofit, Inc.). The constituency for ARMNet will include all northern academics and researchers, Arctic-based corporations, First Responders (FRs), Emergency Management Offices (EMOs) and Risk Management Offices (RMOs), military, Coast Guard, northern police forces, Search and Rescue (SAR) associations, boroughs, territories and communities throughout the Arctic. This presentation will be of interest to all those engaged in Arctic affairs, describe the genesis of ARMNet and present the results of stakeholder meetings and webinars designed to guide the next stages of the Project.
Zhang, Yan; Zhong, Ming
2013-01-01
Groundwater contamination is a serious threat to water supply. Risk assessment of groundwater contamination is an effective way to protect the safety of groundwater resource. Groundwater is a complex and fuzzy system with many uncertainties, which is impacted by different geological and hydrological factors. In order to deal with the uncertainty in the risk assessment of groundwater contamination, we propose an approach with analysis hierarchy process and fuzzy comprehensive evaluation integrated together. Firstly, the risk factors of groundwater contamination are identified by the sources-pathway-receptor-consequence method, and a corresponding index system of risk assessment based on DRASTIC model is established. Due to the complexity in the process of transitions between the possible pollution risks and the uncertainties of factors, the method of analysis hierarchy process is applied to determine the weights of each factor, and the fuzzy sets theory is adopted to calculate the membership degrees of each factor. Finally, a case study is presented to illustrate and test this methodology. It is concluded that the proposed approach integrates the advantages of both analysis hierarchy process and fuzzy comprehensive evaluation, which provides a more flexible and reliable way to deal with the linguistic uncertainty and mechanism uncertainty in groundwater contamination without losing important information. PMID:24453883
Performing a preliminary hazard analysis applied to administration of injectable drugs to infants.
Hfaiedh, Nadia; Kabiche, Sofiane; Delescluse, Catherine; Balde, Issa-Bella; Merlin, Sophie; Carret, Sandra; de Pontual, Loïc; Fontan, Jean-Eudes; Schlatter, Joël
2017-08-01
Errors in hospitals during the preparation and administration of intravenous drugs to infants and children have been reported to a rate of 13% to 84%. This study aimed to investigate the potential for hazardous events that may lead to an accident for preparation and administration of drug injection in a pediatric department and to describe a reduction plan of risks. The preliminary hazard analysis (PHA) method was implemented by a multidisciplinary working group over a period of 5 months (April-August 2014) in infants aged from 28 days to 2 years. The group identified required hazard controls and follow-up actions to reduce the error risk. To analyze the results, the STATCART APR software was used. During the analysis, 34 hazardous situations were identified, among 17 were quoted very critical and drawn 69 risk scenarios. After follow-up actions, the scenarios with unacceptable risk declined from 17.4% to 0%, and these with acceptable under control from 46.4% to 43.5%. The PHA can be used as an aid in the prioritization of corrective actions and the implementation of control measures to reduce risk. The PHA is a complement of the a posteriori risk management already exists. © 2017 John Wiley & Sons, Ltd.
Periodic benefit-risk assessment using Bayesian stochastic multi-criteria acceptability analysis.
Li, Kan; Yuan, Shuai Sammy; Wang, William; Wan, Shuyan Sabrina; Ceesay, Paulette; Heyse, Joseph F; Mt-Isa, Shahrul; Luo, Sheng
2018-04-01
Benefit-risk (BR) assessment is essential to ensure the best decisions are made for a medical product in the clinical development process, regulatory marketing authorization, post-market surveillance, and coverage and reimbursement decisions. One challenge of BR assessment in practice is that the benefit and risk profile may keep evolving while new evidence is accumulating. Regulators and the International Conference on Harmonization (ICH) recommend performing periodic benefit-risk evaluation report (PBRER) through the product's lifecycle. In this paper, we propose a general statistical framework for periodic benefit-risk assessment, in which Bayesian meta-analysis and stochastic multi-criteria acceptability analysis (SMAA) will be combined to synthesize the accumulating evidence. The proposed approach allows us to compare the acceptability of different drugs dynamically and effectively and accounts for the uncertainty of clinical measurements and imprecise or incomplete preference information of decision makers. We apply our approaches to two real examples in a post-hoc way for illustration purpose. The proposed method may easily be modified for other pre and post market settings, and thus be an important complement to the current structured benefit-risk assessment (sBRA) framework to improve the transparent and consistency of the decision-making process. Copyright © 2018 Elsevier Inc. All rights reserved.
Li, Yifan; Wang, Juanle; Gao, Mengxu; Fang, Liqun; Liu, Changhua; Lyu, Xin; Bai, Yongqing; Zhao, Qiang; Li, Hairong; Yu, Hongjie; Cao, Wuchun; Feng, Liqiang; Wang, Yanjun; Zhang, Bin
2017-05-26
Tick-borne encephalitis (TBE) is one of natural foci diseases transmitted by ticks. Its distribution and transmission are closely related to geographic and environmental factors. Identification of environmental determinates of TBE is of great importance to understanding the general distribution of existing and potential TBE natural foci. Hulunbuir, one of the most severe endemic areas of the disease, is selected as the study area. Statistical analysis, global and local spatial autocorrelation analysis, and regression methods were applied to detect the spatiotemporal characteristics, compare the impact degree of associated factors, and model the risk distribution using the heterogeneity. The statistical analysis of gridded geographic and environmental factors and TBE incidence show that the TBE patients mainly occurred during spring and summer and that there is a significant positive spatial autocorrelation between the distribution of TBE cases and environmental characteristics. The impact degree of these factors on TBE risks has the following descending order: temperature, relative humidity, vegetation coverage, precipitation and topography. A high-risk area with a triangle shape was determined in the central part of Hulunbuir; the low-risk area is located in the two belts next to the outside edge of the central triangle. The TBE risk distribution revealed that the impact of the geographic factors changed depending on the heterogeneity.
RiskLab - a joint Teaching Lab on Hazard and Risk Management
NASA Astrophysics Data System (ADS)
Baruffini, Mi.; Baruffini, Mo.; Thuering, M.
2009-04-01
In the future natural disasters are expected to increase due to climatic changes that strongly affect environmental, social and economical systems. For this reason and because of the limited resources, governments require analytical risk analysis for a better mitigation planning. Risk analysis is a process to determine the nature and extent of risk by estimating potential hazards and evaluating existing conditions of vulnerability that could pose a potential threat or harm to people, property, livelihoods and environment. This process has become a generally accepted approach for the assessment of cost-benefit scenarios; originating from technical risks it is being applied to natural hazards for several years now in Switzerland. Starting from these premises "Risk Lab", a joint collaboration between the Institute of Earth Sciences of the University of Applied Sciences of Southern Switzerland and the Institute for Economic Research of the University of Lugano, has been started in 2006, aiming to become a competence centre about Risk Analysis and Evaluation. The main issue studied by the lab concerns the topic "What security at what price?" and the activities follow the philosophy of the integral risk management as proposed by PLANAT, that defines the process as a cycle that contains different and interrelated phases. The final aim is to change the population and technician idea about risk from "defending against danger" to "being aware of risks" through a proper academic course specially addressed to young people. In fact the most important activity of the laboratory consists in a degree course, offered both to Engineering and Architecture students of the University of Applied Sciences of Southern Switzerland and Economy Students of the University of Lugano. The course is structured in two main parts: an introductive, theoretical part, composed by class lessons, where the main aspects of natural hazards, risk perception and evaluation and risk management are presented and analyzed, and a second part, composed by practical activities, where students can learn specific statistical methods and test and use technical software. Special importance is given to seminars held by experts or members of Civil Protection and risk management institutes. Excursions are often organized to directly see and study practical case studies (Eg. The city of Locarno and the lake Maggiore inundations). The course is organized following a "classical" structure (it's mainly held in a class or in an informatics lab), but students can also benefit from a special web portal, powered by "e.coursers" , the official USI/SUPSI Learning Management System , where they can find issues and documents about natural hazards and risk management. The main pedagogical value is that students can attend a course which is entirely devoted to dealing with natural and man-made hazards and risk, allowing them to resume geological, space planning and economic issues and to face real case studies in a challenging and holistic environment. The final aim of the course is to provide students an useful and integrated "toolbox", essential to cope with and to resolve the overwhelming problems due to vulnerability and danger increase of the present-day society. The course has by now reached the third academic year and the initial results are encouraging: beyond the knowledge and expertise acquired, the graduate students, that are now for the most part working in engineering studies or private companies, have shown to have acquired a mentality devoted to understanding and managing risk. REFERENCES PLANAT HTTP://WWW.CENAT.CH/INDEX.PHP?USERHASH=79598753&L=D&NAVID=154 ECOURSES HTTP://CORSI.ELEARNINGLAB.ORG/ NAHRIS HTTP://WWW.NAHRIS.CH/
Applying artificial neural networks to predict communication risks in the emergency department.
Bagnasco, Annamaria; Siri, Anna; Aleo, Giuseppe; Rocco, Gennaro; Sasso, Loredana
2015-10-01
To describe the utility of artificial neural networks in predicting communication risks. In health care, effective communication reduces the risk of error. Therefore, it is important to identify the predictive factors of effective communication. Non-technical skills are needed to achieve effective communication. This study explores how artificial neural networks can be applied to predict the risk of communication failures in emergency departments. A multicentre observational study. Data were collected between March-May 2011 by observing the communication interactions of 840 nurses with their patients during their routine activities in emergency departments. The tools used for our observation were a questionnaire to collect personal and descriptive data, level of training and experience and Guilbert's observation grid, applying the Situation-Background-Assessment-Recommendation technique to communication in emergency departments. A total of 840 observations were made on the nurses working in the emergency departments. Based on Guilbert's observation grid, the output variables is likely to influence the risk of communication failure were 'terminology'; 'listening'; 'attention' and 'clarity', whereas nurses' personal characteristics were used as input variables in the artificial neural network model. A model based on the multilayer perceptron topology was developed and trained. The receiver operator characteristic analysis confirmed that the artificial neural network model correctly predicted the performance of more than 80% of the communication failures. The application of the artificial neural network model could offer a valid tool to forecast and prevent harmful communication errors in the emergency department. © 2015 John Wiley & Sons Ltd.
Probabilistic framework for product design optimization and risk management
NASA Astrophysics Data System (ADS)
Keski-Rahkonen, J. K.
2018-05-01
Probabilistic methods have gradually gained ground within engineering practices but currently it is still the industry standard to use deterministic safety margin approaches to dimensioning components and qualitative methods to manage product risks. These methods are suitable for baseline design work but quantitative risk management and product reliability optimization require more advanced predictive approaches. Ample research has been published on how to predict failure probabilities for mechanical components and furthermore to optimize reliability through life cycle cost analysis. This paper reviews the literature for existing methods and tries to harness their best features and simplify the process to be applicable in practical engineering work. Recommended process applies Monte Carlo method on top of load-resistance models to estimate failure probabilities. Furthermore, it adds on existing literature by introducing a practical framework to use probabilistic models in quantitative risk management and product life cycle costs optimization. The main focus is on mechanical failure modes due to the well-developed methods used to predict these types of failures. However, the same framework can be applied on any type of failure mode as long as predictive models can be developed.
NASA Astrophysics Data System (ADS)
Moreira, Francisco; Silva, Nuno
2016-08-01
Safety systems require accident avoidance. This is covered by application standards, processes, techniques and tools that support the identification, analysis, elimination or reduction to an acceptable level of system risks and hazards. Ideally, a safety system should be free of hazards. However, both industry and academia have been struggling to ensure appropriate risk and hazard analysis, especially in what concerns completeness of the hazards, formalization, and timely analysis in order to influence the specifications and the implementation. Such analysis is also important when considering a change to an existing system. The Common Safety Method for Risk Evaluation and Assessment (CSM- RA) is a mandatory procedure whenever any significant change is proposed to the railway system in a European Member State. This paper provides insights on the fundamentals of CSM-RA based and complemented with Hazard Analysis. When and how to apply them, and the relation and similarities of these processes with industry standards and the system life cycles is highlighted. Finally, the paper shows how CSM-RA can be the basis of a change management process, guiding the identification and management of the hazards helping ensuring the similar safety level as the initial system. This paper will show how the CSM-RA principles can be used in other domains particularly for space system evolution.
The EP-3E vs. the BAMS UAS: An Operating and Support Cost Comparison
2012-09-01
Accountability Office HALE High Altitude Long Endurance ISR Intelligence, Surveillance and Reconnaissance JCC Joint Architecture...others are very complex high altitude long endurance (HALE) aircraft. However, most share the common need for satellite bandwidth. The DoD plan is...collection sites, and risks as they apply to the BAMS UAS. These factors were not adequately considered in the original O&S analysis . Once the analysis
NASA Astrophysics Data System (ADS)
Gan, Luping; Li, Yan-Feng; Zhu, Shun-Peng; Yang, Yuan-Jian; Huang, Hong-Zhong
2014-06-01
Failure mode, effects and criticality analysis (FMECA) and Fault tree analysis (FTA) are powerful tools to evaluate reliability of systems. Although single failure mode issue can be efficiently addressed by traditional FMECA, multiple failure modes and component correlations in complex systems cannot be effectively evaluated. In addition, correlated variables and parameters are often assumed to be precisely known in quantitative analysis. In fact, due to the lack of information, epistemic uncertainty commonly exists in engineering design. To solve these problems, the advantages of FMECA, FTA, fuzzy theory, and Copula theory are integrated into a unified hybrid method called fuzzy probability weighted geometric mean (FPWGM) risk priority number (RPN) method. The epistemic uncertainty of risk variables and parameters are characterized by fuzzy number to obtain fuzzy weighted geometric mean (FWGM) RPN for single failure mode. Multiple failure modes are connected using minimum cut sets (MCS), and Boolean logic is used to combine fuzzy risk priority number (FRPN) of each MCS. Moreover, Copula theory is applied to analyze the correlation of multiple failure modes in order to derive the failure probabilities of each MCS. Compared to the case where dependency among multiple failure modes is not considered, the Copula modeling approach eliminates the error of reliability analysis. Furthermore, for purpose of quantitative analysis, probabilities importance weight from failure probabilities are assigned to FWGM RPN to reassess the risk priority, which generalize the definition of probability weight and FRPN, resulting in a more accurate estimation than that of the traditional models. Finally, a basic fatigue analysis case drawn from turbine and compressor blades in aeroengine is used to demonstrate the effectiveness and robustness of the presented method. The result provides some important insights on fatigue reliability analysis and risk priority assessment of structural system under failure correlations.
Martins, Marcelo Ramos; Schleder, Adriana Miralles; Droguett, Enrique López
2014-12-01
This article presents an iterative six-step risk analysis methodology based on hybrid Bayesian networks (BNs). In typical risk analysis, systems are usually modeled as discrete and Boolean variables with constant failure rates via fault trees. Nevertheless, in many cases, it is not possible to perform an efficient analysis using only discrete and Boolean variables. The approach put forward by the proposed methodology makes use of BNs and incorporates recent developments that facilitate the use of continuous variables whose values may have any probability distributions. Thus, this approach makes the methodology particularly useful in cases where the available data for quantification of hazardous events probabilities are scarce or nonexistent, there is dependence among events, or when nonbinary events are involved. The methodology is applied to the risk analysis of a regasification system of liquefied natural gas (LNG) on board an FSRU (floating, storage, and regasification unit). LNG is becoming an important energy source option and the world's capacity to produce LNG is surging. Large reserves of natural gas exist worldwide, particularly in areas where the resources exceed the demand. Thus, this natural gas is liquefied for shipping and the storage and regasification process usually occurs at onshore plants. However, a new option for LNG storage and regasification has been proposed: the FSRU. As very few FSRUs have been put into operation, relevant failure data on FSRU systems are scarce. The results show the usefulness of the proposed methodology for cases where the risk analysis must be performed under considerable uncertainty. © 2014 Society for Risk Analysis.
A Probabilistic Risk Assessment of Groundwater-Related Risks at Excavation Sites
NASA Astrophysics Data System (ADS)
Jurado, A.; de Gaspari, F.; Vilarrasa, V.; Sanchez-Vila, X.; Fernandez-Garcia, D.; Tartakovsky, D. M.; Bolster, D.
2010-12-01
Excavation sites such as those associated with the construction of subway lines, railways and highway tunnels are hazardous places, posing risks to workers, machinery and surrounding buildings. Many of these risks can be groundwater related. In this work we develop a general framework based on a probabilistic risk assessment (PRA) to quantify such risks. This approach is compatible with standard PRA practices and it employs many well-developed risk analysis tools, such as fault trees. The novelty and computational challenges of the proposed approach stem from the reliance on stochastic differential equations, rather than reliability databases, to compute the probabilities of basic events. The general framework is applied to a specific case study in Spain. It is used to estimate and minimize risks for a potential construction site of an underground station for the new subway line in the Barcelona metropolitan area.
The benefits of integrating cost-benefit analysis and risk assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fisher, K.; Clarke-Whistler, K.
1995-12-31
It has increasingly been recognized that knowledge of risks in the absence of benefits and costs cannot dictate appropriate public policy choices. Recent evidence of this recognition includes the proposed EPA Risk Assessment and Cost-Benefit Analysis Act of 1995, a number of legislative changes in Canada and the US, and the increasing demand for field studies combining measures of impacts, risks, costs and benefits. Failure to consider relative environmental and human health risks, benefits, and costs in making public policy decisions has resulted in allocating scarce resources away from areas offering the highest levels of risk reduction and improvements inmore » health and safety. The authors discuss the implications of not taking costs and benefits into account in addressing environmental risks, drawing on examples from both Canada and the US. The authors also present the results of their recent field work demonstrating the advantages of considering costs and benefits in making public policy and site remediation decisions, including a study on the benefits and costs of prevention, remediation and monitoring techniques applied to groundwater contamination; the benefits and costs of banning the use of chlorine; and the benefits and costs of Canada`s concept of disposing of high-level nuclear waste. The authors conclude that a properly conducted Cost-Benefit Analysis can provide critical input to a Risk Assessment and can ensure that risk management decisions are efficient, cost-effective and maximize improvement to environmental and human health.« less
Reduced Order Model Implementation in the Risk-Informed Safety Margin Characterization Toolkit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mandelli, Diego; Smith, Curtis L.; Alfonsi, Andrea
2015-09-01
The RISMC project aims to develop new advanced simulation-based tools to perform Probabilistic Risk Analysis (PRA) for the existing fleet of U.S. nuclear power plants (NPPs). These tools numerically model not only the thermo-hydraulic behavior of the reactor primary and secondary systems but also external events temporal evolution and components/system ageing. Thus, this is not only a multi-physics problem but also a multi-scale problem (both spatial, µm-mm-m, and temporal, ms-s-minutes-years). As part of the RISMC PRA approach, a large amount of computationally expensive simulation runs are required. An important aspect is that even though computational power is regularly growing, themore » overall computational cost of a RISMC analysis may be not viable for certain cases. A solution that is being evaluated is the use of reduce order modeling techniques. During the FY2015, we investigated and applied reduced order modeling techniques to decrease the RICM analysis computational cost by decreasing the number of simulations runs to perform and employ surrogate models instead of the actual simulation codes. This report focuses on the use of reduced order modeling techniques that can be applied to any RISMC analysis to generate, analyze and visualize data. In particular, we focus on surrogate models that approximate the simulation results but in a much faster time (µs instead of hours/days). We apply reduced order and surrogate modeling techniques to several RISMC types of analyses using RAVEN and RELAP-7 and show the advantages that can be gained.« less
NASA Technical Reports Server (NTRS)
Diorio, Kimberly A.; Voska, Ned (Technical Monitor)
2002-01-01
This viewgraph presentation provides information on Human Factors Process Failure Modes and Effects Analysis (HF PFMEA). HF PFMEA includes the following 10 steps: Describe mission; Define System; Identify human-machine; List human actions; Identify potential errors; Identify factors that effect error; Determine likelihood of error; Determine potential effects of errors; Evaluate risk; Generate solutions (manage error). The presentation also describes how this analysis was applied to a liquid oxygen pump acceptance test.
1990-12-01
Armstrong Aerospace Medical Research Laboratory, Wright Paterson Air Force Base, and Drs. Melvin Andersen and Michael Cargas , formerly with the Harry G...based on the arterial blood concentration surrogate were more III-1-10 similar to those de ’!ved in the traditional manner than were the estimates based on...pharmacokinetic modeling. Prepared by Office of Risk Analysis, Oak Ridge National L-ioratory, Oak Pidge, Tenn.zsee. Prepared under Contract No. DE -ACO5-84
Pham, Clarabelle; Caffrey, Orla; Ben-Tovim, David; Hakendorf, Paul; Crotty, Maria; Karnon, Jonathan
2012-08-21
Methods for the cost-effectiveness analysis of health technologies are now well established, but such methods may also have a useful role in the context of evaluating the effects of variation in applied clinical practice. This study illustrates a general methodology for the comparative analysis of applied clinical practice at alternative institutions--risk adjusted cost-effectiveness (RAC-E) analysis--with an application that compares acute hospital services for stroke patients admitted to the main public hospitals in South Australia. Using linked, routinely collected data on all South Australian hospital separations from July 2001 to June 2008, an analysis of the RAC-E of services provided at four metropolitan hospitals was undertaken using a decision analytic framework. Observed (plus extrapolated) and expected lifetime costs and survival were compared across patient populations, from which the relative cost-effectiveness of services provided at the different hospitals was estimated. Unadjusted results showed that at one hospital patients incurred fewer costs and gained more life years than at the other hospitals (i.e. it was the dominant hospital). After risk adjustment, the cost minimizing hospital incurred the lowest costs, but with fewer life-years gained than one other hospital. The mean incremental cost per life-year gained of services provided at the most effective hospital was under $20,000, with an associated 65% probability of being cost-effective at a $50,000 per life year monetary threshold. RAC-E analyses can be used to identify important variation in the costs and outcomes associated with clinical practice at alternative institutions. Such data provides an impetus for further investigation to identify specific areas of variation, which may then inform the dissemination of best practice service delivery and organisation.
NASA Astrophysics Data System (ADS)
Ferrer, Laetitia; Curt, Corinne; Tacnet, Jean-Marc
2018-04-01
Major hazard prevention is a main challenge given that it is specifically based on information communicated to the public. In France, preventive information is notably provided by way of local regulatory documents. Unfortunately, the law requires only few specifications concerning their content; therefore one can question the impact on the general population relative to the way the document is concretely created. Ergo, the purpose of our work is to propose an analytical methodology to evaluate preventive risk communication document effectiveness. The methodology is based on dependability approaches and is applied in this paper to the Document d'Information Communal sur les Risques Majeurs (DICRIM; in English, Municipal Information Document on Major Risks). DICRIM has to be made by mayors and addressed to the public to provide information on major hazards affecting their municipalities. An analysis of law compliance of the document is carried out thanks to the identification of regulatory detection elements. These are applied to a database of 30 DICRIMs. This analysis leads to a discussion on points such as usefulness of the missing elements. External and internal function analysis permits the identification of the form and content requirements and service and technical functions of the document and its components (here its sections). Their results are used to carry out an FMEA (failure modes and effects analysis), which allows us to define the failure and to identify detection elements. This permits the evaluation of the effectiveness of form and content of each components of the document. The outputs are validated by experts from the different fields investigated. Those results are obtained to build, in future works, a decision support model for the municipality (or specialised consulting firms) in charge of drawing up documents.
NASA Astrophysics Data System (ADS)
Macián-Cervera, Javier; Escuder-Bueno, Ignacio
2017-04-01
One of the main hazards over the water quality in the water supply systems from surface raw water is cryptosporidium, considered by World Health Organization, as the most dangerous emergent pathogen. Analitycal methods for cryptosporidium are expensive, laborious and they do not have enough precission, on the other hand, labs analyze discretal samples, while drinking water production is a continuous process. In that point, the introduction of risk models in necessary to check the ability of safety of the water produced. The advances in tools able to quantify risk applied to conventional treatment drinking water treatment plants is quite useful for the operators, able to assess about decisions in operation and in investments. The model is applied into a real facility. With the results, it's possible to conclude interesting guidelines and policies about improving plant's operation mode. The main conclusion is that conventional treatment is able to work as effective barrier against cryptosporidium, but it is necessary to assess the risk of the plant while it is operating. Taking into account limitations of knowledge, risk estimation can rise non tolerable levels. In that situation, the plant must make investments in the treatment improving the operation, to get tolerable risk levels.
The SOBANE strategy for the management of risk, as applied to whole-body or hand-arm vibration.
Malchaire, J; Piette, A
2006-06-01
The objective was to develop a coherent set of methods to be used effectively in industry to prevent and manage the risks associated with exposure to vibration, by coordinating the progressive intervention of the workers, their management, the occupational health and safety (OHS) professionals and the experts. The methods were developed separately for the exposure to whole-body and hand-arm vibration. The SOBANE strategy of risk prevention includes four levels of intervention: level 1, Screening; level 2, Observation; level 3, Analysis and; level 4, Expertise. The methods making it possible to apply this strategy were developed for 14 types of risk factors. The article presents the methods specific to the prevention of the risks associated with the exposure to vibration. The strategy is similar to those published for the risks associated with exposure to noise, heat and musculoskeletal disorders. It explicitly recognizes the qualifications of the workers and their management with regard to the work situation and shares the principle that measuring the exposure of the workers is not necessarily the first step in order to improve these situations. It attempts to optimize the recourse to the competences of the OHS professionals and the experts, in order to come more rapidly, effectively and economically to practical control measures.
Effects of BMI on the risk and frequency of AIS 3+ injuries in motor-vehicle crashes.
Rupp, Jonathan D; Flannagan, Carol A C; Leslie, Andrew J; Hoff, Carrie N; Reed, Matthew P; Cunningham, Rebecca M
2013-01-01
Determine the effects of BMI on the risk of serious-to-fatal injury (Abbreviated Injury Scale ≥ 3 or AIS 3+) to different body regions for adults in frontal, nearside, farside, and rollover crashes. Multivariate logistic regression analysis was applied to a probability sample of adult occupants involved in crashes generated by combining the National Automotive Sampling System (NASS-CDS) with a pseudoweighted version of the Crash Injury Research and Engineering Network database. Logistic regression models were applied to weighted data to estimate the change in the number of occupants with AIS 3+ injuries if no occupants were obese. Increasing BMI increased risk of lower-extremity injury in frontal crashes, decreased risk of lower-extremity injury in nearside impacts, increased risk of upper-extremity injury in frontal and nearside crashes, and increased risk of spine injury in frontal crashes. Several of these findings were affected by interactions with gender and vehicle type. If no occupants in frontal crashes were obese, 7% fewer occupants would sustain AIS 3+ upper-extremity injuries, 8% fewer occupants would sustain AIS 3+ lower-extremity injuries, and 28% fewer occupants would sustain AIS 3+ spine injuries. Results of this study have implications on the design and evaluation of vehicle safety systems. Copyright © 2013 The Obesity Society.
Rhomberg, Lorenz R; Mayfield, David B; Goodman, Julie E; Butler, Eric L; Nascarella, Marc A; Williams, Daniel R
2015-01-01
The International Agency for Research on Cancer qualitatively characterized occupational exposure to oxidized bitumen emissions during roofing as probably carcinogenic to humans (Group 2A). We examine chemistry, exposure, epidemiology and animal toxicity data to explore quantitative risks for roofing workers applying built-up roofing asphalt (BURA). Epidemiology studies do not consistently report elevated risks, and generally do not have sufficient exposure information or adequately control for confounders, precluding their use for dose-response analysis. Dermal carcinogenicity bioassays using mice report increased tumor incidence with single high doses. In order to quantify potential cancer risks, we develop time-to-tumor model methods [consistent with U.S. Environmental Protection Agency (EPA) dose-response analysis and mixtures guidelines] using the dose-time-response shape of concurrent exposures to benzo[a]pyrene (B[a]P) as concurrent controls (which had several exposure levels) to infer presumed parallel dose-time-response curves for BURA-fume condensate. We compare EPA relative potency factor approaches, based on observed relative potency of BURA to B[a]P in similar experiments, and direct observation of the inferred BURA dose-time-response (scaled to humans) as means for characterizing a dermal unit risk factor. We apply similar approaches to limited data on asphalt-fume inhalation and respiratory cancers in rats. We also develop a method for adjusting potency estimates for asphalts that vary in composition using measured fluorescence. Overall, the various methods indicate that cancer risks to roofers from both dermal and inhalation exposure to BURA are within a range typically deemed acceptable within regulatory frameworks. The approaches developed may be useful in assessing carcinogenic potency of other complex mixtures of polycyclic aromatic compounds.
Heart Rate Variability Dynamics for the Prognosis of Cardiovascular Risk
Ramirez-Villegas, Juan F.; Lam-Espinosa, Eric; Ramirez-Moreno, David F.; Calvo-Echeverry, Paulo C.; Agredo-Rodriguez, Wilfredo
2011-01-01
Statistical, spectral, multi-resolution and non-linear methods were applied to heart rate variability (HRV) series linked with classification schemes for the prognosis of cardiovascular risk. A total of 90 HRV records were analyzed: 45 from healthy subjects and 45 from cardiovascular risk patients. A total of 52 features from all the analysis methods were evaluated using standard two-sample Kolmogorov-Smirnov test (KS-test). The results of the statistical procedure provided input to multi-layer perceptron (MLP) neural networks, radial basis function (RBF) neural networks and support vector machines (SVM) for data classification. These schemes showed high performances with both training and test sets and many combinations of features (with a maximum accuracy of 96.67%). Additionally, there was a strong consideration for breathing frequency as a relevant feature in the HRV analysis. PMID:21386966
TH-EF-BRC-04: Quality Management Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yorke, E.
2016-06-15
This Hands-on Workshop will be focused on providing participants with experience with the principal tools of TG 100 and hence start to build both competence and confidence in the use of risk-based quality management techniques. The three principal tools forming the basis of TG 100’s risk analysis: Process mapping, Failure-Modes and Effects Analysis and fault-tree analysis will be introduced with a 5 minute refresher presentation and each presentation will be followed by a 30 minute small group exercise. An exercise on developing QM from the risk analysis follows. During the exercise periods, participants will apply the principles in 2 differentmore » clinical scenarios. At the conclusion of each exercise there will be ample time for participants to discuss with each other and the faculty their experience and any challenges encountered. Learning Objectives: To review the principles of Process Mapping, Failure Modes and Effects Analysis and Fault Tree Analysis. To gain familiarity with these three techniques in a small group setting. To share and discuss experiences with the three techniques with faculty and participants. Director, TreatSafely, LLC. Director, Center for the Assessment of Radiological Sciences. Occasional Consultant to the IAEA and Varian.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
2016-06-15
This Hands-on Workshop will be focused on providing participants with experience with the principal tools of TG 100 and hence start to build both competence and confidence in the use of risk-based quality management techniques. The three principal tools forming the basis of TG 100’s risk analysis: Process mapping, Failure-Modes and Effects Analysis and fault-tree analysis will be introduced with a 5 minute refresher presentation and each presentation will be followed by a 30 minute small group exercise. An exercise on developing QM from the risk analysis follows. During the exercise periods, participants will apply the principles in 2 differentmore » clinical scenarios. At the conclusion of each exercise there will be ample time for participants to discuss with each other and the faculty their experience and any challenges encountered. Learning Objectives: To review the principles of Process Mapping, Failure Modes and Effects Analysis and Fault Tree Analysis. To gain familiarity with these three techniques in a small group setting. To share and discuss experiences with the three techniques with faculty and participants. Director, TreatSafely, LLC. Director, Center for the Assessment of Radiological Sciences. Occasional Consultant to the IAEA and Varian.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huq, M.
2016-06-15
This Hands-on Workshop will be focused on providing participants with experience with the principal tools of TG 100 and hence start to build both competence and confidence in the use of risk-based quality management techniques. The three principal tools forming the basis of TG 100’s risk analysis: Process mapping, Failure-Modes and Effects Analysis and fault-tree analysis will be introduced with a 5 minute refresher presentation and each presentation will be followed by a 30 minute small group exercise. An exercise on developing QM from the risk analysis follows. During the exercise periods, participants will apply the principles in 2 differentmore » clinical scenarios. At the conclusion of each exercise there will be ample time for participants to discuss with each other and the faculty their experience and any challenges encountered. Learning Objectives: To review the principles of Process Mapping, Failure Modes and Effects Analysis and Fault Tree Analysis. To gain familiarity with these three techniques in a small group setting. To share and discuss experiences with the three techniques with faculty and participants. Director, TreatSafely, LLC. Director, Center for the Assessment of Radiological Sciences. Occasional Consultant to the IAEA and Varian.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dunscombe, P.
This Hands-on Workshop will be focused on providing participants with experience with the principal tools of TG 100 and hence start to build both competence and confidence in the use of risk-based quality management techniques. The three principal tools forming the basis of TG 100’s risk analysis: Process mapping, Failure-Modes and Effects Analysis and fault-tree analysis will be introduced with a 5 minute refresher presentation and each presentation will be followed by a 30 minute small group exercise. An exercise on developing QM from the risk analysis follows. During the exercise periods, participants will apply the principles in 2 differentmore » clinical scenarios. At the conclusion of each exercise there will be ample time for participants to discuss with each other and the faculty their experience and any challenges encountered. Learning Objectives: To review the principles of Process Mapping, Failure Modes and Effects Analysis and Fault Tree Analysis. To gain familiarity with these three techniques in a small group setting. To share and discuss experiences with the three techniques with faculty and participants. Director, TreatSafely, LLC. Director, Center for the Assessment of Radiological Sciences. Occasional Consultant to the IAEA and Varian.« less
C-17 Centerlining - Analysis of Paratrooper Trajectory
2005-06-01
Linear Regression Models. 4th ed. McGraw Hill, 2004. Kuntavanish, Mark. Program Engineer for C-17 System Program Office. Briefing to US Army...Airdrop Risk Assessment Using Bootstrap Sampling, MS AFIT Thesis AFIT/GOR/ENS/96D-01, Dec 1996. Kutner, Michael, C. Nachtsheim, and J. Neter. Applied
Marginalization and School Nursing
ERIC Educational Resources Information Center
Smith, Julia Ann
2004-01-01
The concept of marginalization was first analyzed by nursing researchers Hall, Stevens, and Meleis. Although nursing literature frequently refers to this concept when addressing "at risk" groups such as the homeless, gays and lesbians, and those infected with HIV/AIDS, the concept can also be applied to nursing. Analysis of current school nursing…
Beginning Special Education Teachers: At Risk for Attrition.
ERIC Educational Resources Information Center
Karge, Belinda Dunnick; Freiberg, Melissa R.
Recognizing the importance of early experience to job satisfaction and commitment, this study was conducted to investigate the effect of support from administration on the induction and retention of 457 beginning public school, special education teachers. Secondary analysis techniques were applied to information derived from the 1987-88 cross…
Sequential use of simulation and optimization in analysis and planning
Hans R. Zuuring; Jimmie D. Chew; J. Greg Jones
2000-01-01
Management activities are analyzed at landscape scales employing both simulation and optimization. SIMPPLLE, a stochastic simulation modeling system, is initially applied to assess the risks associated with a specific natural process occurring on the current landscape without management treatments, but with fire suppression. These simulation results are input into...
Common cause evaluations in applied risk analysis of nuclear power plants. [PWR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taniguchi, T.; Ligon, D.; Stamatelatos, M.
1983-04-01
Qualitative and quantitative approaches were developed for the evaluation of common cause failures (CCFs) in nuclear power plants and were applied to the analysis of the auxiliary feedwater systems of several pressurized water reactors (PWRs). Key CCF variables were identified through a survey of experts in the field and a review of failure experience in operating PWRs. These variables were classified into categories of high, medium, and low defense against a CCF. Based on the results, a checklist was developed for analyzing CCFs of systems. Several known techniques for quantifying CCFs were also reviewed. The information provided valuable insights inmore » the development of a new model for estimating CCF probabilities, which is an extension of and improvement over the Beta Factor method. As applied to the analysis of the PWR auxiliary feedwater systems, the method yielded much more realistic values than the original Beta Factor method for a one-out-of-three system.« less
An improved method for risk evaluation in failure modes and effects analysis of CNC lathe
NASA Astrophysics Data System (ADS)
Rachieru, N.; Belu, N.; Anghel, D. C.
2015-11-01
Failure mode and effects analysis (FMEA) is one of the most popular reliability analysis tools for identifying, assessing and eliminating potential failure modes in a wide range of industries. In general, failure modes in FMEA are evaluated and ranked through the risk priority number (RPN), which is obtained by the multiplication of crisp values of the risk factors, such as the occurrence (O), severity (S), and detection (D) of each failure mode. However, the crisp RPN method has been criticized to have several deficiencies. In this paper, linguistic variables, expressed in Gaussian, trapezoidal or triangular fuzzy numbers, are used to assess the ratings and weights for the risk factors S, O and D. A new risk assessment system based on the fuzzy set theory and fuzzy rule base theory is to be applied to assess and rank risks associated to failure modes that could appear in the functioning of Turn 55 Lathe CNC. Two case studies have been shown to demonstrate the methodology thus developed. It is illustrated a parallel between the results obtained by the traditional method and fuzzy logic for determining the RPNs. The results show that the proposed approach can reduce duplicated RPN numbers and get a more accurate, reasonable risk assessment. As a result, the stability of product and process can be assured.
NASA Astrophysics Data System (ADS)
Tonini, R.; Anita, G.
2011-12-01
In both worldwide and regional historical catalogues, most of the tsunamis are caused by earthquakes and a minor percentage is represented by all the other non-seismic sources. On the other hand, tsunami hazard and risk studies are often applied to very specific areas, where this global trend can be different or even inverted, depending on the kind of potential tsunamigenic sources which characterize the case study. So far, few probabilistic approaches consider the contribution of landslides and/or phenomena derived by volcanic activity, i.e. pyroclastic flows and flank collapses, as predominant in the PTHA, also because of the difficulties to estimate the correspondent recurrence time. These considerations are valid, for example, for the city of Naples, Italy, which is surrounded by a complex active volcanic system (Vesuvio, Campi Flegrei, Ischia) that presents a significant number of potential tsunami sources of non-seismic origin compared to the seismic ones. In this work we present the preliminary results of a probabilistic multi-source tsunami hazard assessment applied to Naples. The method to estimate the uncertainties will be based on Bayesian inference. This is the first step towards a more comprehensive task which will provide a tsunami risk quantification for this town in the frame of the Italian national project ByMuR (http://bymur.bo.ingv.it). This three years long ongoing project has the final objective of developing a Bayesian multi-risk methodology to quantify the risk related to different natural hazards (volcanoes, earthquakes and tsunamis) applied to the city of Naples.
Consumers' Risk Perception of Household Cleaning and Washing Products.
Bearth, Angela; Miesler, Linda; Siegrist, Michael
2017-04-01
A large share of accidental and nonaccidental poisonings are caused by household cleaning and washing products, such as drain cleaner or laundry detergent. The main goal of this article was to investigate consumers' risk perception and misconceptions of a variety of cleaning and washing products in order to inform future risk communication efforts. For this, a sorting task including 33 commonly available household cleaning and washing products was implemented. A total of 60 female consumers were asked to place the cleaning and washing products on a reference line 3 m in length with the poles "dangerous" and "not dangerous." The gathered data were analyzed qualitatively and by means of multidimensional scaling, cluster analysis, and linear regression. The dimensionality of the sorting data suggests that both analytically (i.e., written and graphical hazard notes and perceived effectiveness) and intuitively driven risk judgments (i.e., eco vs. regular products) were applied by the participants. Furthermore, results suggest the presence of misconceptions, particularly related to consumers' perceptions of eco cleaning products, which were generally regarded as safer than their regular counterparts. Future risk communication should aim at dispelling these misconceptions and promoting accurate risk perceptions of particular household cleaning and washing products. © 2016 Society for Risk Analysis.
Online Information Sharing About Risks: The Case of Organic Food.
Hilverda, Femke; Kuttschreuter, Margôt
2018-03-23
Individuals have to make sense of an abundance of information to decide whether or not to purchase certain food products. One of the means to sense-making is information sharing. This article reports on a quantitative study examining online information sharing behavior regarding the risks of organic food products. An online survey among 535 respondents was conducted in the Netherlands to examine the determinants of information sharing behavior, and their relationships. Structural equation modeling was applied to test both the measurement model and the structural model. Results showed that the intention to share information online about the risks of organic food was low. Conversations and email were the preferred channels to share information; of the social media Facebook stood out. The developed model was found to provide an adequate description of the data. It explained 41% of the variance in information sharing. Injunctive norms and outcome expectancies were most important in predicting online information sharing, followed by information-related determinants. Risk-perception-related determinants showed a significant, but weak, positive relationship with online information sharing. Implications for authorities communicating on risks associated with food are addressed. © 2018 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.
Choi, Wan-Suk; Moon, Ok-Kon; Yeum, Dong-Moon
2017-10-07
This study investigated the characteristics and health behavior profiles of 1,803 workers who had experienced industrial accidents. Average weekly exercise days, average number of cigarettes smoked per day, average daily sleep duration, and number of days of alcohol consumption were selected to investigate health behavior profiles. Specifically, latent profile analysis was applied to identify the health behavior profiles of people who had completed industrial accident care; the latent classes were the health-conscious type (n=240), the potential-risk type (n=850), and the high-risk type (n=713). Comparison of the health-conscious and potential-risk types indicated that younger subjects, the employed, and those with lower social status and life satisfaction were more likely to be the potential-risk type. Comparison of the health-conscious and high-risk types revealed that males, younger subjects, the employed, those without chronic illnesses, and those with lower social status and life satisfaction were more likely to be the high-risk type. The results suggest that industrial accident victims who have completed accident care have different health behaviors and it is necessary to improve health promotion based on health type characteristics.
Comparing risk in conventional and organic dairy farming in the Netherlands: an empirical analysis.
Berentsen, P B M; Kovacs, K; van Asseldonk, M A P M
2012-07-01
This study was undertaken to contribute to the understanding of why most dairy farmers do not convert to organic farming. Therefore, the objective of this research was to assess and compare risks for conventional and organic farming in the Netherlands with respect to gross margin and the underlying price and production variables. To investigate the risk factors a farm accountancy database was used containing panel data from both conventional and organic representative Dutch dairy farms (2001-2007). Variables with regard to price and production risk were identified using a gross margin analysis scheme. Price risk variables were milk price and concentrate price. The main production risk variables were milk yield per cow, roughage yield per hectare, and veterinary costs per cow. To assess risk, an error component implicit detrending method was applied and the resulting detrended standard deviations were compared between conventional and organic farms. Results indicate that the risk included in the gross margin per cow is significantly higher in organic farming. This is caused by both higher price and production risks. Price risks are significantly higher in organic farming for both milk price and concentrate price. With regard to production risk, only milk yield per cow poses a significantly higher risk in organic farming. Copyright © 2012 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Yang, Yu; Jiang, Yong-Hai; Lian, Xin-Ying; Xi, Bei-Dou; Ma, Zhi-Fei; Xu, Xiang-Jian; An, Da
2016-12-01
Hazardous waste landfill sites are a significant source of groundwater pollution. To ensure that these landfills with a significantly high risk of groundwater contamination are properly managed, a risk-based ranking method related to groundwater contamination is needed. In this research, a risk-based prioritization method for the classification of groundwater pollution from hazardous waste landfills was established. The method encompasses five phases, including risk pre-screening, indicator selection, characterization, classification and, lastly, validation. In the risk ranking index system employed here, 14 indicators involving hazardous waste landfills and migration in the vadose zone as well as aquifer were selected. The boundary of each indicator was determined by K-means cluster analysis and the weight of each indicator was calculated by principal component analysis. These methods were applied to 37 hazardous waste landfills in China. The result showed that the risk for groundwater contamination from hazardous waste landfills could be ranked into three classes from low to high risk. In all, 62.2 % of the hazardous waste landfill sites were classified in the low and medium risk classes. The process simulation method and standardized anomalies were used to validate the result of risk ranking; the results were consistent with the simulated results related to the characteristics of contamination. The risk ranking method was feasible, valid and can provide reference data related to risk management for groundwater contamination at hazardous waste landfill sites.
Klein, A A; Collier, T; Yeates, J; Miles, L F; Fletcher, S N; Evans, C; Richards, T
2017-09-01
A simple and accurate scoring system to predict risk of transfusion for patients undergoing cardiac surgery is lacking. We identified independent risk factors associated with transfusion by performing univariate analysis, followed by logistic regression. We then simplified the score to an integer-based system and tested it using the area under the receiver operator characteristic (AUC) statistic with a Hosmer-Lemeshow goodness-of-fit test. Finally, the scoring system was applied to the external validation dataset and the same statistical methods applied to test the accuracy of the ACTA-PORT score. Several factors were independently associated with risk of transfusion, including age, sex, body surface area, logistic EuroSCORE, preoperative haemoglobin and creatinine, and type of surgery. In our primary dataset, the score accurately predicted risk of perioperative transfusion in cardiac surgery patients with an AUC of 0.76. The external validation confirmed accuracy of the scoring method with an AUC of 0.84 and good agreement across all scores, with a minor tendency to under-estimate transfusion risk in very high-risk patients. The ACTA-PORT score is a reliable, validated tool for predicting risk of transfusion for patients undergoing cardiac surgery. This and other scores can be used in research studies for risk adjustment when assessing outcomes, and might also be incorporated into a Patient Blood Management programme. © The Author 2017. Published by Oxford University Press on behalf of the British Journal of Anaesthesia. All rights reserved. For Permissions, please email: journals.permissions@oup.com
BBN-Based Portfolio Risk Assessment for NASA Technology R&D Outcome
NASA Technical Reports Server (NTRS)
Geuther, Steven C.; Shih, Ann T.
2016-01-01
The NASA Aeronautics Research Mission Directorate (ARMD) vision falls into six strategic thrusts that are aimed to support the challenges of the Next Generation Air Transportation System (NextGen). In order to achieve the goals of the ARMD vision, the Airspace Operations and Safety Program (AOSP) is committed to developing and delivering new technologies. To meet the dual challenges of constrained resources and timely technology delivery, program portfolio risk assessment is critical for communication and decision-making. This paper describes how Bayesian Belief Network (BBN) is applied to assess the probability of a technology meeting the expected outcome. The network takes into account the different risk factors of technology development and implementation phases. The use of BBNs allows for all technologies of projects in a program portfolio to be separately examined and compared. In addition, the technology interaction effects are modeled through the application of object-oriented BBNs. The paper discusses the development of simplified project risk BBNs and presents various risk results. The results presented include the probability of project risks not meeting success criteria, the risk drivers under uncertainty via sensitivity analysis, and what-if analysis. Finally, the paper shows how program portfolio risk can be assessed using risk results from BBNs of projects in the portfolio.
Cheng, Yue; Yu, Chengxiao; Huang, Mingtao; Du, Fangzhi; Song, Ci; Ma, Zijian; Zhai, Xiangjun; Yang, Yuan; Liu, Jibin; Bei, Jin-Xin; Jia, Weihua; Jin, Guangfu; Li, Shengping; Zhou, Weiping; Liu, Jianjun; Dai, Juncheng; Hu, Zhibin
2017-10-01
Observational studies show an association between telomere length and Hepatocellular carcinoma (HCC) risk, but the relationship is controversial. Particularly, it remains unclear whether the association is due to confounding or biases inherent in conventional epidemiological studies. Here, we applied Mendelian randomization approach to evaluate whether telomere length is causally associated with HCC risk. Individual-level data were from HBV-related HCC Genome-wide association studies (1,538 HBV positive HCC patients and 1,465 HBV positive controls). Genetic risk score, as proxy for actual measured telomere length, derived from nine telomere length-associated genetic variants was used to evaluate the effect of telomere length on HCC risk. We observed a significant risk signal between genetically increased telomere length and HBV-related HCC risk (OR=2.09, 95% CI 1.32-3.31, P=0.002). Furthermore, a U-shaped curve was fitted by the restricted cubic spline curve, which indicated that either short or long telomere length would increase HCC risk (P=0.0022 for non-linearity test). Subgroup analysis did not reveal significant heterogeneity between different age, gender, smoking status and drinking status groups. Our results indicated that a genetic background that favors longer or shorter telomere length may increase HBV-related HCC risk-a U-shaped association. Copyright © 2017 Elsevier Ltd. All rights reserved.
Duncan, Michael J; Vale, Susana; Santos, Maria Paula; Ribeiro, José Carlos; Mota, Jorge
2013-01-01
To examine the efficacy of aerobic fitness thresholds in predicting weight status and cardiovascular disease risk (CVD) in young people. A cross-sectional school-based study was conducted on 414 Portuguese young people (235 girls and 179 boys) aged 10-16 years (Mean age ± SD = 13.6 ± 1. 8 years). Height and mass were assessed to determine body mass index (BMI). The 20 m multistage shuttle-fitness test (MSFT) was used as an estimate of aerobic fitness. Capillary blood sampling was used to determine: total cholesterol, triglycerides, high-, and low-density lipoprotein. These were combined with measures of systolic blood pressure as z-scores and summed to create a CVD risk score. Analysis of covariance, controlling for sexual maturation, indicated a significant main effect for BMI as a result of fitness category (P = 0.0001). When applied to CVD risk data, there was no difference between "fit" and "unfit" groups (P = 0.136). Subsequent receiver operating curve (ROC) analysis indicated significant diagnostic accuracy of 20 mMSFT performance for boys and girls (both P = 0.0001) with subsequent cut-offs of estimated VO2 peak of 49.5 ml kg(-1) min(-1) for girls and 47.7 ml kg(-1) min(-1) for boys. When applied to BMI and CVD risk data, there was a significant main effect as a result of fitness category for BMI (P = 0.0001) and CVD risk score (P = 0.0001). Recently established cut-points proposed by Boddy et al. (Boddy et al. [2012]: PLoS One 7(9): e45755) show validity in distinguishing between weight status but not CVD risk in Portuguese young people. Alternative ROC generated cut points significantly predicted BMI and CVD risk in this sample. Copyright © 2013 Wiley Periodicals, Inc.
Moojong, Park; Hwandon, Jun; Minchul, Shin
2008-01-01
Sediments entering the sewer in urban areas reduce the conveyance in sewer pipes, which increases inundation risk. To estimate sediment yields, individual landuse areas in each sub-basin should be obtained. However, because of the complex nature of an urban area, this is almost impossible to obtain manually. Thus, a methodology to obtain individual landuse areas for each sub-basin has been suggested for estimating sediment yields. Using GIS, an urban area is divided into sub-basins with respect to the sewer layout, with the area of individual landuse estimated for each sub-basin. The sediment yield per unit area for each sub-basin is then calculated. The suggested method was applied to the GunJa basin in Seoul. For a relation analysis between sediments and inundation risk, sub-basins were ordered by the sediment yields per unit area and compared with historical inundation areas. From this analysis, sub-basins with higher order were found to match the historical inundation areas. Copyright IWA Publishing 2008.
Informing the NCA: EPA's Climate Change Impact and Risk Analysis Framework
NASA Astrophysics Data System (ADS)
Sarofim, M. C.; Martinich, J.; Kolian, M.; Crimmins, A. R.
2017-12-01
The Climate Change Impact and Risk Analysis (CIRA) framework is designed to quantify the physical impacts and economic damages in the United States under future climate change scenarios. To date, the framework has been applied to 25 sectors, using scenarios and projections developed for the Fourth National Climate Assessment. The strength of this framework has been in the use of consistent climatic, socioeconomic, and technological assumptions and inputs across the impact sectors to maximize the ease of cross-sector comparison. The results of the underlying CIRA sectoral analyses are informing the sustained assessment process by helping to address key gaps related to economic valuation and risk. Advancing capacity and scientific literature in this area has created opportunity to consider future applications and strengthening of the framework. This presentation will describe the CIRA framework, present results for various sectors such as heat mortality, air & water quality, winter recreation, and sea level rise, and introduce potential enhancements that can improve the utility of the framework for decision analysis.
NASA Astrophysics Data System (ADS)
Babendreier, J. E.
2002-05-01
Evaluating uncertainty and parameter sensitivity in environmental models can be a difficult task, even for low-order, single-media constructs driven by a unique set of site-specific data. The challenge of examining ever more complex, integrated, higher-order models is a formidable one, particularly in regulatory settings applied on a national scale. Quantitative assessment of uncertainty and sensitivity within integrated, multimedia models that simulate hundreds of sites, spanning multiple geographical and ecological regions, will ultimately require a systematic, comparative approach coupled with sufficient computational power. The Multimedia, Multipathway, and Multireceptor Risk Assessment Model (3MRA) is an important code being developed by the United States Environmental Protection Agency for use in site-scale risk assessment (e.g. hazardous waste management facilities). The model currently entails over 700 variables, 185 of which are explicitly stochastic. The 3MRA can start with a chemical concentration in a waste management unit (WMU). It estimates the release and transport of the chemical throughout the environment, and predicts associated exposure and risk. The 3MRA simulates multimedia (air, water, soil, sediments), pollutant fate and transport, multipathway exposure routes (food ingestion, water ingestion, soil ingestion, air inhalation, etc.), multireceptor exposures (resident, gardener, farmer, fisher, ecological habitats and populations), and resulting risk (human cancer and non-cancer effects, ecological population and community effects). The 3MRA collates the output for an overall national risk assessment, offering a probabilistic strategy as a basis for regulatory decisions. To facilitate model execution of 3MRA for purposes of conducting uncertainty and sensitivity analysis, a PC-based supercomputer cluster was constructed. Design of SuperMUSE, a 125 GHz Windows-based Supercomputer for Model Uncertainty and Sensitivity Evaluation is described, along with the conceptual layout of an accompanying java-based paralleling software toolset. Preliminary work is also reported for a scenario involving Benzene disposal that describes the relative importance of the vadose zone in driving risk levels for ecological receptors and human health. Incorporating landfills, waste piles, aerated tanks, surface impoundments, and land application units, the site-based data used in the analysis included 201 national facilities representing 419 site-WMU combinations.
Methodology for national risk analysis and prioritization of toxic industrial chemicals.
Taxell, Piia; Engström, Kerstin; Tuovila, Juha; Söderström, Martin; Kiljunen, Harri; Vanninen, Paula; Santonen, Tiina
2013-01-01
The identification of chemicals that pose the greatest threat to human health from incidental releases is a cornerstone in public health preparedness for chemical threats. The present study developed and applied a methodology for the risk analysis and prioritization of industrial chemicals to identify the most significant chemicals that pose a threat to public health in Finland. The prioritization criteria included acute and chronic health hazards, physicochemical and environmental hazards, national production and use quantities, the physicochemical properties of the substances, and the history of substance-related incidents. The presented methodology enabled a systematic review and prioritization of industrial chemicals for the purpose of national public health preparedness for chemical incidents.
Qiao, Yuanhua; Keren, Nir; Mannan, M Sam
2009-08-15
Risk assessment and management of transportation of hazardous materials (HazMat) require the estimation of accident frequency. This paper presents a methodology to estimate hazardous materials transportation accident frequency by utilizing publicly available databases and expert knowledge. The estimation process addresses route-dependent and route-independent variables. Negative binomial regression is applied to an analysis of the Department of Public Safety (DPS) accident database to derive basic accident frequency as a function of route-dependent variables, while the effects of route-independent variables are modeled by fuzzy logic. The integrated methodology provides the basis for an overall transportation risk analysis, which can be used later to develop a decision support system.
Logistic regression for risk factor modelling in stuttering research.
Reed, Phil; Wu, Yaqionq
2013-06-01
To outline the uses of logistic regression and other statistical methods for risk factor analysis in the context of research on stuttering. The principles underlying the application of a logistic regression are illustrated, and the types of questions to which such a technique has been applied in the stuttering field are outlined. The assumptions and limitations of the technique are discussed with respect to existing stuttering research, and with respect to formulating appropriate research strategies to accommodate these considerations. Finally, some alternatives to the approach are briefly discussed. The way the statistical procedures are employed are demonstrated with some hypothetical data. Research into several practical issues concerning stuttering could benefit if risk factor modelling were used. Important examples are early diagnosis, prognosis (whether a child will recover or persist) and assessment of treatment outcome. After reading this article you will: (a) Summarize the situations in which logistic regression can be applied to a range of issues about stuttering; (b) Follow the steps in performing a logistic regression analysis; (c) Describe the assumptions of the logistic regression technique and the precautions that need to be checked when it is employed; (d) Be able to summarize its advantages over other techniques like estimation of group differences and simple regression. Copyright © 2012 Elsevier Inc. All rights reserved.
Soller, Jeffrey A; Eftim, Sorina E; Nappier, Sharon P
2018-01-01
Understanding pathogen risks is a critically important consideration in the design of water treatment, particularly for potable reuse projects. As an extension to our published microbial risk assessment methodology to estimate infection risks associated with Direct Potable Reuse (DPR) treatment train unit process combinations, herein, we (1) provide an updated compilation of pathogen density data in raw wastewater and dose-response models; (2) conduct a series of sensitivity analyses to consider potential risk implications using updated data; (3) evaluate the risks associated with log credit allocations in the United States; and (4) identify reference pathogen reductions needed to consistently meet currently applied benchmark risk levels. Sensitivity analyses illustrated changes in cumulative annual risks estimates, the significance of which depends on the pathogen group driving the risk for a given treatment train. For example, updates to norovirus (NoV) raw wastewater values and use of a NoV dose-response approach, capturing the full range of uncertainty, increased risks associated with one of the treatment trains evaluated, but not the other. Additionally, compared to traditional log-credit allocation approaches, our results indicate that the risk methodology provides more nuanced information about how consistently public health benchmarks are achieved. Our results indicate that viruses need to be reduced by 14 logs or more to consistently achieve currently applied benchmark levels of protection associated with DPR. The refined methodology, updated model inputs, and log credit allocation comparisons will be useful to regulators considering DPR projects and design engineers as they consider which unit treatment processes should be employed for particular projects. Published by Elsevier Ltd.
Risk of Resource Failure and Toolkit Variation in Small-Scale Farmers and Herders
Collard, Mark; Ruttle, April; Buchanan, Briggs; O’Brien, Michael J.
2012-01-01
Recent work suggests that global variation in toolkit structure among hunter-gatherers is driven by risk of resource failure such that as risk of resource failure increases, toolkits become more diverse and complex. Here we report a study in which we investigated whether the toolkits of small-scale farmers and herders are influenced by risk of resource failure in the same way. In the study, we applied simple linear and multiple regression analysis to data from 45 small-scale food-producing groups to test the risk hypothesis. Our results were not consistent with the hypothesis; none of the risk variables we examined had a significant impact on toolkit diversity or on toolkit complexity. It appears, therefore, that the drivers of toolkit structure differ between hunter-gatherers and small-scale food-producers. PMID:22844421
Comparative Risk Analysis for Metropolitan Solid Waste Management Systems
NASA Astrophysics Data System (ADS)
Chang, Ni-Bin; Wang, S. F.
1996-01-01
Conventional solid waste management planning usually focuses on economic optimization, in which the related environmental impacts or risks are rarely considered. The purpose of this paper is to illustrate the methodology of how optimization concepts and techniques can be applied to structure and solve risk management problems such that the impacts of air pollution, leachate, traffic congestion, and noise increments can be regulated in the iong-term planning of metropolitan solid waste management systems. Management alternatives are sequentially evaluated by adding several environmental risk control constraints stepwise in an attempt to improve the management strategies and reduce the risk impacts in the long run. Statistics associated with those risk control mechanisms are presented as well. Siting, routing, and financial decision making in such solid waste management systems can also be achieved with respect to various resource limitations and disposal requirements.
de Knegt, Leonardo V; Pires, Sara M; Löfström, Charlotta; Sørensen, Gitte; Pedersen, Karl; Torpdahl, Mia; Nielsen, Eva M; Hald, Tine
2016-03-01
Salmonella is an important cause of bacterial foodborne infections in Denmark. To identify the main animal-food sources of human salmonellosis, risk managers have relied on a routine application of a microbial subtyping-based source attribution model since 1995. In 2013, multiple locus variable number tandem repeat analysis (MLVA) substituted phage typing as the subtyping method for surveillance of S. Enteritidis and S. Typhimurium isolated from animals, food, and humans in Denmark. The purpose of this study was to develop a modeling approach applying a combination of serovars, MLVA types, and antibiotic resistance profiles for the Salmonella source attribution, and assess the utility of the results for the food safety decisionmakers. Full and simplified MLVA schemes from surveillance data were tested, and model fit and consistency of results were assessed using statistical measures. We conclude that loci schemes STTR5/STTR10/STTR3 for S. Typhimurium and SE9/SE5/SE2/SE1/SE3 for S. Enteritidis can be used in microbial subtyping-based source attribution models. Based on the results, we discuss that an adjustment of the discriminatory level of the subtyping method applied often will be required to fit the purpose of the study and the available data. The issues discussed are also considered highly relevant when applying, e.g., extended multi-locus sequence typing or next-generation sequencing techniques. © 2015 Society for Risk Analysis.
Modelling tsunami inundation for risk analysis at the Andaman Sea Coast of Thailand
NASA Astrophysics Data System (ADS)
Kaiser, G.; Kortenhaus, A.
2009-04-01
The mega-tsunami of Dec. 26, 2004 strongly impacted the Andaman Sea coast of Thailand and devastated coastal ecosystems as well as towns, settlements and tourism resorts. In addition to the tragic loss of many lives, the destruction or damage of life-supporting infrastructure, such as buildings, roads, water & power supply etc. caused high economic losses in the region. To mitigate future tsunami impacts there is a need to assess the tsunami hazard and vulnerability in flood prone areas at the Andaman Sea coast in order to determine the spatial distribution of risk and to develop risk management strategies. In the bilateral German-Thai project TRAIT research is performed on integrated risk assessment for the Provinces Phang Nga and Phuket in southern Thailand, including a hazard analysis, i.e. modelling tsunami propagation to the coast, tsunami wave breaking and inundation characteristics, as well as vulnerability analysis of the socio-economic and the ecological system in order to determine the scenario-based, specific risk for the region. In this presentation results of the hazard analysis and the inundation simulation are presented and discussed. Numerical modelling of tsunami propagation and inundation simulation is an inevitable tool for risk analysis, risk management and evacuation planning. While numerous investigations have been made to model tsunami wave generation and propagation in the Indian Ocean, there is still a lack in determining detailed inundation patterns, i.e. water depth and flow dynamics. However, for risk management and evacuation planning this knowledge is essential. As the accuracy of the inundation simulation is strongly depending on the available bathymetric and the topographic data, a multi-scale approach is chosen in this work. The ETOPO Global Relief Model as a bathymetric basis and the Shuttle Radar Topography Mission (SRTM90) have been widely applied in tsunami modelling approaches as these data are free and almost world-wide available. However, to model tsunami-induced inundation for risk analysis and management purposes the accuracy of these data is not sufficient as the processes in the near-shore zone cannot be modelled accurately enough and the spatial resolution of the topography is weak. Moreover, the SRTM data provide a digital surface model which includes vegetation and buildings in the surface description. To improve the data basis additional bathymetric data were used in the near shore zone of the Phang Nga and Phuket coastlines and various remote sensing techniques as well as additional GPS measurements were applied to derive a high resolution topography from satellite and airborne data. Land use classifications and filter methods were developed to correct the digital surface models to digital elevation models. Simulations were then performed with a non-linear shallow water model to model the 2004 Asian Tsunami and to simulate possible future ones. Results of water elevation near the coast were compared with field measurements and observations, and the influence of the resolution of the topography on inundation patterns like water depth, velocity, dispersion and duration of the flood were analysed. The inundation simulation provides detailed hazard maps and is considered a reliable basis for risk assessment and risk zone mapping. Results are regarded vital for estimation of tsunami induced damages and evacuation planning. Results of the aforementioned simulations will be discussed during the conference. Differences of the numerical results using topographic data of different scales and modified by different post processing techniques will be analysed and explained. Further use of the results with respect to tsunami risk analysis and management will also be demonstrated.
Benefits and Limitations of Real Options Analysis for the Practice of River Flood Risk Management
NASA Astrophysics Data System (ADS)
Kind, Jarl M.; Baayen, Jorn H.; Botzen, W. J. Wouter
2018-04-01
Decisions on long-lived flood risk management (FRM) investments are complex because the future is uncertain. Flexibility and robustness can be used to deal with future uncertainty. Real options analysis (ROA) provides a welfare-economics framework to design and evaluate robust and flexible FRM strategies under risk or uncertainty. Although its potential benefits are large, ROA is hardly used in todays' FRM practice. In this paper, we investigate benefits and limitations of a ROA, by applying it to a realistic FRM case study for an entire river branch. We illustrate how ROA identifies optimal short-term investments and values future options. We develop robust dike investment strategies and value the flexibility offered by additional room for the river measures. We benchmark the results of ROA against those of a standard cost-benefit analysis and show ROA's potential policy implications. The ROA for a realistic case requires a high level of geographical detail, a large ensemble of scenarios, and the inclusion of stakeholders' preferences. We found several limitations of applying the ROA. It is complex. In particular, relevant sources of uncertainty need to be recognized, quantified, integrated, and discretized in scenarios, requiring subjective choices and expert judgment. Decision trees have to be generated and stakeholders' preferences have to be translated into decision rules. On basis of this study, we give general recommendations to use high discharge scenarios for the design of measures with high fixed costs and few alternatives. Lower scenarios may be used when alternatives offer future flexibility.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Copping, Andrea E.; Hanna, Luke A.
2011-11-01
Potential environmental effects of offshore wind (OSW) energy development are not well understood, and yet regulatory agencies are required to make decisions in spite of substantial uncertainty about environmental impacts and their long-term consequences. An understanding of risks associated with interactions between OSW installations and avian and aquatic receptors, including animals, habitats, and ecosystems, can help define key uncertainties and focus regulatory actions and scientific studies on interactions of most concern. During FY 2011, Pacific Northwest National Laboratory (PNNL) scientists adapted and applied the Environmental Risk Evaluation System (ERES), first developed to examine the effects of marine and hydrokinetic energymore » devices on aquatic environments, to offshore wind development. PNNL scientists conducted a risk screening analysis on two initial OSW cases: a wind project in Lake Erie and a wind project off the Atlantic coast of the United States near Atlantic City, New Jersey. The screening analysis revealed that top-tier stressors in the two OSW cases were the dynamic effects of the device (e.g., strike), accidents/disasters, and effects of the static physical presence of the device, such as alterations in bottom habitats. Receptor interactions with these stressors at the highest tiers of risk were dominated by threatened and endangered animals. Risk to the physical environment from changes in flow regime also ranked high. Peer review of this process and results will be conducted during FY 2012. The ERES screening analysis provides an assessment of the vulnerability of environmental receptors to stressors associated with OSW installations; a probability analysis is needed to determine specific risk levels to receptors. As more data become available that document effects of offshore wind farms on specific receptors in U.S. coastal and Great Lakes waters, probability analyses will be performed.« less
Gariani, Karim; Mavrakanas, Thomas; Combescure, Christophe; Perrier, Arnaud; Marti, Christophe
2016-03-01
Diabetes mellitus is a well-established risk factor for atherosclerotic disease, but its role in the occurrence of venous thromboembolism (VTE) has not been elucidated. We conducted a meta-analysis of published cohort and case-control studies to assess whether diabetes mellitus is a risk factor for VTE. We systematically searched MEDLINE and EMBASE for case-control and prospective cohort studies assessing association between the risk of venous thromboembolism and diabetes. Odds ratios (OR) from case-control studies were combined while for prospective studies hazard ratios (HR) were combined. Models with random effects were used. Meta-analyses were conducted separately for raw and adjusted measures of association. 24 studies were identified including 10 cohort studies (274,501 patients) and 14 case-control studies (1,157,086 patients). Meta-analysis of the prospective cohort studies demonstrated a significant association between diabetes and VTE (HR 1.60; 95% CI 1.35 to 1.89). This association was no longer present after analysis of multi-adjusted HRs (HR 1.10; 95% CI 0.77 to 1.56). Meta-analysis of case-control studies showed a significant association between diabetes and VTE (OR 1.57; 95%CI 1.17 to 2.12), but this association was no longer present when adjusted ORs were used (OR 1.18; 95%CI 0.89 to 1.56). The increased risk of VTE associated with diabetes mainly results from confounders rather than an intrinsic effect of diabetes on venous thrombotic risk. Therefore, no specific recommendations should apply for the management of diabetic patients at risk for VTE. Copyright © 2015 European Federation of Internal Medicine. Published by Elsevier B.V. All rights reserved.
Hamada, Tsuyoshi; Nakai, Yousuke; Isayama, Hiroyuki; Togawa, Osamu; Kogure, Hirofumi; Kawakubo, Kazumichi; Tsujino, Takeshi; Sasahira, Naoki; Hirano, Kenji; Yamamoto, Natsuyo; Ito, Yukiko; Sasaki, Takashi; Mizuno, Suguru; Toda, Nobuo; Tada, Minoru; Koike, Kazuhiko
2014-03-01
Self-expandable metallic stent (SEMS) placement is widely carried out for distal malignant biliary obstruction, and survival analysis is used to evaluate the cumulative incidences of SEMS dysfunction (e.g. the Kaplan-Meier [KM] method and the log-rank test). However, these statistical methods might be inappropriate in the presence of 'competing risks' (here, death without SEMS dysfunction), which affects the probability of experiencing the event of interest (SEMS dysfunction); that is, SEMS dysfunction can no longer be observed after death. A competing risk analysis has rarely been done in studies on SEMS. We introduced the concept of a competing risk analysis and illustrated its impact on the evaluation of SEMS outcomes using hypothetical and actual data. Our illustrative study included 476 consecutive patients who underwent SEMS placement for unresectable distal malignant biliary obstruction. A significant difference between cumulative incidences of SEMS dysfunction in male and female patients via theKM method (P = 0.044 by the log-rank test) disappeared after applying a competing risk analysis (P = 0.115 by Gray's test). In contrast, although cumulative incidences of SEMS dysfunction via the KM method were similar with and without chemotherapy (P = 0.647 by the log-rank test), cumulative incidence of SEMS dysfunction in the non-chemotherapy group was shown to be significantly lower (P = 0.031 by Gray's test) in a competing risk analysis. Death as a competing risk event needs to be appropriately considered in estimating a cumulative incidence of SEMS dysfunction, otherwise analytical results may be biased. © 2013 The Authors. Digestive Endoscopy © 2013 Japan Gastroenterological Endoscopy Society.
A financial network perspective of financial institutions' systemic risk contributions
NASA Astrophysics Data System (ADS)
Huang, Wei-Qiang; Zhuang, Xin-Tian; Yao, Shuang; Uryasev, Stan
2016-08-01
This study considers the effects of the financial institutions' local topology structure in the financial network on their systemic risk contribution using data from the Chinese stock market. We first measure the systemic risk contribution with the Conditional Value-at-Risk (CoVaR) which is estimated by applying dynamic conditional correlation multivariate GARCH model (DCC-MVGARCH). Financial networks are constructed from dynamic conditional correlations (DCC) with graph filtering method of minimum spanning trees (MSTs). Then we investigate dynamics of systemic risk contributions of financial institution. Also we study dynamics of financial institution's local topology structure in the financial network. Finally, we analyze the quantitative relationships between the local topology structure and systemic risk contribution with panel data regression analysis. We find that financial institutions with greater node strength, larger node betweenness centrality, larger node closeness centrality and larger node clustering coefficient tend to be associated with larger systemic risk contributions.
Estimating the concordance probability in a survival analysis with a discrete number of risk groups.
Heller, Glenn; Mo, Qianxing
2016-04-01
A clinical risk classification system is an important component of a treatment decision algorithm. A measure used to assess the strength of a risk classification system is discrimination, and when the outcome is survival time, the most commonly applied global measure of discrimination is the concordance probability. The concordance probability represents the pairwise probability of lower patient risk given longer survival time. The c-index and the concordance probability estimate have been used to estimate the concordance probability when patient-specific risk scores are continuous. In the current paper, the concordance probability estimate and an inverse probability censoring weighted c-index are modified to account for discrete risk scores. Simulations are generated to assess the finite sample properties of the concordance probability estimate and the weighted c-index. An application of these measures of discriminatory power to a metastatic prostate cancer risk classification system is examined.
The social value of mortality risk reduction: VSL versus the social welfare function approach.
Adler, Matthew D; Hammitt, James K; Treich, Nicolas
2014-05-01
We examine how different welfarist frameworks evaluate the social value of mortality risk reduction. These frameworks include classical, distributively unweighted cost-benefit analysis--i.e., the "value per statistical life" (VSL) approach-and various social welfare functions (SWFs). The SWFs are either utilitarian or prioritarian, applied to policy choice under risk in either an "ex post" or "ex ante" manner. We examine the conditions on individual utility and on the SWF under which these frameworks display sensitivity to wealth and to baseline risk. Moreover, we discuss whether these frameworks satisfy related properties that have received some attention in the literature, namely equal value of risk reduction, preference for risk equity, and catastrophe aversion. We show that the particular manner in which VSL ranks risk-reduction measures is not necessarily shared by other welfarist frameworks. Copyright © 2014 Elsevier B.V. All rights reserved.
Food allergy and risk assessment: Current status and future directions
NASA Astrophysics Data System (ADS)
Remington, Benjamin C.
2017-09-01
Risk analysis is a three part, interactive process that consists of a scientific risk assessment, a risk management strategy and an exchange of information through risk communication. Quantitative risk assessment methodologies are now available and widely used for assessing risks regarding the unintentional consumption of major, regulated allergens but new or modified proteins can also pose a risk of de-novo sensitization. The risks due to de-novo sensitization to new food allergies are harder to quantify. There is a need for a systematic, comprehensive battery of tests and assessment strategy to identify and characterise de-novo sensitization to new proteins and the risks associated with them. A risk assessment must be attuned to answer the risk management questions and needs. Consequently, the hazard and risk assessment methods applied and the desired information are determined by the requested outcome for risk management purposes and decisions to be made. The COST Action network (ImpARAS, www.imparas.eu) has recently started to discuss these risk management criteria from first principles and will continue with the broader subject of improving strategies for allergen risk assessment throughout 2016-2018/9.
Fault Tree Analysis Application for Safety and Reliability
NASA Technical Reports Server (NTRS)
Wallace, Dolores R.
2003-01-01
Many commercial software tools exist for fault tree analysis (FTA), an accepted method for mitigating risk in systems. The method embedded in the tools identifies a root as use in system components, but when software is identified as a root cause, it does not build trees into the software component. No commercial software tools have been built specifically for development and analysis of software fault trees. Research indicates that the methods of FTA could be applied to software, but the method is not practical without automated tool support. With appropriate automated tool support, software fault tree analysis (SFTA) may be a practical technique for identifying the underlying cause of software faults that may lead to critical system failures. We strive to demonstrate that existing commercial tools for FTA can be adapted for use with SFTA, and that applied to a safety-critical system, SFTA can be used to identify serious potential problems long before integrator and system testing.
MetaDP: a comprehensive web server for disease prediction of 16S rRNA metagenomic datasets.
Xu, Xilin; Wu, Aiping; Zhang, Xinlei; Su, Mingming; Jiang, Taijiao; Yuan, Zhe-Ming
2016-01-01
High-throughput sequencing-based metagenomics has garnered considerable interest in recent years. Numerous methods and tools have been developed for the analysis of metagenomic data. However, it is still a daunting task to install a large number of tools and complete a complicated analysis, especially for researchers with minimal bioinformatics backgrounds. To address this problem, we constructed an automated software named MetaDP for 16S rRNA sequencing data analysis, including data quality control, operational taxonomic unit clustering, diversity analysis, and disease risk prediction modeling. Furthermore, a support vector machine-based prediction model for intestinal bowel syndrome (IBS) was built by applying MetaDP to microbial 16S sequencing data from 108 children. The success of the IBS prediction model suggests that the platform may also be applied to other diseases related to gut microbes, such as obesity, metabolic syndrome, or intestinal cancer, among others (http://metadp.cn:7001/).
Is higher risk sex common among male or female youths?
Berhan, Yifru; Berhan, Asres
2015-01-01
There are several studies that showed the high prevalence of high-risk sexual behaviors among youths, but little is known how significant the proportion of higher risk sex is when the male and female youths are compared. A meta-analysis was done using 26 countries' Demographic and Health Survey data from and outside Africa to make comparisons of higher risk sex among the most vulnerable group of male and female youths. Random effects analytic model was applied and the pooled odds ratios were determined using Mantel-Haenszel statistical method. In this meta-analysis, 19,148 male and 65,094 female youths who reported to have sexual intercourse in a 12-month period were included. The overall OR demonstrated that higher risk sex was ten times more prevalent in male youths than in female youths. The practice of higher risk sex by male youths aged 15-19 years was more than 27-fold higher than that of their female counterparts. Similarly, male youths in urban areas, belonged to a family with middle to highest wealth index, and educated to secondary and above were more than ninefold, eightfold and sixfold at risk of practicing higher risk sex than their female counterparts, respectively. In conclusion, this meta-analysis demonstrated that the practice of risky sexual intercourse by male youths was incomparably higher than female youths. Future risky sex protective interventions should be tailored to secondary and above educated male youths in urban areas.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whipple, C
Several alternative approaches to address the question {open_quotes}How safe is safe enough?{close_quotes} are reviewed and an attempt is made to apply the reasoning behind these approaches to the issue of acceptability of radiation exposures received in space. The approaches to the issue of the acceptability of technological risk described here are primarily analytical, and are drawn from examples in the management of environmental health risks. These include risk-based approaches, in which specific quantitative risk targets determine the acceptability of an activity, and cost-benefit and decision analysis, which generally focus on the estimation and evaluation of risks, benefits and costs, inmore » a framework that balances these factors against each other. These analytical methods tend by their quantitative nature to emphasize the magnitude of risks, costs and alternatives, and to downplay other factors, especially those that are not easily expressed in quantitative terms, that affect acceptance or rejection of risk. Such other factors include the issues of risk perceptions and how and by whom risk decisions are made.« less
Comparative analysis on the probability of being a good payer
NASA Astrophysics Data System (ADS)
Mihova, V.; Pavlov, V.
2017-10-01
Credit risk assessment is crucial for the bank industry. The current practice uses various approaches for the calculation of credit risk. The core of these approaches is the use of multiple regression models, applied in order to assess the risk associated with the approval of people applying for certain products (loans, credit cards, etc.). Based on data from the past, these models try to predict what will happen in the future. Different data requires different type of models. This work studies the causal link between the conduct of an applicant upon payment of the loan and the data that he completed at the time of application. A database of 100 borrowers from a commercial bank is used for the purposes of the study. The available data includes information from the time of application and credit history while paying off the loan. Customers are divided into two groups, based on the credit history: Good and Bad payers. Linear and logistic regression are applied in parallel to the data in order to estimate the probability of being good for new borrowers. A variable, which contains value of 1 for Good borrowers and value of 0 for Bad candidates, is modeled as a dependent variable. To decide which of the variables listed in the database should be used in the modelling process (as independent variables), a correlation analysis is made. Due to the results of it, several combinations of independent variables are tested as initial models - both with linear and logistic regression. The best linear and logistic models are obtained after initial transformation of the data and following a set of standard and robust statistical criteria. A comparative analysis between the two final models is made and scorecards are obtained from both models to assess new customers at the time of application. A cut-off level of points, bellow which to reject the applications and above it - to accept them, has been suggested for both the models, applying the strategy to keep the same Accept Rate as in the current data.
Toroody, Ahmad Bahoo; Abaei, Mohammad Mahdy; Gholamnia, Reza
2016-12-01
Risk assessment can be classified into two broad categories: traditional and modern. This paper is aimed at contrasting the functional resonance analysis method (FRAM) as a modern approach with the fault tree analysis (FTA) as a traditional method, regarding assessing the risks of a complex system. Applied methodology by which the risk assessment is carried out, is presented in each approach. Also, FRAM network is executed with regard to nonlinear interaction of human and organizational levels to assess the safety of technological systems. The methodology is implemented for lifting structures deep offshore. The main finding of this paper is that the combined application of FTA and FRAM during risk assessment, could provide complementary perspectives and may contribute to a more comprehensive understanding of an incident. Finally, it is shown that coupling a FRAM network with a suitable quantitative method will result in a plausible outcome for a predefined accident scenario.
Probabilistic cost-benefit analysis of disaster risk management in a development context.
Kull, Daniel; Mechler, Reinhard; Hochrainer-Stigler, Stefan
2013-07-01
Limited studies have shown that disaster risk management (DRM) can be cost-efficient in a development context. Cost-benefit analysis (CBA) is an evaluation tool to analyse economic efficiency. This research introduces quantitative, stochastic CBA frameworks and applies them in case studies of flood and drought risk reduction in India and Pakistan, while also incorporating projected climate change impacts. DRM interventions are shown to be economically efficient, with integrated approaches more cost-effective and robust than singular interventions. The paper highlights that CBA can be a useful tool if certain issues are considered properly, including: complexities in estimating risk; data dependency of results; negative effects of interventions; and distributional aspects. The design and process of CBA must take into account specific objectives, available information, resources, and the perceptions and needs of stakeholders as transparently as possible. Intervention design and uncertainties should be qualified through dialogue, indicating that process is as important as numerical results. © 2013 The Author(s). Journal compilation © Overseas Development Institute, 2013.
FMEA: a model for reducing medical errors.
Chiozza, Maria Laura; Ponzetti, Clemente
2009-06-01
Patient safety is a management issue, in view of the fact that clinical risk management has become an important part of hospital management. Failure Mode and Effect Analysis (FMEA) is a proactive technique for error detection and reduction, firstly introduced within the aerospace industry in the 1960s. Early applications in the health care industry dating back to the 1990s included critical systems in the development and manufacture of drugs and in the prevention of medication errors in hospitals. In 2008, the Technical Committee of the International Organization for Standardization (ISO), licensed a technical specification for medical laboratories suggesting FMEA as a method for prospective risk analysis of high-risk processes. Here we describe the main steps of the FMEA process and review data available on the application of this technique to laboratory medicine. A significant reduction of the risk priority number (RPN) was obtained when applying FMEA to blood cross-matching, to clinical chemistry analytes, as well as to point-of-care testing (POCT).
Morra, P; Lisi, R; Spadoni, G; Maschio, G
2009-06-01
The impact of industrial and civil activities on an agricultural and residential area is presented in a detailed and global analysis. The examined area is the Pace river valley situated in the northern zone of Messina (Italy). The sources of pollution present in the area are: a Municipal Solid Waste Incinerator operating since 1979, a disused urban solid waste landfill which was used for 30 years, an urban solid waste treatment facility with heavy vehicles traffic, and two open pits for the production of bitumen. Large quantities of toxic, carcinogenic substances and criteria pollutants are released into the environment and represent potential hazards to human health. The analysis is performed using the EHHRA-GIS tool which employs an integrated, multimedia, multi-exposure pathways and multi-receptor risk assessment model that is able to manage all the steps which constitute the human health risk analysis in a georeferenced structure. The transport of pollutants in different environmental media is assessed applying models (AERMOD, GMS, CALINE) that take into account the particular three-dimensional morphology of the terrain. The results obtained, combined with a probabilistic risk assessment and a sensitivity analysis of calculation parameters, are a comprehensive assessment of the total human health risk in the area. Finally human health risks caused by toxic and carcinogenic substances are compared with acceptable legal limits in order to support environmental managers' decisions.
2006-05-16
and Internally Displaced Persons (IDPs) Judicial Personnel and Infrastructure Trafficking in Persons Property Food Security Legal System Reform...Shelter and Non- Food Relief Human Rights Humanitarian Demining Corrections Public Health War Crime Courts and Tribunals Education Truth...Risk Analysis, 22(2) (2002): 385. 26 Ibid. 27 Ibid. 28 Dombroski, 20. 29 Keith R. Hayes, “Final Report: Inductive Hazard Analysis for GMOs
Short-term vs. long-term heart rate variability in ischemic cardiomyopathy risk stratification.
Voss, Andreas; Schroeder, Rico; Vallverdú, Montserrat; Schulz, Steffen; Cygankiewicz, Iwona; Vázquez, Rafael; Bayés de Luna, Antoni; Caminal, Pere
2013-01-01
In industrialized countries with aging populations, heart failure affects 0.3-2% of the general population. The investigation of 24 h-ECG recordings revealed the potential of nonlinear indices of heart rate variability (HRV) for enhanced risk stratification in patients with ischemic heart failure (IHF). However, long-term analyses are time-consuming, expensive, and delay the initial diagnosis. The objective of this study was to investigate whether 30 min short-term HRV analysis is sufficient for comparable risk stratification in IHF in comparison to 24 h-HRV analysis. From 256 IHF patients [221 at low risk (IHFLR) and 35 at high risk (IHFHR)] (a) 24 h beat-to-beat time series (b) the first 30 min segment (c) the 30 min most stationary day segment and (d) the 30 min most stationary night segment were investigated. We calculated linear (time and frequency domain) and nonlinear HRV analysis indices. Optimal parameter sets for risk stratification in IHF were determined for 24 h and for each 30 min segment by applying discriminant analysis on significant clinical and non-clinical indices. Long- and short-term HRV indices from frequency domain and particularly from nonlinear dynamics revealed high univariate significances (p < 0.01) discriminating between IHFLR and IHFHR. For multivariate risk stratification, optimal mixed parameter sets consisting of 5 indices (clinical and nonlinear) achieved 80.4% AUC (area under the curve of receiver operating characteristics) from 24 h HRV analysis, 84.3% AUC from first 30 min, 82.2 % AUC from daytime 30 min and 81.7% AUC from nighttime 30 min. The optimal parameter set obtained from the first 30 min showed nearly the same classification power when compared to the optimal 24 h-parameter set. As results from stationary daytime and nighttime, 30 min segments indicate that short-term analyses of 30 min may provide at least a comparable risk stratification power in IHF in comparison to a 24 h analysis period.
Wang, Hao; Lau, Nathan; Gerdes, Ryan M
2018-04-01
The aim of this study was to apply work domain analysis for cybersecurity assessment and design of supervisory control and data acquisition (SCADA) systems. Adoption of information and communication technology in cyberphysical systems (CPSs) for critical infrastructures enables automated and distributed control but introduces cybersecurity risk. Many CPSs employ SCADA industrial control systems that have become the target of cyberattacks, which inflict physical damage without use of force. Given that absolute security is not feasible for complex systems, cyberintrusions that introduce unanticipated events will occur; a proper response will in turn require human adaptive ability. Therefore, analysis techniques that can support security assessment and human factors engineering are invaluable for defending CPSs. We conducted work domain analysis using the abstraction hierarchy (AH) to model a generic SCADA implementation to identify the functional structures and means-ends relations. We then adopted a case study approach examining the Stuxnet cyberattack by developing and integrating AHs for the uranium enrichment process, SCADA implementation, and malware to investigate the interactions between the three aspects of cybersecurity in CPSs. The AHs for modeling a generic SCADA implementation and studying the Stuxnet cyberattack are useful for mapping attack vectors, identifying deficiencies in security processes and features, and evaluating proposed security solutions with respect to system objectives. Work domain analysis is an effective analytical method for studying cybersecurity of CPSs for critical infrastructures in a psychologically relevant manner. Work domain analysis should be applied to assess cybersecurity risk and inform engineering and user interface design.
2011-01-01
Background The analysis of risk for the population residing and/or working in contaminated areas raises the topic of commuting. In fact, especially in contaminated areas, commuting groups are likely to be subject to lower exposure than residents. Only very recently environmental epidemiology has started considering the role of commuting as a differential source of exposure in contaminated areas. In order to improve the categorization of groups, this paper applies a gravitational model to the analysis of residential risk for workers in the Gela petrochemical complex, which began life in the early 60s in the municipality of Gela (Sicily, Italy) and is the main source of industrial pollution in the local area. Results A logistic regression model is implemented to measure the capacity of Gela "central location" to attract commuting flows from other sites. Drawing from gravity models, the proposed methodology: a) defines the probability of finding commuters from municipalities outside Gela as a function of the origin's "economic mass" and of its distance from each destination; b) establishes "commuting thresholds" relative to the origin's mass. The analysis includes 367 out of the 390 Sicilian municipalities. Results are applied to define "commuters" and "residents" within the cohort of petrochemical workers. The study population is composed of 5,627 workers. Different categories of residence in Gela are compared calculating Mortality Rate Ratios for lung cancer through a Poisson regression model, controlling for age and calendar period. The mobility model correctly classifies almost 90% of observations. Its application to the mortality analysis confirms a major risk for lung cancer associated with residence in Gela. Conclusions Commuting is a critical aspect of the health-environment relationship in contaminated areas. The proposed methodology can be replicated to different contexts when residential information is lacking or unreliable; however, a careful consideration of the territorial characteristics ("insularity" and its impact on transportation time and costs, in our case) is suggested when specifying the area of application for the mobility analysis. PMID:21272299
Montorselli, Niccolò Brachetti; Lombardini, Carolina; Magagnotti, Natascia; Marchi, Enrico; Neri, Francesco; Picchi, Gianni; Spinelli, Raffaele
2010-11-01
The study compared the performance of four different logging crews with respect to productivity, organization and safety. To this purpose, the authors developed a data collection method capable of providing a quantitative analysis of risk-taking behavior. Four crews were tested under the same working conditions, representative of close-to-nature alpine forestry. Motor-manual working methods were applied, since these methods are still prevalent in the specific study area, despite the growing popularity of mechanical processors. Crews from public companies showed a significantly lower frequency of risk-taking behavior. The best safety performance was offered by the only (public) crew that had been administered formal safety training. The study seems to deny the common prejudice that safety practice is inversely proportional to productivity. Instead, productivity is increased by introducing more efficient working methods and equipment. The quantitative analysis of risk-taking behavior developed in this study can be applied to a number of industrial fields besides forestry. Characterizing risk-taking behavior for a given case may eventually lead to the development of custom-made training programmes, which may address problem areas while avoiding that the message is weakened by the inclusion of redundant information. In the specific case of logging crews in the central Alps, the study suggests that current training courses may be weak on ergonomics, and advocates a staged training programme, focusing first on accident reduction and then expanding to the prevention of chronic illness. 2010 Elsevier Ltd. All rights reserved.
Draicchio, F; Silvetti, A; Badellino, E; Vinci, F
2007-01-01
There is little in the literature about the risks of manual handling of material in supermarkets and what there is refers solely to storehouse work. This contrasts with the substantial number of studies of the risk of repeated arm movements among supermarket cash-desk staff. The scarcity of information is partly due to the difficulties of applying widely employed, standardized evaluation methods in this sector. One of the conditions limiting the application of the NIOSH protocol in this retail sector is that lifting tasks are so often closely tied to transport. The biomechanical analysis method we used brought to light considerable risks in many of the steps investigated: unpacking the pallet, unloading the crates from the pallet to the ground, lifting them from the floor onto display stands, and filling the boxes on the stands with goods before the shop opens. Images acquired on site were analyzed in the laboratory. We selected the most indicative images, which were then studied as regards posture and biomechanics using Apalys 3.0 software (ILMCAD GmbH, Ilmenau, Germany). Biomechemical analysis was done on the following movements: unloading crates from the pallet, positioning them on fruit and vegetable department display stands, and filling the boxes on the stands. We obtained a prediction of 2720 to 5472 N for the load at the lumbosacral junction (L5-S1). Simulation of the NIOSH index gave a value of 2.69 in the only case where the Waters protocol could be applied.
NASA Astrophysics Data System (ADS)
Kossobokov, V. G.; Nekrasova, A.
2017-12-01
We continue applying the general concept of seismic risk analysis in a number of seismic regions worldwide by constructing regional seismic hazard maps based on morphostructural analysis, pattern recognition, and the Unified Scaling Law for Earthquakes, USLE, which generalizes the Gutenberg-Richter relationship making use of naturally fractal distribution of earthquake sources of different size in a seismic region. The USLE stands for an empirical relationship log10N(M, L) = A + B·(5 - M) + C·log10L, where N(M, L) is the expected annual number of earthquakes of a certain magnitude M within an seismically prone area of linear dimension L. We use parameters A, B, and C of USLE to estimate, first, the expected maximum credible magnitude in a time interval at seismically prone nodes of the morphostructural scheme of the region under study, then map the corresponding expected ground shaking parameters (e.g. peak ground acceleration, PGA, or macro-seismic intensity etc.). After a rigorous testing against the available seismic evidences in the past (usually, the observed instrumental PGA or the historically reported macro-seismic intensity), such a seismic hazard map is used to generate maps of specific earthquake risks for population, cities, and infrastructures (e.g., those based on census of population, buildings inventory, etc.). This, USLE based, methodology of seismic hazard and risks assessment is applied to the territory of Altai-Sayan Region, of Russia. The study supported by the Russian Science Foundation Grant No. 15-17-30020.
Flood Hazard Mapping by Applying Fuzzy TOPSIS Method
NASA Astrophysics Data System (ADS)
Han, K. Y.; Lee, J. Y.; Keum, H.; Kim, B. J.; Kim, T. H.
2017-12-01
There are lots of technical methods to integrate various factors for flood hazard mapping. The purpose of this study is to suggest the methodology of integrated flood hazard mapping using MCDM(Multi Criteria Decision Making). MCDM problems involve a set of alternatives that are evaluated on the basis of conflicting and incommensurate criteria. In this study, to apply MCDM to assessing flood risk, maximum flood depth, maximum velocity, and maximum travel time are considered as criterion, and each applied elements are considered as alternatives. The scheme to find the efficient alternative closest to a ideal value is appropriate way to assess flood risk of a lot of element units(alternatives) based on various flood indices. Therefore, TOPSIS which is most commonly used MCDM scheme is adopted to create flood hazard map. The indices for flood hazard mapping(maximum flood depth, maximum velocity, and maximum travel time) have uncertainty concerning simulation results due to various values according to flood scenario and topographical condition. These kind of ambiguity of indices can cause uncertainty of flood hazard map. To consider ambiguity and uncertainty of criterion, fuzzy logic is introduced which is able to handle ambiguous expression. In this paper, we made Flood Hazard Map according to levee breach overflow using the Fuzzy TOPSIS Technique. We confirmed the areas where the highest grade of hazard was recorded through the drawn-up integrated flood hazard map, and then produced flood hazard map can be compared them with those indicated in the existing flood risk maps. Also, we expect that if we can apply the flood hazard map methodology suggested in this paper even to manufacturing the current flood risk maps, we will be able to make a new flood hazard map to even consider the priorities for hazard areas, including more varied and important information than ever before. Keywords : Flood hazard map; levee break analysis; 2D analysis; MCDM; Fuzzy TOPSIS Acknowlegement This research was supported by a grant (17AWMP-B079625-04) from Water Management Research Program funded by Ministry of Land, Infrastructure and Transport of Korean government.
Linear regression analysis: part 14 of a series on evaluation of scientific publications.
Schneider, Astrid; Hommel, Gerhard; Blettner, Maria
2010-11-01
Regression analysis is an important statistical method for the analysis of medical data. It enables the identification and characterization of relationships among multiple factors. It also enables the identification of prognostically relevant risk factors and the calculation of risk scores for individual prognostication. This article is based on selected textbooks of statistics, a selective review of the literature, and our own experience. After a brief introduction of the uni- and multivariable regression models, illustrative examples are given to explain what the important considerations are before a regression analysis is performed, and how the results should be interpreted. The reader should then be able to judge whether the method has been used correctly and interpret the results appropriately. The performance and interpretation of linear regression analysis are subject to a variety of pitfalls, which are discussed here in detail. The reader is made aware of common errors of interpretation through practical examples. Both the opportunities for applying linear regression analysis and its limitations are presented.
An ounce of prevention or a pound of cure: bioeconomic risk analysis of invasive species.
Leung, Brian; Lodge, David M; Finnoff, David; Shogren, Jason F; Lewis, Mark A; Lamberti, Gary
2002-12-07
Numbers of non-indigenous species--species introduced from elsewhere - are increasing rapidly worldwide, causing both environmental and economic damage. Rigorous quantitative risk-analysis frameworks, however, for invasive species are lacking. We need to evaluate the risks posed by invasive species and quantify the relative merits of different management strategies (e.g. allocation of resources between prevention and control). We present a quantitative bioeconomic modelling framework to analyse risks from non-indigenous species to economic activity and the environment. The model identifies the optimal allocation of resources to prevention versus control, acceptable invasion risks and consequences of invasion to optimal investments (e.g. labour and capital). We apply the model to zebra mussels (Dreissena polymorpha), and show that society could benefit by spending up to US$324 000 year(-1) to prevent invasions into a single lake with a power plant. By contrast, the US Fish and Wildlife Service spent US$825 000 in 2001 to manage all aquatic invaders in all US lakes. Thus, greater investment in prevention is warranted.
NASA Astrophysics Data System (ADS)
Roshan, E.; Mohammadi Khabbazan, M.; Held, H.
2016-12-01
Solar radiation management (SRM) might be able to reduce the anthropogenic global mean temperature rise but unable to do so for other climate variables such as precipitation, particularly with respect to regional disparities due to changes in planetary energy budget. We apply cost-risk analysis (CRA), which is a decision analytic framework that trades off the expected welfare-loss from climate policies costs against the climate risks from exceeding an environmental target. Here, in both global- and `Giorgi'-regional-scale analyses, we study the optimal mix of SRM and mitigation under probabilistic knowledge about climate sensitivity, in our numerics ranging from 1.01°C to 7.17°C. To do so, we generalize CRA for the sake of including temperature risk, global and regional precipitation risks. Social welfare is maximized in three scenarios, considering a convex combination of climate risks: temperature-risk-only, precipitation-risk-only, and equally weighted both-risks. Our global results represent 100%, 65%, and 90% compliance with 2°C-temperature target and simultaneously 0%, 100%, and 100% compliance with 2°C-compatible-precipitation corridor respectively in temperature-risk-only, precipitation-risk-only, and both-risks scenarios. On the other hand, our regional results emphasize that SRM would alleviate the global mean temperature to be complied with 2°C-temperature target for about 100%, 95%, and 95% of climate sensitivities in temperature-risk-only, precipitation-risk-only, and both-risks scenarios, respectively. However, half of the regions suffer a very high precipitation risks when the society only cares about global temperature reduction in temperature-risk-only scenario. Our results indicate that although SRM might almost substitute for mitigation in the global analysis, it only saves about a half of the welfare-loss in a purely mitigation-based analysis (from economic costs and climate risks, in terms of BGE) when considering regional precipitation risks.
Genetic variation associated with cardiovascular risk in autoimmune diseases
Perrotti, Pedro P.; Aterido, Adrià; Fernández-Nebro, Antonio; Cañete, Juan D.; Ferrándiz, Carlos; Tornero, Jesús; Gisbert, Javier P.; Domènech, Eugeni; Fernández-Gutiérrez, Benjamín; Gomollón, Fernando; García-Planella, Esther; Fernández, Emilia; Sanmartí, Raimon; Gratacós, Jordi; Martínez-Taboada, Víctor Manuel; Rodríguez-Rodríguez, Luís; Palau, Núria; Tortosa, Raül; Corbeto, Mireia L.; Lasanta, María L.; Marsal, Sara; Julià, Antonio
2017-01-01
Autoimmune diseases have a higher prevalence of cardiovascular events compared to the general population. The objective of this study was to investigate the genetic basis of cardiovascular disease (CVD) risk in autoimmunity. We analyzed genome-wide genotyping data from 6,485 patients from six autoimmune diseases that are associated with a high socio-economic impact. First, for each disease, we tested the association of established CVD risk loci. Second, we analyzed the association of autoimmune disease susceptibility loci with CVD. Finally, to identify genetic patterns associated with CVD risk, we applied the cross-phenotype meta-analysis approach (CPMA) on the genome-wide data. A total of 17 established CVD risk loci were significantly associated with CVD in the autoimmune patient cohorts. From these, four loci were found to have significantly different genetic effects across autoimmune diseases. Six autoimmune susceptibility loci were also found to be associated with CVD risk. Genome-wide CPMA analysis identified 10 genetic clusters strongly associated with CVD risk across all autoimmune diseases. Two of these clusters are highly enriched in pathways previously associated with autoimmune disease etiology (TNFα and IFNγ cytokine pathways). The results of this study support the presence of specific genetic variation associated with the increase of CVD risk observed in autoimmunity. PMID:28982122
Evaluating the operational risks of biomedical waste using failure mode and effects analysis.
Chen, Ying-Chu; Tsai, Pei-Yi
2017-06-01
The potential problems and risks of biomedical waste generation have become increasingly apparent in recent years. This study applied a failure mode and effects analysis to evaluate the operational problems and risks of biomedical waste. The microbiological contamination of biomedical waste seldom receives the attention of researchers. In this study, the biomedical waste lifecycle was divided into seven processes: Production, classification, packaging, sterilisation, weighing, storage, and transportation. Twenty main failure modes were identified in these phases and risks were assessed based on their risk priority numbers. The failure modes in the production phase accounted for the highest proportion of the risk priority number score (27.7%). In the packaging phase, the failure mode 'sharp articles not placed in solid containers' had the highest risk priority number score, mainly owing to its high severity rating. The sterilisation process is the main difference in the treatment of infectious and non-infectious biomedical waste. The failure modes in the sterilisation phase were mainly owing to human factors (mostly related to operators). This study increases the understanding of the potential problems and risks associated with biomedical waste, thereby increasing awareness of how to improve the management of biomedical waste to better protect workers, the public, and the environment.
A comparative critical study between FMEA and FTA risk analysis methods
NASA Astrophysics Data System (ADS)
Cristea, G.; Constantinescu, DM
2017-10-01
Today there is used an overwhelming number of different risk analyses techniques with acronyms such as: FMEA (Failure Modes and Effects Analysis) and its extension FMECA (Failure Mode, Effects, and Criticality Analysis), DRBFM (Design Review by Failure Mode), FTA (Fault Tree Analysis) and and its extension ETA (Event Tree Analysis), HAZOP (Hazard & Operability Studies), HACCP (Hazard Analysis and Critical Control Points) and What-if/Checklist. However, the most used analysis techniques in the mechanical and electrical industry are FMEA and FTA. In FMEA, which is an inductive method, information about the consequences and effects of the failures is usually collected through interviews with experienced people, and with different knowledge i.e., cross-functional groups. The FMEA is used to capture potential failures/risks & impacts and prioritize them on a numeric scale called Risk Priority Number (RPN) which ranges from 1 to 1000. FTA is a deductive method i.e., a general system state is decomposed into chains of more basic events of components. The logical interrelationship of how such basic events depend on and affect each other is often described analytically in a reliability structure which can be visualized as a tree. Both methods are very time-consuming to be applied thoroughly, and this is why it is oftenly not done so. As a consequence possible failure modes may not be identified. To address these shortcomings, it is proposed to use a combination of FTA and FMEA.
Space Transportation System Liftoff Debris Mitigation Process Overview
NASA Technical Reports Server (NTRS)
Mitchell, Michael; Riley, Christopher
2011-01-01
Liftoff debris is a top risk to the Space Shuttle Vehicle. To manage the Liftoff debris risk, the Space Shuttle Program created a team with in the Propulsion Systems Engineering & Integration Office. The Shutt le Liftoff Debris Team harnesses the Systems Engineering process to i dentify, assess, mitigate, and communicate the Liftoff debris risk. T he Liftoff Debris Team leverages off the technical knowledge and expe rtise of engineering groups across multiple NASA centers to integrate total system solutions. These solutions connect the hardware and ana lyses to identify and characterize debris sources and zones contribut ing to the Liftoff debris risk. The solutions incorporate analyses sp anning: the definition and modeling of natural and induced environmen ts; material characterizations; statistical trending analyses, imager y based trajectory analyses; debris transport analyses, and risk asse ssments. The verification and validation of these analyses are bound by conservative assumptions and anchored by testing and flight data. The Liftoff debris risk mitigation is managed through vigilant collab orative work between the Liftoff Debris Team and Launch Pad Operation s personnel and through the management of requirements, interfaces, r isk documentation, configurations, and technical data. Furthermore, o n day of launch, decision analysis is used to apply the wealth of ana lyses to case specific identified risks. This presentation describes how the Liftoff Debris Team applies Systems Engineering in their proce sses to mitigate risk and improve the safety of the Space Shuttle Veh icle.
Pfleiderer, Michael; Wichmann, Ole
2015-03-01
Vaccines are among the most effective preventive measures in modern medicine and have led to a dramatic decline and-for a few diseases-even to the elimination of severely infectious diseases. There are some particularities of the risk-benefit assessment of vaccines compared with that of therapeutic drugs. These include the fact that vaccines are applied to healthy individuals with the aim of preventing an infectious disease, while therapeutic drugs are administered to sick people to cure them of an already acquired disease. The acceptable level of risk associated with the application of a vaccine is therefore much lower. In addition, high vaccination coverage can lead to population-level effects (e.g., the indirect protection of unvaccinated individuals) that can confer additional benefits to the population overall. When a marketing authorization application (MAA) for a novel vaccine is evaluated, conclusions are made regarding its quality, safety, and efficacy, and a benefit-risk assessment is carried out accordingly. In contrast, when deciding on the introduction of a new vaccine into a national immunization program or on a recommendation for a specific risk-group, the focus is shifted to considerations of how a licensed vaccine can be best used in a population (e.g., which immunization strategy is most effective in preventing deaths or hospitalizations, or in reducing treatment costs for the health care system). Stringent assessment criteria have been developed that require a robust safety analysis before a new vaccine is administered to humans for the first time in pre-licensure studies. Similarly, criteria are applied for calculating the benefit-risk ratio at the time of the licensure of a new vaccine in addition to during the entire post-licensure period. However, when deciding if and how a licensed vaccine can best be integrated into an existing immunization program, additional criteria are applied that are different, yet complementary to those applied for granting a marketing authorization. These decisions require-in addition to considerations of vaccine quality, vaccine efficacy and safety-conclusions regarding population-level effects combined with an integrative analysis of the local context (e.g., local epidemiology, cost-effectiveness, and acceptance by the population). To serve these objectives, national authorities such as the Standing Committee on Vaccination in Germany (STIKO) have been established to integrate globally developed vaccines into the national context of immunization strategies.
Advancements in Risk-Informed Performance-Based Asset Management for Commercial Nuclear Power Plants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liming, James K.; Ravindra, Mayasandra K.
2006-07-01
Over the past several years, ABSG Consulting Inc. (ABS Consulting) and the South Texas Project Nuclear Operating Company (STPNOC) have developed a decision support process and associated software for risk-informed, performance-based asset management (RIPBAM) of nuclear power plant facilities. RIPBAM applies probabilistic risk assessment (PRA) tools and techniques in the realm of plant physical and financial asset management. The RIPBAM process applies a tiered set of models and supporting performance measures (or metrics) that can ultimately be applied to support decisions affecting the allocation and management of plant resources (e.g., funding, staffing, scheduling, etc.). In general, the ultimate goal ofmore » the RIPBAM process is to continually support decision-making to maximize a facility's net present value (NPV) and long-term profitability for its owners. While the initial applications of RIPBAM have been for nuclear power stations, the methodology can easily be adapted to other types of power station or complex facility decision-making support. RIPBAM can also be designed to focus on performance metrics other than NPV and profitability (e.g., mission reliability, operational availability, probability of mission success per dollar invested, etc.). Recent advancements in the RIPBAM process focus on expanding the scope of previous RIPBAM applications to include not only operations, maintenance, and safety issues, but also broader risk perception components affecting plant owner (stockholder), operator, and regulator biases. Conceptually, RIPBAM is a comprehensive risk-informed cash flow model for decision support. It originated as a tool to help manage plant refueling outage scheduling, and was later expanded to include the full spectrum of operations and maintenance decision support. However, it differs from conventional business modeling tools in that it employs a systems engineering approach with broadly based probabilistic analysis of organizational 'value streams'. The scope of value stream inclusion in the process can be established by the user, but in its broadest applications, RIPBAM can be used to address how risk perceptions of plant owners and regulators are impacted by plant performance. Plant staffs can expand and refine RIPBAM models scope via a phased program of activities over time. This paper shows how the multi-metric uncertainty analysis feature of RIPBAM can apply a wide spectrum of decision-influencing factors to support decisions designed to maximize the probability of achieving, maintaining, and improving upon plant goals and objectives. In this paper, the authors show how this approach can be extremely valuable to plant owners and operators in supporting plant value-impacting decision-making processes. (authors)« less
Hu, Yuan Yuan; Yuan, Hua; Jiang, Guang Bing; Chen, Ning; Wen, Li; Leng, Wei Dong; Zeng, Xian Tao; Niu, Yu Ming
2012-01-01
Background To investigate the association between XPD Asp312Asn polymorphism and head and neck cancer risk through this meta-analysis. Methods We performed a meta-analysis of 9 published case-control studies including 2,670 patients with head and neck cancer and 4,452 controls. An odds ratio (OR) with a 95% confidence interval (CI) was applied to assess the association between XPD Asp312Asn polymorphism and head and neck cancer risk. Results Overall, no significant association between XPD Asp312Asn polymorphism and head and neck cancer risk was found in this meta-analysis (Asn/Asn vs. Asp/Asp: OR = 0.95, 95%CI = 0.80–1.13, P = 0.550, P heterogeneity = 0.126; Asp/Asn vs. Asp/Asp: OR = 1.11, 95%CI = 0.99–1.24, P = 0.065, P heterogeneity = 0.663; Asn/Asn+Asp/Asn vs. Asp/Asp: OR = 1.07, 95%CI = 0.97–1.19, P = 0.189, P heterogeneity = 0.627; Asn/Asn vs. Asp/Asp+Asp/Asn: OR = 0.87, 95%CI = 0.68–1.10, P = 0.243, P heterogeneity = 0.089). In the subgroup analysis by HWE, ethnicity, and study design, there was still no significant association detected in all genetic models. Conclusions This meta-analysis demonstrates that XPD Asp312Asn polymorphism may not be a risk factor for developing head and neck cancer. PMID:22536360
Smoking increases the risk of diabetic foot amputation: A meta-analysis.
Liu, Min; Zhang, Wei; Yan, Zhaoli; Yuan, Xiangzhen
2018-02-01
Accumulating evidence suggests that smoking is associated with diabetic foot amputation. However, the currently available results are inconsistent and controversial. Therefore, the present study performed a meta-analysis to systematically review the association between smoking and diabetic foot amputation and to investigate the risk factors of diabetic foot amputation. Public databases, including PubMed and Embase, were searched prior to 29th February 2016. The heterogeneity was assessed using the Cochran's Q statistic and the I 2 statistic, and odds ratio (OR) and 95% confidence interval (CI) were calculated and pooled appropriately. Sensitivity analysis was performed to evaluate the stability of the results. In addition, Egger's test was applied to assess any potential publication bias. Based on the research, a total of eight studies, including five cohort studies and three case control studies were included. The data indicated that smoking significantly increased the risk of diabetic foot amputation (OR=1.65; 95% CI, 1.09-2.50; P<0.0001) compared with non-smoking. Sensitivity analysis demonstrated that the pooled analysis did not vary substantially following the exclusion of any one study. Additionally, there was no evidence of publication bias (Egger's test, t=0.1378; P=0.8958). Furthermore, no significant difference was observed between the minor and major amputation groups in patients who smoked (OR=0.79; 95% CI, 0.24-2.58). The results of the present meta-analysis suggested that smoking is a notable risk factor for diabetic foot amputation. Smoking cessation appears to reduce the risk of diabetic foot amputation.
Interpersonal Relations: A Choice-Theoretic Framework.
ERIC Educational Resources Information Center
Couvillion, L. Michael; Eckstein, Daniel G.
The microeconomic theory relating to utility and cost is applied to the "risk," and the possible "payoff" relative to relationships with others. A good measure of utility is the need or want-satisfying power of an alternative. For the analysis of interpersonal relationships, the needs delineated by Maslow (i.e. food, shelter, belongingness, love,…
Analyzing seasonal patterns of wildfire exposure factors in Sardinia, Italy
Michele Salis; Alan A. Ager; Fermin J. Alcasena; Bachisio Arca; Mark A. Finney; Grazia Pellizzaro; Donatella Spano
2015-01-01
In this paper, we applied landscape scale wildfire simulation modeling to explore the spatiotemporal patterns of wildfire likelihood and intensity in the island of Sardinia (Italy). We also performed wildfire exposure analysis for selected highly valued resources on the island to identify areas characterized by high risk. We observed substantial variation in burn...
Automating Risk Analysis of Software Design Models
Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.
2014-01-01
The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688
NASA Astrophysics Data System (ADS)
Zhou, Qianqian; Panduro, Toke Emil; Thorsen, Bo Jellesmark; Arnbjerg-Nielsen, Karsten
2013-03-01
This paper presents a cross-disciplinary framework for assessment of climate change adaptation to increased precipitation extremes considering pluvial flood risk as well as additional environmental services provided by some of the adaptation options. The ability of adaptation alternatives to cope with extreme rainfalls is evaluated using a quantitative flood risk approach based on urban inundation modeling and socio-economic analysis of corresponding costs and benefits. A hedonic valuation model is applied to capture the local economic gains or losses from more water bodies in green areas. The framework was applied to the northern part of the city of Aarhus, Denmark. We investigated four adaptation strategies that encompassed laissez-faire, larger sewer pipes, local infiltration units, and open drainage system in the urban green structure. We found that when taking into account environmental amenity effects, an integration of open drainage basins in urban recreational areas is likely the best adaptation strategy, followed by pipe enlargement and local infiltration strategies. All three were improvements compared to the fourth strategy of no measures taken.
Automating risk analysis of software design models.
Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P
2014-01-01
The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.
Zhou, Qianqian; Panduro, Toke Emil; Thorsen, Bo Jellesmark; Arnbjerg-Nielsen, Karsten
2013-03-01
This paper presents a cross-disciplinary framework for assessment of climate change adaptation to increased precipitation extremes considering pluvial flood risk as well as additional environmental services provided by some of the adaptation options. The ability of adaptation alternatives to cope with extreme rainfalls is evaluated using a quantitative flood risk approach based on urban inundation modeling and socio-economic analysis of corresponding costs and benefits. A hedonic valuation model is applied to capture the local economic gains or losses from more water bodies in green areas. The framework was applied to the northern part of the city of Aarhus, Denmark. We investigated four adaptation strategies that encompassed laissez-faire, larger sewer pipes, local infiltration units, and open drainage system in the urban green structure. We found that when taking into account environmental amenity effects, an integration of open drainage basins in urban recreational areas is likely the best adaptation strategy, followed by pipe enlargement and local infiltration strategies. All three were improvements compared to the fourth strategy of no measures taken.
Reliability considerations for the total strain range version of strainrange partitioning
NASA Technical Reports Server (NTRS)
Wirsching, P. H.; Wu, Y. T.
1984-01-01
A proposed total strainrange version of strainrange partitioning (SRP) to enhance the manner in which SRP is applied to life prediction is considered with emphasis on how advanced reliability technology can be applied to perform risk analysis and to derive safety check expressions. Uncertainties existing in the design factors associated with life prediction of a component which experiences the combined effects of creep and fatigue can be identified. Examples illustrate how reliability analyses of such a component can be performed when all design factors in the SRP model are random variables reflecting these uncertainties. The Rackwitz-Fiessler and Wu algorithms are used and estimates of the safety index and the probablity of failure are demonstrated for a SRP problem. Methods of analysis of creep-fatigue data with emphasis on procedures for producing synoptic statistics are presented. An attempt to demonstrate the importance of the contribution of the uncertainties associated with small sample sizes (fatique data) to risk estimates is discussed. The procedure for deriving a safety check expression for possible use in a design criteria document is presented.
Yang, Meng; Qian, Xin; Zhang, Yuchao; Sheng, Jinbao; Shen, Dengle; Ge, Yi
2011-01-01
Approximately 30,000 dams in China are aging and are considered to be high-level risks. Developing a framework for analyzing spatial multicriteria flood risk is crucial to ranking management scenarios for these dams, especially in densely populated areas. Based on the theories of spatial multicriteria decision analysis, this report generalizes a framework consisting of scenario definition, problem structuring, criteria construction, spatial quantification of criteria, criteria weighting, decision rules, sensitivity analyses, and scenario appraisal. The framework is presented in detail by using a case study to rank dam rehabilitation, decommissioning and existing-condition scenarios. The results show that there was a serious inundation, and that a dam rehabilitation scenario could reduce the multicriteria flood risk by 0.25 in the most affected areas; this indicates a mean risk decrease of less than 23%. Although increased risk (<0.20) was found for some residential and commercial buildings, if the dam were to be decommissioned, the mean risk would not be greater than the current existing risk, indicating that the dam rehabilitation scenario had a higher rank for decreasing the flood risk than the decommissioning scenario, but that dam rehabilitation alone might be of little help in abating flood risk. With adjustments and improvement to the specific methods (according to the circumstances and available data) this framework may be applied to other sites. PMID:21655125
Caous, Cristofer André; Machado, Birajara; Hors, Cora; Zeh, Andrea Kaufmann; Dias, Cleber Gustavo; Amaro Junior, Edson
2012-01-01
To propose a measure (index) of expected risks to evaluate and follow up the performance analysis of research projects involving financial and adequate structure parameters for its development. A ranking of acceptable results regarding research projects with complex variables was used as an index to gauge a project performance. In order to implement this method the ulcer index as the basic model to accommodate the following variables was applied: costs, high impact publication, fund raising, and patent registry. The proposed structured analysis, named here as RoSI (Return on Scientific Investment) comprises a pipeline of analysis to characterize the risk based on a modeling tool that comprises multiple variables interacting in semi-quantitatively environments. This method was tested with data from three different projects in our Institution (projects A, B and C). Different curves reflected the ulcer indexes identifying the project that may have a minor risk (project C) related to development and expected results according to initial or full investment. The results showed that this model contributes significantly to the analysis of risk and planning as well as to the definition of necessary investments that consider contingency actions with benefits to the different stakeholders: the investor or donor, the project manager and the researchers.
Chevance, Aurélie; Schuster, Tibor; Steele, Russell; Ternès, Nils; Platt, Robert W
2015-10-01
Robustness of an existing meta-analysis can justify decisions on whether to conduct an additional study addressing the same research question. We illustrate the graphical assessment of the potential impact of an additional study on an existing meta-analysis using published data on statin use and the risk of acute kidney injury. A previously proposed graphical augmentation approach is used to assess the sensitivity of the current test and heterogeneity statistics extracted from existing meta-analysis data. In addition, we extended the graphical augmentation approach to assess potential changes in the pooled effect estimate after updating a current meta-analysis and applied the three graphical contour definitions to data from meta-analyses on statin use and acute kidney injury risk. In the considered example data, the pooled effect estimates and heterogeneity indices demonstrated to be considerably robust to the addition of a future study. Supportingly, for some previously inconclusive meta-analyses, a study update might yield statistically significant kidney injury risk increase associated with higher statin exposure. The illustrated contour approach should become a standard tool for the assessment of the robustness of meta-analyses. It can guide decisions on whether to conduct additional studies addressing a relevant research question. Copyright © 2015 Elsevier Inc. All rights reserved.
Modern monitoring with preventive role for a production capacity
NASA Astrophysics Data System (ADS)
Tomescu, Cristian; Lupu, Constantin; Szollosi-Mota, Andrei; Rădoi, Florin; Chiuzan, Emeric
2016-10-01
In the process of exploitation of coal, the appearance of the phenomenon of spontaneous combustion represents a risk factor identified by the subjective and objective the causes, which requires the development of appropriate prevention methods. In order to control the risk, shall be drawn up incipient intervention solutions with preventive function, which consist in the direct and indirect measurement of the working environment, of the temperature of the coal massif and of the concentrations of gases, O2, CO2, CO. Monitoring instruments which fall within the modern concept for proactively anticipation is represented by thermography applied in the exploitation of coal and by the gas chromatograph for the analysis of the air collected. The drawing up of thermal maps on the basis of the thermograms and analysis of the chromatograms resulted represents the binome for assessing and treatments of the spontaneous combustion risk, which will be discussed in this work.
Quality Interaction Between Mission Assurance and Project Team Members
NASA Technical Reports Server (NTRS)
Kwong-Fu, Helenann H.; Wilson, Robert K.
2006-01-01
Mission Assurance independent assessments started during the development cycle and continued through post launch operations. In operations, Health and Safety of the Observatory is of utmost importance. Therefore, Mission Assurance must ensure requirements compliance and focus on process improvements required across the operational systems including new/modified products, tools, and procedures. The deployment of the interactive model involves three objectives: Team member Interaction, Good Root Cause Analysis Practices, and Risk Assessment to avoid reoccurrences. In applying this model, we use a metric based measurement process and was found to have the most significant effect, which points to the importance of focuses on a combination of root cause analysis and risk approaches allowing the engineers the ability to prioritize and quantify their corrective actions based on a well-defined set of root cause definitions (i.e. closure criteria for problem reports), success criteria and risk rating definitions.
On different types of uncertainties in the context of the precautionary principle.
Aven, Terje
2011-10-01
Few policies for risk management have created more controversy than the precautionary principle. A main problem is the extreme number of different definitions and interpretations. Almost all definitions of the precautionary principle identify "scientific uncertainties" as the trigger or criterion for its invocation; however, the meaning of this concept is not clear. For applying the precautionary principle it is not sufficient that the threats or hazards are uncertain. A stronger requirement is needed. This article provides an in-depth analysis of this issue. We question how the scientific uncertainties are linked to the interpretation of the probability concept, expected values, the results from probabilistic risk assessments, the common distinction between aleatory uncertainties and epistemic uncertainties, and the problem of establishing an accurate prediction model (cause-effect relationship). A new classification structure is suggested to define what scientific uncertainties mean. © 2011 Society for Risk Analysis.
Value-driven ERM: making ERM an engine for simultaneous value creation and value protection.
Celona, John; Driver, Jeffrey; Hall, Edward
2011-01-01
Enterprise risk management (ERM) began as an effort to integrate the historically disparate silos of risk management in organizations. More recently, as recognition has grown of the need to cover the upside risks in value creation (financial and otherwise), organizations and practitioners have been searching for the means to do this. Existing tools such as heat maps and risk registers are not adequate for this task. Instead, a conceptually new value-driven framework is needed to realize the promise of enterprise-wide coverage of all risks, for both value protection and value creation. The methodology of decision analysis provides the means of capturing systemic, correlated, and value-creation risks on the same basis as value protection risks and has been integrated into the value-driven approach to ERM described in this article. Stanford Hospital and Clinics Risk Consulting and Strategic Decisions Group have been working to apply this value-driven ERM at Stanford University Medical Center. © 2011 American Society for Healthcare Risk Management of the American Hospital Association.
Chocolate intake and diabetes risk.
Greenberg, James A
2015-02-01
In-vitro and rodent studies, and short-term human trials suggest that compounds in chocolate can enhance insulin sensitivity. Also, a recent prospective Japanese epidemiological analysis found that long-term chocolate consumption was inversely associated with diabetes risk. The objective of the present analysis was to test the epidemiological association between long-term chocolate consumption and diabetes risk in a U.S. cohort. Multivariable prospective Cox Regression analysis with time-dependent covariates was used to examine data from 7802 participants in the prospective Atherosclerosis Risk in Communities Cohort. The data included 861 new diabetes cases during 98,543 person-years of follow up (mean = 13.3 years). Compared to participants who ate 1 oz of chocolate less often than monthly, those who ate it 1-4 times/month, 2-6 times/week and ≥ 1 time/day had relative risks of being diagnosed with diabetes that were lower by 13% (95% confidence interval: -2%, 25%), 34% (18%, 47%) and 18% (-10%, 38%). These relative risks applied to participants without evidence of preexisting serious chronic disease that included diabetes, heart attacks, stroke or cancer. In conclusion, the risk of diabetes decreased as the frequency of chocolate intake increased, up to 2-6 servings (1 oz) per week. Consuming ≥ 1 serving per day did not yield significantly lower relative risk. These results suggest that consuming moderate amount of chocolate may reduce the risk of diabetes. Further research is required to confirm and explore these findings. Copyright © 2014 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.
Pereira, Luisa Santos; Müller, Vanessa Teixeira; da Mota Gomes, Marleide; Rotenberg, Alexander; Fregni, Felipe
2016-04-01
Approximately one-third of patients with epilepsy remain with pharmacologically intractable seizures. An emerging therapeutic modality for seizure suppression is repetitive transcranial magnetic stimulation (rTMS). Despite being considered a safe technique, rTMS carries the risk of inducing seizures, among other milder adverse events, and thus, its safety in the population with epilepsy should be continuously assessed. We performed an updated systematic review on the safety and tolerability of rTMS in patients with epilepsy, similar to a previous report published in 2007 (Bae EH, Schrader LM, Machii K, Alonso-Alonso M, Riviello JJ, Pascual-Leone A, Rotenberg A. Safety and tolerability of repetitive transcranial magnetic stimulation in patients with epilepsy: a review of the literature. Epilepsy Behav. 2007; 10 (4): 521-8), and estimated the risk of seizures and other adverse events during or shortly after rTMS application. We searched the literature for reports of rTMS being applied on patients with epilepsy, with no time or language restrictions, and obtained studies published from January 1990 to August 2015. A total of 46 publications were identified, of which 16 were new studies published after the previous safety review of 2007. We noted the total number of subjects with epilepsy undergoing rTMS, medication usage, incidence of adverse events, and rTMS protocol parameters: frequency, intensity, total number of stimuli, train duration, intertrain intervals, coil type, and stimulation site. Our main data analysis included separate calculations for crude per subject risk of seizure and other adverse events, as well as risk per 1000 stimuli. We also performed an exploratory, secondary analysis on the risk of seizure and other adverse events according to the type of coil used (figure-of-8 or circular), stimulation frequency (≤ 1 Hz or > 1 Hz), pulse intensity in terms of motor threshold (<100% or ≥ 100%), and number of stimuli per session (< 500 or ≥ 500). Presence or absence of adverse events was reported in 40 studies (n = 426 subjects). A total of 78 (18.3%) subjects reported adverse events, of which 85% were mild. Headache or dizziness was the most common one, occurring in 8.9%. We found a crude per subject seizure risk of 2.9% (95% CI: 1.3-4.5), given that 12 subjects reported seizures out of 410 subjects included in the analysis after data of patients with epilepsia partialis continua or status epilepticus were excluded from the estimate. Only one of the reported seizures was considered atypical in terms of the clinical characteristics of the patients' baseline seizures. The atypical seizure happened during high-frequency rTMS with maximum stimulator output for speech arrest, clinically arising from the region of stimulation. Although we estimated a larger crude per subject seizure risk compared with the previous safety review, the corresponding confidence intervals contained both risks. Furthermore, the exclusive case of atypical seizure was the same as reported in the previous report. We conclude that the risk of seizure induction in patients with epilepsy undergoing rTMS is small and that the risk of other adverse events is similar to that of rTMS applied to other conditions and to healthy subjects. Our results should be interpreted with caution, given the need for adjusted analysis controlling for potential confounders, such as baseline seizure frequency. The similarity between the safety profiles of rTMS applied to the population with epilepsy and to individuals without epilepsy supports further investigation of rTMS as a therapy for seizure suppression. Copyright © 2016. Published by Elsevier Inc.
Risk assessment techniques with applicability in marine engineering
NASA Astrophysics Data System (ADS)
Rudenko, E.; Panaitescu, F. V.; Panaitescu, M.
2015-11-01
Nowadays risk management is a carefully planned process. The task of risk management is organically woven into the general problem of increasing the efficiency of business. Passive attitude to risk and awareness of its existence are replaced by active management techniques. Risk assessment is one of the most important stages of risk management, since for risk management it is necessary first to analyze and evaluate risk. There are many definitions of this notion but in general case risk assessment refers to the systematic process of identifying the factors and types of risk and their quantitative assessment, i.e. risk analysis methodology combines mutually complementary quantitative and qualitative approaches. Purpose of the work: In this paper we will consider as risk assessment technique Fault Tree analysis (FTA). The objectives are: understand purpose of FTA, understand and apply rules of Boolean algebra, analyse a simple system using FTA, FTA advantages and disadvantages. Research and methodology: The main purpose is to help identify potential causes of system failures before the failures actually occur. We can evaluate the probability of the Top event.The steps of this analize are: the system's examination from Top to Down, the use of symbols to represent events, the use of mathematical tools for critical areas, the use of Fault tree logic diagrams to identify the cause of the Top event. Results: In the finally of study it will be obtained: critical areas, Fault tree logical diagrams and the probability of the Top event. These results can be used for the risk assessment analyses.
Spatiotemporal analysis of the agricultural drought risk in Heilongjiang Province, China
NASA Astrophysics Data System (ADS)
Pei, Wei; Fu, Qiang; Liu, Dong; Li, Tian-xiao; Cheng, Kun; Cui, Song
2017-06-01
Droughts are natural disasters that pose significant threats to agricultural production as well as living conditions, and a spatial-temporal difference analysis of agricultural drought risk can help determine the spatial distribution and temporal variation of the drought risk within a region. Moreover, this type of analysis can provide a theoretical basis for the identification, prevention, and mitigation of drought disasters. In this study, the overall dispersion and local aggregation of projection points were based on research by Friedman and Tukey (IEEE Trans on Computer 23:881-890, 1974). In this work, high-dimensional samples were clustered by cluster analysis. The clustering results were represented by the clustering matrix, which determined the local density in the projection index. This method avoids the problem of determining a cutoff radius. An improved projection pursuit model is proposed that combines cluster analysis and the projection pursuit model, which offer advantages for classification and assessment, respectively. The improved model was applied to analyze the agricultural drought risk of 13 cities in Heilongjiang Province over 6 years (2004, 2006, 2008, 2010, 2012, and 2014). The risk of an agricultural drought disaster was characterized by 14 indicators and the following four aspects: hazard, exposure, sensitivity, and resistance capacity. The spatial distribution and temporal variation characteristics of the agricultural drought risk in Heilongjiang Province were analyzed. The spatial distribution results indicated that Suihua, Qigihar, Daqing, Harbin, and Jiamusi are located in high-risk areas, Daxing'anling and Yichun are located in low-risk areas, and the differences among the regions were primarily caused by the aspects exposure and resistance capacity. The temporal variation results indicated that the risk of agricultural drought in most areas presented an initially increasing and then decreasing trend. A higher value for the exposure aspect increased the risk of drought, whereas a higher value for the resistance capacity aspect reduced the risk of drought. Over the long term, the exposure level of the region presented limited increases, whereas the resistance capacity presented considerable increases. Therefore, the risk of agricultural drought in Heilongjiang Province will continue to exhibit a decreasing trend.
Schedule Risk Analysis Of Southern Mainway Construction In Jember Regency
NASA Astrophysics Data System (ADS)
Susilo, K.; Wiguna, I. P. A.; Adi, T. J. W.
2017-11-01
In Jember Regency, it has been built Southern Cross Road (JLS) as part of regional project. On the implementation of previous construction, there were still some events which gave negative impact to the project. The purpose of this research is to analyze risk and its effect on schedule at the construction phase of JLS at Jember Regency. Risk identification process is carried out by site survey, literature studies and supporting data. The use of Probability and Impact Matrix were aimed to obtain the level of risk. Based on the analysis, it was obtained six highest risk that could affecting schedule, such as difficult access locations, heavy rains, increases of material price, broken road pavement work, change order, and work accident. Risk responses were proposed by applying agreement to guarantee stock and price of materials, prioritized drainage, and constructing bridge to solve difficult access. An intense coordination in the site, routine checks of quality, manufacturing of retailing walls were also needed to reduce possibility of distruption to pavement work. To avoid work accident, it is needed to socialize about harsh terrain condition, mutual allertness among supervisor, worker and the others, and also all personals must comply with savety rules.
NASA Astrophysics Data System (ADS)
Al-Akad, S.; Akensous, Y.; Hakdaoui, M.
2017-11-01
This research article is summarize the applications of remote sensing and GIS to study the urban floods risk in Al Mukalla. Satellite acquisition of a flood event on October 2015 in Al Mukalla (Yemen) by using flood risk mapping techniques illustrate the potential risk present in this city. Satellite images (The Landsat and DEM images data were atmospherically corrected, radiometric corrected, and geometric and topographic distortions rectified.) are used for flood risk mapping to afford a hazard (vulnerability) map. This map is provided by applying image-processing techniques and using geographic information system (GIS) environment also the application of NDVI, NDWI index, and a method to estimate the flood-hazard areas. Four factors were considered in order to estimate the spatial distribution of the hazardous areas: flow accumulation, slope, land use, geology and elevation. The multi-criteria analysis, allowing to deal with vulnerability to flooding, as well as mapping areas at the risk of flooding of the city Al Mukalla. The main object of this research is to provide a simple and rapid method to reduce and manage the risks caused by flood in Yemen by take as example the city of Al Mukalla.
Toyabe, Shin-ichi
2014-01-01
Inpatient falls are the most common adverse events that occur in a hospital, and about 3 to 10% of falls result in serious injuries such as bone fractures and intracranial haemorrhages. We previously reported that bone fractures and intracranial haemorrhages were two major fall-related injuries and that risk assessment score for osteoporotic bone fracture was significantly associated not only with bone fractures after falls but also with intracranial haemorrhage after falls. Based on the results, we tried to establish a risk assessment tool for predicting fall-related severe injuries in a hospital. Possible risk factors related to fall-related serious injuries were extracted from data on inpatients that were admitted to a tertiary-care university hospital by using multivariate Cox’ s regression analysis and multiple logistic regression analysis. We found that fall risk score and fracture risk score were the two significant factors, and we constructed models to predict fall-related severe injuries incorporating these factors. When the prediction model was applied to another independent dataset, the constructed model could detect patients with fall-related severe injuries efficiently. The new assessment system could identify patients prone to severe injuries after falls in a reproducible fashion. PMID:25168984
Risk of Death in Infants Who Have Experienced a Brief Resolved Unexplained Event: A Meta-Analysis.
Brand, Donald A; Fazzari, Melissa J
2018-06-01
To estimate an upper bound on the risk of death after a brief resolved unexplained event (BRUE), a sudden alteration in an infant's breathing, color, tone, or responsiveness, previously labeled "apparent life-threatening event" (ALTE). The meta-analysis incorporated observational studies of patients with ALTE that included data on in-hospital and post-discharge deaths with at least 1 week of follow-up after hospital discharge. Pertinent studies were identified from a published review of the literature from 1970 through 2014 and a supplementary PubMed query through February 2017. The 12 included studies (n = 3005) reported 12 deaths, of which 8 occurred within 4 months of the event. Applying a Poisson-normal random effects model to the 8 proximate deaths using a 4-month time horizon yielded a post-ALTE mortality rate of about 1 in 800, which constitutes an upper bound on the risk of death after a BRUE. This risk is about the same as the baseline risk of death during the first year of life. The meta-analysis therefore supports the return-home approach advocated in a recently published clinical practice guideline-not routine hospitalization-for BRUE patients who have been evaluated in the emergency department and determined to be at lower risk. Copyright © 2017 Elsevier Inc. All rights reserved.
76 FR 16234 - Prompt Corrective Action; Amended Definition of Low-Risk Assets
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-23
... guaranteed by NCUA. Assets in this category receive a risk-weighting of zero for regulatory capital purposes... not apply a risk-weighting of zero even when an investment carries no credit risk. The ``Low-risk assets'' risk portfolio, in contrast, does apply a risk- weighting of zero, but the NGNs did not fall...
NASA Technical Reports Server (NTRS)
Prassinos, Peter G.; Lyver, John W., IV; Bui, Chinh T.
2011-01-01
Risk assessment is used in many industries to identify and manage risks. Initially developed for use on aeronautical and nuclear systems, risk assessment has been applied to transportation, chemical, computer, financial, and security systems among others. It is used to gain an understanding of the weaknesses or vulnerabilities in a system so modification can be made to increase operability, efficiency, and safety and to reduce failure and down-time. Risk assessment results are primary inputs to risk-informed decision making; where risk information including uncertainty is used along with other pertinent information to assist management in the decision-making process. Therefore, to be useful, a risk assessment must be directed at specific objectives. As the world embraces the globalization of trade and manufacturing, understanding the associated risk become important to decision making. Applying risk assessment techniques to a global system of development, manufacturing, and transportation can provide insight into how the system can fail, the likelihood of system failure and the consequences of system failure. The risk assessment can identify those elements that contribute most to risk and identify measures to prevent and mitigate failures, disruptions, and damaging outcomes. In addition, risk associated with public and environment impact can be identified. The risk insights gained can be applied to making decisions concerning suitable development and manufacturing locations, supply chains, and transportation strategies. While risk assessment has been mostly applied to mechanical and electrical systems, the concepts and techniques can be applied across other systems and activities. This paper provides a basic overview of the development of a risk assessment.
López-Contreras, María José; López, Maria Ángeles; Canteras, Manuel; Candela, María Emilia; Zamora, Salvador; Pérez-Llamas, Francisca
2014-03-01
To apply a cluster analysis to groups of individuals of similar characteristics in an attempt to identify undernutrition or the risk of undernutrition in this population. A cross-sectional study. Seven public nursing homes in the province of Murcia, on the Mediterranean coast of Spain. 205 subjects aged 65 and older (131 women and 74 men). Dietary intake (energy and nutrients), anthropometric (body mass index, skinfold thickness, mid-arm muscle circumference, mid-arm muscle area, corrected arm muscle area, waist to hip ratio) and biochemical and haematological (serum albumin, transferrin, total cholesterol, total lymphocyte count). Variables were analyzed by cluster analysis. The results of the cluster analysis, including intake, anthropometric and analytical data showed that, of the 205 elderly subjects, 66 (32.2%) were over - weight/obese, 72 (35.1%) had an adequate nutritional status and 67 (32.7%) were undernourished or at risk of undernutrition. The undernourished or at risk of undernutrition group showed the lowest values for dietary intake and the anthropometric and analytical parameters measured. Our study shows that cluster analysis is a useful statistical method for assessing the nutritional status of institutionalized elderly populations. In contrast, use of the specific reference values frequently described in the literature might fail to detect real cases of undernourishment or those at risk of undernutrition. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.
The Validity and Reliability of the Turkish Version of the Neonatal Skin Risk Assessment Scale.
Sari, Çiğdem; Altay, Naime
2017-03-01
The study created a Turkish translation of the Neonatal Skin Risk Assessment Scale (NSRAS) that was developed by Huffines and Longsdon in 1997. Study authors used a cross-sectional survey design in order to determine the validity and reliability of the Turkish translation. The study was conducted at the neonatal intensive care unit of a university hospital in Ankara between March 15 and June 30, 2014. The research sample included 130 neonatal assessments from 17 patients. Data were collected by questionnaire regarding the characteristics of the participating neonates, 7 nurse observers, and the NSRAS and its subarticles. After translation and back-translation were performed to assess language validity of the scale, necessary corrections were made in line with expert suggestions, and content validity was ensured. Internal consistency of the scale was assessed by its homogeneity, Cronbach's α, and subarticle-general scale grade correlation. Cronbach's α for the scale overall was .88, and Cronbach's α values for the subarticles were between .83 and .90. Results showed a positive relationship among all the subarticles and the overall NSRAS scale grade (P < .01) with correlation values between 0.333 and 0.721. Explanatory and predicative factor analysis was applied for structural validity. Kaiser-Meyer-Olkin analysis was applied for sample sufficiency, and Bartlett test analysis was applied in order to assess the factor analysis of the sample. The Kaiser-Meyer-Olkin coefficient was 0.73, and the χ value found according to the Bartlett test was statistically significant at an advanced level (P < .05). In the 6 subarticles of the scale and in the general scale total grade, a high, positive, and significant relationship among the grades given by the researcher and the nurse observers was found (P < .05). The Turkish NSRAS is reliable and valid.
NASA Astrophysics Data System (ADS)
Zumpano, Veronica; Balteanu, Dan; Mazzorana, Bruno; Micu, Mihai
2014-05-01
It is increasingly important to provide to stakeholders tools that will enable them to better understand what is the state of the environment in which they live and manage and to help them to make decisions that aim to minimize the consequences of hydro-meteorological hazards. Very often, however, quantitative studies, especially for large areas, are difficult to perform. This is due to the fact that unfortunately isn't often possible to have the numerous data required to perform the analysis. In addition it has been proven that in scenario analysis, often deterministic approaches are not able to detect some features of the system revealing unexpected behaviors, and resulting in underestimation or omission of some impact factors. Here are presented some preliminary results obtained applying Formative Scenario Analysis that can be considered a possible solution for landslide risk analysis in cases where the data needed even if existent are not available. This method is an expert based approach that integrates intuitions and qualitative evaluations of impact factors with the quantitative analysis of relations between these factors: a group of experts with different but pertinent expertise, determine (by a rating procedure) quantitative relations between these factors, then through mathematical operations the scenarios describing a certain state of the system are obtained. The approach is applied to Buzau County (Romania), an area belonging to the Curvature Romanian Carpathians and Subcarpathians, a region strongly affected by environmental hazards. The region has been previously involved in numerous episodes of severe hydro-meteorological events that caused considerable damages (1975, 2005, 2006). In this application we are referring only to one type of landslides that can be described as shallow and medium-seated with a (mainly) translational movement that can go from slide to flow. The material involved can be either soil, debris or a mixture of both, in Romanian literature these typical movements has been described as alunecare curgatoare. The Formative Scenario Analysis approach will be applied for each component of risk (H,V,and A) and then the acquired states will be combined in order to obtain for obtaining a series of alternatives scenarios for risk. The approach is structured in two main sections corresponding to a level of influence of conditioning factors and a response. In this latter are obtained the results of the formative scenario approach trained with the conditioning factors of the first level. These factors are divided in two subsets representing 2 levels of influences, k=1 comprises the global factors while in k=2 one finds local factors. In order to include uncertainty estimation within the analysis the method of knowledge representation type-1 fuzzy sets is introduced and hence decisions made by experts on certain events are expressed in terms of triangular fuzzy numbers.
Simulation Assisted Risk Assessment: Blast Overpressure Modeling
NASA Technical Reports Server (NTRS)
Lawrence, Scott L.; Gee, Ken; Mathias, Donovan; Olsen, Michael
2006-01-01
A probabilistic risk assessment (PRA) approach has been developed and applied to the risk analysis of capsule abort during ascent. The PRA is used to assist in the identification of modeling and simulation applications that can significantly impact the understanding of crew risk during this potentially dangerous maneuver. The PRA approach is also being used to identify the appropriate level of fidelity for the modeling of those critical failure modes. The Apollo launch escape system (LES) was chosen as a test problem for application of this approach. Failure modes that have been modeled and/or simulated to date include explosive overpressure-based failure, explosive fragment-based failure, land landing failures (range limits exceeded either near launch or Mode III trajectories ending on the African continent), capsule-booster re-contact during separation, and failure due to plume-induced instability. These failure modes have been investigated using analysis tools in a variety of technical disciplines at various levels of fidelity. The current paper focuses on the development and application of a blast overpressure model for the prediction of structural failure due to overpressure, including the application of high-fidelity analysis to predict near-field and headwinds effects.
NASA Astrophysics Data System (ADS)
Klügel, J.
2006-12-01
Deterministic scenario-based seismic hazard analysis has a long tradition in earthquake engineering for developing the design basis of critical infrastructures like dams, transport infrastructures, chemical plants and nuclear power plants. For many applications besides of the design of infrastructures it is of interest to assess the efficiency of the design measures taken. These applications require a method allowing to perform a meaningful quantitative risk analysis. A new method for a probabilistic scenario-based seismic risk analysis has been developed based on a probabilistic extension of proven deterministic methods like the MCE- methodology. The input data required for the method are entirely based on the information which is necessary to perform any meaningful seismic hazard analysis. The method is based on the probabilistic risk analysis approach common for applications in nuclear technology developed originally by Kaplan & Garrick (1981). It is based (1) on a classification of earthquake events into different size classes (by magnitude), (2) the evaluation of the frequency of occurrence of events, assigned to the different classes (frequency of initiating events, (3) the development of bounding critical scenarios assigned to each class based on the solution of an optimization problem and (4) in the evaluation of the conditional probability of exceedance of critical design parameters (vulnerability analysis). The advantage of the method in comparison with traditional PSHA consists in (1) its flexibility, allowing to use different probabilistic models for earthquake occurrence as well as to incorporate advanced physical models into the analysis, (2) in the mathematically consistent treatment of uncertainties, and (3) in the explicit consideration of the lifetime of the critical structure as a criterion to formulate different risk goals. The method was applied for the evaluation of the risk of production interruption losses of a nuclear power plant during its residual lifetime.
Chan, Ramony; Steel, Zachary; Brooks, Robert; Heung, Tracy; Erlich, Jonathan; Chow, Josephine; Suranyi, Michael
2011-11-01
Research into the association between psychosocial factors and depression in End-Stage Renal Disease (ESRD) has expanded considerably in recent years identifying a range of factors that may act as important risk and protective factors of depression for this population. The present study provides the first systematic review and meta-analysis of this body of research. Published studies reporting associations between any psychosocial factor and depression were identified and retrieved from Medline, Embase, and PsycINFO, by applying optimised search strategies. Mean effect sizes were calculated for the associations across five psychosocial constructs (social support, personality attributes, cognitive appraisal, coping process, stress/stressor). Multiple hierarchical meta-regression analysis was applied to examine the moderating effects of methodological and substantive factors on the strength of the observed associations. 57 studies covering 58 independent samples with 5956 participants were identified, resulting in 246 effect sizes of the association between a range of psychosocial factors and depression. The overall mean effect size (Pearsons correlation coefficient) of the association between psychosocial factor and depression was 0.36. The effect sizes between the five psychosocial constructs and depression ranged from medium (0.27) to large levels (0.46) with personality attributes (0.46) and cognitive appraisal (0.46) having the largest effect sizes. In the meta-regression analyses, identified demographic (gender, age, location of study) and treatment (type of dialysis) characteristics moderated the strength of the associations with depression. The current analysis documents a moderate to large association between the presence of psychosocial risk factors and depression in ESRD. 2011. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Ndu, Obibobi Kamtochukwu
To ensure that estimates of risk and reliability inform design and resource allocation decisions in the development of complex engineering systems, early engagement in the design life cycle is necessary. An unfortunate constraint on the accuracy of such estimates at this stage of concept development is the limited amount of high fidelity design and failure information available on the actual system under development. Applying the human ability to learn from experience and augment our state of knowledge to evolve better solutions mitigates this limitation. However, the challenge lies in formalizing a methodology that takes this highly abstract, but fundamentally human cognitive, ability and extending it to the field of risk analysis while maintaining the tenets of generalization, Bayesian inference, and probabilistic risk analysis. We introduce an integrated framework for inferring the reliability, or other probabilistic measures of interest, of a new system or a conceptual variant of an existing system. Abstractly, our framework is based on learning from the performance of precedent designs and then applying the acquired knowledge, appropriately adjusted based on degree of relevance, to the inference process. This dissertation presents a method for inferring properties of the conceptual variant using a pseudo-spatial model that describes the spatial configuration of the family of systems to which the concept belongs. Through non-metric multidimensional scaling, we formulate the pseudo-spatial model based on rank-ordered subjective expert perception of design similarity between systems that elucidate the psychological space of the family. By a novel extension of Kriging methods for analysis of geospatial data to our "pseudo-space of comparable engineered systems", we develop a Bayesian inference model that allows prediction of the probabilistic measure of interest.
Protection of agriculture against drought in Slovenia based on vulnerability and risk assessment
NASA Astrophysics Data System (ADS)
Dovžak, M.; Stanič, S.; Bergant, K.; Gregorič, G.
2012-04-01
Past and recent extreme events, like earthquakes, extreme droughts, heat waves, flash floods and volcanic eruptions continuously remind us that natural hazards are an integral component of the global environment. Despite rapid improvement of detection techniques many of these events evade long-term or even mid-term prediction and can thus have disastrous impacts on affected communities and environment. Effective mitigation and preparedness strategies will be possible to develop only after gaining the understanding on how and where such hazards may occur, what causes them, what circumstances increase their severity, and what their impacts may be and their study has the recent years emerged as under the common title of natural hazard management. The first step in natural risk management is risk identification, which includes hazard analysis and monitoring, vulnerability analysis and determination of the risk level. The presented research focuses on drought, which is at the present already the most widespread as well as still unpredictable natural hazard. Its primary aim was to assess the frequency and the consequences of droughts in Slovenia based on drought events in the past, to develop methodology for drought vulnerability and risk assessment that can be applied in Slovenia and wider in South-Eastern Europe, to prepare maps of drought risk and crop vulnerability and to guidelines to reduce the vulnerability of the crops. Using the amounts of plant available water in the soil, slope inclination, solar radiation, land use and irrigation infrastructure data sets as inputs, we obtained vulnerability maps for Slovenia using GIS-based multi-criteria decision analysis with a weighted linear combination of the input parameters. The weight configuration was optimized by comparing the modelled crop damage to the assessed actual damage, which was available for the extensive drought case in 2006. Drought risk was obtained quantitatively as a function of hazard and vulnerability and presented in the same way as the vulnerability, as a GIS-based map. Risk maps show geographic regions in Slovenia where droughts pose a major threat to the agriculture and together with the vulnerability maps provide the basis for drought management, in particular for the appropriate mitigation and response actions in specific regions. The developed methodology is expected to be applied to the entire region of South-Eastern Europe within the initiative of the Drought Management Centre for Southeastern Europe.
A framework for risk assessment and decision-making strategies in dangerous good transportation.
Fabiano, B; Currò, F; Palazzi, E; Pastorino, R
2002-07-01
The risk from dangerous goods transport by road and strategies for selecting road load/routes are faced in this paper, by developing an original site-oriented framework of general applicability at local level. A realistic evaluation of the frequency must take into account on one side inherent factors (e.g. tunnels, rail bridges, bend radii, slope, characteristics of neighborhood, etc.) on the other side factors correlated to the traffic conditions (e.g. dangerous goods trucks, etc.). Field data were collected on the selected highway, by systematic investigation, providing input data for a database reporting tendencies and intrinsic parameter/site-oriented statistics. The developed technique was applied to a pilot area, considering both the individual risk and societal risk and making reference to flammable and explosive scenarios. In this way, a risk assessment, sensitive to route features and population exposed, is proposed, so that the overall uncertainties in risk analysis can be lowered.
A review of machine learning in obesity.
DeGregory, K W; Kuiper, P; DeSilvio, T; Pleuss, J D; Miller, R; Roginski, J W; Fisher, C B; Harness, D; Viswanath, S; Heymsfield, S B; Dungan, I; Thomas, D M
2018-05-01
Rich sources of obesity-related data arising from sensors, smartphone apps, electronic medical health records and insurance data can bring new insights for understanding, preventing and treating obesity. For such large datasets, machine learning provides sophisticated and elegant tools to describe, classify and predict obesity-related risks and outcomes. Here, we review machine learning methods that predict and/or classify such as linear and logistic regression, artificial neural networks, deep learning and decision tree analysis. We also review methods that describe and characterize data such as cluster analysis, principal component analysis, network science and topological data analysis. We introduce each method with a high-level overview followed by examples of successful applications. The algorithms were then applied to National Health and Nutrition Examination Survey to demonstrate methodology, utility and outcomes. The strengths and limitations of each method were also evaluated. This summary of machine learning algorithms provides a unique overview of the state of data analysis applied specifically to obesity. © 2018 World Obesity Federation.
Analysis of risk factors for central venous port failure in cancer patients
Hsieh, Ching-Chuan; Weng, Hsu-Huei; Huang, Wen-Shih; Wang, Wen-Ke; Kao, Chiung-Lun; Lu, Ming-Shian; Wang, Chia-Siu
2009-01-01
AIM: To analyze the risk factors for central port failure in cancer patients administered chemotherapy, using univariate and multivariate analyses. METHODS: A total of 1348 totally implantable venous access devices (TIVADs) were implanted into 1280 cancer patients in this cohort study. A Cox proportional hazard model was applied to analyze risk factors for failure of TIVADs. Log-rank test was used to compare actuarial survival rates. Infection, thrombosis, and surgical complication rates (χ2 test or Fisher’s exact test) were compared in relation to the risk factors. RESULTS: Increasing age, male gender and open-ended catheter use were significant risk factors reducing survival of TIVADs as determined by univariate and multivariate analyses. Hematogenous malignancy decreased the survival time of TIVADs; this reduction was not statistically significant by univariate analysis [hazard ratio (HR) = 1.336, 95% CI: 0.966-1.849, P = 0.080)]. However, it became a significant risk factor by multivariate analysis (HR = 1.499, 95% CI: 1.079-2.083, P = 0.016) when correlated with variables of age, sex and catheter type. Close-ended (Groshong) catheters had a lower thrombosis rate than open-ended catheters (2.5% vs 5%, P = 0.015). Hematogenous malignancy had higher infection rates than solid malignancy (10.5% vs 2.5%, P < 0.001). CONCLUSION: Increasing age, male gender, open-ended catheters and hematogenous malignancy were risk factors for TIVAD failure. Close-ended catheters had lower thrombosis rates and hematogenous malignancy had higher infection rates. PMID:19787834
Guo, How-Ran
2011-10-20
Despite its limitations, ecological study design is widely applied in epidemiology. In most cases, adjustment for age is necessary, but different methods may lead to different conclusions. To compare three methods of age adjustment, a study on the associations between arsenic in drinking water and incidence of bladder cancer in 243 townships in Taiwan was used as an example. A total of 3068 cases of bladder cancer, including 2276 men and 792 women, were identified during a ten-year study period in the study townships. Three methods were applied to analyze the same data set on the ten-year study period. The first (Direct Method) applied direct standardization to obtain standardized incidence rate and then used it as the dependent variable in the regression analysis. The second (Indirect Method) applied indirect standardization to obtain standardized incidence ratio and then used it as the dependent variable in the regression analysis instead. The third (Variable Method) used proportions of residents in different age groups as a part of the independent variables in the multiple regression models. All three methods showed a statistically significant positive association between arsenic exposure above 0.64 mg/L and incidence of bladder cancer in men and women, but different results were observed for the other exposure categories. In addition, the risk estimates obtained by different methods for the same exposure category were all different. Using an empirical example, the current study confirmed the argument made by other researchers previously that whereas the three different methods of age adjustment may lead to different conclusions, only the third approach can obtain unbiased estimates of the risks. The third method can also generate estimates of the risk associated with each age group, but the other two are unable to evaluate the effects of age directly.
1990-12-01
related to presumed mechanisms of TCE-induced liver tumors, especially as they relate to peroxisome proliferation. lFishe-, J., Cargas , M., Allen, B., et...Risk Analysis, Oak Ridge National Laboratory, Oak Ridge, Tennessee. Prepared under Contract No. DE -ACO5-94)R21O0 for the U.S. Department of Energy...Dekant et al., 1989). In the liver DCVC may be acetylated to form N-Ac-DCVC which can then be transported back to the kidney where it may be excreted in
Ahmed, Ruhi; Baseman, Harold; Ferreira, Jorge; Genova, Thomas; Harclerode, William; Hartman, Jeffery; Kim, Samuel; Londeree, Nanette; Long, Michael; Miele, William; Ramjit, Timothy; Raschiatore, Marlene; Tomonto, Charles
2008-01-01
In July 2006 the Parenteral Drug Association's Risk Management Task Force for Aseptic Processes, conducted an electronic survey of PDA members to determine current industry practices regarding implementation of Quality Risk Management in their organizations. This electronic survey was open and publicly available via the PDA website and targeted professionals in our industry who are involved in initiating, implementing, or reviewing risk management programs or decisions in their organizations. One hundred twenty-nine members participated and their demographics are presented in the sidebar "Correspondents Profile". Among the major findings are: *The "Aseptic Processing/Filling" operation is the functional area identified as having the greatest need for risk assessment and quality risk management. *The most widely used methodology in industry to identify risk is Failure Mode and Effects Analysis (FMEA). This tool was most widely applied in assessing change control and for adverse event, complaint, or failure investigations. *Despite the fact that personnel training was identified as the strategy most used for controlling/minimizing risk, the largest contributors to sterility failure in operations are still "Personnel". *Most companies still rely on "Manufacturing Controls" to mitigate risk and deemed the utilization of Process Analytical Technology (PAT) least important in this aspect. *A majority of correspondents verified that they did not periodically assess their risk management programs. *A majority of the correspondents desired to see case studies or examples of risk analysis implementation (as applicable to aseptic processing) in future PDA technical reports on risk management.
A balanced hazard ratio for risk group evaluation from survival data.
Branders, Samuel; Dupont, Pierre
2015-07-30
Common clinical studies assess the quality of prognostic factors, such as gene expression signatures, clinical variables or environmental factors, and cluster patients into various risk groups. Typical examples include cancer clinical trials where patients are clustered into high or low risk groups. Whenever applied to survival data analysis, such groups are intended to represent patients with similar survival odds and to select the most appropriate therapy accordingly. The relevance of such risk groups, and of the related prognostic factors, is typically assessed through the computation of a hazard ratio. We first stress three limitations of assessing risk groups through the hazard ratio: (1) it may promote the definition of arbitrarily unbalanced risk groups; (2) an apparently optimal group hazard ratio can be largely inconsistent with the p-value commonly associated to it; and (3) some marginal changes between risk group proportions may lead to highly different hazard ratio values. Those issues could lead to inappropriate comparisons between various prognostic factors. Next, we propose the balanced hazard ratio to solve those issues. This new performance metric keeps an intuitive interpretation and is as simple to compute. We also show how the balanced hazard ratio leads to a natural cut-off choice to define risk groups from continuous risk scores. The proposed methodology is validated through controlled experiments for which a prescribed cut-off value is defined by design. Further results are also reported on several cancer prognosis studies, and the proposed methodology could be applied more generally to assess the quality of any prognostic markers. Copyright © 2015 John Wiley & Sons, Ltd.
Spatial Analysis of Geohazards using ArcGIS--A web-based Course.
NASA Astrophysics Data System (ADS)
Harbert, W.; Davis, D.
2003-12-01
As part of the Environmental Systems Research Incorporated (ESRI) Virtual Campus program, a course was designed to present the benefits of Geographical Information Systems (GIS) based spatial analysis as applied towards a variety of geohazards. We created this on-line ArcGIS 8.x-based course to aid the motivated student or professional in his or her efforts to use GIS in determining where geohazards are likely to occur and for assessing their potential impact on the human community. Our course is broadly designed for earth scientists, public sector professionals, students, and others who want to apply GIS to the study of geohazards. Participants work with ArcGIS software and diverse datasets to display, visualize and analyze a wide variety of data sets and map a variety of geohazards including earthquakes, volcanoes, landslides, tsunamis, and floods. Following the GIS-based methodology of posing a question, decomposing the question into specific criteria, applying the criteria to spatial or tabular geodatasets and then analyzing feature relationships, from the beginning the course content was designed in order to enable the motivated student to answer questions. For example, to explain the relationship between earth quake location, earthquake depth, and plate boundaries; use a seismic hazard map to identify population and features at risk from an earthquake; import data from an earthquake catalog and visualize these data in 3D; explain the relationship between earthquake damage and local geology; use a flood scenario map to identify features at risk for forecast river discharges; use a tsunami inundation map to identify population and features at risk from tsunami; use a hurricane inundation map to identify the population at risk for any given category hurricane; estimate accumulated precipitation by integrating time-series Doppler radar data; and model a real-life landslide event. The six on-line modules for our course are Earthquakes I, Earthquakes II, Volcanoes, Floods, Coastal Geohazards and Landslides. Earthquake I can be viewed and accessed for no cost at http://campus.esri.com.
Changes in Cross-Correlations as an Indicator for Systemic Risk
NASA Astrophysics Data System (ADS)
Zheng, Zeyu; Podobnik, Boris; Feng, Ling; Li, Baowen
2012-11-01
The 2008-2012 global financial crisis began with the global recession in December 2007 and exacerbated in September 2008, during which the U.S. stock markets lost 20% of value from its October 11 2007 peak. Various studies reported that financial crisis are associated with increase in both cross-correlations among stocks and stock indices and the level of systemic risk. In this paper, we study 10 different Dow Jones economic sector indexes, and applying principle component analysis (PCA) we demonstrate that the rate of increase in principle components with short 12-month time windows can be effectively used as an indicator of systemic risk--the larger the change of PC1, the higher the increase of systemic risk. Clearly, the higher the level of systemic risk, the more likely a financial crisis would occur in the near future.
Bogen, Kenneth T
2016-03-01
To improve U.S. Environmental Protection Agency (EPA) dose-response (DR) assessments for noncarcinogens and for nonlinear mode of action (MOA) carcinogens, the 2009 NRC Science and Decisions Panel recommended that the adjustment-factor approach traditionally applied to these endpoints should be replaced by a new default assumption that both endpoints have linear-no-threshold (LNT) population-wide DR relationships. The panel claimed this new approach is warranted because population DR is LNT when any new dose adds to a background dose that explains background levels of risk, and/or when there is substantial interindividual heterogeneity in susceptibility in the exposed human population. Mathematically, however, the first claim is either false or effectively meaningless and the second claim is false. Any dose-and population-response relationship that is statistically consistent with an LNT relationship may instead be an additive mixture of just two quasi-threshold DR relationships, which jointly exhibit low-dose S-shaped, quasi-threshold nonlinearity just below the lower end of the observed "linear" dose range. In this case, LNT extrapolation would necessarily overestimate increased risk by increasingly large relative magnitudes at diminishing values of above-background dose. The fact that chemically-induced apoptotic cell death occurs by unambiguously nonlinear, quasi-threshold DR mechanisms is apparent from recent data concerning this quintessential toxicity endpoint. The 2009 NRC Science and Decisions Panel claims and recommendations that default LNT assumptions be applied to DR assessment for noncarcinogens and nonlinear MOA carcinogens are therefore not justified either mathematically or biologically. © 2015 The Author. Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Baklanov, A.; Mahura, A.; Sørensen, J. H.
2003-06-01
There are objects with some periods of higher than normal levels of risk of accidental atmospheric releases (nuclear, chemical, biological, etc.). Such accidents or events may occur due to natural hazards, human errors, terror acts, and during transportation of waste or various operations at high risk. A methodology for risk assessment is suggested and it includes two approaches: 1) probabilistic analysis of possible atmospheric transport patterns using long-term trajectory and dispersion modelling, and 2) forecast and evaluation of possible contamination and consequences for the environment and population using operational dispersion modelling. The first approach could be applied during the preparation stage, and the second - during the operation stage. The suggested methodology is applied on an example of the most important phases (lifting, transportation, and decommissioning) of the ``Kursk" nuclear submarine operation. It is found that the temporal variability of several probabilistic indicators (fast transport probability fields, maximum reaching distance, maximum possible impact zone, and average integral concentration of 137Cs) showed that the fall of 2001 was the most appropriate time for the beginning of the operation. These indicators allowed to identify the hypothetically impacted geographical regions and territories. In cases of atmospheric transport toward the most populated areas, the forecasts of possible consequences during phases of the high and medium potential risk levels based on a unit hypothetical release (e.g. 1 Bq) are performed. The analysis showed that the possible deposition fractions of 10-11 (Bq/m2) over the Kola Peninsula, and 10-12 - 10-13 (Bq/m2) for the remote areas of the Scandinavia and Northwest Russia could be observed. The suggested methodology may be used successfully for any potentially dangerous object involving risk of atmospheric release of hazardous materials of nuclear, chemical or biological nature.
NASA Astrophysics Data System (ADS)
Baklanov, A.; Mahura, A.; Sørensen, J. H.
2003-03-01
There are objects with some periods of higher than normal levels of risk of accidental atmospheric releases (nuclear, chemical, biological, etc.). Such accidents or events may occur due to natural hazards, human errors, terror acts, and during transportation of waste or various operations at high risk. A methodology for risk assessment is suggested and it includes two approaches: 1) probabilistic analysis of possible atmospheric transport patterns using long-term trajectory and dispersion modelling, and 2) forecast and evaluation of possible contamination and consequences for the environment and population using operational dispersion modelling. The first approach could be applied during the preparation stage, and the second - during the operation stage. The suggested methodology is applied on an example of the most important phases (lifting, transportation, and decommissioning) of the "Kursk" nuclear submarine operation. It is found that the temporal variability of several probabilistic indicators (fast transport probability fields, maximum reaching distance, maximum possible impact zone, and average integral concentration of 137Cs) showed that the fall of 2001 was the most appropriate time for the beginning of the operation. These indicators allowed to identify the hypothetically impacted geographical regions and territories. In cases of atmospheric transport toward the most populated areas, the forecasts of possible consequences during phases of the high and medium potential risk levels based on a unit hypothetical release are performed. The analysis showed that the possible deposition fractions of 1011 over the Kola Peninsula, and 10-12 - 10-13 for the remote areas of the Scandinavia and Northwest Russia could be observed. The suggested methodology may be used successfully for any potentially dangerous object involving risk of atmospheric release of hazardous materials of nuclear, chemical or biological nature.
Agapova, Maria; Devine, Emily Beth; Bresnahan, Brian W; Higashi, Mitchell K; Garrison, Louis P
2014-09-01
Health agencies making regulatory marketing-authorization decisions use qualitative and quantitative approaches to assess expected benefits and expected risks associated with medical interventions. There is, however, no universal standard approach that regulatory agencies consistently use to conduct benefit-risk assessment (BRA) for pharmaceuticals or medical devices, including for imaging technologies. Economics, health services research, and health outcomes research use quantitative approaches to elicit preferences of stakeholders, identify priorities, and model health conditions and health intervention effects. Challenges to BRA in medical devices are outlined, highlighting additional barriers in radiology. Three quantitative methods--multi-criteria decision analysis, health outcomes modeling and stated-choice survey--are assessed using criteria that are important in balancing benefits and risks of medical devices and imaging technologies. To be useful in regulatory BRA, quantitative methods need to: aggregate multiple benefits and risks, incorporate qualitative considerations, account for uncertainty, and make clear whose preferences/priorities are being used. Each quantitative method performs differently across these criteria and little is known about how BRA estimates and conclusions vary by approach. While no specific quantitative method is likely to be the strongest in all of the important areas, quantitative methods may have a place in BRA of medical devices and radiology. Quantitative BRA approaches have been more widely applied in medicines, with fewer BRAs in devices. Despite substantial differences in characteristics of pharmaceuticals and devices, BRA methods may be as applicable to medical devices and imaging technologies as they are to pharmaceuticals. Further research to guide the development and selection of quantitative BRA methods for medical devices and imaging technologies is needed. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.
Hallett, Timothy B; Gregson, Simon; Mugurungi, Owen; Gonese, Elizabeth; Garnett, Geoff P
2009-06-01
Determining whether interventions to reduce HIV transmission have worked is essential, but complicated by the potential for generalised epidemics to evolve over time without individuals changing risk behaviour. We aimed to develop a method to evaluate evidence for changes in risk behaviour altering the course of an HIV epidemic. We developed a mathematical model of HIV transmission, incorporating the potential for natural changes in the epidemic as it matures and the introduction of antiretroviral treatment, and applied a Bayesian Melding framework, in which the model and observed trends in prevalence can be compared. We applied the model to Zimbabwe, using HIV prevalence estimates from antenatal clinic surveillance and house-hold based surveys, and basing model parameters on data from sexual behaviour surveys. There was strong evidence for reductions in risk behaviour stemming HIV transmission. We estimate these changes occurred between 1999 and 2004 and averted 660,000 (95% credible interval: 460,000-860,000) infections by 2008. The model and associated analysis framework provide a robust way to evaluate the evidence for changes in risk behaviour affecting the course of HIV epidemics, avoiding confounding by the natural evolution of HIV epidemics.
Risk analysis theory applied to fishing operations: A new approach on the decision-making problem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cunha, J.C.S.
1994-12-31
In the past the decisions concerning whether to continue or interrupt a fishing operation were based primarily on the operator`s previous experience. This procedure often led to wrong decisions and unnecessary loss of money and time. This paper describes a decision-making method based on risk analysis theory and previous operation results from a field under study. The method leads to more accurate decisions on a daily basis allowing the operator to verify each day of the operation if the decision being carried out is the one with the highest probability to conduct to the best economical result. An example ofmore » the method application is provided at the end of the paper.« less
NASA Astrophysics Data System (ADS)
Pingel, N.; Liang, Y.; Bindra, A.
2016-12-01
More than 1 million Californians live and work in the floodplains of the Sacramento-San Joaquin Valley where flood risks are among the highest in the nation. In response to this threat to people, property and the environment, the Department of Water Resources (DWR) has been called to action to improve flood risk management. This has transpired through significant advances in development of flood information and tools, analysis, and planning. Senate Bill 5 directed DWR to prepare the Central Valley Flood Protection Plan (CVFPP) and update it every 5 years. A key component of this aggressive planning approach is answering the question: What is the current flood risk, and how would proposed improvements change flood risk throughout the system? Answering this question is a substantial challenge due to the size and complexity of the watershed and flood control system. The watershed is roughly 42,000 sq mi, and flows are controlled by numerous reservoirs, bypasses, and levees. To overcome this challenge, the State invested in development of a comprehensive analysis "tool box" through various DWR programs. Development of the tool box included: collection of hydro-meteorological, topographic, geotechnical, and economic data; development of rainfall-runoff, reservoir operation, hydraulic routing, and flood risk analysis models; and development of specialized applications and computing schemes to accelerate the analysis. With this toolbox, DWR is analyzing flood hazard, flood control system performance, exposure and vulnerability of people and property to flooding, consequence of flooding for specific events, and finally flood risk for a range of CVFPP alternatives. Based on the results, DWR will put forward a State Recommended Plan in the 2017 CVFPP. Further, the value of the analysis tool box extends beyond the CVFPP. It will serve as a foundation for other flood studies for years to come and has already been successfully applied for inundation mapping to support emergency response, reservoir operation analysis, and others.
In search of robust flood risk management alternatives for the Netherlands
NASA Astrophysics Data System (ADS)
Klijn, F.; Knoop, J. M.; Ligtvoet, W.; Mens, M. J. P.
2012-05-01
The Netherlands' policy for flood risk management is being revised in view of a sustainable development against a background of climate change, sea level rise and increasing socio-economic vulnerability to floods. This calls for a thorough policy analysis, which can only be adequate when there is agreement about the "framing" of the problem and about the strategic alternatives that should be taken into account. In support of this framing, we performed an exploratory policy analysis, applying future climate and socio-economic scenarios to account for the autonomous development of flood risks, and defined a number of different strategic alternatives for flood risk management at the national level. These alternatives, ranging from flood protection by brute force to reduction of the vulnerability by spatial planning only, were compared with continuation of the current policy on a number of criteria, comprising costs, the reduction of fatality risk and economic risk, and their robustness in relation to uncertainties. We found that a change of policy away from conventional embankments towards gaining control over the flooding process by making the embankments unbreachable is attractive. By thus influencing exposure to flooding, the fatality risk can be effectively reduced at even lower net societal costs than by continuation of the present policy or by raising the protection standards where cost-effective.
Wang, Jinghong; Lo, Siuming; Wang, Qingsong; Sun, Jinhua; Mu, Honglin
2013-08-01
Crowd density is a key factor that influences the moving characteristics of a large group of people during a large-scale evacuation. In this article, the macro features of crowd flow and subsequent rescue strategies were considered, and a series of characteristic crowd densities that affect large-scale people movement, as well as the maximum bearing density when the crowd is extremely congested, were analyzed. On the basis of characteristic crowd densities, the queuing theory was applied to simulate crowd movement. Accordingly, the moving characteristics of the crowd and the effects of typical crowd density-which is viewed as the representation of the crowd's arrival intensity in front of the evacuation passageways-on rescue strategies was studied. Furthermore, a "risk axle of crowd density" is proposed to determine the efficiency of rescue strategies in a large-scale evacuation, i.e., whether the rescue strategies are able to effectively maintain or improve evacuation efficiency. Finally, through some rational hypotheses for the value of evacuation risk, a three-dimensional distribution of the evacuation risk is established to illustrate the risk axle of crowd density. This work aims to make some macro, but original, analysis on the risk of large-scale crowd evacuation from the perspective of the efficiency of rescue strategies. © 2012 Society for Risk Analysis.
Gigrich, James; Sarkani, Shahryar; Holzer, Thomas
2017-03-01
There is an increasing backlog of potentially toxic compounds that cannot be evaluated with current animal-based approaches in a cost-effective and expeditious manner, thus putting human health at risk. Extrapolation of animal-based test results for human risk assessment often leads to different physiological outcomes. This article introduces the use of quantitative tools and methods from systems engineering to evaluate the risk of toxic compounds by the analysis of the amount of stress that human hepatocytes undergo in vitro when metabolizing GW7647 1 over extended times and concentrations. Hepatocytes are exceedingly connected systems that make it challenging to understand the highly varied dimensional genomics data to determine risk of exposure. Gene expression data of peroxisome proliferator-activated receptor-α (PPARα) 2 binding was measured over multiple concentrations and varied times of GW7647 exposure and leveraging mahalanombis distance to establish toxicity threshold risk levels. The application of these novel systems engineering tools provides new insight into the intricate workings of human hepatocytes to determine risk threshold levels from exposure. This approach is beneficial to decision makers and scientists, and it can help reduce the backlog of untested chemical compounds due to the high cost and inefficiency of animal-based models.
NASA Astrophysics Data System (ADS)
Khotimah, Bain Khusnul; Irhamni, Firli; Kustiyahningsih, Yenny
2017-08-01
Business competition is one risk factor for Small and Medium Enterprises (SME) to set up good management in handling the risk of loss. This proposed research will look for criteria that influence the occurrence of damages based on data from by Cooperative and SME on Batik Madura. Method approach which used Fuzzy Analytic Network Process (FANP) as the weight of interest in decision support systems. Factor analysis of the level losses will influence the performance in the business sector. SWOT analysis combined with FANP method to determine the most appropriate development strategy to be applied industry. From the results of SWOT analysis and FANP, it was found the strategy of the best development to apply business strategy. The raw materials and human resources are available to increase the production capacity of the test results of SWOT analysis SME on Batik Madura. The result measurement of SME are always favourable the position, because the value is well resulted production and the amount is stable revenue which caused SME are in the first quadrant, so the power can exist take advantage of business opportunities. While the trial results of SWOT analysis on SME on Batik Madura in January and March are quadrant of second quadrant because of the number of defective products is quite produced, causing SME are under threat. But although SME suffer threats, SME still have strength on the amount of production and timely delivery.
Meta-analysis on night shift work and risk of metabolic syndrome.
Wang, F; Zhang, L; Zhang, Y; Zhang, B; He, Y; Xie, S; Li, M; Miao, X; Chan, E Y Y; Tang, J L; Wong, M C S; Li, Z; Yu, I T S; Tse, L A
2014-09-01
This study aims to quantitatively summarize the association between night shift work and the risk of metabolic syndrome (MetS), with special reference to the dose-response relationship with years of night shift work. We systematically searched all observational studies published in English on PubMed and Embase from 1971 to 2013. We extracted effect measures (relative risk, RR; or odd ratio, OR) with 95% confidence interval (CI) from individual studies to generate pooled results using meta-analysis approach. Pooled RR was calculated using random- or fixed-effect model. Downs and Black scale was applied to assess the methodological quality of included studies. A total of 13 studies were included. The pooled RR for the association between 'ever exposed to night shift work' and MetS risk was 1.57 (95% CI = 1.24-1.98, pheterogeneity = 0.001), while a higher risk was indicated in workers with longer exposure to night shifts (RR = 1.77, 95% CI = 1.32-2.36, pheterogeneity = 0.936). Further stratification analysis demonstrated a higher pooled effect of 1.84 (95% CI = 1.45-2.34) for studies using the NCEP-ATPIII criteria, among female workers (RR = 1.61, 95% CI = 1.10-2.34) and the countries other than Asia (RR = 1.65, 95% CI = 1.39-1.95). Sensitivity analysis confirmed the robustness of the results. No evidence of publication bias was detected. The present meta-analysis suggested that night shift work is significantly associated with the risk of MetS, and a positive dose-response relationship with duration of exposure was indicated. © 2014 The Authors. obesity reviews © 2014 World Obesity.
Tsunamis: Global Exposure and Local Risk Analysis
NASA Astrophysics Data System (ADS)
Harbitz, C. B.; Løvholt, F.; Glimsdal, S.; Horspool, N.; Griffin, J.; Davies, G.; Frauenfelder, R.
2014-12-01
The 2004 Indian Ocean tsunami led to a better understanding of the likelihood of tsunami occurrence and potential tsunami inundation, and the Hyogo Framework for Action (HFA) was one direct result of this event. The United Nations International Strategy for Disaster Risk Reduction (UN-ISDR) adopted HFA in January 2005 in order to reduce disaster risk. As an instrument to compare the risk due to different natural hazards, an integrated worldwide study was implemented and published in several Global Assessment Reports (GAR) by UN-ISDR. The results of the global earthquake induced tsunami hazard and exposure analysis for a return period of 500 years are presented. Both deterministic and probabilistic methods (PTHA) are used. The resulting hazard levels for both methods are compared quantitatively for selected areas. The comparison demonstrates that the analysis is rather rough, which is expected for a study aiming at average trends on a country level across the globe. It is shown that populous Asian countries account for the largest absolute number of people living in tsunami prone areas, more than 50% of the total exposed people live in Japan. Smaller nations like Macao and the Maldives are among the most exposed by population count. Exposed nuclear power plants are limited to Japan, China, India, Taiwan, and USA. On the contrary, a local tsunami vulnerability and risk analysis applies information on population, building types, infrastructure, inundation, flow depth for a certain tsunami scenario with a corresponding return period combined with empirical data on tsunami damages and mortality. Results and validation of a GIS tsunami vulnerability and risk assessment model are presented. The GIS model is adapted for optimal use of data available for each study. Finally, the importance of including landslide sources in the tsunami analysis is also discussed.
Wang, F; Yeung, K L; Chan, W C; Kwok, C C H; Leung, S L; Wu, C; Chan, E Y Y; Yu, I T S; Yang, X R; Tse, L A
2013-11-01
This study aimed to conduct a systematic review to sum up evidence of the associations between different aspects of night shift work and female breast cancer using a dose-response meta-analysis approach. We systematicly searched all cohort and case-control studies published in English on MEDLINE, Embase, PSYCInfo, APC Journal Club and Global Health, from January 1971 to May 2013. We extracted effect measures (relative risk, RR; odd ratio, OR; or hazard ratio, HR) from individual studies to generate pooled results using meta-analysis approaches. A log-linear dose-response regression model was used to evaluate the relationship between various indicators of exposure to night shift work and breast cancer risk. Downs and Black scale was applied to assess the methodological quality of included studies. Ten studies were included in the meta-analysis. A pooled adjusted relative risk for the association between 'ever exposed to night shift work' and breast cancer was 1.19 [95% confidence interval (CI) 1.05-1.35]. Further meta-analyses on dose-response relationship showed that every 5-year increase of exposure to night shift work would correspondingly enhance the risk of breast cancer of the female by 3% (pooled RR = 1.03, 95% CI 1.01-1.05; Pheterogeneity < 0.001). Our meta-analysis also suggested that an increase in 500-night shifts would result in a 13% (RR = 1.13, 95% CI 1.07-1.21; Pheterogeneity = 0.06) increase in breast cancer risk. This systematic review updated the evidence that a positive dose-response relationship is likely to present for breast cancer with increasing years of employment and cumulative shifts involved in the work.
Bamford, Adrian; Nation, Andy; Durrell, Susie; Andronis, Lazaros; Rule, Ellen; McLeod, Hugh
2017-02-03
The Keele stratified care model for management of low back pain comprises use of the prognostic STarT Back Screening Tool to allocate patients into one of three risk-defined categories leading to associated risk-specific treatment pathways, such that high-risk patients receive enhanced treatment and more sessions than medium- and low-risk patients. The Keele model is associated with economic benefits and is being widely implemented. The objective was to assess the use of the stratified model following its introduction in an acute hospital physiotherapy department setting in Gloucestershire, England. Physiotherapists recorded data on 201 patients treated using the Keele model in two audits in 2013 and 2014. To assess whether implementation of the stratified model was associated with the anticipated range of treatment sessions, regression analysis of the audit data was used to determine whether high- or medium-risk patients received significantly more treatment sessions than low-risk patients. The analysis controlled for patient characteristics, year, physiotherapists' seniority and physiotherapist. To assess the physiotherapists' views on the usefulness of the stratified model, audit data on this were analysed using framework methods. To assess the potential economic consequences of introducing the stratified care model in Gloucestershire, published economic evaluation findings on back-related National Health Service (NHS) costs, quality-adjusted life years (QALYs) and societal productivity losses were applied to audit data on the proportion of patients by risk classification and estimates of local incidence. When the Keele model was implemented, patients received significantly more treatment sessions as the risk-rating increased, in line with the anticipated impact of targeted treatment pathways. Physiotherapists were largely positive about using the model. The potential annual impact of rolling out the model across Gloucestershire is a gain in approximately 30 QALYs, a reduction in productivity losses valued at £1.4 million and almost no change to NHS costs. The Keele model was implemented and risk-specific treatment pathways successfully used for patients presenting with low back pain. Applying published economic evidence to the Gloucestershire locality suggests that substantial health and productivity outcomes would be associated with rollout of the Keele model while being cost-neutral for the NHS.
An agent based architecture for high-risk neonate management at neonatal intensive care unit.
Malak, Jaleh Shoshtarian; Safdari, Reza; Zeraati, Hojjat; Nayeri, Fatemeh Sadat; Mohammadzadeh, Niloofar; Farajollah, Seide Sedighe Seied
2018-01-01
In recent years, the use of new tools and technologies has decreased the neonatal mortality rate. Despite the positive effect of using these technologies, the decisions are complex and uncertain in critical conditions when the neonate is preterm or has a low birth weight or malformations. There is a need to automate the high-risk neonate management process by creating real-time and more precise decision support tools. To create a collaborative and real-time environment to manage neonates with critical conditions at the NICU (Neonatal Intensive Care Unit) and to overcome high-risk neonate management weaknesses by applying a multi agent based analysis and design methodology as a new solution for NICU management. This study was a basic research for medical informatics method development that was carried out in 2017. The requirement analysis was done by reviewing articles on NICU Decision Support Systems. PubMed, Science Direct, and IEEE databases were searched. Only English articles published after 1990 were included; also, a needs assessment was done by reviewing the extracted features and current processes at the NICU environment where the research was conducted. We analyzed the requirements and identified the main system roles (agents) and interactions by a comparative study of existing NICU decision support systems. The Universal Multi Agent Platform (UMAP) was applied to implement a prototype of our multi agent based high-risk neonate management architecture. Local environment agents interacted inside a container and each container interacted with external resources, including other NICU systems and consultation centers. In the NICU container, the main identified agents were reception, monitoring, NICU registry, and outcome prediction, which interacted with human agents including nurses and physicians. Managing patients at the NICU units requires online data collection, real-time collaboration, and management of many components. Multi agent systems are applied as a well-known solution for management, coordination, modeling, and control of NICU processes. We are currently working on an outcome prediction module using artificial intelligence techniques for neonatal mortality risk prediction. The full implementation of the proposed architecture and evaluation is considered the future work.
Estimating causal contrasts involving intermediate variables in the presence of selection bias.
Valeri, Linda; Coull, Brent A
2016-11-20
An important goal across the biomedical and social sciences is the quantification of the role of intermediate factors in explaining how an exposure exerts an effect on an outcome. Selection bias has the potential to severely undermine the validity of inferences on direct and indirect causal effects in observational as well as in randomized studies. The phenomenon of selection may arise through several mechanisms, and we here focus on instances of missing data. We study the sign and magnitude of selection bias in the estimates of direct and indirect effects when data on any of the factors involved in the analysis is either missing at random or not missing at random. Under some simplifying assumptions, the bias formulae can lead to nonparametric sensitivity analyses. These sensitivity analyses can be applied to causal effects on the risk difference and risk-ratio scales irrespectively of the estimation approach employed. To incorporate parametric assumptions, we also develop a sensitivity analysis for selection bias in mediation analysis in the spirit of the expectation-maximization algorithm. The approaches are applied to data from a health disparities study investigating the role of stage at diagnosis on racial disparities in colorectal cancer survival. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Code of Federal Regulations, 2011 CFR
2011-10-01
... a nationally average risk profile for the factors described in § 422.308(c), and this amount is... risk profile for the risk factors CMS applies to payment calculations as set forth at § 422.308(c) of... eligible beneficiary with a nationally average risk profile for the risk factors CMS applies to payment...
Schrem, Harald; Schneider, Valentin; Kurok, Marlene; Goldis, Alon; Dreier, Maren; Kaltenborn, Alexander; Gwinner, Wilfried; Barthold, Marc; Liebeneiner, Jan; Winny, Markus; Klempnauer, Jürgen; Kleine, Moritz
2016-01-01
The aim of this study is to identify independent pre-transplant cancer risk factors after kidney transplantation and to assess the utility of G-chart analysis for clinical process control. This may contribute to the improvement of cancer surveillance processes in individual transplant centers. 1655 patients after kidney transplantation at our institution with a total of 9,425 person-years of follow-up were compared retrospectively to the general German population using site-specific standardized-incidence-ratios (SIRs) of observed malignancies. Risk-adjusted multivariable Cox regression was used to identify independent pre-transplant cancer risk factors. G-chart analysis was applied to determine relevant differences in the frequency of cancer occurrences. Cancer incidence rates were almost three times higher as compared to the matched general population (SIR = 2.75; 95%-CI: 2.33-3.21). Significantly increased SIRs were observed for renal cell carcinoma (SIR = 22.46), post-transplant lymphoproliferative disorder (SIR = 8.36), prostate cancer (SIR = 2.22), bladder cancer (SIR = 3.24), thyroid cancer (SIR = 10.13) and melanoma (SIR = 3.08). Independent pre-transplant risk factors for cancer-free survival were age <52.3 years (p = 0.007, Hazard ratio (HR): 0.82), age >62.6 years (p = 0.001, HR: 1.29), polycystic kidney disease other than autosomal dominant polycystic kidney disease (ADPKD) (p = 0.001, HR: 0.68), high body mass index in kg/m2 (p<0.001, HR: 1.04), ADPKD (p = 0.008, HR: 1.26) and diabetic nephropathy (p = 0.004, HR = 1.51). G-chart analysis identified relevant changes in the detection rates of cancer during aftercare with no significant relation to identified risk factors for cancer-free survival (p<0.05). Risk-adapted cancer surveillance combined with prospective G-chart analysis likely improves cancer surveillance schemes by adapting processes to identified risk factors and by using G-chart alarm signals to trigger Kaizen events and audits for root-cause analysis of relevant detection rate changes. Further, comparative G-chart analysis would enable benchmarking of cancer surveillance processes between centers.
Kurok, Marlene; Goldis, Alon; Dreier, Maren; Kaltenborn, Alexander; Gwinner, Wilfried; Barthold, Marc; Liebeneiner, Jan; Winny, Markus; Klempnauer, Jürgen; Kleine, Moritz
2016-01-01
Background The aim of this study is to identify independent pre-transplant cancer risk factors after kidney transplantation and to assess the utility of G-chart analysis for clinical process control. This may contribute to the improvement of cancer surveillance processes in individual transplant centers. Patients and Methods 1655 patients after kidney transplantation at our institution with a total of 9,425 person-years of follow-up were compared retrospectively to the general German population using site-specific standardized-incidence-ratios (SIRs) of observed malignancies. Risk-adjusted multivariable Cox regression was used to identify independent pre-transplant cancer risk factors. G-chart analysis was applied to determine relevant differences in the frequency of cancer occurrences. Results Cancer incidence rates were almost three times higher as compared to the matched general population (SIR = 2.75; 95%-CI: 2.33–3.21). Significantly increased SIRs were observed for renal cell carcinoma (SIR = 22.46), post-transplant lymphoproliferative disorder (SIR = 8.36), prostate cancer (SIR = 2.22), bladder cancer (SIR = 3.24), thyroid cancer (SIR = 10.13) and melanoma (SIR = 3.08). Independent pre-transplant risk factors for cancer-free survival were age <52.3 years (p = 0.007, Hazard ratio (HR): 0.82), age >62.6 years (p = 0.001, HR: 1.29), polycystic kidney disease other than autosomal dominant polycystic kidney disease (ADPKD) (p = 0.001, HR: 0.68), high body mass index in kg/m2 (p<0.001, HR: 1.04), ADPKD (p = 0.008, HR: 1.26) and diabetic nephropathy (p = 0.004, HR = 1.51). G-chart analysis identified relevant changes in the detection rates of cancer during aftercare with no significant relation to identified risk factors for cancer-free survival (p<0.05). Conclusions Risk-adapted cancer surveillance combined with prospective G-chart analysis likely improves cancer surveillance schemes by adapting processes to identified risk factors and by using G-chart alarm signals to trigger Kaizen events and audits for root-cause analysis of relevant detection rate changes. Further, comparative G-chart analysis would enable benchmarking of cancer surveillance processes between centers. PMID:27398803
Environmental Drivers and Predicted Risk of Bacillary Dysentery in Southwest China.
Zhang, Han; Si, Yali; Wang, Xiaofeng; Gong, Peng
2017-07-14
Bacillary dysentery has long been a considerable health problem in southwest China, however, the quantitative relationship between anthropogenic and physical environmental factors and the disease is not fully understand. It is also not clear where exactly the bacillary dysentery risk is potentially high. Based on the result of hotspot analysis, we generated training samples to build a spatial distribution model. Univariate analyses, autocorrelation and multi-collinearity examinations and stepwise selection were then applied to screen the potential causative factors. Multiple logistic regressions were finally applied to quantify the effects of key factors. A bootstrapping strategy was adopted while fitting models. The model was evaluated by area under the receiver operating characteristic curve (AUC), Kappa and independent validation samples. Hotspot counties were mainly mountainous lands in southwest China. Higher risk of bacillary dysentery was found associated with underdeveloped socio-economy, proximity to farmland or water bodies, higher environmental temperature, medium relative humidity and the distribution of the Tibeto-Burman ethnicity. A predictive risk map with high accuracy (88.19%) was generated. The high-risk areas are mainly located in the mountainous lands where the Tibeto-Burman people live, especially in the basins, river valleys or other flat places in the mountains with relatively lower elevation and a warmer climate. In the high-risk areas predicted by this study, improving the economic development, investment in health care and the construction of infrastructures for safe water supply, waste treatment and sewage disposal, and improving health related education could reduce the disease risk.
Environmental Drivers and Predicted Risk of Bacillary Dysentery in Southwest China
Si, Yali; Gong, Peng
2017-01-01
Bacillary dysentery has long been a considerable health problem in southwest China, however, the quantitative relationship between anthropogenic and physical environmental factors and the disease is not fully understand. It is also not clear where exactly the bacillary dysentery risk is potentially high. Based on the result of hotspot analysis, we generated training samples to build a spatial distribution model. Univariate analyses, autocorrelation and multi-collinearity examinations and stepwise selection were then applied to screen the potential causative factors. Multiple logistic regressions were finally applied to quantify the effects of key factors. A bootstrapping strategy was adopted while fitting models. The model was evaluated by area under the receiver operating characteristic curve (AUC), Kappa and independent validation samples. Hotspot counties were mainly mountainous lands in southwest China. Higher risk of bacillary dysentery was found associated with underdeveloped socio-economy, proximity to farmland or water bodies, higher environmental temperature, medium relative humidity and the distribution of the Tibeto-Burman ethnicity. A predictive risk map with high accuracy (88.19%) was generated. The high-risk areas are mainly located in the mountainous lands where the Tibeto-Burman people live, especially in the basins, river valleys or other flat places in the mountains with relatively lower elevation and a warmer climate. In the high-risk areas predicted by this study, improving the economic development, investment in health care and the construction of infrastructures for safe water supply, waste treatment and sewage disposal, and improving health related education could reduce the disease risk. PMID:28708077
Akata, Kentaro; Yatera, Kazuhiro; Yamasaki, Kei; Kawanami, Toshinori; Naito, Keisuke; Noguchi, Shingo; Fukuda, Kazumasa; Ishimoto, Hiroshi; Taniguchi, Hatsumi; Mukae, Hiroshi
2016-05-11
Aspiration pneumonia has been a growing interest in an aging population. Anaerobes are important pathogens, however, the etiology of aspiration pneumonia is not fully understood. In addition, the relationship between the patient clinical characteristics and the causative pathogens in pneumonia patients with aspiration risk factors are unclear. To evaluate the relationship between the patient clinical characteristics with risk factors for aspiration and bacterial flora in bronchoalveolar lavage fluid (BALF) in pneumonia patients, the bacterial floral analysis of 16S ribosomal RNA gene was applied in addition to cultivation methods in BALF samples. From April 2010 to February 2014, BALF samples were obtained from the affected lesions of pneumonia via bronchoscopy, and were evaluated by the bacterial floral analysis of 16S rRNA gene in addition to cultivation methods in patients with community-acquired pneumonia (CAP) and healthcare-associated pneumonia (HCAP). Factors associated with aspiration risks in these patients were analyzed. A total of 177 (CAP 83, HCAP 94) patients were enrolled. According to the results of the bacterial floral analysis, detection rate of oral streptococci as the most detected bacterial phylotypes in BALF was significantly higher in patients with aspiration risks (31.0 %) than in patients without aspiration risks (14.7 %) (P = 0.009). In addition, the percentages of oral streptococci in each BALF sample were significantly higher in patients with aspiration risks (26.6 ± 32.0 %) than in patients without aspiration risks (13.8 ± 25.3 %) (P = 0.002). A multiple linear regression analysis showed that an Eastern Cooperative Oncology Group (ECOG) performance status (PS) of ≥3, the presence of comorbidities, and a history of pneumonia within a previous year were significantly associated with a detection of oral streptococci in BALF. The bacterial floral analysis of 16S rRNA gene revealed that oral streptococci were mostly detected as the most detected bacterial phylotypes in BALF samples in CAP and HCAP patients with aspiration risks, especially in those with a poor ECOG-PS or a history of pneumonia.
Genre, Ludivine; Roché, Henri; Varela, Léonel; Kanoun, Dorra; Ouali, Monia; Filleron, Thomas; Dalenc, Florence
2017-02-01
Survival of patients with metastatic breast cancer (MBC) suffering from brain metastasis (BM) is limited and this event is usually fatal. In 2010, the Graesslin's nomogram was published in order to predict subsequent BM in patients with breast cancer (BC) with extra-cerebral metastatic disease. This model aims to select a patient population at high risk for BM and thus will facilitate the design of prevention strategies and/or the impact of early treatment of BM in prospective clinical studies. Nomogram external validation was retrospectively applied to patients with BC and later BM between January 2005 and December 2012, treated in our institution. Moreover, risk factors of BM appearance were studied by Fine and Gray's competing risk analysis. Among 492 patients with MBC, 116 developed subsequent BM. Seventy of them were included for the nomogram validation. The discrimination is good (area under curve = 0.695 [95% confidence interval, 0.61-0.77]). Risk factors of BM appearance are: human epidermal growth factor receptor 2 (HER2) overexpression/amplification, triple-negative BC and number of extra-cerebral metastatic sites (>1). With a competing risk model, we highlight the nomogram interest for HER2+ tumour subgroup exclusively. Graesslin's nomogram external validation demonstrates exportability and reproducibility. Importantly, the competing risk model analysis provides additional information for the design of prospective trials concerning the early diagnosis of BM and/or preventive treatment on high risk patients with extra-cerebral metastatic BC. Copyright © 2016 Elsevier Ltd. All rights reserved.
WE-B-BRC-02: Risk Analysis and Incident Learning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fraass, B.
Prospective quality management techniques, long used by engineering and industry, have become a growing aspect of efforts to improve quality management and safety in healthcare. These techniques are of particular interest to medical physics as scope and complexity of clinical practice continue to grow, thus making the prescriptive methods we have used harder to apply and potentially less effective for our interconnected and highly complex healthcare enterprise, especially in imaging and radiation oncology. An essential part of most prospective methods is the need to assess the various risks associated with problems, failures, errors, and design flaws in our systems. Wemore » therefore begin with an overview of risk assessment methodologies used in healthcare and industry and discuss their strengths and weaknesses. The rationale for use of process mapping, failure modes and effects analysis (FMEA) and fault tree analysis (FTA) by TG-100 will be described, as well as suggestions for the way forward. This is followed by discussion of radiation oncology specific risk assessment strategies and issues, including the TG-100 effort to evaluate IMRT and other ways to think about risk in the context of radiotherapy. Incident learning systems, local as well as the ASTRO/AAPM ROILS system, can also be useful in the risk assessment process. Finally, risk in the context of medical imaging will be discussed. Radiation (and other) safety considerations, as well as lack of quality and certainty all contribute to the potential risks associated with suboptimal imaging. The goal of this session is to summarize a wide variety of risk analysis methods and issues to give the medical physicist access to tools which can better define risks (and their importance) which we work to mitigate with both prescriptive and prospective risk-based quality management methods. Learning Objectives: Description of risk assessment methodologies used in healthcare and industry Discussion of radiation oncology-specific risk assessment strategies and issues Evaluation of risk in the context of medical imaging and image quality E. Samei: Research grants from Siemens and GE.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-26
... ENVIRONMENTAL PROTECTION AGENCY [FRL-9311-4] Problem Formulation for Human Health Risk Assessments of Pathogens in Land-Applied Biosolids AGENCY: Environmental Protection Agency (EPA). ACTION: Notice... Formulation for Human Health Risk Assessments of Pathogens in Land-Applied Biosolids'' EPA/600/R-08/035F...
A Quantitative Risk Analysis of Deficient Contractor Business System
2012-04-30
Mathematically , Jorion’s concept of VaR looks like this: ( > ) ≤ 1 − (2) where, = ^Åèìáëáíáçå=oÉëÉ~êÅÜ=éêçÖê~ãW= `êÉ~íáåÖ=póåÉêÖó=Ñçê=fåÑçêãÉÇ=ÅÜ...presents three models for calculating VaR. The local-valuation method determines the value of a portfolio once and uses mathematical derivatives...management. In the insurance industry, actuarial data is applied to model risk and risk capital reserves are “held” to cover the expected values for
NASA Astrophysics Data System (ADS)
Takahashi, Masakazu; Nanba, Reiji; Fukue, Yoshinori
This paper proposes operational Risk Management (RM) method using Failure Mode and Effects Analysis (FMEA) for drug manufacturing computerlized system (DMCS). The quality of drug must not be influenced by failures and operational mistakes of DMCS. To avoid such situation, DMCS has to be conducted enough risk assessment and taken precautions. We propose operational RM method using FMEA for DMCS. To propose the method, we gathered and compared the FMEA results of DMCS, and develop a list that contains failure modes, failures and countermeasures. To apply this list, we can conduct RM in design phase, find failures, and conduct countermeasures efficiently. Additionally, we can find some failures that have not been found yet.
Rath, Frank
2008-01-01
This article examines the concepts of quality management (QM) and quality assurance (QA), as well as the current state of QM and QA practices in radiotherapy. A systematic approach incorporating a series of industrial engineering-based tools is proposed, which can be applied in health care organizations proactively to improve process outcomes, reduce risk and/or improve patient safety, improve through-put, and reduce cost. This tool set includes process mapping and process flowcharting, failure modes and effects analysis (FMEA), value stream mapping, and fault tree analysis (FTA). Many health care organizations do not have experience in applying these tools and therefore do not understand how and when to use them. As a result there are many misconceptions about how to use these tools, and they are often incorrectly applied. This article describes these industrial engineering-based tools and also how to use them, when they should be used (and not used), and the intended purposes for their use. In addition the strengths and weaknesses of each of these tools are described, and examples are given to demonstrate the application of these tools in health care settings.
Endogenous fluorescence emission of the ovary
NASA Astrophysics Data System (ADS)
Utzinger, Urs; Kirkpatrick, Nathaniel D.; Drezek, Rebekah A.; Brewer, Molly A.
2005-03-01
Epithelial ovarian cancer has the highest mortality rate among the gynecologic cancers. Early detection would significantly improve survival and quality of life of women at increased risk to develop ovarian cancer. We have constructed a device to investigate endogenous signals of the ovarian tissue surface in the UV C to visible range and describe our initial investigation of the use of optical spectroscopy to characterize the condition of the ovary. We have acquired data from more than 33 patients. A table top spectroscopy system was used to collect endogenous fluorescence with a fiberoptic probe that is compatible with endoscopic techniques. Samples were broken into five groups: Normal-Low Risk (for developing ovarian cancer) Normal-High Risk, Benign, and Cancer. Rigorous statistical analysis was applied to the data using variance tests for direct intensity versus diagnostic group comparisons and principal component analysis (PCA) to study the variance of the whole data set. We conclude that the diagnostically most useful excitation wavelengths are located in the UV. Furthermore, our results indicate that UV B and C are most useful. A safety analysis indicates that UV-C imaging can be conducted at exposure levels below safety thresholds. We found that fluorescence excited in the UV-C and UV-B range increases from benign to normal to cancerous tissues. This is in contrast to the emission created with UV-A excitation which decreased in the same order. We hypothesize that an increase of protein production and a decrease of fluorescence contributions of the extracellular matrix could explain this behavior. Variance analysis also identified fluctuation of fluorescence at 320/380 which is associated with collagen cross link residues. Small differences were observed between the group at high risk and normal risk for ovarian cancer. High risk samples deviated towards the cancer group and low risk samples towards benign group.
Evaluating uncertainty and parameter sensitivity in environmental models can be a difficult task, even for low-order, single-media constructs driven by a unique set of site-specific data. The challenge of examining ever more complex, integrated, higher-order models is a formidab...
Analysis of the Special Studies Program Based on the Interviews of Its Students.
ERIC Educational Resources Information Center
Esp, Barbarann; Torelli, Alexis
The special studies program at Hofstra University is designed for high school graduates applying to the university whose educational backgrounds require a more personalized approach to introductory college work. An attempt is made to minimize the risk of poor academic performance during the first year in college. A random sample of 24 students in…
A Quantitative Literature Review of the Effectiveness of Suicide Prevention Centers.
ERIC Educational Resources Information Center
Dew, Mary Amanda; And Others
1987-01-01
Applied meta-analysis to several series of studies to evaluate the effectiveness of prevention centers. Results indicate that centers do attract a high-risk population; center clients were more likely to commit suicide than were members of the general population, and individuals who committed suicide were more likely to have been clients than were…
ERIC Educational Resources Information Center
Odgers, Candice L.; Moffitt, Terrie E.; Tach, Laura M.; Taylor, Alan; Caspi, Avshalom; Matthews, Charlotte L.; Sampson, Robert J.
2009-01-01
This article reports on the influence of neighborhood-level deprivation and collective efficacy on children's antisocial behavior between the ages of 5 and 10 years. Latent growth curve modeling was applied to characterize the developmental course of antisocial behavior among children in the E-Risk Longitudinal Twin Study, an epidemiological…
Research in Modeling and Simulation for Airspace Systems Innovation
NASA Technical Reports Server (NTRS)
Ballin, Mark G.; Kimmel, William M.; Welch, Sharon S.
2007-01-01
This viewgraph presentation provides an overview of some of the applied research and simulation methodologies at the NASA Langley Research Center that support aerospace systems innovation. Risk assessment methodologies, complex systems design and analysis methodologies, and aer ospace operations simulations are described. Potential areas for future research and collaboration using interactive and distributed simula tions are also proposed.
Remediation Strategies for Learners at Risk of Failure: A Course Based Retention Model
ERIC Educational Resources Information Center
Gajewski, Agnes; Mather, Meera
2015-01-01
This paper presents an overview and discussion of a course based remediation model developed to enhance student learning and increased retention based on literature. This model focuses on course structure and course delivery in a compressed semester format. A comparative analysis was applied to a pilot study of students enrolled in a course…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-06
... accordance with 7 CFR part 305 with a minimum absorbed dose of 400 Gy; If the irradiation treatment is... accompanied by a phytosanitary certificate attesting that the fruit received the required irradiation... nephelii; If irradiation is applied upon arrival in the United States, each consignment of fresh fruit of...
Bagnasco, A; Sobrero, M; Sperlinga, L; Tibaldi, L; Sasso, L
2010-06-01
This study stemmed from the data gathered by a research conducted by the coordinator of the Department of Healthcare Services and a group of nurses involved in a research on accidental falls in hospitalized children at the "G. Gaslini" Children's Hospital and Scientific Research Institute in Genoa, Italy. The first retrospective study evaluated the accidental falls in hospitalized children referred to the three-year period 2003-2006, while the second perspective study, referred to the trimester March-May 2007, found that the main cause of falls in children was parent's distraction. The method adopted in the first phase of our study was a proactive risk analysis (The Basics of Healthcare Failure Mode and Effect Analysis), identified in the first place by the VA National Centre for Patient Safety and applied to the "Child and parent hospital admission process". This proactive risk analysis has proven to be very effective in preventing the risk of accidental falls in hospitalized children through effective communication and educational interventions. The second phase of our study consisted of two Focus Groups for accidental traumatic events. The analysis of the results of the study showed how effective communication is instrumental, not only to have a better awareness of the children and their parents during their stay in hospital, but also to implement educational sessions on prevention to reduce the risk of accidental traumatic events. The present study contributes to improve safety and the quality of care by motivating nurses to keep their attention high on falls in hospitalized children, by monitoring and the development of new risk assessment tools.
De Reu, Paul; Smits, Luc J; Oosterbaan, Herman P; Snijders, Rosalinde J; De Reu-Cuppens, Marga J; Nijhuis, Jan G
2007-01-01
To determine fetal growth in low risk pregnancies at the beginning of the third trimester and to assess the relative importance of fetal gender and maternal parity. Dutch primary care midwifery practice. Retrospective cohort study on 3641 singleton pregnancies seen at a primary care midwifery center in the Netherlands. Parameters used for analysis were fetal abdominal circumference (AC), fetal head circumference (HC), gestational age, fetal gender and maternal parity. Regression analysis was applied to describe variation in AC and HC with gestational age. Means and standard deviations in the present population were compared with commonly used reference charts. Multiple regression analysis was applied to examine whether gender and parity should be taken into account. The fetal AC and HC increased significantly between the 27th and the 33rd week of pregnancy (AC r2=0.3652, P<0.0001; HC r2=0.3301, P<0.0001). Compared to some curves, our means and standard deviations were significantly smaller (at 30+0 weeks AC mean=258+/-13 mm; HC mean=281+/-14 mm), but corresponded well with other curves. Fetal gender was a significant determinant for both AC (P<0.0001) and HC (P<0.0001). Parity contributed significantly to AC only but the difference was small (beta=0.00464). At the beginning of the third trimester, fetal size is associated with fetal gender and, to a lesser extent, with parity. Some fetal growth charts (e.g., Chitty et al.) are more suitable for the low-risk population in the Netherlands than others.
FU, Chung-Jung; KAO, Cheng-Yan; LEE, Yueh-Lun; LIAO, Chien-Wei; CHEN, Po-Ching; CHUANG, Ting-Wu; WANG, Ying-Chin; CHOU, Chia-Mei; HUANG, Ying-Chie; NAITO, Toshio; FAN, Chia-Kwung
2015-01-01
Background: Infection by Toxocara spp. is known to be significantly associated with partial epilepsy. It has become popular for people to raise dogs/cats as pets and consume roasted meat/viscera, and the status of Toxocara spp. infection, epilepsy awareness, and associated risk factors among the general population are currently unknown in Taiwan. Methods: A seroepidemiological investigation among 203 college students (CSs), consisting of 110 males and 93 females with an average age of 21.5 ± 1.2 years, was conducted in 2009 in Taipei City. A Western blot analysis based on excretory-secretory antigens derived from Toxocara canis larvae (TcESs) was applied to determine the positivity of serum immunoglobulin G antibodies. A self-administered questionnaire was also given to obtain information about demographic characteristics, epilepsy awareness, and risk factors. A logistic regression model was applied for the statistical analysis using SPSS software. Results: The overall seropositive rate of Toxocara spp. infection was 8.4% (17/203). As to epilepsy awareness, a non-significantly higher seroprevalence was found in CSs who claimed to "know" about epilepsy compared to those who did not know (P > 0.05). Conclusions: It appears that appropriate educational programs are urgently needed to provide correct knowledge related to the prevention and control measures against Toxocara spp. infections to avoid potential threats by this parasite to the general population in Taiwan. PMID:26622304
NASA Astrophysics Data System (ADS)
Andersson-sköld, Y. B.; Tremblay, M.
2011-12-01
Climate change is in most parts of Sweden expected to result in increased precipitation and increased sea water levels causing flooding, erosion, slope instability and related secondary consequences. Landslide risks are expected to increase with climate change in large parts of Sweden due to increased annual precipitation, more intense precipitation and increased flows combined with dryer summers. In response to the potential climate related risks, and on the commission of the Ministry of Environment, the Swedish Geotechnical Institute (SGI) is at present performing a risk analysis project for the most prominent landslide risk area in Sweden: the Göta river valley. As part of this, a methodology for land slide ex-ante consequence analysis today, and in a future climate, has been developed and applied in the Göta river valley. Human life, settlements, industry, contaminated sites, infrastructure of national importance are invented and assessed important elements at risk. The goal of the consequence analysis is to produce a map of geographically distributed expected losses, which can be combined with a corresponding map displaying landslide probability to describe the risk (the combination of probability and consequence of a (negative) event). The risk analysis is GIS-aided in presenting and visualise the risk and using existing databases for quantification of the consequences represented by ex-ante estimated monetary losses. The results will be used on national, regional and as an indication of the risk on local level, to assess the need of measures to mitigate the risk. The costs and environmental and social impacts to mitigate the risk are expected to be very high but the costs and impacts of a severe landslide are expected to be even higher. Therefore, civil servants have pronounced a need of tools to assess both the vulnerability and a more holistic picture of impacts of climate change adaptation measures. At SGI a tool for the inclusion of sustainability aspects in the decision making process on adaptation measures has been developed and is currently being tested in municipalities including central Gothenburg, and smaller municipalities in Sweden and Norway. The tool is a matrix based decision support tool (MDST) aiming for encoring discussion among experts and stakeholders. The first steps in the decision process include identification, inventory and assessment of the potential impacts of climate change such as landslides (or other events or actions). These steps are also included in general technical/physical risk and vulnerability analyses such as the risk analysis of the Göta älv valley. The MDST also includes further subsequent steps of the risk management process, and the full sequence of the MDST includes risk identification, risk specification, risk assessment, identification of measures, impact analysis of measures including an assessment of environmental, social and economical costs and benefits, a weight process and visualisation of the result. Here the MDST with some examples from the methodology for the Göta river valley analysis and the risk mitigation analysis from Sweden and Norway will be presented.
Clinical risk analysis with failure mode and effect analysis (FMEA) model in a dialysis unit.
Bonfant, Giovanna; Belfanti, Pietro; Paternoster, Giuseppe; Gabrielli, Danila; Gaiter, Alberto M; Manes, Massimo; Molino, Andrea; Pellu, Valentina; Ponzetti, Clemente; Farina, Massimo; Nebiolo, Pier E
2010-01-01
The aim of clinical risk management is to improve the quality of care provided by health care organizations and to assure patients' safety. Failure mode and effect analysis (FMEA) is a tool employed for clinical risk reduction. We applied FMEA to chronic hemodialysis outpatients. FMEA steps: (i) process study: we recorded phases and activities. (ii) Hazard analysis: we listed activity-related failure modes and their effects; described control measures; assigned severity, occurrence and detection scores for each failure mode and calculated the risk priority numbers (RPNs) by multiplying the 3 scores. Total RPN is calculated by adding single failure mode RPN. (iii) Planning: we performed a RPNs prioritization on a priority matrix taking into account the 3 scores, and we analyzed failure modes causes, made recommendations and planned new control measures. (iv) Monitoring: after failure mode elimination or reduction, we compared the resulting RPN with the previous one. Our failure modes with the highest RPN came from communication and organization problems. Two tools have been created to ameliorate information flow: "dialysis agenda" software and nursing datasheets. We scheduled nephrological examinations, and we changed both medical and nursing organization. Total RPN value decreased from 892 to 815 (8.6%) after reorganization. Employing FMEA, we worked on a few critical activities, and we reduced patients' clinical risk. A priority matrix also takes into account the weight of the control measures: we believe this evaluation is quick, because of simple priority selection, and that it decreases action times.
Sherrod, Charles W.; Casey, George; Dubro, Robert E.; Johnson, Dale F.
2013-01-01
Objective This report describes the case management of musculoskeletal disorders for an employee in a college work environment using both chiropractic care and applied ergonomics. Clinical Findings A 54-year-old male office worker presented with decreased motor function in both wrists; intermittent moderate-to-severe headaches; and pain or discomfort in the neck, both shoulders, left hand and wrist, and lumbosacral region resulting from injuries sustained during recreational soccer and from excessive forces and awkward postures when interacting with his home and office computer workstations. Intervention and Results Ergonomic training, surveillance, retrofitted equipment with new furniture, and an emphasis on adopting healthy work-style behaviors were applied in combination with regular chiropractic care. Baseline ergonomic job task analysis identified risk factors and delineated appropriate control measures to improve the subject's interface with his office workstation. Serial reevaluations at 3-month, 1-year, and 2-year periods recorded changes to the participant's pain, discomfort, and work-style behaviors. At end of study and relative to baseline, pain scale improved from 4/10 to 2/10; general disability improved from 4 to 0; and hand grip strength (pounds) increased from 20 to 105 (left) and 45 to 100 (right). Healthy work habits and postures adopted in the 3-month to 1-year period regressed to baseline exposures for 3 of 6 risk priorities identified in the ergonomic job task analysis. Conclusion The patient responded positively to the intervention of chiropractic care and applied ergonomics. PMID:23997724
Applying the partitioned multiobjective risk method (PMRM) to portfolio selection.
Reyes Santos, Joost; Haimes, Yacov Y
2004-06-01
The analysis of risk-return tradeoffs and their practical applications to portfolio analysis paved the way for Modern Portfolio Theory (MPT), which won Harry Markowitz a 1992 Nobel Prize in Economics. A typical approach in measuring a portfolio's expected return is based on the historical returns of the assets included in a portfolio. On the other hand, portfolio risk is usually measured using volatility, which is derived from the historical variance-covariance relationships among the portfolio assets. This article focuses on assessing portfolio risk, with emphasis on extreme risks. To date, volatility is a major measure of risk owing to its simplicity and validity for relatively small asset price fluctuations. Volatility is a justified measure for stable market performance, but it is weak in addressing portfolio risk under aberrant market fluctuations. Extreme market crashes such as that on October 19, 1987 ("Black Monday") and catastrophic events such as the terrorist attack of September 11, 2001 that led to a four-day suspension of trading on the New York Stock Exchange (NYSE) are a few examples where measuring risk via volatility can lead to inaccurate predictions. Thus, there is a need for a more robust metric of risk. By invoking the principles of the extreme-risk-analysis method through the partitioned multiobjective risk method (PMRM), this article contributes to the modeling of extreme risks in portfolio performance. A measure of an extreme portfolio risk, denoted by f(4), is defined as the conditional expectation for a lower-tail region of the distribution of the possible portfolio returns. This article presents a multiobjective problem formulation consisting of optimizing expected return and f(4), whose solution is determined using Evolver-a software that implements a genetic algorithm. Under business-as-usual market scenarios, the results of the proposed PMRM portfolio selection model are found to be compatible with those of the volatility-based model. However, under extremely unfavorable market conditions, results indicate that f(4) can be a more valid measure of risk than volatility.
DellaValle, Curt T; Hoppin, Jane A; Hines, Cynthia J; Andreotti, Gabriella; Alavanja, Michael C R
2012-01-01
Pesticide exposures can be reduced by use of personal protective equipment as well as proper mixing and application practices. The authors examined the effects of risk-accepting personality on personal protective equipment (PPE) use and mixing and application practices among private pesticide applicators and their spouses within the Agricultural Health Study (AHS) in Iowa and North Carolina and commercial applicators in Iowa. The AHS follow-up questionnaire included four questions designed to assess attitudes toward risk. Analysis was limited to those who were currently working on a farm or registered as a commercial applicator and indicated current pesticide use (n=25,166). Respondents who answered three or more questions in the affirmative (private applicators: n=4160 [21%]; commercial applicators: n=199 [14%]; spouses: n=829 [23%]) were classified as having a risk-accepting personality. Logistic regression was used to evaluate specific work practices associated with risk-accepting attitudes. Among private applicators, the likelihood of using any PPE when mixing or loading pesticides was lower among risk-acceptors compared to risk-averse individuals (odds ratio [OR] = 0.72, 95% confidence interval [CI]: 0.65-0.79). A similar relationship was observed among commercial applicators (OR = 0.77, 95% CI: 0.34-1.77) but not among spouses (OR = 1.09, 95% CI: 0.90-1.33). Among private applicators, risk-acceptors were more likely than the risk-averse to apply pesticides within 50 feet of the home (OR = 1.21, 95% CI: 1.01-1.44), compared to further than ¼ mile. These findings suggest that the decisions to use personal protective equipment and properly handle/apply pesticides may be driven by risk-accepting personality traits.
von Rosen, P; Frohm, A; Kottorp, A; Fridén, C; Heijne, A
2017-12-01
Many risk factors for injury are presented in the literature, few of those are however consistent and the majority is associated with adult and not adolescent elite athletes. The aim was to identify risk factors for injury in adolescent elite athletes, by applying a biopsychosocial approach. A total of 496 adolescent elite athletes (age range 15-19), participating in 16 different sports, were monitored repeatedly over 52 weeks using a valid questionnaire about injuries, training exposure, sleep, stress, nutrition, and competence-based self-esteem. Univariate and multiple Cox regression analyses were used to calculate hazard ratios (HR) for risk factors for first reported injury. The main finding was that an increase in training load, training intensity, and at the same time decreasing the sleep volume resulted in a higher risk for injury compared to no change in these variables (HR 2.25, 95% CI, 1.46-3.45, P<.01), which was the strongest risk factor identified. In addition, an increase by one score of competence-based self-esteem increased the hazard for injury with 1.02 (HR 95% CI, 1.00-1.04, P=.01). Based on the multiple Cox regression analysis, an athlete having the identified risk factors (Risk Index, competence-based self-esteem), with an average competence-based self-esteem score, had more than a threefold increased risk for injury (HR 3.35), compared to an athlete with a low competence-based self-esteem and no change in sleep or training volume. Our findings confirm injury occurrence as a result of multiple risk factors interacting in complex ways. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
DellaValle, Curt T.; Hoppin, Jane A.; Hines, Cynthia J.; Andreotti, Gabriella; Alavanja, Michael C.R.
2012-01-01
Pesticide exposures can be reduced by use of personal protective equipment as well as proper mixing and application practices. We examined the effects of risk-accepting personality on personal protective equipment (PPE) use and mixing and application practices among private pesticide applicators and their spouses within the Agricultural Health Study (AHS) in Iowa and North Carolina and commercial applicators in Iowa. The AHS follow-up questionnaire included four questions designed to assess attitudes toward risk. Analysis was limited to those who were currently working on a farm or registered as a commercial applicator and indicated current pesticide use (n=25,166). Respondents who answered three or more questions in the affirmative (private applicators: n=4,160 (21%); commercial applicators: n=199 (14%); spouses: n=829 (23%)) were classified as having a risk-accepting personality. Logistic regression was used to evaluate specific work practices associated with risk-accepting attitudes. Among private applicators, the likelihood of using any PPE when mixing or loading pesticides was lower among risk-acceptors compared to risk-averse individuals (odds ratio (OR) = 0.72; 95% Confidence Interval (CI): 0.65 – 0.79). A similar relationship was observed among commercial applicators (OR = 0.77, 95% CI: 0.34 – 1.77) but not among spouses (OR = 1.09, 95% CI: 0.90 – 1.33). Among private applicators, risk-acceptors were more likely than the risk-averse to apply pesticides within 50 ft of the home (OR=1.21; 95% CI: 1.01 – 1.44), compared to further than ¼ mile. Our findings suggest that the decisions to use personal protective equipment and properly handle/apply pesticides may be driven by risk-accepting personality traits. PMID:22732067
Collaborative development of land use change scenarios for analysing hydro-meteorological risk
NASA Astrophysics Data System (ADS)
Malek, Žiga; Glade, Thomas
2015-04-01
Simulating future land use changes remains a difficult task, due to uncontrollable and uncertain driving forces of change. Scenario development emerged as a tool to address these limitations. Scenarios offer the exploration of possible futures and environmental consequences, and enable the analysis of possible decisions. Therefore, there is increasing interest of both decision makers and researchers to apply scenarios when studying future land use changes and their consequences. The uncertainties related to generating land use change scenarios are among others defined by the accuracy of data, identification and quantification of driving forces, and the relation between expected future changes and the corresponding spatial pattern. To address the issue of data and intangible driving forces, several studies have applied collaborative, participatory techniques when developing future scenarios. The involvement of stakeholders can lead to incorporating a broader spectrum of professional values and experience. Moreover, stakeholders can help to provide missing data, improve detail, uncover mistakes, and offer alternatives. Thus, collaborative scenarios can be considered as more reliable and relevant. Collaborative scenario development has been applied to study a variety of issues in environmental sciences on different spatial and temporal scales. Still, these participatory approaches are rarely spatially explicit, making them difficult to apply when analysing changes to hydro-meteorological risk on a local scale. Spatial explicitness is needed to identify potentially critical areas of land use change, leading to locations where the risk might increase. In order to allocate collaboratively developed scenarios of land change, we combined participatory modeling with geosimulation in a multi-step scenario generation framework. We propose a framework able to develop scenarios that are plausible, can overcome data inaccessibility, address intangible and external driving forces of land change, and is transferable to other case study areas with different land use change processes and consequences. The framework starts with the involvement of stakeholders where driving forces of land use change are being studied by performing interviews and group discussions. In order to bridge the gap between qualitative methods and conventional geospatial techniques, we applied cognitive mapping and the Drivers-Pressures-State-Impact and Response framework (DPSIR) to develop a conceptual land use change model. This was later transformed into a spatially explicit land use change model based on remote sensing data, GIS and cellular automata spatial allocation. The methodology was developed and applied in a study area in the eastern Italian Alps, where the uncertainties regarding future urban expansion are high. Later, we transferred it to a study area in the Romanian Carpathians, where the identified prevailing process of land use change is deforestation. Both areas are subject to hydro-meteorological risk, posing a need for the analysis of the possible future spatial pattern and locations of land use change. The resulting scenarios enabled us, to point at identifying hot-spots of land use change, serving as a possible input for a risk assessment.
[Economic effects of integrated RIS-PACS solution in the university environment].
Kröger, M; Nissen-Meyer, S; Wetekam, V; Reiser, M
1999-04-01
The goal of the current article is to demonstrate how qualitative and monetary effects resulting from an integrated RIS/PACS installation can be evaluated. First of all, the system concept of a RIS/PACS solution for a university hospital is defined and described. Based on this example, a generic method for the evaluation of qualitative and monetary effects as well as associated risks is depicted and demonstrated. To this end, qualitative analyses, investment calculations and risk analysis are employed. The sample analysis of a RIS/PACS solution specially designed for a university hospital demonstrates positive qualitative and monetary effects of the system. Under ideal conditions the payoff time of the investments is reached after 4 years of an assumed 8 years effective life of the system. Furthermore, under conservative assumptions, the risk analysis shows a probability of 0% for realising a negative net present value at the end of the payoff time period. It should be pointed out that the positive result of this sample analysis will not necessarily apply to other clinics or hospitals. However, the same methods may be used for the individual evaluation of the qualitative and monetary effects of a RIS/PACS installation in any clinic.
The true meaning of 'exotic species' as a model for genetically engineered organisms.
Regal, P J
1993-03-15
The exotic or non-indigenous species model for deliberately introduced genetically engineered organisms (GEOs) has often been misunderstood or misrepresented. Yet proper comparisons of of ecologically competent GEOs to the patterns of adaptation of introduced species have been highly useful among scientists in attempting to determine how to apply biological theory to specific GEO risk issues, and in attempting to define the probabilities and scale of ecological risks with GEOs. In truth, the model predicts that most projects may be environmentally safe, but a significant minority may be very risky. The model includes a history of institutional follies that also should remind workers of the danger of oversimplifying biological issues, and warn against repeating the sorts of professional misjudgements that have too often been made in introducing organisms to new settings. We once expected that the non-indigenous species model would be refined by more analysis of species eruptions, ecological genetics, and the biology of select GEOs themselves, as outlined. But there has been political resistance to the effective regulation of GEOs, and a bureaucratic tendency to focus research agendas on narrow data collection. Thus there has been too little promotion by responsible agencies of studies to provide the broad conceptual base for truly science-based regulation. In its presently unrefined state, the non-indigenous species comparison would overestimate the risks of GEOs if it were (mis)applied to genetically disrupted, ecologically crippled GEOs, but in some cases of wild-type organisms with novel engineered traits, it could greatly underestimate the risks. Further analysis is urgently needed.
Assessing privacy risks in population health publications using a checklist-based approach.
O'Keefe, Christine M; Ickowicz, Adrien; Churches, Tim; Westcott, Mark; O'Sullivan, Maree; Khan, Atikur
2017-11-10
Recent growth in the number of population health researchers accessing detailed datasets, either on their own computers or through virtual data centers, has the potential to increase privacy risks. In response, a checklist for identifying and reducing privacy risks in population health analysis outputs has been proposed for use by researchers themselves. In this study we explore the usability and reliability of such an approach by investigating whether different users identify the same privacy risks on applying the checklist to a sample of publications. The checklist was applied to a sample of 100 academic population health publications distributed among 5 readers. Cohen's κ was used to measure interrater agreement. Of the 566 instances of statistical output types found in the 100 publications, the most frequently occurring were counts, summary statistics, plots, and model outputs. Application of the checklist identified 128 outputs (22.6%) with potential privacy concerns. Most of these were associated with the reporting of small counts. Among these identified outputs, the readers found no substantial actual privacy concerns when context was taken into account. Interrater agreement for identifying potential privacy concerns was generally good. This study has demonstrated that a checklist can be a reliable tool to assist researchers with anonymizing analysis outputs in population health research. This further suggests that such an approach may have the potential to be developed into a broadly applicable standard providing consistent confidentiality protection across multiple analyses of the same data. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Quantifying and Reducing Uncertainty in Correlated Multi-Area Short-Term Load Forecasting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Yannan; Hou, Zhangshuan; Meng, Da
2016-07-17
In this study, we represent and reduce the uncertainties in short-term electric load forecasting by integrating time series analysis tools including ARIMA modeling, sequential Gaussian simulation, and principal component analysis. The approaches are mainly focusing on maintaining the inter-dependency between multiple geographically related areas. These approaches are applied onto cross-correlated load time series as well as their forecast errors. Multiple short-term prediction realizations are then generated from the reduced uncertainty ranges, which are useful for power system risk analyses.
Continuous Risk Management at NASA
NASA Technical Reports Server (NTRS)
Hammer, Theodore F.; Rosenberg, Linda
1999-01-01
NPG 7120.5A, "NASA Program and Project Management Processes and Requirements" enacted in April, 1998, requires that "The program or project manager shall apply risk management principles..." The Software Assurance Technology Center (SATC) at NASA GSFC has been tasked with the responsibility for developing and teaching a systems level course for risk management that provides information on how to comply with this edict. The course was developed in conjunction with the Software Engineering Institute at Carnegie Mellon University, then tailored to the NASA systems community. This presentation will briefly discuss the six functions for risk management: (1) Identify the risks in a specific format; (2) Analyze the risk probability, impact/severity, and timeframe; (3) Plan the approach; (4) Track the risk through data compilation and analysis; (5) Control and monitor the risk; (6) Communicate and document the process and decisions. This risk management structure of functions has been taught to projects at all NASA Centers and is being successfully implemented on many projects. This presentation will give project managers the information they need to understand if risk management is to be effectively implemented on their projects at a cost they can afford.
Pérez-Rodríguez, F; van Asselt, E D; Garcia-Gimeno, R M; Zurera, G; Zwietering, M H
2007-05-01
The risk assessment study of Listeria monocytogenes in ready-to-eat foods conducted by the U.S. Food and Drug Administration is an example of an extensive quantitative microbiological risk assessment that could be used by risk analysts and other scientists to obtain information and by managers and stakeholders to make decisions on food safety management. The present study was conducted to investigate how detailed sensitivity analysis can be used by assessors to extract more information on risk factors and how results can be communicated to managers and stakeholders in an understandable way. The extended sensitivity analysis revealed that the extremes at the right side of the dose distribution (at consumption, 9 to 11.5 log CFU per serving) were responsible for most of the cases of listeriosis simulated. For concentration at retail, values below the detection limit of 0.04 CFU/g and the often used limit for L. monocytogenes of 100 CFU/g (also at retail) were associated with a high number of annual cases of listeriosis (about 29 and 82%, respectively). This association can be explained by growth of L. monocytogenes at both average and extreme values of temperature and time, indicating that a wide distribution can lead to high risk levels. Another finding is the importance of the maximal population density (i.e., the maximum concentration of L. monocytogenes assumed at a certain temperature) for accurately estimating the risk of infection by opportunistic pathogens such as L. monocytogenes. According to the obtained results, mainly concentrations corresponding to the highest maximal population densities caused risk in the simulation. However, sensitivity analysis applied to the uncertainty parameters revealed that prevalence at retail was the most important source of uncertainty in the model.
Giardina, M; Castiglia, F; Tomarchio, E
2014-12-01
Failure mode, effects and criticality analysis (FMECA) is a safety technique extensively used in many different industrial fields to identify and prevent potential failures. In the application of traditional FMECA, the risk priority number (RPN) is determined to rank the failure modes; however, the method has been criticised for having several weaknesses. Moreover, it is unable to adequately deal with human errors or negligence. In this paper, a new versatile fuzzy rule-based assessment model is proposed to evaluate the RPN index to rank both component failure and human error. The proposed methodology is applied to potential radiological over-exposure of patients during high-dose-rate brachytherapy treatments. The critical analysis of the results can provide recommendations and suggestions regarding safety provisions for the equipment and procedures required to reduce the occurrence of accidental events.
Use of Model-Based Design Methods for Enhancing Resiliency Analysis of Unmanned Aerial Vehicles
NASA Astrophysics Data System (ADS)
Knox, Lenora A.
The most common traditional non-functional requirement analysis is reliability. With systems becoming more complex, networked, and adaptive to environmental uncertainties, system resiliency has recently become the non-functional requirement analysis of choice. Analysis of system resiliency has challenges; which include, defining resilience for domain areas, identifying resilience metrics, determining resilience modeling strategies, and understanding how to best integrate the concepts of risk and reliability into resiliency. Formal methods that integrate all of these concepts do not currently exist in specific domain areas. Leveraging RAMSoS, a model-based reliability analysis methodology for Systems of Systems (SoS), we propose an extension that accounts for resiliency analysis through evaluation of mission performance, risk, and cost using multi-criteria decision-making (MCDM) modeling and design trade study variability modeling evaluation techniques. This proposed methodology, coined RAMSoS-RESIL, is applied to a case study in the multi-agent unmanned aerial vehicle (UAV) domain to investigate the potential benefits of a mission architecture where functionality to complete a mission is disseminated across multiple UAVs (distributed) opposed to being contained in a single UAV (monolithic). The case study based research demonstrates proof of concept for the proposed model-based technique and provides sufficient preliminary evidence to conclude which architectural design (distributed vs. monolithic) is most resilient based on insight into mission resilience performance, risk, and cost in addition to the traditional analysis of reliability.
Wealth inhomogeneity applied to crash rate theory.
Shuler, Robert L
2015-11-01
A crash rate theory based on corporate economic utility maximization is applied to individual behavior in U.S. and German motorway death rates, by using wealth inhomogeneity data in ten-percentile bins to account for variations of utility maximization in the population. Germany and the U.S. have similar median wealth figures, a well-known indicator of accident risk, but different motorway death rates. It is found that inhomogeneity in roughly the 10(th) to 30(th) percentile, not revealed by popular measures such as the Gini index which focus on differences at the higher percentiles, provides a satisfactory explanation of the data. The inhomogeneity analysis reduces data disparity from a factor of 2.88 to 1.75 as compared with median wealth assumed homogeneity, and further to 1.09 with average wealth assumed homogeneity. The first reduction from 2.88 to 1.75 is attributable to inequality at lower percentiles and suggests it may be as important in indicating socioeconomic risk as extremes in the upper percentile ranges, and that therefore the U.S. socioeconomic risk may be higher than generally realized.
2017-01-01
Abstract Objective: To analyse the metric properties of the Timed Get up and Go-Modified Version Test (TGUGM), in risk assessment of falls in a group of physically active women. Methods: A sample was constituted by 202 women over 55 years of age, were assessed through a crosssectional study. The TGUGM was applied to assess their fall risk. The test was analysed by comparison of the qualitative and quantitative information and by factor analysis. The development of a logistic regression model explained the risk of falls according to the test components. Results: The TGUGM was useful for assessing the risk of falls in the studied group. The test revealed two factors: the Get Up and the Gait with dual task. Less than twelve points in the evaluation or runtimes higher than 35 seconds was associated with high risk of falling. More than 35 seconds in the test indicated a risk fall probability greater than 0.50. Also, scores less than 12 points were associated with a delay of 7 seconds more in the execution of the test (p= 0.0016). Conclusions: Factor analysis of TGUGM revealed two dimensions that can be independent predictors of risk of falling: The Get up that explains between 64% and 87% of the risk of falling, and the Gait with dual task, that explains between 77% and 95% of risk of falling. PMID:28559642
Assessing the risk posed by natural hazards to infrastructures
NASA Astrophysics Data System (ADS)
Eidsvig, Unni; Kristensen, Krister; Vidar Vangelsten, Bjørn
2015-04-01
The modern society is increasingly dependent on infrastructures to maintain its function, and disruption in one of the infrastructure systems may have severe consequences. The Norwegian municipalities have, according to legislation, a duty to carry out a risk and vulnerability analysis and plan and prepare for emergencies in a short- and long term perspective. Vulnerability analysis of the infrastructures and their interdependencies is an important part of this analysis. This paper proposes a model for assessing the risk posed by natural hazards to infrastructures. The model prescribes a three level analysis with increasing level of detail, moving from qualitative to quantitative analysis. This paper focuses on the second level, which consists of a semi-quantitative analysis. The purpose of this analysis is to perform a screening of the scenarios of natural hazards threatening the infrastructures identified in the level 1 analysis and investigate the need for further analyses, i.e. level 3 quantitative analyses. The proposed level 2 analysis considers the frequency of the natural hazard, different aspects of vulnerability including the physical vulnerability of the infrastructure itself and the societal dependency on the infrastructure. An indicator-based approach is applied, ranking the indicators on a relative scale. The proposed indicators characterize the robustness of the infrastructure, the importance of the infrastructure as well as interdependencies between society and infrastructure affecting the potential for cascading effects. Each indicator is ranked on a 1-5 scale based on pre-defined ranking criteria. The aggregated risk estimate is a combination of the semi-quantitative vulnerability indicators, as well as quantitative estimates of the frequency of the natural hazard and the number of users of the infrastructure. Case studies for two Norwegian municipalities are presented, where risk to primary road, water supply and power network threatened by storm and landslide is assessed. The application examples show that the proposed model provides a useful tool for screening of undesirable events, with the ultimate goal to reduce the societal vulnerability.
Residues in Beeswax: A Health Risk for the Consumer of Honey and Beeswax?
Wilmart, Olivier; Legrève, Anne; Scippo, Marie-Louise; Reybroeck, Wim; Urbain, Bruno; de Graaf, Dirk C; Steurbaut, Walter; Delahaut, Philippe; Gustin, Pascal; Nguyen, Bach Kim; Saegerman, Claude
2016-11-09
A scenario analysis in regard to the risk of chronic exposure of consumers to residues through the consumption of contaminated honey and beeswax was conducted. Twenty-two plant protection products and veterinary substances of which residues have already been detected in beeswax in Europe were selected. The potential chronic exposure was assessed by applying a worst-case scenario based on the addition of a "maximum" daily intake through the consumption of honey and beeswax to the theoretical maximum daily intake through other foodstuffs. For each residue, the total exposure was finally compared to the acceptable daily intake. It is concluded that the food consumption of honey and beeswax contaminated with these residues considered separately does not compromise the consumer's health, provided proposed action limits are met. In regard to residues of flumethrin in honey and in beeswax, "zero tolerance" should be applied.
Ruschioni, Angela; Montesi, Simona; Spagnuolo, Loreta Maria; Rinaldi, Lucia; Fantozzi, Lucia; Fanti, M
2011-01-01
Beekeeping is common activity in the two regions in this study, Marche and Tuscany: in both regions the numbers of beekeepers, both amateur and professional, and honey production are high. The aim was to study, through the application of simple tools, the organization of beekeeping activity so as to identify hazardous situations in the work process. We followed the production cycle of two businesses that differed in size and work organization for a period of twelve months. Subsequently each homogeneous period was assessed via increasingly complex levels of intervention which made it possible to identify the work phases where preventive measures could be applied. The results obtained made it possible to detect the presence of risk situations for the musculoskeletal system of beekeepers. Organizational analysis in the two enterprises showed that is possible to apply easy solutions to improve safety and health at the workplace.
Zhang, Bo; Chen, Zhen; Albert, Paul S
2012-01-01
High-dimensional biomarker data are often collected in epidemiological studies when assessing the association between biomarkers and human disease is of interest. We develop a latent class modeling approach for joint analysis of high-dimensional semicontinuous biomarker data and a binary disease outcome. To model the relationship between complex biomarker expression patterns and disease risk, we use latent risk classes to link the 2 modeling components. We characterize complex biomarker-specific differences through biomarker-specific random effects, so that different biomarkers can have different baseline (low-risk) values as well as different between-class differences. The proposed approach also accommodates data features that are common in environmental toxicology and other biomarker exposure data, including a large number of biomarkers, numerous zero values, and complex mean-variance relationship in the biomarkers levels. A Monte Carlo EM (MCEM) algorithm is proposed for parameter estimation. Both the MCEM algorithm and model selection procedures are shown to work well in simulations and applications. In applying the proposed approach to an epidemiological study that examined the relationship between environmental polychlorinated biphenyl (PCB) exposure and the risk of endometriosis, we identified a highly significant overall effect of PCB concentrations on the risk of endometriosis.
Barfoed, Benedicte Lind; Jarbøl, Dorte Ejg; Paulsen, Maja Skov; Christensen, Palle Mark; Halvorsen, Peder Andreas; Nielsen, Jesper Bo; Søndergaard, Jens
2015-01-01
Objective. General practitioners' (GPs') perception of risk is a cornerstone of preventive care. The aims of this interview study were to explore GPs' professional and personal attitudes and experiences regarding treatment with lipid-lowering drugs and their views on patient compliance. Methods. The material was drawn from semistructured qualitative interviews. We sampled GPs purposively from ten selected practices, ensuring diversity of demographic, professional, and personal characteristics. The GPs were encouraged to describe examples from their own practices and reflect on them and were informed that the focus was their personal attitudes and experiences. Systematic text condensation was applied for analysis in order to uncover the concepts and themes. Results. The analysis revealed the following 3 main themes: (1) use of cardiovascular guidelines and risk assessment tools, (2) strategies for managing patient compliance, and (3) GPs' own risk management. There were substantial differences in the attitudes concerning all three themes. Conclusions. The substantial differences in the GPs' personal and professional risk perceptions may be a key to understanding why GPs do not always follow cardiovascular guidelines. The impact on daily clinical practice, personal consultation style, and patient behaviour with regard to prevention is worth studying further.
Barfoed, Benedicte Lind; Jarbøl, Dorte Ejg; Paulsen, Maja Skov; Christensen, Palle Mark; Halvorsen, Peder Andreas; Nielsen, Jesper Bo; Søndergaard, Jens
2015-01-01
Objective. General practitioners' (GPs') perception of risk is a cornerstone of preventive care. The aims of this interview study were to explore GPs' professional and personal attitudes and experiences regarding treatment with lipid-lowering drugs and their views on patient compliance. Methods. The material was drawn from semistructured qualitative interviews. We sampled GPs purposively from ten selected practices, ensuring diversity of demographic, professional, and personal characteristics. The GPs were encouraged to describe examples from their own practices and reflect on them and were informed that the focus was their personal attitudes and experiences. Systematic text condensation was applied for analysis in order to uncover the concepts and themes. Results. The analysis revealed the following 3 main themes: (1) use of cardiovascular guidelines and risk assessment tools, (2) strategies for managing patient compliance, and (3) GPs' own risk management. There were substantial differences in the attitudes concerning all three themes. Conclusions. The substantial differences in the GPs' personal and professional risk perceptions may be a key to understanding why GPs do not always follow cardiovascular guidelines. The impact on daily clinical practice, personal consultation style, and patient behaviour with regard to prevention is worth studying further. PMID:26495143
Failure mode and effects analysis drastically reduced potential risks in clinical trial conduct.
Lee, Howard; Lee, Heechan; Baik, Jungmi; Kim, Hyunjung; Kim, Rachel
2017-01-01
Failure mode and effects analysis (FMEA) is a risk management tool to proactively identify and assess the causes and effects of potential failures in a system, thereby preventing them from happening. The objective of this study was to evaluate effectiveness of FMEA applied to an academic clinical trial center in a tertiary care setting. A multidisciplinary FMEA focus group at the Seoul National University Hospital Clinical Trials Center selected 6 core clinical trial processes, for which potential failure modes were identified and their risk priority number (RPN) was assessed. Remedial action plans for high-risk failure modes (RPN >160) were devised and a follow-up RPN scoring was conducted a year later. A total of 114 failure modes were identified with an RPN score ranging 3-378, which was mainly driven by the severity score. Fourteen failure modes were of high risk, 11 of which were addressed by remedial actions. Rescoring showed a dramatic improvement attributed to reduction in the occurrence and detection scores by >3 and >2 points, respectively. FMEA is a powerful tool to improve quality in clinical trials. The Seoul National University Hospital Clinical Trials Center is expanding its FMEA capability to other core clinical trial processes.
Dadaev, Tokhir; Saunders, Edward J; Newcombe, Paul J; Anokian, Ezequiel; Leongamornlert, Daniel A; Brook, Mark N; Cieza-Borrella, Clara; Mijuskovic, Martina; Wakerell, Sarah; Olama, Ali Amin Al; Schumacher, Fredrick R; Berndt, Sonja I; Benlloch, Sara; Ahmed, Mahbubl; Goh, Chee; Sheng, Xin; Zhang, Zhuo; Muir, Kenneth; Govindasami, Koveela; Lophatananon, Artitaya; Stevens, Victoria L; Gapstur, Susan M; Carter, Brian D; Tangen, Catherine M; Goodman, Phyllis; Thompson, Ian M; Batra, Jyotsna; Chambers, Suzanne; Moya, Leire; Clements, Judith; Horvath, Lisa; Tilley, Wayne; Risbridger, Gail; Gronberg, Henrik; Aly, Markus; Nordström, Tobias; Pharoah, Paul; Pashayan, Nora; Schleutker, Johanna; Tammela, Teuvo L J; Sipeky, Csilla; Auvinen, Anssi; Albanes, Demetrius; Weinstein, Stephanie; Wolk, Alicja; Hakansson, Niclas; West, Catharine; Dunning, Alison M; Burnet, Neil; Mucci, Lorelei; Giovannucci, Edward; Andriole, Gerald; Cussenot, Olivier; Cancel-Tassin, Géraldine; Koutros, Stella; Freeman, Laura E Beane; Sorensen, Karina Dalsgaard; Orntoft, Torben Falck; Borre, Michael; Maehle, Lovise; Grindedal, Eli Marie; Neal, David E; Donovan, Jenny L; Hamdy, Freddie C; Martin, Richard M; Travis, Ruth C; Key, Tim J; Hamilton, Robert J; Fleshner, Neil E; Finelli, Antonio; Ingles, Sue Ann; Stern, Mariana C; Rosenstein, Barry; Kerns, Sarah; Ostrer, Harry; Lu, Yong-Jie; Zhang, Hong-Wei; Feng, Ninghan; Mao, Xueying; Guo, Xin; Wang, Guomin; Sun, Zan; Giles, Graham G; Southey, Melissa C; MacInnis, Robert J; FitzGerald, Liesel M; Kibel, Adam S; Drake, Bettina F; Vega, Ana; Gómez-Caamaño, Antonio; Fachal, Laura; Szulkin, Robert; Eklund, Martin; Kogevinas, Manolis; Llorca, Javier; Castaño-Vinyals, Gemma; Penney, Kathryn L; Stampfer, Meir; Park, Jong Y; Sellers, Thomas A; Lin, Hui-Yi; Stanford, Janet L; Cybulski, Cezary; Wokolorczyk, Dominika; Lubinski, Jan; Ostrander, Elaine A; Geybels, Milan S; Nordestgaard, Børge G; Nielsen, Sune F; Weisher, Maren; Bisbjerg, Rasmus; Røder, Martin Andreas; Iversen, Peter; Brenner, Hermann; Cuk, Katarina; Holleczek, Bernd; Maier, Christiane; Luedeke, Manuel; Schnoeller, Thomas; Kim, Jeri; Logothetis, Christopher J; John, Esther M; Teixeira, Manuel R; Paulo, Paula; Cardoso, Marta; Neuhausen, Susan L; Steele, Linda; Ding, Yuan Chun; De Ruyck, Kim; De Meerleer, Gert; Ost, Piet; Razack, Azad; Lim, Jasmine; Teo, Soo-Hwang; Lin, Daniel W; Newcomb, Lisa F; Lessel, Davor; Gamulin, Marija; Kulis, Tomislav; Kaneva, Radka; Usmani, Nawaid; Slavov, Chavdar; Mitev, Vanio; Parliament, Matthew; Singhal, Sandeep; Claessens, Frank; Joniau, Steven; Van den Broeck, Thomas; Larkin, Samantha; Townsend, Paul A; Aukim-Hastie, Claire; Gago-Dominguez, Manuela; Castelao, Jose Esteban; Martinez, Maria Elena; Roobol, Monique J; Jenster, Guido; van Schaik, Ron H N; Menegaux, Florence; Truong, Thérèse; Koudou, Yves Akoli; Xu, Jianfeng; Khaw, Kay-Tee; Cannon-Albright, Lisa; Pandha, Hardev; Michael, Agnieszka; Kierzek, Andrzej; Thibodeau, Stephen N; McDonnell, Shannon K; Schaid, Daniel J; Lindstrom, Sara; Turman, Constance; Ma, Jing; Hunter, David J; Riboli, Elio; Siddiq, Afshan; Canzian, Federico; Kolonel, Laurence N; Le Marchand, Loic; Hoover, Robert N; Machiela, Mitchell J; Kraft, Peter; Freedman, Matthew; Wiklund, Fredrik; Chanock, Stephen; Henderson, Brian E; Easton, Douglas F; Haiman, Christopher A; Eeles, Rosalind A; Conti, David V; Kote-Jarai, Zsofia
2018-06-11
Prostate cancer is a polygenic disease with a large heritable component. A number of common, low-penetrance prostate cancer risk loci have been identified through GWAS. Here we apply the Bayesian multivariate variable selection algorithm JAM to fine-map 84 prostate cancer susceptibility loci, using summary data from a large European ancestry meta-analysis. We observe evidence for multiple independent signals at 12 regions and 99 risk signals overall. Only 15 original GWAS tag SNPs remain among the catalogue of candidate variants identified; the remainder are replaced by more likely candidates. Biological annotation of our credible set of variants indicates significant enrichment within promoter and enhancer elements, and transcription factor-binding sites, including AR, ERG and FOXA1. In 40 regions at least one variant is colocalised with an eQTL in prostate cancer tissue. The refined set of candidate variants substantially increase the proportion of familial relative risk explained by these known susceptibility regions, which highlights the importance of fine-mapping studies and has implications for clinical risk profiling.
Mikolai, Júlia; Kulu, Hill
2018-02-01
This study investigates the effect of marital and nonmarital separation on individuals' residential and housing trajectories. Using rich data from the British Household Panel Survey (BHPS) and applying multilevel competing-risks event history models, we analyze the risk of a move of single, married, cohabiting, and separated men and women to different housing types. We distinguish moves due to separation from moves of separated people and account for unobserved codeterminants of moving and separation risks. Our analysis shows that many individuals move due to separation, as expected, but that the likelihood of moving is also relatively high among separated individuals. We find that separation has a long-term effect on individuals' residential careers. Separated women exhibit high moving risks regardless of whether they moved out of the joint home upon separation, whereas separated men who did not move out upon separation are less likely to move. Interestingly, separated women are most likely to move to terraced houses, whereas separated men are equally likely to move to flats (apartments) and terraced (row) houses, suggesting that family structure shapes moving patterns of separated individuals.
Li, Jia; Wang, Deming; Huang, Zonghou
2017-01-01
Coal dust explosions (CDE) are one of the main threats to the occupational safety of coal miners. Aiming to identify and assess the risk of CDE, this paper proposes a novel method of fuzzy fault tree analysis combined with the Visual Basic (VB) program. In this methodology, various potential causes of the CDE are identified and a CDE fault tree is constructed. To overcome drawbacks from the lack of exact probability data for the basic events, fuzzy set theory is employed and the probability data of each basic event is treated as intuitionistic trapezoidal fuzzy numbers. In addition, a new approach for calculating the weighting of each expert is also introduced in this paper to reduce the error during the expert elicitation process. Specifically, an in-depth quantitative analysis of the fuzzy fault tree, such as the importance measure of the basic events and the cut sets, and the CDE occurrence probability is given to assess the explosion risk and acquire more details of the CDE. The VB program is applied to simplify the analysis process. A case study and analysis is provided to illustrate the effectiveness of this proposed method, and some suggestions are given to take preventive measures in advance and avoid CDE accidents. PMID:28793348
Wang, Hetang; Li, Jia; Wang, Deming; Huang, Zonghou
2017-01-01
Coal dust explosions (CDE) are one of the main threats to the occupational safety of coal miners. Aiming to identify and assess the risk of CDE, this paper proposes a novel method of fuzzy fault tree analysis combined with the Visual Basic (VB) program. In this methodology, various potential causes of the CDE are identified and a CDE fault tree is constructed. To overcome drawbacks from the lack of exact probability data for the basic events, fuzzy set theory is employed and the probability data of each basic event is treated as intuitionistic trapezoidal fuzzy numbers. In addition, a new approach for calculating the weighting of each expert is also introduced in this paper to reduce the error during the expert elicitation process. Specifically, an in-depth quantitative analysis of the fuzzy fault tree, such as the importance measure of the basic events and the cut sets, and the CDE occurrence probability is given to assess the explosion risk and acquire more details of the CDE. The VB program is applied to simplify the analysis process. A case study and analysis is provided to illustrate the effectiveness of this proposed method, and some suggestions are given to take preventive measures in advance and avoid CDE accidents.
Environmental risk, precaution, and scientific rationality in the context of WTO/NAFTA trade rules.
Crawford-Brown, Douglas; Pauwelyn, Joost; Smith, Kelly
2004-04-01
This article considers the role of scientific rationality in understanding statements of risk produced by a scientific community. An argument is advanced that, while scientific rationality does impose constraints on valid scientific justifications for restrictions on products and practices, it also provides flexibility in the judgments needed to both develop and apply characterizations of risk. The implications of this flexibility for the understanding of risk estimates in WTO and NAFTA deliberations are explored, with the goal of finding an intermediate ground between the view that science unambiguously justifies or rejects a policy, and the view that science is yet another cultural tool that can be manipulated in support of any decision. The result is a proposal for a dialogical view of scientific rationality in which risk estimates are depicted as confidence distributions that follow from a structured dialogue of scientific panels focused on judgments of evidence, evidential reasoning, and epistemic analysis.
Cohen, Elizabeth L
2010-12-01
The decision to become an organ donor involves considering both self-relevant risks and the needs of others. This study applied prospect theory to examine how news exemplar message frames that focus on the possible survival or death of a potential organ transplant recipient affect participants' willingness to become organ donors. Perceived personal risk and ambivalence were examined as moderating variables. Results indicate that risk, rather than ambivalence, played an instrumental role in participants' decisions to donate. Although no main effects or interactions related to message frame emerged in initial analyses, a supplemental analysis revealed an interaction such that there was a modest persuasive advantage for the loss-framed message among low-risk participants. Findings suggest that vivid exemplar message frames, compared to other types of more explicit organ donor appeals, may be associated with unique decisions about organ donation.
Classifying Nanomaterial Risks Using Multi-Criteria Decision Analysis
NASA Astrophysics Data System (ADS)
Linkov, I.; Steevens, J.; Chappell, M.; Tervonen, T.; Figueira, J. R.; Merad, M.
There is rapidly growing interest by regulatory agencies and stakeholders in the potential toxicity and other risks associated with nanomaterials throughout the different stages of the product life cycle (e.g., development, production, use and disposal). Risk assessment methods and tools developed and applied to chemical and biological material may not be readily adaptable for nanomaterials because of the current uncertainty in identifying the relevant physico-chemical and biological properties that adequately describe the materials. Such uncertainty is further driven by the substantial variations in the properties of the original material because of the variable manufacturing processes employed in nanomaterial production. To guide scientists and engineers in nanomaterial research and application as well as promote the safe use/handling of these materials, we propose a decision support system for classifying nanomaterials into different risk categories. The classification system is based on a set of performance metrics that measure both the toxicity and physico-chemical characteristics of the original materials, as well as the expected environmental impacts through the product life cycle. The stochastic multicriteria acceptability analysis (SMAA-TRI), a formal decision analysis method, was used as the foundation for this task. This method allowed us to cluster various nanomaterials in different risk categories based on our current knowledge of nanomaterial's physico-chemical characteristics, variation in produced material, and best professional judgement. SMAA-TRI uses Monte Carlo simulations to explore all feasible values for weights, criteria measurements, and other model parameters to assess the robustness of nanomaterial grouping for risk management purposes.1,2
Essays in financial economics and econometrics
NASA Astrophysics Data System (ADS)
La Spada, Gabriele
Chapter 1 (my job market paper) asks the following question: Do asset managers reach for yield because of competitive pressures in a low rate environment? I propose a tournament model of money market funds (MMFs) to study this issue. I show that funds with different costs of default respond differently to changes in interest rates, and that it is important to distinguish the role of risk-free rates from that of risk premia. An increase in the risk premium leads funds with lower default costs to increase risk-taking, while funds with higher default costs reduce risk-taking. Without changes in the premium, low risk-free rates reduce risk-taking. My empirical analysis shows that these predictions are consistent with the risk-taking of MMFs during the 2006--2008 period. Chapter 2, co-authored with Fabrizio Lillo and published in Studies in Nonlinear Dynamics and Econometrics (2014), studies the effect of round-off error (or discretization) on stationary Gaussian long-memory process. For large lags, the autocovariance is rescaled by a factor smaller than one, and we compute this factor exactly. Hence, the discretized process has the same Hurst exponent as the underlying one. We show that in presence of round-off error, two common estimators of the Hurst exponent, the local Whittle (LW) estimator and the detrended fluctuation analysis (DFA), are severely negatively biased in finite samples. We derive conditions for consistency and asymptotic normality of the LW estimator applied to discretized processes and compute the asymptotic properties of the DFA for generic long-memory processes that encompass discretized processes. Chapter 3, co-authored with Fabrizio Lillo, studies the effect of round-off error on integrated Gaussian processes with possibly correlated increments. We derive the variance and kurtosis of the realized increment process in the limit of both "small" and "large" round-off errors, and its autocovariance for large lags. We propose novel estimators for the variance and lag-one autocorrelation of the underlying, unobserved increment process. We also show that for fractionally integrated processes, the realized increments have the same Hurst exponent as the underlying ones, but the LW estimator applied to the realized series is severely negatively biased in medium-sized samples.
Villa-Mancera, Abel; Pastelín-Rojas, César; Olivares-Pérez, Jaime; Córdova-Izquierdo, Alejandro; Reynoso-Palomar, Alejandro
2018-05-01
This study investigated the prevalence, production losses, spatial clustering, and predictive risk mapping in different climate zones in five states of Mexico. The bulk tank milk samples obtained between January and April 2015 were analyzed for antibodies against Ostertagia ostertagi using the Svanovir ELISA. A total of 1204 farm owners or managers answered the questionnaire. The overall herd prevalence and mean optical density ratio (ODR) of parasite were 61.96% and 0.55, respectively. Overall, the production loss was approximately 0.542 kg of milk per parasited cow per day (mean ODR = 0.92, 142 farms, 11.79%). The spatial disease cluster analysis using SatScan software indicated that two high-risk clusters were observed. In the multivariable analysis, three models were tested for potential association with the ELISA results supported by climatic, environmental, and management factors. The final logistic regression model based on both climatic/environmental and management variables included the factors rainfall, elevation, land surface temperature (LST) day, and parasite control program that were significantly associated with an increased risk of infection. Geostatistical kriging was applied to generate a risk map for the presence of parasite in dairy cattle herds in Mexico. The results indicate that climatic and meteorological factors had a higher potential impact on the spatial distribution of O. ostertagi than the management factors.
Pulleyblank, Ryan; Chuma, Jefter; Gilbody, Simon M; Thompson, Carl
2013-09-01
For a test to be considered useful for making treatment decisions, it is necessary that making treatment decisions based on the results of the test be a preferable strategy to making treatment decisions without the test. Decision curve analysis is a framework for assessing when a test would be expected to be useful, which integrates evidence of a test's performance characteristics (sensitivity and specificity), condition prevalence among at-risk patients, and patient preferences for treatment. We describe decision curve analysis generally and illustrate its potential through an application to tests for prodromal psychosis. Clinical psychosis is often preceded by a prodromal phase, but not all those with prodromal symptoms proceed to develop full psychosis. Patients identified as at risk for developing psychosis may be considered for proactive treatment to mitigate development of clinically defined psychosis. Tests exist to help identify those at-risk patients most likely to develop psychosis, but it is uncertain when these tests would be considered useful for making proactive treatment decisions. We apply decision curve analysis to results from a systematic review of studies investigating clinical tests for predicting the development of psychosis in at-risk populations, and present resulting decision curves that illustrate when the tests may be expected to be useful for making proactive treatment decisions.
Giannopoulos, G; Larcher, M; Casadei, F; Solomos, G
2010-01-15
Terrorist attacks in New York have shocked the world community showing clearly the vulnerability of air transport in such events. However, the terrorist attacks in Madrid and London showed that land mass transport infrastructure is equally vulnerable in case of similar attacks. The fact that there has not been substantial investment in the domain of risk analysis and evaluation of the possible effects due to such events in land mass transportation infrastructure leaves large room for new developments that could eventually fill this gap. In the present work using the finite element code EUROPLEXUS there has been a large effort to perform a complete study of the land mass infrastructure in case of explosion events. This study includes a train station, a metro station and a metro carriage providing thus valuable simulation data for a variety of different situations. For the analysis of these structures it has been necessary to apply a laser scanning method for the acquisition of geometrical data, to improve the simulation capabilities of EUROPLEXUS by adding failure capabilities for specific finite elements, to implement new material models (e.g. glass), and to add new modules that achieve data post-processing for the calculation of fatal and non-fatal injuries risk. The aforementioned improvements are explained in the present work with emphasis in the newly developed risk analysis features of EUROPLEXUS.
Kim, Minjae; Wall, Melanie M; Li, Guohua
2016-07-01
Perioperative risk stratification is often performed using individual risk factors without consideration of the syndemic of these risk factors. We used latent class analysis (LCA) to identify the classes of comorbidities and risk factors associated with perioperative mortality in patients presenting for intraabdominal general surgery. The 2005 to 2010 American College of Surgeons National Surgical Quality Improvement Program was used to obtain a cohort of patients undergoing intraabdominal general surgery. Risk factors and comorbidities were entered into LCA models to identify the latent classes, and individuals were assigned to a class based on the highest posterior probability of class membership. Relative risk regression was used to determine the associations between the latent classes and 30-day mortality, with adjustments for procedure. A 9-class model was fit using LCA on 466,177 observations. After combining classes with similar adjusted mortality risks, 5 risk classes were obtained. Compared with the class with average mortality risk (class 4), the risk ratios (95% confidence interval) ranged from 0.020 (0.014-0.027) in the lowest risk class (class 1) to 6.75 (6.46-7.02) in the highest risk class. After adjusting for procedure and ASA physical status, the latent classes remained significantly associated with 30-day mortality. The addition of the risk class variable to a model containing ASA physical status and surgical procedure demonstrated a significant increase in the area under the receiver operator characteristic curve (0.892 vs 0.915; P < 0.0001). Latent classes of risk factors and comorbidities in patients undergoing intraabdominal surgery are predictive of 30-day mortality independent of the ASA physical status and improve risk prediction with the ASA physical status.
The application of decision analysis to life support research and technology development
NASA Technical Reports Server (NTRS)
Ballin, Mark G.
1994-01-01
Applied research and technology development is often characterized by uncertainty, risk, and significant delays before tangible returns are obtained. Decision making regarding which technologies to advance and what resources to devote to them is a challenging but essential task. In the application of life support technology to future manned space flight, new technology concepts typically are characterized by nonexistent data and rough approximations of technology performance, uncertain future flight program needs, and a complex, time-intensive process to develop technology to a flight-ready status. Decision analysis is a quantitative, logic-based discipline that imposes formalism and structure to complex problems. It also accounts for the limits of knowledge that may be available at the time a decision is needed. The utility of decision analysis to life support technology R & D was evaluated by applying it to two case studies. The methodology was found to provide insight that is not possible from more traditional analysis approaches.
A Synthetic Vision Preliminary Integrated Safety Analysis
NASA Technical Reports Server (NTRS)
Hemm, Robert; Houser, Scott
2001-01-01
This report documents efforts to analyze a sample of aviation safety programs, using the LMI-developed integrated safety analysis tool to determine the change in system risk resulting from Aviation Safety Program (AvSP) technology implementation. Specifically, we have worked to modify existing system safety tools to address the safety impact of synthetic vision (SV) technology. Safety metrics include reliability, availability, and resultant hazard. This analysis of SV technology is intended to be part of a larger effort to develop a model that is capable of "providing further support to the product design and development team as additional information becomes available". The reliability analysis portion of the effort is complete and is fully documented in this report. The simulation analysis is still underway; it will be documented in a subsequent report. The specific goal of this effort is to apply the integrated safety analysis to SV technology. This report also contains a brief discussion of data necessary to expand the human performance capability of the model, as well as a discussion of human behavior and its implications for system risk assessment in this modeling environment.
Application of the NUREG/CR-6850 EPRI/NRC Fire PRA Methodology to a DOE Facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tom Elicson; Bentley Harwood; Richard Yorg
2011-03-01
The application NUREG/CR-6850 EPRI/NRC fire PRA methodology to DOE facility presented several challenges. This paper documents the process and discusses several insights gained during development of the fire PRA. A brief review of the tasks performed is provided with particular focus on the following: • Tasks 5 and 14: Fire-induced risk model and fire risk quantification. A key lesson learned was to begin model development and quantification as early as possible in the project using screening values and simplified modeling if necessary. • Tasks 3 and 9: Fire PRA cable selection and detailed circuit failure analysis. In retrospect, it wouldmore » have been beneficial to perform the model development and quantification in 2 phases with detailed circuit analysis applied during phase 2. This would have allowed for development of a robust model and quantification earlier in the project and would have provided insights into where to focus the detailed circuit analysis efforts. • Tasks 8 and 11: Scoping fire modeling and detailed fire modeling. More focus should be placed on detailed fire modeling and less focus on scoping fire modeling. This was the approach taken for the fire PRA. • Task 14: Fire risk quantification. Typically, multiple safe shutdown (SSD) components fail during a given fire scenario. Therefore dependent failure analysis is critical to obtaining a meaningful fire risk quantification. Dependent failure analysis for the fire PRA presented several challenges which will be discussed in the full paper.« less
Feng, Yinling; Wang, Xuefeng
2017-03-01
In order to investigate commonly disturbed genes and pathways in various brain regions of patients with Parkinson's disease (PD), microarray datasets from previous studies were collected and systematically analyzed. Different normalization methods were applied to microarray datasets from different platforms. A strategy combining gene co‑expression networks and clinical information was adopted, using weighted gene co‑expression network analysis (WGCNA) to screen for commonly disturbed genes in different brain regions of patients with PD. Functional enrichment analysis of commonly disturbed genes was performed using the Database for Annotation, Visualization, and Integrated Discovery (DAVID). Co‑pathway relationships were identified with Pearson's correlation coefficient tests and a hypergeometric distribution‑based test. Common genes in pathway pairs were selected out and regarded as risk genes. A total of 17 microarray datasets from 7 platforms were retained for further analysis. Five gene coexpression modules were identified, containing 9,745, 736, 233, 101 and 93 genes, respectively. One module was significantly correlated with PD samples and thus the 736 genes it contained were considered to be candidate PD‑associated genes. Functional enrichment analysis demonstrated that these genes were implicated in oxidative phosphorylation and PD. A total of 44 pathway pairs and 52 risk genes were revealed, and a risk gene pathway relationship network was constructed. Eight modules were identified and were revealed to be associated with PD, cancers and metabolism. A number of disturbed pathways and risk genes were unveiled in PD, and these findings may help advance understanding of PD pathogenesis.
Xu, J-L; Xia, R; Sun, Z-H; Sun, L; Min, X; Liu, C; Zhang, H; Zhu, Y-M
2016-12-01
This meta-analysis aimed to assess the prophylactic effects of honey use on the management of radio/chemotherapy-induced mucositis. PubMed, Cochrane Library, Science Direct, China National Knowledge Infrastructure (CNKI), VIP (Chinese scientific journal database), and China Biology Medicine (CBM) were searched for relevant articles without language restriction. Two reviewers searched and evaluated the related studies independently. Statistical analyses were performed using Stata 11.0, calculating the pooled risk ratio (RR) with the corresponding 95% confidence interval (CI). Begg's funnel plot was used together with Egger's test to detect publication bias. A total of seven randomized controlled trials were finally included. Quality assessment showed one article to have a low risk of bias, two to have a moderate risk, and four to have a high risk. Meta-analysis showed that, compared with blank control, honey treatment could reduce the incidence of oral mucositis after radio/chemotherapy (RR 0.35, 95% CI 0.18-0.70, P=0.003). No meta-analysis was applied for honey vs. lidocaine or honey vs. golden syrup. The sensitivity analysis showed no significant change when any one study was excluded. No obvious publication bias (honey vs. blank control) was detected. In conclusion, honey can effectively reduce the incidence of radio/chemotherapy-induced oral mucositis; however, further multi-centre randomized controlled trials are needed to support the current evidence. Copyright © 2016 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.
Liao, Zhi-Heng; Sun, Jia-Ren; Wu, Dui; Fan, Shao-Jia; Ren, Ming-Zhong; Lü, Jia-Yang
2014-06-01
The CALPUFF model was applied to simulate the ground-level atmospheric concentrations of Pb and Cd from municipal solid waste incineration (MSWI) plants, and the soil concentration model was used to estimate soil concentration increments after atmospheric deposition based on Monte Carlo simulation, then ecological risk assessment was conducted by the potential ecological risk index method. The results showed that the largest atmospheric concentrations of Pb and Cd were 5.59 x 109-3) microg x m(-3) and 5.57 x 10(-4) microg x m(-3), respectively, while the maxima of soil concentration incremental medium of Pb and Cd were 2.26 mg x kg(-1) and 0.21 mg x kg(-1), respectively; High risk areas were located next to the incinerators, Cd contributed the most to the ecological risk, and Pb was basically free of pollution risk; Higher ecological hazard level was predicted at the most polluted point in urban areas with a 55.30% probability, while in rural areas, the most polluted point was assessed to moderate ecological hazard level with a 72.92% probability. In addition, sensitivity analysis of calculation parameters in the soil concentration model was conducted, which showed the simulated results of urban and rural area were most sensitive to soil mix depth and dry deposition rate, respectively.
2018-01-01
Qualitative risk assessment frameworks, such as the Productivity Susceptibility Analysis (PSA), have been developed to rapidly evaluate the risks of fishing to marine populations and prioritize management and research among species. Despite being applied to over 1,000 fish populations, and an ongoing debate about the most appropriate method to convert biological and fishery characteristics into an overall measure of risk, the assumptions and predictive capacity of these approaches have not been evaluated. Several interpretations of the PSA were mapped to a conventional age-structured fisheries dynamics model to evaluate the performance of the approach under a range of assumptions regarding exploitation rates and measures of biological risk. The results demonstrate that the underlying assumptions of these qualitative risk-based approaches are inappropriate, and the expected performance is poor for a wide range of conditions. The information required to score a fishery using a PSA-type approach is comparable to that required to populate an operating model and evaluating the population dynamics within a simulation framework. In addition to providing a more credible characterization of complex system dynamics, the operating model approach is transparent, reproducible and can evaluate alternative management strategies over a range of plausible hypotheses for the system. PMID:29856869
Furlan, L; Contiero, B; Chiarini, F; Colauzzi, M; Sartori, E; Benvegnù, I; Fracasso, F; Giandon, P
2017-01-01
A survey of maize fields was conducted in northeast Italy from 1986 to 2014, resulting in a dataset of 1296 records including information on wireworm damage to maize, plant-attacking species, agronomic characteristics, landscape and climate. Three wireworm species, Agriotes brevis Candeze, A. sordidus Illiger and A. ustulatus Schäller, were identified as the dominant pest species in maize fields. Over the 29-year period surveyed, no yield reduction was observed when wireworm plant damage was below 15 % of the stand. A preliminary univariate analysis of risk assessment was applied to identify the main factors influencing the occurrence of damage. A multifactorial model was then applied by using the significant factors identified. This model allowed the research to highlight the strongest factors and to analyse how the main factors together influenced damage risk. The strongest factors were: A. brevis as prevalent damaging species, soil organic matter content >5 %, rotation including meadows and/or double crops, A. sordidus as prevalent damaging species, and surrounding landscape mainly meadows, uncultivated grass and double crops. The multifactorial model also showed how the simultaneous occurrence of two or more of the aforementioned risk factors can conspicuously increase the risk of wireworm damage to maize crops, while the probability of damage to a field with no-risk factors is always low (<1 %). These results make it possible to draw risk maps to identify low-risk and high-risk areas, a first step in implementing bespoke IPM procedures in an attempt to reduce the impact of soil insecticides significantly.
Incidence and Residual Risk of HIV, HBV and HCV Infections Among Blood Donors in Tehran.
Saber, Hamid Reza; Tabatabaee, Seyed Morteza; Abasian, Ali; Jamali, Mostafa; SalekMoghadam, Ebadollah; Hajibeigi, Bashir; Alavian, Seyed Moayed; Mirrezaie, Seyed Mohammad
2017-09-01
Estimation of residual risk is essential to monitor and improve blood safety. Our epidemiologic knowledge in the Iranian donor population regarding transfusion transmitted viral infections (TTIs), is confined to a few studies based on prevalence rate. There are no reports on residual risk of TTIs in Iran. In present survey, a software database of donor records of Tehran Blood Transfusion Center (TBTC) was used to estimate the incidence and residual risk of hepatitis B virus (HBV), hepatitis C virus (HCV) and human immunodeficiency virus (HIV) infections, by applying the incidence rate/window period (IR-WP) model. A total of 1,207,155 repeat donations was included in the analysis and represented a mean of 8.4 donations per donor over 6 years. The incidence amongst repeat donors was estimated by dividing the number of confirmed seroconverting donors by the total number of person-years at risk. The residual risk was calculated using the incidence/window period model. Incidence rate and residual risk for HBV, HCV and HIV infections were calculated for total (2005-2010) and two consecutive periods (2005-2007 and 2008-2010) of the study. According to the IR-WP model, overall residual risk for HIV and HCV in the total study period was 0.4 and 12.5 per million units, respectively and for HBV 4.57/100,000 donations. The incidence and residual risk of TTIs, calculated on TBTC's blood supply was low and comparable with developed countries for HIV infection but high for HCV and HBV infections. Blood safety may therefore be better managed by applying other techniques like nucleic acid amplification tests.
Annotation analysis for testing drug safety signals using unstructured clinical notes
2012-01-01
Background The electronic surveillance for adverse drug events is largely based upon the analysis of coded data from reporting systems. Yet, the vast majority of electronic health data lies embedded within the free text of clinical notes and is not gathered into centralized repositories. With the increasing access to large volumes of electronic medical data—in particular the clinical notes—it may be possible to computationally encode and to test drug safety signals in an active manner. Results We describe the application of simple annotation tools on clinical text and the mining of the resulting annotations to compute the risk of getting a myocardial infarction for patients with rheumatoid arthritis that take Vioxx. Our analysis clearly reveals elevated risks for myocardial infarction in rheumatoid arthritis patients taking Vioxx (odds ratio 2.06) before 2005. Conclusions Our results show that it is possible to apply annotation analysis methods for testing hypotheses about drug safety using electronic medical records. PMID:22541596
System for decision analysis support on complex waste management issues
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shropshire, D.E.
1997-10-01
A software system called the Waste Flow Analysis has been developed and applied to complex environmental management processes for the United States Department of Energy (US DOE). The system can evaluate proposed methods of waste retrieval, treatment, storage, transportation, and disposal. Analysts can evaluate various scenarios to see the impacts to waste slows and schedules, costs, and health and safety risks. Decision analysis capabilities have been integrated into the system to help identify preferred alternatives based on a specific objectives may be to maximize the waste moved to final disposition during a given time period, minimize health risks, minimize costs,more » or combinations of objectives. The decision analysis capabilities can support evaluation of large and complex problems rapidly, and under conditions of variable uncertainty. The system is being used to evaluate environmental management strategies to safely disposition wastes in the next ten years and reduce the environmental legacy resulting from nuclear material production over the past forty years.« less
6C.04: INTEGRATED SNP ANALYSIS AND METABOLOMIC PROFILES OF METABOLIC SYNDROME.
Marrachelli, V; Monleon, D; Morales, J M; Rentero, P; Martínez, F; Chaves, F J; Martin-Escudero, J C; Redon, J
2015-06-01
Metabolic syndrome (MS) has become a health and financial burden worldwide. Susceptibility of genetically determined metabotype of MS has not yet been investigated. We aimed to identify a distinctive metabolic profile of blood serum which might correlates to the early detection of the development of MS associated to genetic polymorphism. We applied high resolution NMR spectroscopy to profile blood serum from patients without MS (n = 945) or with (n = 291). Principal component analysis (PCA) and projection to latent structures for discriminant analysis (PLS-DA) were applied to NMR spectral datasets. Results were cross-validated using the Venetian Blinds approach. Additionally, five SNPs previously associated with MS were genotyped with SNPlex and tested for associations between the metabolic profiles and the genetic variants. Statistical analysis was performed using in-house MATLAB scripts and the PLS Toolbox statistical multivariate analysis library. Our analysis provided a PLS-DA Metabolic Syndrome discrimination model based on NMR metabolic profile (AUC = 0.86) with 84% of sensitivity and 72% specificity. The model identified 11 metabolites differentially regulated in patients with MS. Among others, fatty acids, glucose, alanine, hydroxyisovalerate, acetone, trimethylamine, 2-phenylpropionate, isobutyrate and valine, significantly contributed to the model. The combined analysis of metabolomics and SNP data revealed an association between the metabolic profile of MS and genes polymorphism involved in the adiposity regulation and fatty acids metabolism: rs2272903_TT (TFAP2B), rs3803_TT (GATA2), rs174589_CC (FADS2) and rs174577_AA (FADS2). In addition, individuals with the rs2272903-TT genotype seem to develop MS earlier than general population. Our study provides new insights on the metabolic alterations associated with a MS high-risk genotype. These results could help in future development of risk assessment and predictive models for subclinical cardiovascular disease.
Use of acetaminophen and risk of endometrial cancer: evidence from observational studies.
Ding, Yuan-Yuan; Yao, Peng; Verma, Surya; Han, Zhen-Kai; Hong, Tao; Zhu, Yong-Qiang; Li, Hong-Xi
2017-05-23
Previous meta-analyses suggested that aspirin was associated with reduced risk of endometrial cancer. However, there has been no study comprehensively summarize the evidence of acetaminophen use and risk of endometrial cancer from observational studies. We systematically searched electronic databases (PubMed , EMBASE, Web of Science, and Cochrane Library) for relevant cohort or case-control studies up to February 28, 2017. Two independent authors performed the eligibility evaluation and data extraction. All differences were resolved by discussion. A random-effects model was applied to estimate summary relative risks (RRs) with 95% CIs. All statistical tests were two-sided. Seven observational studies including four prospective cohort studies and three case-control studies with 3874 endometrial cancer cases were included for final analysis. Compared with never use acetaminophen, ever use this drug was not associated with risk of endometrial cancer (summarized RR = 1.02; 95% CI: 0.93-1.13, I2 = 0%). Similar null association was also observed when compared the highest category of frequency/duration with never use acetaminophen (summarized RR = 0.88; 95% CI: 0.70-1.11, I2 = 15.2%). Additionally, the finding was robust in the subgroup analyses stratified by study characteristics and adjustment for potential confounders and risk factors. There was no evidence of publication bias by a visual inspection of a funnel plot and formal statistical tests. In summary, the present meta-analysis reveals no association between acetaminophen use and risk of endometrial cancer. More large scale prospective cohort studies are warranted to confirm our findings and carry out the dose-response analysis of aforementioned association.